Right now, I'm sitting in a beautiful part of the world: Franconia, a culturally and historically distinct region of Germany, mostly within the boundaries of modern Bavaria.
Or to cut a long story short, I'm on holiday in Germany, and people around me are speaking lots of German.
I've studied a little bit of German, having done most of the Michel Thomas course and 8 or 9 levels of the Duolingo course. When I'm in the shops, the bars and the streets, I keep hearing German, and although I don't understand it, I feel completely like I could. The sounds, the rhythms and even some of the words just feel natural to my ears.
And so I will learn German properly now. I'm not going to wait another month just for the sake of turning it into a New Year's resolution. Previously, I'd been using German as a mostly-unknown language so that I could get a feel for what DuoLingo was doing (and I have lots of year-old notes and screenshots on a harddrive somewhere, prep work for a review that I never bothered to write up) so it wasn't a serious push. Time to change that.
This is a feeling I've had before, and it's always been followed by an intense period of focus, because my brain is just ready to soak up all that it needs.
I suppose it's a bit like cycling up a hill, and struggling up a steep bit, then hitting a milder uphill that feels almost like a downhill. You're still peddling, still pushing yourself up, but somehow it feels effortless by comparison to the previous slog.
Ich will Deutsch sprechen können.
(I hope that's the right word order...)
27 November 2014
12 November 2014
Language tends to deteriate dramasticly.
Johnson, the guy who compiled that famous early dictionary of English, once said
Is he talking about how language change in general? Was he lamenting the loss of conservative feature such as "thee"/"thou" and subjunctive conjugations?
Or was he talking about loss of precision in language, such as the change in meaning of "decimation" from "killing one tenth" via "massacre" to more general "destruction".
Either way, language does most definitely change, and the two words in the title that your spellchecker probably would say aren't words at all are good examples that I hear with reasonable frequency in my own life.
I don't know how many of you will have worked out there meanings and/or origins from reading them, but please leave a comment to let me know.
The loss of a vowel in the middle of a sentence is what we call "syncope". Syncope is particularly common where a vowel is sandwiched between two instances of the same consonant. Here, it's the repeated R that triggers the lost syllable.
A more topical example of the same mechanism is the word "quantitative", as in "quantitative easing". Listen to the news, and most reporters will pronounce it fully. In an unscripted interview, though, you may just hear "quantative". Discuss the economy in a bar, and after a couple of glasses, you'll all be saying it that way. Even that form still has two Ts, so one day it might just shrink to "quantive".
What's interesting about "deteriate", though is the /i/ sound. We haven't lost the "io" from deteriorate, and just had the Rs collide, we've lost the "or". Hmmm...
This work is a confusion between "dramatically" and "drastically", and I was always conscious of that fact. But that doesn't mean it's not a legitimate word. We have a lot of evidence of words "falling together" in multiple languages.
For example, the conjugations of the verb "to go" in Spanish, French and Italian are a mixture of three verbs in Latin: andare, ire and vadere. But now they're just one word.
A far more recent example of falling together is the term "nailed it".
Most of us would associate that with getting something right/doing it perfectly. In that sense, it derives from the phrase "to hit the nail on the head" and evolved from saying the perfect answer to doing something really well, like "nailing" a jump at a skate park.
On the other hand, we have the management version, where "nailing" something is just getting it finished. It probably derives from the phrase "to nail ((something)) to the wall". That's pretty much the opposite meaning, because that phrase is all about not doing things perfectly. The metaphor is a kitchen cabinet -- you don't care if the door is slightly squint, you just want it on the wall so that the job's finished and everyone can go home.
Both of these long proverbial forms have reduced to the same verb, which can cause misunderstandings.
This sort of change isn't uncommon, though, so you should always be careful about discounting any theory about the origin of a term because of some other theory. It could turn out that both are right....
Language is only the instrument of science, and words are but the signs of ideas: I wish, however, that the instrument might be less apt to decay,There seems to be a value judgement their against the natural processes of language change, but it strikes me as far less clear than others make out.
Is he talking about how language change in general? Was he lamenting the loss of conservative feature such as "thee"/"thou" and subjunctive conjugations?
Or was he talking about loss of precision in language, such as the change in meaning of "decimation" from "killing one tenth" via "massacre" to more general "destruction".
Either way, language does most definitely change, and the two words in the title that your spellchecker probably would say aren't words at all are good examples that I hear with reasonable frequency in my own life.
I don't know how many of you will have worked out there meanings and/or origins from reading them, but please leave a comment to let me know.
Deteriate
Of the two, this is the one I hear most often; from my own mouth, from the mouths of friends, and often even on TV. You probably hear it too. It's simply a contraction of "deteriorate". It occurs in the derived noun too: "deteriation".The loss of a vowel in the middle of a sentence is what we call "syncope". Syncope is particularly common where a vowel is sandwiched between two instances of the same consonant. Here, it's the repeated R that triggers the lost syllable.
A more topical example of the same mechanism is the word "quantitative", as in "quantitative easing". Listen to the news, and most reporters will pronounce it fully. In an unscripted interview, though, you may just hear "quantative". Discuss the economy in a bar, and after a couple of glasses, you'll all be saying it that way. Even that form still has two Ts, so one day it might just shrink to "quantive".
What's interesting about "deteriate", though is the /i/ sound. We haven't lost the "io" from deteriorate, and just had the Rs collide, we've lost the "or". Hmmm...
Dramasticly
(Or possibly "dramastically".) This is something I'm not aware of hearing that much. I associate it particularly with my little sister (although I'm aware that several of us in the family have said it), and a few months ago I heard it in the pub in my parents' village, so maybe it's a local thing. I'll keep my ears open.This work is a confusion between "dramatically" and "drastically", and I was always conscious of that fact. But that doesn't mean it's not a legitimate word. We have a lot of evidence of words "falling together" in multiple languages.
For example, the conjugations of the verb "to go" in Spanish, French and Italian are a mixture of three verbs in Latin: andare, ire and vadere. But now they're just one word.
A far more recent example of falling together is the term "nailed it".
Most of us would associate that with getting something right/doing it perfectly. In that sense, it derives from the phrase "to hit the nail on the head" and evolved from saying the perfect answer to doing something really well, like "nailing" a jump at a skate park.
On the other hand, we have the management version, where "nailing" something is just getting it finished. It probably derives from the phrase "to nail ((something)) to the wall". That's pretty much the opposite meaning, because that phrase is all about not doing things perfectly. The metaphor is a kitchen cabinet -- you don't care if the door is slightly squint, you just want it on the wall so that the job's finished and everyone can go home.
Both of these long proverbial forms have reduced to the same verb, which can cause misunderstandings.
This sort of change isn't uncommon, though, so you should always be careful about discounting any theory about the origin of a term because of some other theory. It could turn out that both are right....
01 November 2014
Newsflash: unquestioning praise considered harmful!
I have found myself for years defending the teaching profession against what I saw as unwarranted attacks. People accused the culture of praise in modern schooling as being namby-pamby liberal nonsense. I tried to explain there was a good body of evidence behind it.
But news came this week of a report rubbishing the practice.
Perhaps my mistake was in trying to educate the lay people rather than the teachers. You see, most teachers didn't get it, not because individually they aren't bright, motivated professionals, but because the news came to them filtered through layers of management, focus groups and in-service training teams, each of which reinterpreted what the last had told them in a great game of academic Chinese whispers.
The message that reached the teachers wasn't the message that the psychological studies had given, but something completely different: "Always encourage! Never criticise!" I never believed that could work. I have always hated praise or encouragement when I don't understand something. The difference between me and a "bad learner" is that when this happens, I don't lose faith in myself -- I lose faith in the teacher. (And I have lost a lot of faith in a lot of teachers over the many years I've spent in full-time and part-time education.)
No, the advice was more subtle and nuanced than that.
As I recall it, the observation came first with misbehaving children. It was noted in classroom observations (carefully managed studies involving timing of teacher-pupil interactions) that teachers spent a lot more time berating misbehaving children than they did praising them when they behaved well. There is a school of thought that sees most children's misbehaviour as a call for attention, and by giving children more attention for misbehaving than behaving, on a certain level you reward the bad behaviour. The theory was that by increasing contact time during periods of good behaviour, you would reinforce the fact that good behaviour leads to adult approval.
But it went further than that. The observers noted that the teachers' response to a misbehaving child just wasn't positive at all. There was visible relief, and the teachers would actually draw attention to the child's normal poor behaviour. By doing so, the researchers claimed, they were establishing the teachers' low expectations, and undermining the pupils' confidence.
Now, can anyone really argue against the idea that we should show kids that we appreciate their good behaviour? Does anyone think that showing a kid that us adults have identified them as a "problem child" has any possible benefits? I doubt it.
So far, so uncontroversial.
But the follow-up to this was that researchers identified similar patterns with children that didn't necessarily misbehave, but just weren't doing well. Criticism for getting it wrong, implied criticism on the occasions they do get it right.
I still think we're in pretty uncontroversial territory here, because the advice is still pretty straightforward: when an underperforming child finally answers a question right, don't say "Thank God! At last you've got one right!"
And, in fact, the most uncontroversial advice from the experts was to smile as you say it, because they saw teachers who never smiled at a correct answer from an underperforming pupil.
The experts were not calling for uncritical, undeserving praise.
How does that translate into the classroom?
Well, a couple of years ago, I was teaching English in a French university. The French (like the Italians and the Spanish) believe themselves to be incapable of learning languages. I also had the challenge of an extremely mixed-ability first year group in the law faculty, everything from people with no previous experience of English to people I could have sat and talked to for an hour or two without problems. Imagine trying to teach a room of 20+ in that situation. It was not fun.
Often trying to get answers felt like trying to get blood from a stone, and when I finally got a good answer from some of them, the relief was palpable...
... and that was it. I had caught myself falling into the trap that the experts warned about. My attitude was more or less "why couldn't you have said that in the first place?!?" and it was all too obvious. The students were reluctant to give answers, and when they did give them, I did nothing to bolster their confidence.
What I had to do wasn't a simple matter of giving out "well done" stickers, but a complete change in my attitude to the students. I was seeing them as obstacles, as problems, when the problem was the circumstance, and my reaction to it. It was easier to see this as a situation that I could do nothing about than to actually do something.
So I set about the task of finding material that was suitable for everyone (and to a great extent succeeded), but more importantly, I changed my attitude to my students. Instead of feeling relieved when I finally got the correct answer, I felt happy. Instead of dropping my shoulders and saying "why didn't you say that earlier?" I smiled and said "of course! I told you you knew it"... until I had built such a rapport with them that I could start dropping my shoulders again and saying "why didn't you say that earlier?"
Which brings us back to the report, and the suggestion that invariable praise projects low expectations onto the students.
In the end, whatever I did was projecting my expectations onto the students. I always had higher expectations of the students than they had of themselves. But my projection had to satisfy two criteria for the students to accept it: it had to be realistic based on their ability, and it had to be close to their own expectations.
If a student's confidence is five steps behind their ability, there's no point in projecting a confidence that matches their ability -- you have to project one that's one step ahead of theirs, and slowly bring up their confidence.
But news came this week of a report rubbishing the practice.
Perhaps my mistake was in trying to educate the lay people rather than the teachers. You see, most teachers didn't get it, not because individually they aren't bright, motivated professionals, but because the news came to them filtered through layers of management, focus groups and in-service training teams, each of which reinterpreted what the last had told them in a great game of academic Chinese whispers.
The message that reached the teachers wasn't the message that the psychological studies had given, but something completely different: "Always encourage! Never criticise!" I never believed that could work. I have always hated praise or encouragement when I don't understand something. The difference between me and a "bad learner" is that when this happens, I don't lose faith in myself -- I lose faith in the teacher. (And I have lost a lot of faith in a lot of teachers over the many years I've spent in full-time and part-time education.)
No, the advice was more subtle and nuanced than that.
As I recall it, the observation came first with misbehaving children. It was noted in classroom observations (carefully managed studies involving timing of teacher-pupil interactions) that teachers spent a lot more time berating misbehaving children than they did praising them when they behaved well. There is a school of thought that sees most children's misbehaviour as a call for attention, and by giving children more attention for misbehaving than behaving, on a certain level you reward the bad behaviour. The theory was that by increasing contact time during periods of good behaviour, you would reinforce the fact that good behaviour leads to adult approval.
But it went further than that. The observers noted that the teachers' response to a misbehaving child just wasn't positive at all. There was visible relief, and the teachers would actually draw attention to the child's normal poor behaviour. By doing so, the researchers claimed, they were establishing the teachers' low expectations, and undermining the pupils' confidence.
Now, can anyone really argue against the idea that we should show kids that we appreciate their good behaviour? Does anyone think that showing a kid that us adults have identified them as a "problem child" has any possible benefits? I doubt it.
So far, so uncontroversial.
But the follow-up to this was that researchers identified similar patterns with children that didn't necessarily misbehave, but just weren't doing well. Criticism for getting it wrong, implied criticism on the occasions they do get it right.
I still think we're in pretty uncontroversial territory here, because the advice is still pretty straightforward: when an underperforming child finally answers a question right, don't say "Thank God! At last you've got one right!"
And, in fact, the most uncontroversial advice from the experts was to smile as you say it, because they saw teachers who never smiled at a correct answer from an underperforming pupil.
The experts were not calling for uncritical, undeserving praise.
How does that translate into the classroom?
Well, a couple of years ago, I was teaching English in a French university. The French (like the Italians and the Spanish) believe themselves to be incapable of learning languages. I also had the challenge of an extremely mixed-ability first year group in the law faculty, everything from people with no previous experience of English to people I could have sat and talked to for an hour or two without problems. Imagine trying to teach a room of 20+ in that situation. It was not fun.
Often trying to get answers felt like trying to get blood from a stone, and when I finally got a good answer from some of them, the relief was palpable...
... and that was it. I had caught myself falling into the trap that the experts warned about. My attitude was more or less "why couldn't you have said that in the first place?!?" and it was all too obvious. The students were reluctant to give answers, and when they did give them, I did nothing to bolster their confidence.
What I had to do wasn't a simple matter of giving out "well done" stickers, but a complete change in my attitude to the students. I was seeing them as obstacles, as problems, when the problem was the circumstance, and my reaction to it. It was easier to see this as a situation that I could do nothing about than to actually do something.
So I set about the task of finding material that was suitable for everyone (and to a great extent succeeded), but more importantly, I changed my attitude to my students. Instead of feeling relieved when I finally got the correct answer, I felt happy. Instead of dropping my shoulders and saying "why didn't you say that earlier?" I smiled and said "of course! I told you you knew it"... until I had built such a rapport with them that I could start dropping my shoulders again and saying "why didn't you say that earlier?"
Which brings us back to the report, and the suggestion that invariable praise projects low expectations onto the students.
In the end, whatever I did was projecting my expectations onto the students. I always had higher expectations of the students than they had of themselves. But my projection had to satisfy two criteria for the students to accept it: it had to be realistic based on their ability, and it had to be close to their own expectations.
If a student's confidence is five steps behind their ability, there's no point in projecting a confidence that matches their ability -- you have to project one that's one step ahead of theirs, and slowly bring up their confidence.
05 October 2014
What's it like to lose language?
I recently started an online course about children's play with FutureLearn. One of the optional readings for week 1 was an interview with a practicing psychologist who had a stroke [journalofplay.com], triggering aphasia from which she has never fully recovered.
There's a lot of intriguing ideas in there, although it's a very long article and it wasn't interesting enough for me to get to the very end, but I figure a few of my regulars would enjoy it anyway.
There's a lot of intriguing ideas in there, although it's a very long article and it wasn't interesting enough for me to get to the very end, but I figure a few of my regulars would enjoy it anyway.
30 September 2014
Language shaped holes: looking back at my Gaelic learning
After my first couple of intensive (week-long) courses in Scottish Gaelic, I developed an analogy for how I thought language learning worked. Classes, I decided, don't put the language in your head, but instead drill "language-shaped holes" that you can later pour stuff into and, like jelly in a mould, what you get in the end is language.
A lot of my school-level language learning had worked that way too. I studied stuff, but really had no conscious command of most of it, but at some point later I mastered through a combination of self-study, practice and exposure. Adherents of the "input hypothesis" would say that only the last one counts, but I don't buy it.
The process I went through with Gaelic was all about production. The first courses I took favoured production over input (the teachers were kind of old-school) and outside of those courses, I took part in discussions on internet forums and at a conversation circle.
My strategy at the conversation circle was to read a little of a coursebook every week before going. The book was ordered grammatically, rather than the thematic units of modern TY/Colloquial and the like. This meant that I could pick a feature, read it up, and then practise it as much as possible during the hour-or-so of conversation.
Quite often, these were features that I'd studied in class but forgotten, and I don't think the book explanations alone would have done the trick.
I kind of went blank on the conditional for a long time, but I started noticing other people use it at the conversation circle. But I could not have noticed if there wasn't a conditional-shaped hole in my head.
Since then, I've decided that language-shaped holes are not the optimum manner of teaching, but as suboptimal goes, they're pretty good....
A lot of my school-level language learning had worked that way too. I studied stuff, but really had no conscious command of most of it, but at some point later I mastered through a combination of self-study, practice and exposure. Adherents of the "input hypothesis" would say that only the last one counts, but I don't buy it.
The process I went through with Gaelic was all about production. The first courses I took favoured production over input (the teachers were kind of old-school) and outside of those courses, I took part in discussions on internet forums and at a conversation circle.
My strategy at the conversation circle was to read a little of a coursebook every week before going. The book was ordered grammatically, rather than the thematic units of modern TY/Colloquial and the like. This meant that I could pick a feature, read it up, and then practise it as much as possible during the hour-or-so of conversation.
Quite often, these were features that I'd studied in class but forgotten, and I don't think the book explanations alone would have done the trick.
I kind of went blank on the conditional for a long time, but I started noticing other people use it at the conversation circle. But I could not have noticed if there wasn't a conditional-shaped hole in my head.
Since then, I've decided that language-shaped holes are not the optimum manner of teaching, but as suboptimal goes, they're pretty good....
19 July 2014
iPad for teachers? No thanks.
My laptop is stuffed. There's some kind of fault in the power circuit and it keeps refusing the mains power. As the battery is nearly exhausted and holds about twenty minutes of charge at most, it's basically unusable.
As the problem worsened, I slowly migrated as much of my daily activities as possible to my iPad. First it was web browsing, then email, an eventually I bought a Python programming environment and the Pages word processor so I could continue developing software and producing material for my students.
There are lots of articles out there that will tell you how wonderful the iPad is for teaching, but these are often little more than superficial lists of frivolous apps for presentations, flashcards and the like.
As a language teacher, there are more fundamental features of the iPad that are instantly a problem: audio and video and file access.
As the problem worsened, I slowly migrated as much of my daily activities as possible to my iPad. First it was web browsing, then email, an eventually I bought a Python programming environment and the Pages word processor so I could continue developing software and producing material for my students.
There are lots of articles out there that will tell you how wonderful the iPad is for teaching, but these are often little more than superficial lists of frivolous apps for presentations, flashcards and the like.
As a language teacher, there are more fundamental features of the iPad that are instantly a problem: audio and video and file access.
Sound files? No thanks!
I wanted to do an exam simulation using one of the practice papers at www.cityandguildsenglish.com, so I downloaded the paper, the answer scheme and the listening transcript onto my iPad. But all this became a bit futile when I rediscovered that e iPad will not let you download MP3s from websites, preferring to force you to use either the iTunes store or the iTunes app. With the files not being available on iTunes andmy PC out of action, I have no way to get any audio or video files I need onto my iPad. Now for listening exercises I am forced to fall back on a rather old Android phone, as it allows me to download anything I want.
Why would you want to access your own files?
Apple have gone out of their way to prevent the iPad being a computer. In one aspect, it was a clever design decision, as now rather than having the abstract concept of "a file", most file types exist as documents within their respective applications. There's less confusion for the user and less danger of malicious or faulty software interfering with the files from other applications,
However, in my current job, I don't do my own printing and photocopying, so I'm always sending multiple worksheets to the course secretary. Without file browser access, I'm currently restricted to going into individual applications, and using the "share" function on individual files to send them as emails. Where once I had one email with 8 attachments, now I have 8 emails with 1 attachment each. This makes life hard for both me and the secretary, as there is a very good chance that one of us will forget something.
Overall
Feel free to tell me about the latest app that has made your life so much easier, but I will never be able to advise other teachers to use a device that complicates the very basics of digital technology for teachers. Most of those apps, or close equivalents, will be available for Android anyway, and Android gives you the power to do what you like with your own data.
Not only that, but the iPad is actually massively overpowered for the basic functions we teachers need (have you seen the complexity of some of the games?) so you're paying more than you need.
Buy a cheap Android tablet instead - it'll save you money and time.
07 July 2014
Musings on confusings...
Since I first learned I had got the job in Sicily, my Spanish has suffered. The day after the job interview, I was at a Spanish/English language exchange, and I kept dropping words of Italian into my Spanish. The weird thing is that my Spanish was a million times stronger than my Italian then, but somehow my brain had switched "mode".
Obviously, living in Italy for four months has only served to intensify this, with my Spanish now being half-hidden behind fairly broken bits of Italian. My assumption for a long time was that my problem was in my accent -- I still speak Italian with a bit of a Spanish twang. This belief was bolstered by the fact that my Catalan, while being very, very weak from lack of use, didn't seem so badly affected. The Catalan accent is very, very different from Spanish and Italian.
However, I was at a Couchsurfing meeting on Friday night which changed my mind. There was an Andalusian tourist visiting, and when I spoke to her, my accent was more different from the one I use in Italian than I had expected. My brain started playing tricks on me, and I had difficulty speaking Italian when she was in my line-of-sight, and for a while I was wobbling between Italian and Spanish.
But that's not the important thing.
When I was speaking Italian, I got into much deeper and more complex conversations than I normally would, and rather than jamming up as I hit the limits of my Italian, I was automatically switching to Spanish to fill in the gaps. Now, I wasn't just importing words or grammar rules from Spanish into Italian -- no, I was switching into Spanish; conjugations, pronouns and all. As I became aware I was doing this, it dawned on me that I'd been doing it for my whole stay, but normally I'd just not thought about it too much and fallen back to English.
This is a bit of a new sensation... or actually, no. The only new thing is the fact that I was unaware of it. When it was Scottish Gaelic and French, for example, it would be instantly noticeable. The difference here is that the similarity of the languages (including, but not limited to, accent) allowed it to slip through the net on occasions.
The trigger mechanism is the same, regardless of language: hit a gap in your knowledge in one language and the brain will fall back on another. The only difference lies in detection.
This makes me wonder if the only option I have now to get my Spanish back is... to learn more Italian. My theory is that filling in the main gaps in my Italian will not only stop me falling back on Spanish when I run out of Italian, but that as a consequence of this, it will reduce the strength of the linkage between the two, allowing me to speak Spanish without Italian interrupting me.
It looks like I might be practising my Italian a lot, even once I leave Italy...
Obviously, living in Italy for four months has only served to intensify this, with my Spanish now being half-hidden behind fairly broken bits of Italian. My assumption for a long time was that my problem was in my accent -- I still speak Italian with a bit of a Spanish twang. This belief was bolstered by the fact that my Catalan, while being very, very weak from lack of use, didn't seem so badly affected. The Catalan accent is very, very different from Spanish and Italian.
However, I was at a Couchsurfing meeting on Friday night which changed my mind. There was an Andalusian tourist visiting, and when I spoke to her, my accent was more different from the one I use in Italian than I had expected. My brain started playing tricks on me, and I had difficulty speaking Italian when she was in my line-of-sight, and for a while I was wobbling between Italian and Spanish.
But that's not the important thing.
When I was speaking Italian, I got into much deeper and more complex conversations than I normally would, and rather than jamming up as I hit the limits of my Italian, I was automatically switching to Spanish to fill in the gaps. Now, I wasn't just importing words or grammar rules from Spanish into Italian -- no, I was switching into Spanish; conjugations, pronouns and all. As I became aware I was doing this, it dawned on me that I'd been doing it for my whole stay, but normally I'd just not thought about it too much and fallen back to English.
This is a bit of a new sensation... or actually, no. The only new thing is the fact that I was unaware of it. When it was Scottish Gaelic and French, for example, it would be instantly noticeable. The difference here is that the similarity of the languages (including, but not limited to, accent) allowed it to slip through the net on occasions.
The trigger mechanism is the same, regardless of language: hit a gap in your knowledge in one language and the brain will fall back on another. The only difference lies in detection.
This makes me wonder if the only option I have now to get my Spanish back is... to learn more Italian. My theory is that filling in the main gaps in my Italian will not only stop me falling back on Spanish when I run out of Italian, but that as a consequence of this, it will reduce the strength of the linkage between the two, allowing me to speak Spanish without Italian interrupting me.
It looks like I might be practising my Italian a lot, even once I leave Italy...
23 June 2014
Benny Lewis... fluent in 3 months...?
People who know me from sites such as HTLAL will know there's no love lost between me and Benny Lewis. The man always refuses to discuss anything, and takes mortal offence at anyone who doesn't agree unquestioning with every single word he says.
This is a shame, because Benny has a wealth of experience in language learning, and being able to "mine" this experience would surely reveal a lot of good stuff. Sadly, though, Benny's refusal to engage in any critical analysis of his own performance has led him to maintain a blog that is full of positively charged platitudes and little of practical substance.
Years ago, I borrowed a copy of his "Language Hacking Guide" from a friend, and blitzed through it making notes for a review on this site, but in the end I decided to let it slide and never published anything.
Now, though, the guy has another book out, entitled Fluent in 3 Months and published by no less than Harper Collins, and the press seems to be lapping it up.
Has Benny grown a clue recently? Has he stopped and given any serious thought to the language learning process? The writing on his blog is as devoid of content as ever, so I doubt it.
I'm hoping to get a loan of a copy of the book in a few weeks time to do my own review, but for now, you can read a very thorough review of the book by the user Big_Dog on the Polydog forum (accompanied by much discussion).
I have to say, most of what he says strikes a chord with me as typical of Benny's style. Woolly definitions, constantly moving goalposts, contradictions, overinterpretations, and just often downright wrong.
Does Benny say anything useful? Yes, he does, but then again, even a stopped clock is right twice a day.
This is a shame, because Benny has a wealth of experience in language learning, and being able to "mine" this experience would surely reveal a lot of good stuff. Sadly, though, Benny's refusal to engage in any critical analysis of his own performance has led him to maintain a blog that is full of positively charged platitudes and little of practical substance.
Years ago, I borrowed a copy of his "Language Hacking Guide" from a friend, and blitzed through it making notes for a review on this site, but in the end I decided to let it slide and never published anything.
Now, though, the guy has another book out, entitled Fluent in 3 Months and published by no less than Harper Collins, and the press seems to be lapping it up.
Has Benny grown a clue recently? Has he stopped and given any serious thought to the language learning process? The writing on his blog is as devoid of content as ever, so I doubt it.
I'm hoping to get a loan of a copy of the book in a few weeks time to do my own review, but for now, you can read a very thorough review of the book by the user Big_Dog on the Polydog forum (accompanied by much discussion).
I have to say, most of what he says strikes a chord with me as typical of Benny's style. Woolly definitions, constantly moving goalposts, contradictions, overinterpretations, and just often downright wrong.
Does Benny say anything useful? Yes, he does, but then again, even a stopped clock is right twice a day.
14 June 2014
Let's be honest with ourselves....
Well, in the last few weeks it has dawned on my how little I have achieved in 3 months in Sicily.
I learned a few previously half-learned, half-forgotten verb suffices and a handful of new words. I noticed a little bit about differences in vowels between Italian and other Romance languages. But that was about it.
So a couple of weeks ago I bought a pile of DVDs. I now have 5 TV serieses and a couple of films, totalling about 70 or so hours. I've watched two serieses and my brain feels far more comfortable with Italian already.
Why didn't I do this earlier?
Well, I was stressed out with work. But that's a weak excuse, because I still had loads of free time that I was wasting on the internet and silly computer games. I could have been spending that time far more productively, and I would have relaxed more.
I've now got a month and a half left, and I need to improve quickly, to make the most of my stay.
I never learned any Sicilian, and I think I'll have to give up hope on that front, because I was relying on my software to help me with that, and it's still not quite working.
But I will leave with better Italian, I have to.
I learned a few previously half-learned, half-forgotten verb suffices and a handful of new words. I noticed a little bit about differences in vowels between Italian and other Romance languages. But that was about it.
So a couple of weeks ago I bought a pile of DVDs. I now have 5 TV serieses and a couple of films, totalling about 70 or so hours. I've watched two serieses and my brain feels far more comfortable with Italian already.
Why didn't I do this earlier?
Well, I was stressed out with work. But that's a weak excuse, because I still had loads of free time that I was wasting on the internet and silly computer games. I could have been spending that time far more productively, and I would have relaxed more.
I've now got a month and a half left, and I need to improve quickly, to make the most of my stay.
I never learned any Sicilian, and I think I'll have to give up hope on that front, because I was relying on my software to help me with that, and it's still not quite working.
But I will leave with better Italian, I have to.
30 May 2014
Return to forumland: Polydog.org
It's been over a year since I stopped using language learning forums on the internet. I was banned from one, and it was the right time for it to happen, as it had become my main method of procrastination.
But recently I got invited to join a new forum polydog.org. It seemed like the right time to get back into the community proper, so I signed up. If I'm honest with myself, my posts have slowly got less and less interesting as there's nothing to challenge my views if I don't get involved in discussions, so how are my views going to develop and change.
So it's time to talk. Lots.
But recently I got invited to join a new forum polydog.org. It seemed like the right time to get back into the community proper, so I signed up. If I'm honest with myself, my posts have slowly got less and less interesting as there's nothing to challenge my views if I don't get involved in discussions, so how are my views going to develop and change.
So it's time to talk. Lots.
28 May 2014
An observation on the order of teaching (from English)
As I recently said, I've been experimenting with Michel Thomas-like techniques in the classroom of late.
One of the crucial elements of Michel Thomas's teaching, seemingly forgotten by the teachers after him, is to address student errors in complicated sentences by reverting to simpler, related sentences and then rebuilding the complexity. The effectiveness of this technique is that it builds and reinforces the underlying structural concepts in his students' minds, as opposed to just giving answers that don't train any linkage between structures.
But it turns out that doing this mindfully also presents the teacher with a hell of a lot of information about what is difficult and easy for students, and hence what order things should be taught in.
With my current MT-style student, my first divergence from standard teaching order was to focus on auxiliary verb-based tenses before the simple past and present, and this did seem effective.
Yesterday, I was trying to revise and solidify the whole pattern of positive declarative vs negative declarative vs interrogative. Now in English, there is only one main pattern, which has three variations: "to be" vs simple aspect vs auxiliaries.
I was eliciting each form from her to build up a table like this:
Nothing spectacular. I've always taught "two verbs in the negative and the question, unless the verb is 'be'," and that's what I was trying to show. What was different was my student's errors: she tried to say "I can to do it" -- a mistake I thought I'd wiped out ages ago. At that point, I instinctively moved on to the negative, because subconsciously I remembered that she didn't make the mistake in the negative, because she knew the difference between "I don't like to"/"I don't want to" and "I can't"/"I won't".
What I realised (and scribbled down in about three places) was that even despite constantly revisiting these structures, the original emphasis on the present had created an erroneous link between the two structures, as they look very, very similar, but as that similarity doesn't carry through to the negative form, the negative unlinks the two.
The result is that in future I intend to start by teaching negatives and interrogatives before introducing the positive forms, in order to force the students' brains to store the auxiliary verbs and verbs like "want" and "like" as different things.
Let me anticipate the first criticism: "students will end up overusing do/did in the positive." I don't think so. Yes, they will initially want to use an auxiliary, but I'll teach them not to -- that's my job after all, isn't it? Besides, the standard order of teaching leads to plenty of predictable, oft-repeated errors: do you can...? I can to do... I want do.... Even if I introduce one error, I'll be eliminating several more.
One of the crucial elements of Michel Thomas's teaching, seemingly forgotten by the teachers after him, is to address student errors in complicated sentences by reverting to simpler, related sentences and then rebuilding the complexity. The effectiveness of this technique is that it builds and reinforces the underlying structural concepts in his students' minds, as opposed to just giving answers that don't train any linkage between structures.
But it turns out that doing this mindfully also presents the teacher with a hell of a lot of information about what is difficult and easy for students, and hence what order things should be taught in.
With my current MT-style student, my first divergence from standard teaching order was to focus on auxiliary verb-based tenses before the simple past and present, and this did seem effective.
Yesterday, I was trying to revise and solidify the whole pattern of positive declarative vs negative declarative vs interrogative. Now in English, there is only one main pattern, which has three variations: "to be" vs simple aspect vs auxiliaries.
I was eliciting each form from her to build up a table like this:
To be | Simple aspect | auxiliary tenses |
I'm here. | I like it. | I'll do it. |
I'm not there. | I don't like it. | I won't do it. |
Are you there? | Do you like it? | Will you do it? |
I was here. | I liked it. | I can do it. |
I wasn't there. | I didn't like it. | I can't do it. |
Were you here? | Did you like it? | Can you do it? |
Nothing spectacular. I've always taught "two verbs in the negative and the question, unless the verb is 'be'," and that's what I was trying to show. What was different was my student's errors: she tried to say "I can to do it" -- a mistake I thought I'd wiped out ages ago. At that point, I instinctively moved on to the negative, because subconsciously I remembered that she didn't make the mistake in the negative, because she knew the difference between "I don't like to"/"I don't want to" and "I can't"/"I won't".
What I realised (and scribbled down in about three places) was that even despite constantly revisiting these structures, the original emphasis on the present had created an erroneous link between the two structures, as they look very, very similar, but as that similarity doesn't carry through to the negative form, the negative unlinks the two.
The result is that in future I intend to start by teaching negatives and interrogatives before introducing the positive forms, in order to force the students' brains to store the auxiliary verbs and verbs like "want" and "like" as different things.
Let me anticipate the first criticism: "students will end up overusing do/did in the positive." I don't think so. Yes, they will initially want to use an auxiliary, but I'll teach them not to -- that's my job after all, isn't it? Besides, the standard order of teaching leads to plenty of predictable, oft-repeated errors: do you can...? I can to do... I want do.... Even if I introduce one error, I'll be eliminating several more.
22 May 2014
Answer in sentences II: Death to abstraction!
A few weeks ago I wrote a post about how the standard practice of forcing beginners to answer "in sentences" seemed to devoid the grammar of meaning.
It was inspired by trying to teach past continuous (he was running etc) and revise times (target: at five o'clock, he was running in the park, at half past eight, he was eating dinner in the dining room etc). Unfortunately, the weaker students would look at a time and say "it's five o'clock", and they would look at the action and say "he's running", and they would look at the location and say "he's in the park". The he's, it's etc structure was drummed so heavily into their heads that they didn't dare diverge from it.
This week, I observed a related problem, as I was giving an exam to some secondary school pupils.
They're preparing for a Trinity GESE exam (spoken English), and one of the features of the level they're at is that they're expected to answer to prompts that aren't actually questions. "Tell me about a time when you..." etc. This is considered a more advanced function because it requires abstraction.
But isn't it true that when we teach students to "answer in sentences", we do so by training them to recycle the words of the question...? Well, guess what. My students were regurgitating my words. I wrote Write about a past holiday, and many of the answers started In my past holiday... Obviously this is not natural English.
Is it the student's fault? Is the student incapable of abstraction? Certainly not! Instead, it seems to me that we as teachers actually train abstraction out of our students as soon as we start this "answer in sentences" thing, because it's a skill they have already learned in their first language, and we actively militate against them using it in the new language.
On the occasions where we do answer in sentences, natural language often uses non-symmetrical forms, such as answering What is your name? with I'm Niall.
Our second-language instruction, then, seems to teach language in a way that is quite contrary to nature....
It was inspired by trying to teach past continuous (he was running etc) and revise times (target: at five o'clock, he was running in the park, at half past eight, he was eating dinner in the dining room etc). Unfortunately, the weaker students would look at a time and say "it's five o'clock", and they would look at the action and say "he's running", and they would look at the location and say "he's in the park". The he's, it's etc structure was drummed so heavily into their heads that they didn't dare diverge from it.
This week, I observed a related problem, as I was giving an exam to some secondary school pupils.
They're preparing for a Trinity GESE exam (spoken English), and one of the features of the level they're at is that they're expected to answer to prompts that aren't actually questions. "Tell me about a time when you..." etc. This is considered a more advanced function because it requires abstraction.
But isn't it true that when we teach students to "answer in sentences", we do so by training them to recycle the words of the question...? Well, guess what. My students were regurgitating my words. I wrote Write about a past holiday, and many of the answers started In my past holiday... Obviously this is not natural English.
Is it the student's fault? Is the student incapable of abstraction? Certainly not! Instead, it seems to me that we as teachers actually train abstraction out of our students as soon as we start this "answer in sentences" thing, because it's a skill they have already learned in their first language, and we actively militate against them using it in the new language.
On the occasions where we do answer in sentences, natural language often uses non-symmetrical forms, such as answering What is your name? with I'm Niall.
Our second-language instruction, then, seems to teach language in a way that is quite contrary to nature....
19 May 2014
Looking to the future...
Last week marked the halfway mark on my current contract, so in two months I will be out of a job. Since I got here, I haven't really learnt any Sicilian, and only a tiny amount of Russian from my flatmate. My Italian has improved, but not all that much given that I've got a couple of flatmates whose English is better than their Italian (one's even Scottish).
What do I want to achieve in my remaining time? I think I really should be working on the Sicilian -- I'm in Sicily after all. And my flatmates are going to be disappearing slowly: the Russian one is planning on leaving for Germany in a couple of weeks. The Scottish one's contract ends a few weeks before mine. The Sicilian part-time flatmate's courses will be finishing in a few weeks' time. My world is gradually dissolving away, and it's forcing me to think about what comes after.
The default option was a return to Scotland, back to Edinburgh to try to gather a few private students while working on a piece of software that I've been supposed to be writing for a long, long time.
Or do I just go out and find myself another teaching job? Wouldn't it be easier just to keep the money coming in, while doing a job that I find satisfying and rewarding?
Besides, if I go back to Scotland, I'd be giving up the chance to immerse myself in another language full-time, which would be a shame.
There is a compromise, of course. I could go overseas and work on the software in a foreign country. But that could potentially be a very lonely existence. Without a job, I wouldn't have colleagues to lean on for company.
Maybe I should go back to Corsica and finally get a proper grip on Corsican. But if I did, I would probably end up working in a language school anyway, and the software would suffer.
The question I'm asking myself is: what do I desire?
And I really do want to get this software written.
I want to produce something better than everything available to the language learner, and part of me thinks I can do it, but another part doubts my ability to do it, and that doubt results in a fear of failure. Fear of failure leads to inaction, because if you never try, you never fail.
This fear of failure is what cripples many language learners. They hit obstacles, and things get difficult. Suddenly they're faced with the fear that the problem lies within themselves, when typically the problem is nothing more than ineffective teaching.
I had those same doubts myself for a very long time, and it held me back and prevented me fully committing myself to language study. I got over them for language. Now I need to get over them again in a different circumstance, and believe that I'm genuinely capable of doing something no-one else can do, and write something that is truly revolutionary.
And more importantly: something which removes the obstacles that cause most learners to believe that they're "just not good at" languages.
What do I want to achieve in my remaining time? I think I really should be working on the Sicilian -- I'm in Sicily after all. And my flatmates are going to be disappearing slowly: the Russian one is planning on leaving for Germany in a couple of weeks. The Scottish one's contract ends a few weeks before mine. The Sicilian part-time flatmate's courses will be finishing in a few weeks' time. My world is gradually dissolving away, and it's forcing me to think about what comes after.
The default option was a return to Scotland, back to Edinburgh to try to gather a few private students while working on a piece of software that I've been supposed to be writing for a long, long time.
Or do I just go out and find myself another teaching job? Wouldn't it be easier just to keep the money coming in, while doing a job that I find satisfying and rewarding?
Besides, if I go back to Scotland, I'd be giving up the chance to immerse myself in another language full-time, which would be a shame.
There is a compromise, of course. I could go overseas and work on the software in a foreign country. But that could potentially be a very lonely existence. Without a job, I wouldn't have colleagues to lean on for company.
Maybe I should go back to Corsica and finally get a proper grip on Corsican. But if I did, I would probably end up working in a language school anyway, and the software would suffer.
The question I'm asking myself is: what do I desire?
And I really do want to get this software written.
I want to produce something better than everything available to the language learner, and part of me thinks I can do it, but another part doubts my ability to do it, and that doubt results in a fear of failure. Fear of failure leads to inaction, because if you never try, you never fail.
This fear of failure is what cripples many language learners. They hit obstacles, and things get difficult. Suddenly they're faced with the fear that the problem lies within themselves, when typically the problem is nothing more than ineffective teaching.
I had those same doubts myself for a very long time, and it held me back and prevented me fully committing myself to language study. I got over them for language. Now I need to get over them again in a different circumstance, and believe that I'm genuinely capable of doing something no-one else can do, and write something that is truly revolutionary.
And more importantly: something which removes the obstacles that cause most learners to believe that they're "just not good at" languages.
14 May 2014
Language books and forgetting the rules of teaching.
I can't remember where I first heard it, but this rule immediately struck me as one of the most sensible and important rules in setting classroom tasks:
It's a fairly straightforward rule -- the stronger students will finish questions quicker than the weaker ones, so as a teacher you are left with a choice between stopping the exercise before all the students have finished, or leaving the faster students hanging around bored after they've finished.
We all typically compromise, leaving the stronger students waiting for a little while, but still not giving the weaker students the chance to finish everything. We formalise this with the immortal line "It doesn't matter if you haven't finished," but in reality, normally it does.
Have a look at most language exercises. You'll typically find that a great many question sets only cover a particular language point or case once, and if you don't answer that question, you don't practise that point. So yes, it does matter if you haven't finished.
The common-sense solution is to write a question set that covers all the points once, then add in additional questions that revisit the same points, but in a more complicated way. Say your minimal coverage of the grammar points can be done in 6 questions -- add another four to make it up to 10. Now you can watch for your weaker students finishing question 6 and declare in all truthfulness that it doesn't matter if they haven't finished... because actually, they have finished -- they just don't know that the remaining questions are primarily time-fillers.
But yes -- primarily time-fillers. As I said, they should also serve to practise more sophisticated uses of the point in question. This way every class is differentiated. All students cover the same basic material, but the advanced students get advanced practise.
Sadly, none of the materials I'm asked to use in class work this way, and I'm forced into the dishonest version of "it doesn't matter if you haven't finished." I would like to see books where the minimal coverage is marked with a line to indicate where "it doesn't matter..." becomes true, beyond which the questions are not strictly required.
This would be helpful. Shame they don't do it.
Always write more questions than you need.
It's a fairly straightforward rule -- the stronger students will finish questions quicker than the weaker ones, so as a teacher you are left with a choice between stopping the exercise before all the students have finished, or leaving the faster students hanging around bored after they've finished.
We all typically compromise, leaving the stronger students waiting for a little while, but still not giving the weaker students the chance to finish everything. We formalise this with the immortal line "It doesn't matter if you haven't finished," but in reality, normally it does.
Have a look at most language exercises. You'll typically find that a great many question sets only cover a particular language point or case once, and if you don't answer that question, you don't practise that point. So yes, it does matter if you haven't finished.
The common-sense solution is to write a question set that covers all the points once, then add in additional questions that revisit the same points, but in a more complicated way. Say your minimal coverage of the grammar points can be done in 6 questions -- add another four to make it up to 10. Now you can watch for your weaker students finishing question 6 and declare in all truthfulness that it doesn't matter if they haven't finished... because actually, they have finished -- they just don't know that the remaining questions are primarily time-fillers.
But yes -- primarily time-fillers. As I said, they should also serve to practise more sophisticated uses of the point in question. This way every class is differentiated. All students cover the same basic material, but the advanced students get advanced practise.
Sadly, none of the materials I'm asked to use in class work this way, and I'm forced into the dishonest version of "it doesn't matter if you haven't finished." I would like to see books where the minimal coverage is marked with a line to indicate where "it doesn't matter..." becomes true, beyond which the questions are not strictly required.
This would be helpful. Shame they don't do it.
30 April 2014
Putting Michel Thomas into practice
So I've finally found myself using Michel Thomas's techniques (or at least my interpretation of them) in the language classroom.
When I started my current job, I made the decision that I was going to do exactly what the school wanted of me, and stop worrying about the ineffectiveness of standard techniques -- after all, standard techniques are what the school sells, and therefore what the students have bought. Basically, I figured that I needed to start looking at teaching as a job rather than some kind of holy calling, so that I could go home at the end of the day and switch off, rather than beating myself up until bedtime for the day's errors.
It's ironic, then, that the student who's getting the most "me" in my teaching is the second class I started. I went in preparing to teach the standard course, and with only a couple of hours until the class started, I discovered that the method materials were aimed at large classes, and were inappropriate to a one-on-one class, so I went to the syllabus and wrote down the language features for the first few lessons. To be, to have, there is/are... OK, fine. The various things pointed in a direction I wasn't intending to head in, but there it was: Total Physical Response. For about half a dozen lessons we were putting things in a box, on a chair, on the floor... even on my head. I was lucky that I'd recently got in an online discussion about "Language Hunting" just before, as I had whole system analysed and fresh in my head.
The student was herself a teacher -- in fact, a headteacher -- and understood a lot about pedagogy. This was a double-edged sword, as while she was open to different ideas, she got kind of fixated on the "concreteness" of the TPR class, and was slightly reluctant to move on until I'd built up her confidence in herself and in me as a teacher. Once that confidence was there, it was time to switch, and switch we did.
Compliments from students are always welcome, but when they come from a trained and experienced teacher, they really feel good. Of course, at the same time, I recognise that it's the techniques drawing the praise, not me.
Teaching Michel Thomas style is actually even more difficult than I really appreciated. The last time I tried it, I wasn't teaching full-time, and I had enough space in my head to balance the repetition of the various language points, but in these classes I frequently find myself leaving revision of a particular word or structure late, and the student has forgotten it. She blames herself, when of course it's my fault. I'm trying to develop ways of notating and mnemonicising for myself to keep track of what to revise and when.
But overall, it's been clearly very effective, and it's reinforced my belief that Thomas's core methodology is constructed on sound concepts, and that there was nothing miraculous about Thomas as a teacher.
If there's a stressful element in it, it's seeing how much the lesson suffers when the teacher's on an off-day, something that doesn't happen as much when you're hiding behind worksheets and programmed materials. On the flip-side, though, this connection between teacher performance and learning eliminates the great existential angst of the teacher: "Are they learning because of me or despite me?"
When I started my current job, I made the decision that I was going to do exactly what the school wanted of me, and stop worrying about the ineffectiveness of standard techniques -- after all, standard techniques are what the school sells, and therefore what the students have bought. Basically, I figured that I needed to start looking at teaching as a job rather than some kind of holy calling, so that I could go home at the end of the day and switch off, rather than beating myself up until bedtime for the day's errors.
It's ironic, then, that the student who's getting the most "me" in my teaching is the second class I started. I went in preparing to teach the standard course, and with only a couple of hours until the class started, I discovered that the method materials were aimed at large classes, and were inappropriate to a one-on-one class, so I went to the syllabus and wrote down the language features for the first few lessons. To be, to have, there is/are... OK, fine. The various things pointed in a direction I wasn't intending to head in, but there it was: Total Physical Response. For about half a dozen lessons we were putting things in a box, on a chair, on the floor... even on my head. I was lucky that I'd recently got in an online discussion about "Language Hunting" just before, as I had whole system analysed and fresh in my head.
The student was herself a teacher -- in fact, a headteacher -- and understood a lot about pedagogy. This was a double-edged sword, as while she was open to different ideas, she got kind of fixated on the "concreteness" of the TPR class, and was slightly reluctant to move on until I'd built up her confidence in herself and in me as a teacher. Once that confidence was there, it was time to switch, and switch we did.
Compliments from students are always welcome, but when they come from a trained and experienced teacher, they really feel good. Of course, at the same time, I recognise that it's the techniques drawing the praise, not me.
Teaching Michel Thomas style is actually even more difficult than I really appreciated. The last time I tried it, I wasn't teaching full-time, and I had enough space in my head to balance the repetition of the various language points, but in these classes I frequently find myself leaving revision of a particular word or structure late, and the student has forgotten it. She blames herself, when of course it's my fault. I'm trying to develop ways of notating and mnemonicising for myself to keep track of what to revise and when.
But overall, it's been clearly very effective, and it's reinforced my belief that Thomas's core methodology is constructed on sound concepts, and that there was nothing miraculous about Thomas as a teacher.
If there's a stressful element in it, it's seeing how much the lesson suffers when the teacher's on an off-day, something that doesn't happen as much when you're hiding behind worksheets and programmed materials. On the flip-side, though, this connection between teacher performance and learning eliminates the great existential angst of the teacher: "Are they learning because of me or despite me?"
26 April 2014
Oversharing
Back in December 2012, I wrote a post entitled "Too many cooks spoil the net". The basic gist of it was that people are so keen to share everything that the internet is flooded with redundant and suboptimal resources, making it very difficult to identify the really useful stuff. When I was prepping a lesson on sports a few weeks ago, I went onto the net to search, and I came to a lesson plan uploaded with the following blurb:
This one sentence captures the problem beautifully: what we're looking at isn't a carefully considered, well thought out piece of work, but something basically thrown together the night before like any teacher can do. This is surely what much of the material on the net is.I have a lesson coming up tomorrow and I thought I would share this lesson that I have created over the evening.
Is such work really worthy of sharing? How much time does it save us as teachers to have such resources available? Is it worth it? In the end, we either spend as much time searching for something suitable as we would have spent writing our own resource, or we take the first thing we find without fully evaluating its effectiveness. Either way, we end up producing something that results in a worse experience for our students, and if we can't do the best by our students, why are we even trying?
So I have a particular beef with the site ESLprintables.com. There seems to be a fair amount of good material on the site, but I have never been able to use it, because the policy there is that you can't download anything if you haven't uploaded anything. Really, who benefits from that policy? Do you really want a bunch of inexperienced gap year teachers flooding you with poor quality worksheets and drowning out the good stuff in the process? And are the people who are capable of producing good resources really going to be interested in using other people's materials rather than the stuff they've been refining and improving over the course of their careers? The whole thing seems like an exercise in futility to me.
16 April 2014
All teachers lie (or "The myth of the target-language-only classroom")
So last time I was talking about the danger of the philosophy of answer-in-sentences, but I mentioned the English-only-classroom in passing.
It's a commonly stated goal that we, as teachers, should be aiming to use only the target language in the classroom, and that we shouldn't be introducing "translation". In the guidelines of my school, translation is marked down as a "last resort".
However, most teachers kid themselves on. In a heterogeneous, mixed-mother-tongue class (foreigners studying in an English-speaking country), obviously you can't use the students' mother tongue as there is no other shared language, but once you go into a homogenous class, translation becomes a matter of course. I've observed other teachers doing it, and I've even had advice from teachers that has thrown in translation as a completely standard tool. (EG. a former boss in Spain who insisted that he taught in English only told me that the best way to correct Spanish kids saying "one car" instead of "a car" was to say "uno coche?" thus demonstrating that "one" was explicitly a number. He was completely unconscious of the dichotomy of philosophy vs practice.)
When I've witnessed this in observed classes, what I tend to experience is a careful, elaborate explanation of a task or grammar point in consciously graded English, followed by a very quick, concise summary in the students' native language. How are we to know whether it's the difficult-to-understand English or the exceptionally easy mother-tongue instruction that they're learning from? Common sense, perhaps? It strikes me as obvious which one would be more effective.
I've had three full-time jobs in different foreign countries, and in all of them, I've walked in and been told to pretend I don't know anything about the local language, and each time I've tried to do so. It has never worked. It's not a question of how well I grade my English, either -- it's simply that you cannot and will not stop the stronger students translating for the weaker ones, which means losing control of the classroom (particularly when the students doing the translation have failed to understand themselves). This translation problem is even more serious in a mixed-mother-tongue class where a large subset of the students already share a common language, where there is a large heterogeneous immigrant population (I'm told there are a lot of South Americans in Italy, for example, and they would have to learn Italian.)
An English-only classroom presents a huge psychological barrier, as the students see themselves as unable to communicate with the teacher. English should be a means of communication, yet we present it to our students in such a way as to make it an impediment to communication. That can't be right, surely....
It's a commonly stated goal that we, as teachers, should be aiming to use only the target language in the classroom, and that we shouldn't be introducing "translation". In the guidelines of my school, translation is marked down as a "last resort".
However, most teachers kid themselves on. In a heterogeneous, mixed-mother-tongue class (foreigners studying in an English-speaking country), obviously you can't use the students' mother tongue as there is no other shared language, but once you go into a homogenous class, translation becomes a matter of course. I've observed other teachers doing it, and I've even had advice from teachers that has thrown in translation as a completely standard tool. (EG. a former boss in Spain who insisted that he taught in English only told me that the best way to correct Spanish kids saying "one car" instead of "a car" was to say "uno coche?" thus demonstrating that "one" was explicitly a number. He was completely unconscious of the dichotomy of philosophy vs practice.)
When I've witnessed this in observed classes, what I tend to experience is a careful, elaborate explanation of a task or grammar point in consciously graded English, followed by a very quick, concise summary in the students' native language. How are we to know whether it's the difficult-to-understand English or the exceptionally easy mother-tongue instruction that they're learning from? Common sense, perhaps? It strikes me as obvious which one would be more effective.
I've had three full-time jobs in different foreign countries, and in all of them, I've walked in and been told to pretend I don't know anything about the local language, and each time I've tried to do so. It has never worked. It's not a question of how well I grade my English, either -- it's simply that you cannot and will not stop the stronger students translating for the weaker ones, which means losing control of the classroom (particularly when the students doing the translation have failed to understand themselves). This translation problem is even more serious in a mixed-mother-tongue class where a large subset of the students already share a common language, where there is a large heterogeneous immigrant population (I'm told there are a lot of South Americans in Italy, for example, and they would have to learn Italian.)
An English-only classroom presents a huge psychological barrier, as the students see themselves as unable to communicate with the teacher. English should be a means of communication, yet we present it to our students in such a way as to make it an impediment to communication. That can't be right, surely....
05 April 2014
Answer in sentences: death to meaning!
After my injury-induced half-year out of the classroom, I made myself a promise: I wasn't going to torture myself by wanting to undo all the mistakes of teaching orthodoxy in one go. It's not something I'm capable of doing, and in wanting to, I have ended up hampering my ability to get on with the task as requested by the people writing my paycheque. After all, while the orthodoxy may be far from perfect, it's at least tried and tested, and people have learned from it.
I have tried my best to stick to this philosophy since the start of my new employment, with a few particular exceptions where suitable materials weren't available in time and I had to improvise. You can hardly be expected to improvise in a style that isn't yours, after all.
The real danger in doing something someone else's way is that you might start to believe in it. I was never a fan of "answer in sentences" as it always seemed unnatural, but it's something I've come to rely on in class, and I was starting to view it uncritically, until I came face-to-face with the downside...
I was teaching a class of primary-age kids, and I was integrating times with the past tense of to be. The worksheet presented a clock representing the time, and a little picture of a location, to prompt sentences of the form "at five o'clock, he was in the kitchen," following a model example at the top of the first page. Some of the kids latched on to the point fairly quickly, but most needed repeated explanation and demonstration. (This is because I'm trying to stick to the orthodoxy of "the English only classroom" even though these kids don't speak English yet -- but that's a rant for another time.)
One in particular was having difficulties, as he's afraid of making mistakes: you can only fail if you try, so he's naturally afraid of trying. I led him through several questions directly, breaking the task into two parts: the time and the location. The problem was, when I pointed at the time, he would say "it's five o'clock", as per his answer-in-sentences training; and when I pointed at the location, he would say "he's in the kitchen". These kids have been trained (by myself and by other teachers) to never say any noun or adjective on its own, so that little contracted form it's has taken on a life that is divorced from its meaning, and appears in the language of many of the learners as little more than a particle that precedes certain words.
Basically, it has reawakened a long-held belief of mine that the frequent repetition of words doesn't truly aid in their memorisation, as the students simply aren't required to consider the context, and the language forms are devoided of all meaning.
But language is meaning.
I have tried my best to stick to this philosophy since the start of my new employment, with a few particular exceptions where suitable materials weren't available in time and I had to improvise. You can hardly be expected to improvise in a style that isn't yours, after all.
The real danger in doing something someone else's way is that you might start to believe in it. I was never a fan of "answer in sentences" as it always seemed unnatural, but it's something I've come to rely on in class, and I was starting to view it uncritically, until I came face-to-face with the downside...
I was teaching a class of primary-age kids, and I was integrating times with the past tense of to be. The worksheet presented a clock representing the time, and a little picture of a location, to prompt sentences of the form "at five o'clock, he was in the kitchen," following a model example at the top of the first page. Some of the kids latched on to the point fairly quickly, but most needed repeated explanation and demonstration. (This is because I'm trying to stick to the orthodoxy of "the English only classroom" even though these kids don't speak English yet -- but that's a rant for another time.)
One in particular was having difficulties, as he's afraid of making mistakes: you can only fail if you try, so he's naturally afraid of trying. I led him through several questions directly, breaking the task into two parts: the time and the location. The problem was, when I pointed at the time, he would say "it's five o'clock", as per his answer-in-sentences training; and when I pointed at the location, he would say "he's in the kitchen". These kids have been trained (by myself and by other teachers) to never say any noun or adjective on its own, so that little contracted form it's has taken on a life that is divorced from its meaning, and appears in the language of many of the learners as little more than a particle that precedes certain words.
Basically, it has reawakened a long-held belief of mine that the frequent repetition of words doesn't truly aid in their memorisation, as the students simply aren't required to consider the context, and the language forms are devoided of all meaning.
But language is meaning.
21 March 2014
The borders of cultural references....
I'm currently teaching in Italy, and as they all speak an H-less language here, it's unsurprising that a lot of them make mistakes with Hs.
Many of them do try, but unfortunately this often leads to the phenomenon called "hypercorrection" -- ie. they attempt to correct things that aren't wrong to start off with. In the case of H, this means Hs inserted into words naturally starting with vowels.
And so it was today that I was teaching the future tense, and when I was looking for the answer "I will", a couple of my students said "h-I'll".
Now how's a teacher supposed to react to that? I spun round from the board, straightening my arm and shouting "HEIL!". Now what I had forgotten was that what most of the world thinks of as the "Nazi salute" is actually a pan-fascist salute, a neo-imperial confection claimed by its originators to be a revival of Roman tradition. And of course, Italy had plenty of fascists of its own.
Oops.
Many of them do try, but unfortunately this often leads to the phenomenon called "hypercorrection" -- ie. they attempt to correct things that aren't wrong to start off with. In the case of H, this means Hs inserted into words naturally starting with vowels.
And so it was today that I was teaching the future tense, and when I was looking for the answer "I will", a couple of my students said "h-I'll".
Now how's a teacher supposed to react to that? I spun round from the board, straightening my arm and shouting "HEIL!". Now what I had forgotten was that what most of the world thinks of as the "Nazi salute" is actually a pan-fascist salute, a neo-imperial confection claimed by its originators to be a revival of Roman tradition. And of course, Italy had plenty of fascists of its own.
Oops.
06 March 2014
On the move...
Well there was a nice surprise... just as the window was closing on spring recruitment for TEFL posts, I got invited to a Skype interview and offered a job! At the moment, I'm packing for the flight to Sicily.
Between the job and my side project programming computer software, I'm going to be pretty busy from now until the summer, which can only be good.
From a language perspective, I'll be able to finally firm up my Italian, which I use so rarely that it's kind of stuttery and weak. It's better than I expected, though, and that's probably thanks to the Corsican I learnt last year. I even found myself speaking Spanish and Catalan last night, and accidentally throwing in a few Italianisms, so my brain's clearly getting prepped in the background.
Will I learn Sicilian while I'm there? Well, I'd certainly like to, but I'm going to have a pretty heavy workload to deal with, and there's also the problem of trying to manage two highly similar languages essentially simultaneously. Would I be able to keep the two separate...? I suppose I'll just have to try and find out.
It might be nice to go and visit the "Piana degli Albanesi", but I don't think I'll be able to dedicate time to studying the local Albanian tongue, sadly!
(Any pointers to good materials on learning Sicilian gratefully received.)
Between the job and my side project programming computer software, I'm going to be pretty busy from now until the summer, which can only be good.
From a language perspective, I'll be able to finally firm up my Italian, which I use so rarely that it's kind of stuttery and weak. It's better than I expected, though, and that's probably thanks to the Corsican I learnt last year. I even found myself speaking Spanish and Catalan last night, and accidentally throwing in a few Italianisms, so my brain's clearly getting prepped in the background.
Will I learn Sicilian while I'm there? Well, I'd certainly like to, but I'm going to have a pretty heavy workload to deal with, and there's also the problem of trying to manage two highly similar languages essentially simultaneously. Would I be able to keep the two separate...? I suppose I'll just have to try and find out.
It might be nice to go and visit the "Piana degli Albanesi", but I don't think I'll be able to dedicate time to studying the local Albanian tongue, sadly!
(Any pointers to good materials on learning Sicilian gratefully received.)
03 March 2014
Language learning software: a continual disappointment
If I've gone a bit quiet of late, it's because I'm getting less comfortable writing about other people and companies. You see, I've been trying on and off for a while to develop some language learning software, so there's a risk that anything I write could later be construed as just another company bad-mouthing the competition. On top of that, I'm now becoming a bit cautious about going into depth in my criticism and giving away too much of the thinking behind my software.
Anyhow, at the weekend I was having a break from coding and decided to have a little audit of my competition, including a lot of projects on sites like indiegogo and kickstarter. In recent months, the number of language learning applications being touted on these sites and elsewhere has skyrocketed, but I can't say I'm impressed with the quality of the proposals.
For the most part, these projects are touted as "revolutionary", but to anyone with any awareness of the language learning marketplace, it's just the same old stuff everyone else is already marketing. Among the main "differences" that people try to use to sell their wares is "real, conversational language" and "language to help you as a traveller in real situations". Tied to that is "no boring grammar". Whether this is the right way or not is not my point (not yet -- I'll come back to that) -- what's most depressing that everyone is doing the same thing and calling it "new and exciting".
Now from listening to the pitches, I get the feeling that this isn't just shysterism -- they genuinely believe that they're doing something new and different, which means they can't have researched what's available, and they can't know much about teaching.
This does put them in good company, though. One of the world's biggest language software vendors was founded by someone with no background in teaching, but who had been overseas for a bit (putting him on the fringes of my "professional language learner" category).
Now, many of the proposals are too thin on detail to be investable, and true to expectations, these ones never pick up any meaningful number of pledges. It's all well and good to tell me it's going to be immersive and revolutionary, but what is it going to look like? What is it going to do?
Of those that go into a little more detail, many seem like glorified flashcards, and many take their cues directly from Rosetta Stone. Others still are just a blend of absolutely standard techniques -- translations, gapfills, word-rearranging, picture-word matching etc etc.
It's these ones I would be most interested in, because if teachers have been using these techniques for all these years, they must have some merit, and if we can improve on them with technology, then we should do so. Sadly, once we've seen a demonstration (in those cases where development has progressed far enough for there to be some kind of prototype), there's no real innovation presented. A cute picture of an animal to go with your new vocabulary does not an innovative pedagogy make!
One product that looked reasonably unique among the crowd (not found on the crowdfunding sites) was one that used Microsoft's Kinect hardware to track the user, to allow more "active" engagement. While the presenter was enthusiastically telling us about how this opens up all sorts of possibilities for "interactive environments", nothing in the demonstration was any more sophisticated than a Total Physical Response lesson, and while TPR is still used by some teachers, it never really took off as it was always a very limited technique. Anything in this package that wasn't just eTPR (to coin a phrase) was essentially vocabulary practise, selecting things that the computer asked for in the target language, and I imagine most users would find it easier to just use the mouse to point and click the item requested than to move about in front of a webcam. Furthermore, as more and more of the software market migrates to tablets and smartphones, the touchscreen really seems a much more mass-market technology to employ. The limitations of the product prototype on display fulfil Wilfried Decoo's observation that "the medium makes the method" -- ie that most new movements in language learning are just the most obvious means of the application of a new technology to language learning.
I even came across one project for a learning game that recreated the "bad old days" of early "edutainment" software -- it was a fairly basic shoot-em-up game where the educational material wasn't really part of the game, instead being an interruption to gameplay: you walked up to a door and couldn't open it until you gave the correct translation.
But in my opinion, the single worst thing that I came across was perfectly harmless and ordinary until I got to the bit about what they wanted the money for: to translate the lessons. Yes, "template teaching" is alive and well in the computer world. Remember folks: every language is different!
Anyhow, at the weekend I was having a break from coding and decided to have a little audit of my competition, including a lot of projects on sites like indiegogo and kickstarter. In recent months, the number of language learning applications being touted on these sites and elsewhere has skyrocketed, but I can't say I'm impressed with the quality of the proposals.
For the most part, these projects are touted as "revolutionary", but to anyone with any awareness of the language learning marketplace, it's just the same old stuff everyone else is already marketing. Among the main "differences" that people try to use to sell their wares is "real, conversational language" and "language to help you as a traveller in real situations". Tied to that is "no boring grammar". Whether this is the right way or not is not my point (not yet -- I'll come back to that) -- what's most depressing that everyone is doing the same thing and calling it "new and exciting".
Now from listening to the pitches, I get the feeling that this isn't just shysterism -- they genuinely believe that they're doing something new and different, which means they can't have researched what's available, and they can't know much about teaching.
This does put them in good company, though. One of the world's biggest language software vendors was founded by someone with no background in teaching, but who had been overseas for a bit (putting him on the fringes of my "professional language learner" category).
Now, many of the proposals are too thin on detail to be investable, and true to expectations, these ones never pick up any meaningful number of pledges. It's all well and good to tell me it's going to be immersive and revolutionary, but what is it going to look like? What is it going to do?
Of those that go into a little more detail, many seem like glorified flashcards, and many take their cues directly from Rosetta Stone. Others still are just a blend of absolutely standard techniques -- translations, gapfills, word-rearranging, picture-word matching etc etc.
It's these ones I would be most interested in, because if teachers have been using these techniques for all these years, they must have some merit, and if we can improve on them with technology, then we should do so. Sadly, once we've seen a demonstration (in those cases where development has progressed far enough for there to be some kind of prototype), there's no real innovation presented. A cute picture of an animal to go with your new vocabulary does not an innovative pedagogy make!
One product that looked reasonably unique among the crowd (not found on the crowdfunding sites) was one that used Microsoft's Kinect hardware to track the user, to allow more "active" engagement. While the presenter was enthusiastically telling us about how this opens up all sorts of possibilities for "interactive environments", nothing in the demonstration was any more sophisticated than a Total Physical Response lesson, and while TPR is still used by some teachers, it never really took off as it was always a very limited technique. Anything in this package that wasn't just eTPR (to coin a phrase) was essentially vocabulary practise, selecting things that the computer asked for in the target language, and I imagine most users would find it easier to just use the mouse to point and click the item requested than to move about in front of a webcam. Furthermore, as more and more of the software market migrates to tablets and smartphones, the touchscreen really seems a much more mass-market technology to employ. The limitations of the product prototype on display fulfil Wilfried Decoo's observation that "the medium makes the method" -- ie that most new movements in language learning are just the most obvious means of the application of a new technology to language learning.
I even came across one project for a learning game that recreated the "bad old days" of early "edutainment" software -- it was a fairly basic shoot-em-up game where the educational material wasn't really part of the game, instead being an interruption to gameplay: you walked up to a door and couldn't open it until you gave the correct translation.
But in my opinion, the single worst thing that I came across was perfectly harmless and ordinary until I got to the bit about what they wanted the money for: to translate the lessons. Yes, "template teaching" is alive and well in the computer world. Remember folks: every language is different!
19 February 2014
The persistence of origin myths
It's quite hard to challenge the orthodoxy in most fields, and language is no exception. This only seems fair - you need a lot of evidence to disprove an existing theory... or do you? I suppose that depends on what evidence there is for the existing orthodoxy.
The orthodoxy of language origin theories is often distinctly lacking in evidence, as in many cases it is tied into notions of ethnic origin, most of which are being proven wrong even as we speak.
A great many national ethnic origin myths are based on the idea of conquest as an annihilation of the existing population. For example, take the Anglo-Saxon invasion of England and south-east Scotland. Received wisdom is that the Norsemen started invading the territory roughly corresponding to modern Denmark, and started killing and displacing the locals, who fled across the North Sea and started killing and displacing the local Celtic Britons on the east coast of Great Britain.
That gives us the creation myth of England, that was built on hearsay.
I mean... did the Romans kill and displace the locals? No, they ruled over them. We were asked to believe they were the exception.
But did the Greeks kill and displace the locals? No, they ruled over them. Another exception, perhaps? The Moguls in the Indian subcontinent? The Mongols?
And more recently the Ottoman Empire? Or indeed the French or British Empires?
The fact of the matter is that the overwhelming majority of well-documented "invasions" in history have simply involved the installation of a new elite over the existing population. The genocide perpetrated in the Americas and Australia is actually pretty unique in history.
So why are we expected to assume that all the poorly-documented invasions are so different from the documented ones? Why shouldn't the Anglo-Saxon invasion be more like the Romans?
Well, thanks to DNA, that old orthodoxy has been turned on its head. Genetic testing suggests that there's a heck of a lot of Celtic ancestry where the old kill-them-all theory of historical invasion would have left us with only Anglo-Saxon blood. On top of this, in the old Anglo-Saxon homeland around Denmark, they still have predominantly Anglo-Saxon DNA, where we were supposed to believe that the Norsemen had killed or exiled all the Anglo-Saxons.
But we shouldn't have needed DNA evidence. We know that the Anglo-Saxon kings of England claimed descent from King Arthur, a figure not from Anglo-Saxon mythology, but from British Celtic mythology. This alone should have proven conclusively that a huge portion of the English population was drawn from Celtic stock. Not to mention that the Danes have never looked all that similar to the Norwegians and Swedish. The Norse Vikings that invaded the west of Scotland were towering red-heads, while the Danish Vikings that invaded Northumberland were normal height and had dark hair.
This naturally has direct consequences in the origin myths of languages.
The best documented case of language birth brings us once again back to the Romans.
We know a heck of a lot about Classical Latin, but we know very little about the colloquial speech of the common man in ancient Rome. We know it was different, but we don't know how much. We also know that when they went out and conquered other nations, the locals tried to learn Latin, but did so imperfectly. We know that these imperfect Latins eventually developed into the Romance language family we know today (French, Italian, Spanish etc). Crucially, while we know very little about the early stages of the Romance languages, this is because Latin remained the language of the elite, so all literature was in Latin.
This means that the single best-attested example of language evolution revolves around an elite language hiding changes in the common language.
We can compare this to creole languages. Even today, many creole-speaking countries are run by elites that prefer to speak in French or English, and resist efforts to raise the profile of the local creole language.
All our most reliable data on language birth comes from languages where a large population have imperfectly learned the language of the elite, and where the elite language and the common language have existed in parallel for a significant period of time. It stands to reason that this should be our default assumption for all languages.
And yet our standard model of language development is still based on time periods: Old followed by Middle followed by Modern.
Given all that, it came as no surprise to me that someone recently published a paper proposing that a lot of so-called "Middle English" was actually contemporaneous with "Old English", and that the loss of grammatical complexity wasn't due to the Norman invasion (the Normans were a tiny minority elite ruling over a huge population) but rather the result of the Britons imperfectly learning the Anglo-Saxon of their rulers. The reason he gave for the apparent abrupt change was quite simple: the Anglo-Saxon elite were disenfranchised and Norman French became the language of literature. Middle English literature was not written by the descendants of the old Anglo-Saxon elite, but by the descendants of the peasantry, or people of Norman descent who learned it from the peasantry.
Shouldn't that have been our default assumption all along...?
The orthodoxy of language origin theories is often distinctly lacking in evidence, as in many cases it is tied into notions of ethnic origin, most of which are being proven wrong even as we speak.
A great many national ethnic origin myths are based on the idea of conquest as an annihilation of the existing population. For example, take the Anglo-Saxon invasion of England and south-east Scotland. Received wisdom is that the Norsemen started invading the territory roughly corresponding to modern Denmark, and started killing and displacing the locals, who fled across the North Sea and started killing and displacing the local Celtic Britons on the east coast of Great Britain.
That gives us the creation myth of England, that was built on hearsay.
I mean... did the Romans kill and displace the locals? No, they ruled over them. We were asked to believe they were the exception.
But did the Greeks kill and displace the locals? No, they ruled over them. Another exception, perhaps? The Moguls in the Indian subcontinent? The Mongols?
And more recently the Ottoman Empire? Or indeed the French or British Empires?
The fact of the matter is that the overwhelming majority of well-documented "invasions" in history have simply involved the installation of a new elite over the existing population. The genocide perpetrated in the Americas and Australia is actually pretty unique in history.
So why are we expected to assume that all the poorly-documented invasions are so different from the documented ones? Why shouldn't the Anglo-Saxon invasion be more like the Romans?
Well, thanks to DNA, that old orthodoxy has been turned on its head. Genetic testing suggests that there's a heck of a lot of Celtic ancestry where the old kill-them-all theory of historical invasion would have left us with only Anglo-Saxon blood. On top of this, in the old Anglo-Saxon homeland around Denmark, they still have predominantly Anglo-Saxon DNA, where we were supposed to believe that the Norsemen had killed or exiled all the Anglo-Saxons.
But we shouldn't have needed DNA evidence. We know that the Anglo-Saxon kings of England claimed descent from King Arthur, a figure not from Anglo-Saxon mythology, but from British Celtic mythology. This alone should have proven conclusively that a huge portion of the English population was drawn from Celtic stock. Not to mention that the Danes have never looked all that similar to the Norwegians and Swedish. The Norse Vikings that invaded the west of Scotland were towering red-heads, while the Danish Vikings that invaded Northumberland were normal height and had dark hair.
This naturally has direct consequences in the origin myths of languages.
The best documented case of language birth brings us once again back to the Romans.
We know a heck of a lot about Classical Latin, but we know very little about the colloquial speech of the common man in ancient Rome. We know it was different, but we don't know how much. We also know that when they went out and conquered other nations, the locals tried to learn Latin, but did so imperfectly. We know that these imperfect Latins eventually developed into the Romance language family we know today (French, Italian, Spanish etc). Crucially, while we know very little about the early stages of the Romance languages, this is because Latin remained the language of the elite, so all literature was in Latin.
This means that the single best-attested example of language evolution revolves around an elite language hiding changes in the common language.
We can compare this to creole languages. Even today, many creole-speaking countries are run by elites that prefer to speak in French or English, and resist efforts to raise the profile of the local creole language.
All our most reliable data on language birth comes from languages where a large population have imperfectly learned the language of the elite, and where the elite language and the common language have existed in parallel for a significant period of time. It stands to reason that this should be our default assumption for all languages.
And yet our standard model of language development is still based on time periods: Old followed by Middle followed by Modern.
Given all that, it came as no surprise to me that someone recently published a paper proposing that a lot of so-called "Middle English" was actually contemporaneous with "Old English", and that the loss of grammatical complexity wasn't due to the Norman invasion (the Normans were a tiny minority elite ruling over a huge population) but rather the result of the Britons imperfectly learning the Anglo-Saxon of their rulers. The reason he gave for the apparent abrupt change was quite simple: the Anglo-Saxon elite were disenfranchised and Norman French became the language of literature. Middle English literature was not written by the descendants of the old Anglo-Saxon elite, but by the descendants of the peasantry, or people of Norman descent who learned it from the peasantry.
Shouldn't that have been our default assumption all along...?
02 February 2014
Language learning professionals vs Professional language learners
The internet, they say, "democratises" human activity. We no longer need to go to the ivory towers of academia to learn; we no longer need experts as intermediaries: we can collaborate with one another.
This is true, certainly, but with it comes a certain set of dangers.
First, the "wisdom of the crowds" is generally fallacious, and we either get a mass of people who constantly contradict each other flat-out or we get little cliques that share and reinforce each other's views, to the exclusion of all new information.
More importantly, though, people want to defer to experts. This means that it's actually quite easy for someone with the right patter to set themself up as a "lay expert". Once they do so, they gather a little clique of the "wisdom of the crowds" type who will support and propound the self-appointed expert's proclamations.
There are many such "experts" in language learning. Typically they say they teach not on "dry, academic grounds" or the "received wisdom of the establishment", but "from experience". Their argument is simple and appealing: I have learned a language, therefore I know how to learn a language. But wait... haven't millions upon millions of people learned a language too? Why you and not them?
I call these people "professional language learners". Wouldn't we all like to learn languages as a job? Wouldn't that be great? I know I'd love it... except that's only of benefit to me, so really there's no reason for anyone else to pay me to do it. I find it difficult to stomach that there are people out there who make their entire living by writing and making videos about their language learning, and kidding themselves and their audiences on that they're giving some immensely valuable and unique insight into the learning process.
But they're not.
Their advice is at best vague, and very often even inconsistent and self-contradictory (eg Sid Efromovich's video that I discussed recently). Vague advice can be followed to the letter, and still have you doing something almost entirely the opposite of what was intended. As advice it's at best useless, at worst detrimental. Why am I failing? What am I doing wrong? Frustration sets in. Maybe I'm just no good at languages.
But why is the advice vague? Is it a fault in the author's use of English? His writing composition? Maybe, but mostly it seems to me that these people don't actually fully understand their own process. There is much to be learned from these people, but only if they're willing to discuss it, so that we can help each other tease out the details.
This is why we need to defer to language learning professionals, people who have trained, and studied, and (hopefully) taught. But most importantly, they are in a position to experiment. They can try something on one class, identify the weaknesses, then try it on another class in an adjusted form. Did it work better? Then it's better. A professional language learner only has a sample of one, and therefore cannot identify the changes that make things better.
That is not to say that all language learning professionals are always correct; sadly, language teaching standards are pretty poor at present. Many teachers and academics continue to parrot outdated and/or unproven theories as gospel, but if they can express their views more clearly, then at least you'll be better able to follow them if you choose to.
Of course, a lot of language teachers aren't really experts anyway. Most professional teachers of English as a foreign language have a four week certificate that is essentially a walk-through of typical classroom techniques, and no real in-depth analysis of what works, when it works, why it works or how. Simply being a teacher does not make you an expert, and very few teachers would ever try to claim otherwise. A real expert is someone who has dedicated multiple years of their life to both academic study of the topic and real life application.
It takes a lot of time and a lot of money to become an expert, and for those of us who are going through all the slog of trying to become genuine experts, it's kind of galling to see these guys walking the easy route and getting pretty handsomely rewarded for it.
This is true, certainly, but with it comes a certain set of dangers.
First, the "wisdom of the crowds" is generally fallacious, and we either get a mass of people who constantly contradict each other flat-out or we get little cliques that share and reinforce each other's views, to the exclusion of all new information.
More importantly, though, people want to defer to experts. This means that it's actually quite easy for someone with the right patter to set themself up as a "lay expert". Once they do so, they gather a little clique of the "wisdom of the crowds" type who will support and propound the self-appointed expert's proclamations.
There are many such "experts" in language learning. Typically they say they teach not on "dry, academic grounds" or the "received wisdom of the establishment", but "from experience". Their argument is simple and appealing: I have learned a language, therefore I know how to learn a language. But wait... haven't millions upon millions of people learned a language too? Why you and not them?
I call these people "professional language learners". Wouldn't we all like to learn languages as a job? Wouldn't that be great? I know I'd love it... except that's only of benefit to me, so really there's no reason for anyone else to pay me to do it. I find it difficult to stomach that there are people out there who make their entire living by writing and making videos about their language learning, and kidding themselves and their audiences on that they're giving some immensely valuable and unique insight into the learning process.
But they're not.
Their advice is at best vague, and very often even inconsistent and self-contradictory (eg Sid Efromovich's video that I discussed recently). Vague advice can be followed to the letter, and still have you doing something almost entirely the opposite of what was intended. As advice it's at best useless, at worst detrimental. Why am I failing? What am I doing wrong? Frustration sets in. Maybe I'm just no good at languages.
But why is the advice vague? Is it a fault in the author's use of English? His writing composition? Maybe, but mostly it seems to me that these people don't actually fully understand their own process. There is much to be learned from these people, but only if they're willing to discuss it, so that we can help each other tease out the details.
This is why we need to defer to language learning professionals, people who have trained, and studied, and (hopefully) taught. But most importantly, they are in a position to experiment. They can try something on one class, identify the weaknesses, then try it on another class in an adjusted form. Did it work better? Then it's better. A professional language learner only has a sample of one, and therefore cannot identify the changes that make things better.
That is not to say that all language learning professionals are always correct; sadly, language teaching standards are pretty poor at present. Many teachers and academics continue to parrot outdated and/or unproven theories as gospel, but if they can express their views more clearly, then at least you'll be better able to follow them if you choose to.
Of course, a lot of language teachers aren't really experts anyway. Most professional teachers of English as a foreign language have a four week certificate that is essentially a walk-through of typical classroom techniques, and no real in-depth analysis of what works, when it works, why it works or how. Simply being a teacher does not make you an expert, and very few teachers would ever try to claim otherwise. A real expert is someone who has dedicated multiple years of their life to both academic study of the topic and real life application.
It takes a lot of time and a lot of money to become an expert, and for those of us who are going through all the slog of trying to become genuine experts, it's kind of galling to see these guys walking the easy route and getting pretty handsomely rewarded for it.
21 January 2014
Video commentary: 5 techniques to speak any language
Sometimes it's nice to run with a theme, and although I'm mixed it up with a few unrelated posts in between, I started discussing what we can learn from learners a couple of weeks ago, and followed up with a discussion on the limits of my own self-awareness. I figure it makes sense to build on this theme and have a critical look at what others say.
I came across the following video last month, where Sid Efromovich presents his "5 techniques to speak any language" at TEDx UpperEastSide. You can watch the video (~15 min) now to make up your own mind, or skip straight down to my analysis.
Now, the first thing to note is that the video blurb says that Sid grew up in Brazil, which tells us that either he is an exceptionally good learner (I never noticed any flaws in his English) or that he was a childhood bilingual -- unfortunately not even his personal website tells us which. Not knowing his starting point makes it quite difficult to evaluate how suitable he is as a model for any individual (if he's natively bilingual in Portuguese and English, that wouldn't be particularly unusual, but for those of us who were brought up monolingually, it doesn't match our world).
Now lets move on to the actual talk, and Sid's rules.
Thankfully, that's not what Eric goes on to discuss. Instead, he talks about how we all have a "database" of sounds and structures that our brains identify as correct, and that everything outside that database is flagged as "wrong" by the brain.
This throws us into a little paradox where our fear of making mistakes causes us to make mistakes.
His example is the Spanish letter R, which he gives with the word "Puerta". The Spanish R is markedly different from the English R, but close enough that a beginner's brain will try to replace the wrong-sounding Spanish R with a correct-sounding English one, which Eric refers to as the "closest relative sound".
In this rule we have something of real value to the learner. I've discussed the difficulty of dealing with foreign phoneme maps many times, but I never made that last logical step and talked about perceived wrongness and correctness, so a big thank you to Eric -- this idea alone made the video worth watching for me.
The P, though close, is subtly different, as Spanish is a language more typically distinguished by aspiration, and English by voicing (in the pairs P/B, T/D, C/G). It's a very subtle difference, but a difference nonetheless.
The Spanish UE doesn't exist in English, and any attempt at approximating it is going to be wrong -- one typical English pronunciation of Puerto Rico is /ˌpwɛərtə ˈriːkoʊ/, whereas the Spanish is /pʷeɾto ˈriko/.
Finally, there's that last syllable ta. This has a clear vowel, because all Spanish syllables do... but not all English ones. It is exceptionally difficult for an English speaker to pronounce a clear vowel in the syllable directly preceding or following the stressed syllable, because in English these are almost always reduced to schwa -- that feeble little "uh" sound.
So there's the first alarm bell... his pronunciation is good enough to show that he internally knows the difference, but he's not consciously aware of everything he does, which (as I keep saying) is the limitation of anyone who claims to speak from experience.
So far, so logical. What is the alternative?
Some (not Eric) propose learning aurally first, ignoring writing completely. (I disagree, because there are differences that a beginner might not be able to hear in the spoken language, but they'll be seen in the written language.)
But Eric is in favour of writing down. Does he suggest learning the IPA? No, instead he proposes using your own pseudo-phonetics, based on the sounds of your own language.
His example for this whole section is the Brazilian currency: the Real. It's a great example, because it is immediately misleading, having as it does the same spelling as an English word. His suggested phoneticisation, though, is very troubling: hey ouch . In writing it this way, he's using the same "closest relative sounds" he warned against in rule 1, which is a total contradiction. A Brazilian Portuguese R is not like an English H: it's a guttural sound, more similar to the German/Scottish CH (Bach/loch)... but not even quite the same as that. The L isn't quite the w-glide of ow/ouch, and even if it was, you're going to have to find a different notation for it in different situations (in can occur after vowels that an English W wouldn't appear after). And there are other problems too, but that'll do for now.
The main thing is that having presented only two rules, we already have a contradiction and fundamental incompatibility, throwing into doubt Sid's credibility as an instructor.
But setting that aside for a moment and looking at this rule in isolation:
Avoiding the native spelling is by no means a necessary step in learning a language.
Myself, I have at times learned to pronounce a language from its written form before making any serious attempt to learn the language, and to me that's far more useful, because I'm able to look at dictionaries and phrasebooks to learn new language, whereas if I'm stuck with my own idiosyncratic phonetics, I won't have access to any external sources whatsoever. I say learn the alphabet, but learn it right. Even if idiosyncratic phonetics did work (and I don't think they do), the cost in terms of isolation from materials is simply too high.
First of all, correction is ad hoc. You say what you want, you get a correction. Fine, but language and a system, and therefore effective correction has to be systematic. A lot of the corrections you get from a native speaker are going to be extremely different from what your mistake was, and therefore don't really correct the source of the error, just the superficial form.
Secondly, if you make an error, you may be misunderstood. If someone says "I will do it yesterday," do they mean "I will do it tomorrow" or "I did it yesterday"? Where an error is ambiguous, the correction given has as much chance of being totally wrong as being right.
But worst of all, sticklers often teach things that are outdated. For example, the English you and I as subject isn't current in most parts of the English speaking world, but a stickler will "correct" you and me even if says it himself. Similarly for there's three things. Or he might "correct" can I...? to may I...?
How do you avoid these things? How do you choose your stickler? Well, for the first one, you're not looking for a language buddy, you're looking for a teacher. For the second and third, you're looking for a good teacher. You need someone who is an expert not only in the language, but in showing others how to become experts.
Eric's idea of having your conversations in the shower gives you a safe area to practice pronunciation, but the bigger point he raises is that by having a conversation as opposed to a monologue, you're more likely to identify gaps in your knowledge.
I'm not sure that this is actually true, and as a result he glides by what should have been his most important point, and worthy of being a rule in and of itself: mind your gaps.
When you're speaking, it will be your natural tendency to avoid and skirt round gaps in your target language knowledge, and doing so is part of being a successful user of the language. However, it is all too easy to become so proficient at avoiding gaps that you stop looking for them, and stop learning new things. The really successful learners continually seek out gaps in their knowledge and plug them with new information. Obviously it's less embarrassing to identify those gaps when you're on your own than when you're in a conversation, so it's a good starting point, but on the other hand, the reality is that you cannot say anything to yourself that you don't expect, so you can't find as many gaps speaking to yourself as you can when speaking to others. If you can be bothered, you can carry a notebook to jot down any gaps you come across, but while I had such a notebook for years (for Spanish), I only ever noted down 2 or 3 things in it...
You will get no argument from me on that -- I often find myself banging into that wall, and I often caution against so-called "immersion" classes where the class has a common language anyway (eg. immersive Gaelic for English speakers) because it risks conditioning people to view the language as a barrier to communication instead of a means of communication.
But more than that, Sid offers no advice to dealing with the big problem that all learner-learner conversations carry: learner errors. If one English learner says pod-ae-to, and the other says pod-ah-to, they're both wrong, and there's no stimulus for correction. Or perhaps one of them gets it right, then the other gives a well-meaning "correction" that teaches them the wrong thing.
...unfortunately, no feedback on errors.
But his rules aren't correct. Rule 2 is a dogmatic assertion that glosses over a very complex issue.
Neither rule 3 or rule 5 is wrong per se, but neither is essential, and Sid's description is inadequate as advice for the learner, because it doesn't give any concrete advice on how to circumvent any of the problems a learner will face.
Rule 4 fails on similar, but slightly more interesting, grounds. Again, he has failed to really give enough advice on how to avoid potential pitfalls, but in rules 3 and 5, those were pitfalls I don't believe he'd really thought about himself. With rule 4, he does talk briefly about the pitfalls. He identified a problem, and identified a suitable solution. He has used that solution and it should be useful to many others. But there is nothing unique in the shower conversation that forces you to identify your language gaps. If his rule had been "Mind the gap", his shower conversation could have been given as one possible technique to address it, and then he would have been forced into providing a description of how to structure a shower conversation such that it addresses the goals of a "mind the gap" rule.
I came across the following video last month, where Sid Efromovich presents his "5 techniques to speak any language" at TEDx UpperEastSide. You can watch the video (~15 min) now to make up your own mind, or skip straight down to my analysis.
Now, the first thing to note is that the video blurb says that Sid grew up in Brazil, which tells us that either he is an exceptionally good learner (I never noticed any flaws in his English) or that he was a childhood bilingual -- unfortunately not even his personal website tells us which. Not knowing his starting point makes it quite difficult to evaluate how suitable he is as a model for any individual (if he's natively bilingual in Portuguese and English, that wouldn't be particularly unusual, but for those of us who were brought up monolingually, it doesn't match our world).
Now lets move on to the actual talk, and Sid's rules.
1: Make mistakes
My first reaction when he gave the title was "here we go again," as I expected the usual line about how we learn from our mistakes. Yes, we are more than capable from learning from our mistakes, from which the hardliners conclude that there is no learning without mistakes... and yet the things I've learned best were learned right first time.Thankfully, that's not what Eric goes on to discuss. Instead, he talks about how we all have a "database" of sounds and structures that our brains identify as correct, and that everything outside that database is flagged as "wrong" by the brain.
This throws us into a little paradox where our fear of making mistakes causes us to make mistakes.
His example is the Spanish letter R, which he gives with the word "Puerta". The Spanish R is markedly different from the English R, but close enough that a beginner's brain will try to replace the wrong-sounding Spanish R with a correct-sounding English one, which Eric refers to as the "closest relative sound".
In this rule we have something of real value to the learner. I've discussed the difficulty of dealing with foreign phoneme maps many times, but I never made that last logical step and talked about perceived wrongness and correctness, so a big thank you to Eric -- this idea alone made the video worth watching for me.
Caveat emptor
However, in that same example of puerta, Eric starts to show the limitations of his language awareness, as he appears to be claiming that it is only the Spanish R that isn't in the English "database", which is simply untrue.The P, though close, is subtly different, as Spanish is a language more typically distinguished by aspiration, and English by voicing (in the pairs P/B, T/D, C/G). It's a very subtle difference, but a difference nonetheless.
The Spanish UE doesn't exist in English, and any attempt at approximating it is going to be wrong -- one typical English pronunciation of Puerto Rico is /ˌpwɛərtə ˈriːkoʊ/, whereas the Spanish is /pʷeɾto ˈriko/.
Finally, there's that last syllable ta. This has a clear vowel, because all Spanish syllables do... but not all English ones. It is exceptionally difficult for an English speaker to pronounce a clear vowel in the syllable directly preceding or following the stressed syllable, because in English these are almost always reduced to schwa -- that feeble little "uh" sound.
So there's the first alarm bell... his pronunciation is good enough to show that he internally knows the difference, but he's not consciously aware of everything he does, which (as I keep saying) is the limitation of anyone who claims to speak from experience.
2 Scrap the foreign alphabet
Rule number 2 is a doozy. Eric reckons that the foreign alphabet "will give you wrong signals," at least for languages in the same script as your own, and he is correct. It is difficult for an English speaker to see the letter "I" as representing it's so called "cardinal" value (the Latin I) because that's not the sound it has in English.So far, so logical. What is the alternative?
Some (not Eric) propose learning aurally first, ignoring writing completely. (I disagree, because there are differences that a beginner might not be able to hear in the spoken language, but they'll be seen in the written language.)
But Eric is in favour of writing down. Does he suggest learning the IPA? No, instead he proposes using your own pseudo-phonetics, based on the sounds of your own language.
His example for this whole section is the Brazilian currency: the Real. It's a great example, because it is immediately misleading, having as it does the same spelling as an English word. His suggested phoneticisation, though, is very troubling: he
The main thing is that having presented only two rules, we already have a contradiction and fundamental incompatibility, throwing into doubt Sid's credibility as an instructor.
But setting that aside for a moment and looking at this rule in isolation:
Avoiding the native spelling is by no means a necessary step in learning a language.
Myself, I have at times learned to pronounce a language from its written form before making any serious attempt to learn the language, and to me that's far more useful, because I'm able to look at dictionaries and phrasebooks to learn new language, whereas if I'm stuck with my own idiosyncratic phonetics, I won't have access to any external sources whatsoever. I say learn the alphabet, but learn it right. Even if idiosyncratic phonetics did work (and I don't think they do), the cost in terms of isolation from materials is simply too high.
3 Find a stickler
He reckons you need someone who'll correct you, and you probably do, but he doesn't address the serious limitations of this, or how to avoid them.First of all, correction is ad hoc. You say what you want, you get a correction. Fine, but language and a system, and therefore effective correction has to be systematic. A lot of the corrections you get from a native speaker are going to be extremely different from what your mistake was, and therefore don't really correct the source of the error, just the superficial form.
Secondly, if you make an error, you may be misunderstood. If someone says "I will do it yesterday," do they mean "I will do it tomorrow" or "I did it yesterday"? Where an error is ambiguous, the correction given has as much chance of being totally wrong as being right.
But worst of all, sticklers often teach things that are outdated. For example, the English you and I as subject isn't current in most parts of the English speaking world, but a stickler will "correct" you and me even if says it himself. Similarly for there's three things. Or he might "correct" can I...? to may I...?
How do you avoid these things? How do you choose your stickler? Well, for the first one, you're not looking for a language buddy, you're looking for a teacher. For the second and third, you're looking for a good teacher. You need someone who is an expert not only in the language, but in showing others how to become experts.
4 Shower conversations
Eric then suggests talking to yourself, a technique lots of people use. For many of us, that language practice is in the form of a silent internal monologue. The acknowledged limitation with this is that it's not practising pronunciation.Eric's idea of having your conversations in the shower gives you a safe area to practice pronunciation, but the bigger point he raises is that by having a conversation as opposed to a monologue, you're more likely to identify gaps in your knowledge.
I'm not sure that this is actually true, and as a result he glides by what should have been his most important point, and worthy of being a rule in and of itself: mind your gaps.
When you're speaking, it will be your natural tendency to avoid and skirt round gaps in your target language knowledge, and doing so is part of being a successful user of the language. However, it is all too easy to become so proficient at avoiding gaps that you stop looking for them, and stop learning new things. The really successful learners continually seek out gaps in their knowledge and plug them with new information. Obviously it's less embarrassing to identify those gaps when you're on your own than when you're in a conversation, so it's a good starting point, but on the other hand, the reality is that you cannot say anything to yourself that you don't expect, so you can't find as many gaps speaking to yourself as you can when speaking to others. If you can be bothered, you can carry a notebook to jot down any gaps you come across, but while I had such a notebook for years (for Spanish), I only ever noted down 2 or 3 things in it...
5 Buddy Formula
Sid's final rule is a "formula" for finding the best "language buddy" for practise. Now I object to the trite, twee abuse of the term "formula" here, because it's just a rule that he chooses to write with an equals sign:Target language = best language in commonHe raises a very good point in his justification for this: that clear communication is the motivation for learning a language, and that if you know you can get your message across more easily in another language, you're going to switch to that other language.
You will get no argument from me on that -- I often find myself banging into that wall, and I often caution against so-called "immersion" classes where the class has a common language anyway (eg. immersive Gaelic for English speakers) because it risks conditioning people to view the language as a barrier to communication instead of a means of communication.
But more than that, Sid offers no advice to dealing with the big problem that all learner-learner conversations carry: learner errors. If one English learner says pod-ae-to, and the other says pod-ah-to, they're both wrong, and there's no stimulus for correction. Or perhaps one of them gets it right, then the other gives a well-meaning "correction" that teaches them the wrong thing.
...unfortunately, no feedback on errors.
Overall
So what can we take away from Eric's advice? Even if his rules were all unarguably correct, they alone are not going to teach you a language -- in fact, they're pretty peripheral to the main learning process.But his rules aren't correct. Rule 2 is a dogmatic assertion that glosses over a very complex issue.
Neither rule 3 or rule 5 is wrong per se, but neither is essential, and Sid's description is inadequate as advice for the learner, because it doesn't give any concrete advice on how to circumvent any of the problems a learner will face.
Rule 4 fails on similar, but slightly more interesting, grounds. Again, he has failed to really give enough advice on how to avoid potential pitfalls, but in rules 3 and 5, those were pitfalls I don't believe he'd really thought about himself. With rule 4, he does talk briefly about the pitfalls. He identified a problem, and identified a suitable solution. He has used that solution and it should be useful to many others. But there is nothing unique in the shower conversation that forces you to identify your language gaps. If his rule had been "Mind the gap", his shower conversation could have been given as one possible technique to address it, and then he would have been forced into providing a description of how to structure a shower conversation such that it addresses the goals of a "mind the gap" rule.
16 January 2014
Is language like science...?
Quite often, when I talk about the rules of language, I find I get hit with the response "Language isn't like science!" When I talk about teaching language systematically, people say "Language isn't like science." When I talk about language in schools, I'm told it's destined to fail because "Language isn't like science."
Well, I'm in the middle of trying to sort through a lot of old stuff that had been stored in the loft, and I came across a piece of paper on which I had hastily scrawled the following:
The reason people say "language isn't like science" is because of their misconception of the nature of science. To them, science has been presented as a series of rules to be memorised. They have been conditioned to think that the end goal of science is to be able to regurgitate the rules on demand, because that's all that was required of them in school.
That is not science.
Science is the art of investigating natural phenomena and finding explanations and models for them. These explanations and models are mostly a combination and application of existing scientific rules, and sometimes of identifying and creating new rules.
Or to put it another way: science is not about the knowledge of rules, it's about the application of rules.
But would the same statement not hold for language too? Language is not about the knowledge of rules (we can all agree on that) but the application of rules, surely?
Science is very often taught badly, in that there is such a focus on the rules themselves that students never get the chance to integrate those rules into a working body of scientific knowledge. This leaves the student able to recall the rule or law by name, but not recall the rule when addressed with a problem that requires that rule in order to reach a solution. You cannot solve a useful scientific problem this way -- the only type of problem that can tell you explicitly which rules are required to solve it is a problem that has already been solved, and science is about creating new knowledge, not repeating the known ad nauseum.
A good course in science will instead train the student in identifying the characteristics of a problem domain and noticing patterns that relate to particular laws or rules: they will teach them how to select the appropriate rule for the given situation.
That, I contest, is the very same process we go through when we try to formulate an utterance. We have a bank of words and grammatical rules at our disposal, and we have to select the appropriate items from it to express the message that we want.
So language is a lot like science, and the objections typically raised against grammar teaching are systemic problems that also affect science teaching. It's a problem that the late, great Richard Feynman recounted in his memoir Surely You're Joking, Mr Feynman?, when he talks of his experience on sabbatical placement in Brazil. It's a problem that affects all education systems to a greater or lesser extent.
But the problem comes when reformers attempt to throw the baby out with the bathwater: "rules teaching has failed," they tell us, "so we need to do away with rules."
That, to me, is a ridiculous philosophy. How can you choose which rule to apply if you don't know what rules exist? How can you search for it if you don't know what it is?
Let's be clear, I do not have to be able to recite the present tense endings of regular -ARE verbs in Latin in order to usefully "know" the rule, but that doesn't mean I shouldn't be taught them. I initially learned Spanish, for example, by the explicit teaching of the endings, and the explicit teaching of rules like 2s = 3s+"s" and 3p=3s+"n" (NB: this is my notation, not the way I was taught the rule!), and not by memorising the list of conjugations or a table. But that was still explicit teaching. I did not learn by osmosis, I did not learn by exposure, I did not learn by magic. I was told what my range of choices was, then given sufficient opportunities to make those decisions that I eventually could make the decision subconsciously.
Well, I'm in the middle of trying to sort through a lot of old stuff that had been stored in the loft, and I came across a piece of paper on which I had hastily scrawled the following:
Now this is not a statement of my belief; rather it's my attempt to understand the logic behind the statement, so for anyone other than myself to get my full meaning requires a bit more explanation.Language isn't like science.Why?It's about choosing the rules, not knowing the rules.
The reason people say "language isn't like science" is because of their misconception of the nature of science. To them, science has been presented as a series of rules to be memorised. They have been conditioned to think that the end goal of science is to be able to regurgitate the rules on demand, because that's all that was required of them in school.
That is not science.
Science is the art of investigating natural phenomena and finding explanations and models for them. These explanations and models are mostly a combination and application of existing scientific rules, and sometimes of identifying and creating new rules.
Or to put it another way: science is not about the knowledge of rules, it's about the application of rules.
But would the same statement not hold for language too? Language is not about the knowledge of rules (we can all agree on that) but the application of rules, surely?
Science is very often taught badly, in that there is such a focus on the rules themselves that students never get the chance to integrate those rules into a working body of scientific knowledge. This leaves the student able to recall the rule or law by name, but not recall the rule when addressed with a problem that requires that rule in order to reach a solution. You cannot solve a useful scientific problem this way -- the only type of problem that can tell you explicitly which rules are required to solve it is a problem that has already been solved, and science is about creating new knowledge, not repeating the known ad nauseum.
A good course in science will instead train the student in identifying the characteristics of a problem domain and noticing patterns that relate to particular laws or rules: they will teach them how to select the appropriate rule for the given situation.
That, I contest, is the very same process we go through when we try to formulate an utterance. We have a bank of words and grammatical rules at our disposal, and we have to select the appropriate items from it to express the message that we want.
So language is a lot like science, and the objections typically raised against grammar teaching are systemic problems that also affect science teaching. It's a problem that the late, great Richard Feynman recounted in his memoir Surely You're Joking, Mr Feynman?, when he talks of his experience on sabbatical placement in Brazil. It's a problem that affects all education systems to a greater or lesser extent.
But the problem comes when reformers attempt to throw the baby out with the bathwater: "rules teaching has failed," they tell us, "so we need to do away with rules."
That, to me, is a ridiculous philosophy. How can you choose which rule to apply if you don't know what rules exist? How can you search for it if you don't know what it is?
Let's be clear, I do not have to be able to recite the present tense endings of regular -ARE verbs in Latin in order to usefully "know" the rule, but that doesn't mean I shouldn't be taught them. I initially learned Spanish, for example, by the explicit teaching of the endings, and the explicit teaching of rules like 2s = 3s+"s" and 3p=3s+"n" (NB: this is my notation, not the way I was taught the rule!), and not by memorising the list of conjugations or a table. But that was still explicit teaching. I did not learn by osmosis, I did not learn by exposure, I did not learn by magic. I was told what my range of choices was, then given sufficient opportunities to make those decisions that I eventually could make the decision subconsciously.
13 January 2014
Learners and learning: "I wish I'd known then what I know now..."
These are
the favourite words of many internet polyglots, and continuing on the theme of what we can learn from learners, they are one of the most important reasons why you should take any advice from someone who has "been there" with a pinch of salt.
The problem with this philosophy is simple: there's no guarantee that you would have been ready to learn that then. I'll use my own learning techniques as an example.
The English course didn't just occupy itself with the traditional grammatical concepts of parts-of-speech and syntax, but addressed language in terms of multipleframeworks: Halliday's systemic functional linguistics and metafunctions, the idiom principle vs the open-choice principle (language as a series of set phrases vs a series of free grammatical choices), lexicogrammar, corpus linguistics and collocation, directness and indirectness, agency and affectedness... the list goes on.
Knowing each of these concepts allowed me to disambiguate or disentangle my own confusion about new word or phrase forms, and I know that I would not be as successful a language learner today without having undergone that course of study. (This of course is in direct contradiction to a significant number of language learners who flatly refuse to accept that conscious study has any value whatsoever.) Now I have often wished I had learnt all that earlier... but would I have been ready? If I had not already learnt (a little of) two foreign languages, would I have been as open to the instruction I was given? Or if I had not been actively engaged in learning two languages simultaneous to that study?
I cannot know, therefore I cannot say for sure, but I believe that it would have helped. I temper that with the knowledge that the course I took would be overkill for most learners, but that there are certain concepts that help immensely. (eg indirectness as a form of politeness; compare; "shut the door","can you shut the door" and "could you shut the door" -- the imperative is direct and rude, can is somewhat indirect, and could is very indirect, making it polite. This same rule about indirectness holds for most European languages, but not e.g. Japanese. It is very easy to draw a learner's attention to the presence or absence of this pattern, but many courses fail to even attempt this.)
So I would advise learners to learn this stuff (if I only knew of a good book!) but I would also caution them about trusting my advice blindly.
Fast forward 9 years, and I now find myself watching Gaelic TV or French films and noticing unusual turns-of-phrase on the fly, by comparing the soundtrack and the subtitles. This, as it turns out, is a highly effective way of learning the sort of stuff that doesn't make it into the textbooks.
So would I recommend it? Not really. 9 years is a long time, and it has taken me a lot of practice to get to the point where I can do this -- when I started, the subtitles would soak up my attention, and I wouldn't just "not understand" the audio track, I wouldn't hear it -- my mind blocked it out. It took me several years to stop blocking it out, and years after that to be able to consciously follow both languages, and even now I'll generally slip into either reading or listening -- it takes active concentration to do otherwise.
Now I could have forgotten all the time and effort and told myself that I obviously hadn't worked hard enough soon enough, and that if I had, I would have learned even better and quicker than I did, but I simply don't believe that's true. I don't think this technique, which is now one of the most useful in my arsenal, is actually worth the effort of learning to most people.
This serves as a useful reminder to me that everything I do now is a refinement and improvement of things I've been doing for years. These techniques can't just be "done", they have to be learned; which means that I can't just "tell" people to do them, I have to teach them... or keep my mouth shut.
So there we have another problem in learning from successful learners: all too often they will simply tell you what they do now, with no real conscious understanding of how hard it was for them to reach the point where they could.
The problem with this philosophy is simple: there's no guarantee that you would have been ready to learn that then. I'll use my own learning techniques as an example.
What I wish I'd known
I took French and Italian as high school subjects, and didn't learn all that much in either, considering the time cost. When I started learning Spanish and Gaelic, I was simultaneously studying English language at degree level with the Open University. When I moved on to degree-level Spanish, I was annoyed that none of the linguistic concepts I'd been taught in the (mandatory!) English module were used to speed up the learning process. Too many things were left undescribed and confusing to my fellow students, when a ready explanation was available.The English course didn't just occupy itself with the traditional grammatical concepts of parts-of-speech and syntax, but addressed language in terms of multipleframeworks: Halliday's systemic functional linguistics and metafunctions, the idiom principle vs the open-choice principle (language as a series of set phrases vs a series of free grammatical choices), lexicogrammar, corpus linguistics and collocation, directness and indirectness, agency and affectedness... the list goes on.
Knowing each of these concepts allowed me to disambiguate or disentangle my own confusion about new word or phrase forms, and I know that I would not be as successful a language learner today without having undergone that course of study. (This of course is in direct contradiction to a significant number of language learners who flatly refuse to accept that conscious study has any value whatsoever.) Now I have often wished I had learnt all that earlier... but would I have been ready? If I had not already learnt (a little of) two foreign languages, would I have been as open to the instruction I was given? Or if I had not been actively engaged in learning two languages simultaneous to that study?
I cannot know, therefore I cannot say for sure, but I believe that it would have helped. I temper that with the knowledge that the course I took would be overkill for most learners, but that there are certain concepts that help immensely. (eg indirectness as a form of politeness; compare; "shut the door","can you shut the door" and "could you shut the door" -- the imperative is direct and rude, can is somewhat indirect, and could is very indirect, making it polite. This same rule about indirectness holds for most European languages, but not e.g. Japanese. It is very easy to draw a learner's attention to the presence or absence of this pattern, but many courses fail to even attempt this.)
So I would advise learners to learn this stuff (if I only knew of a good book!) but I would also caution them about trusting my advice blindly.
What I might have wished I'd done
When I bought myself a DVD player, it was because of language. I had taken a French level test with the OU before starting my study, and while my reading and writing were at a pretty good level, my listening level was disproportionately low (incidentally, this worked in my favour in the end, as otherwise I might have started with French rather than English) so I needed to improve my ear.Fast forward 9 years, and I now find myself watching Gaelic TV or French films and noticing unusual turns-of-phrase on the fly, by comparing the soundtrack and the subtitles. This, as it turns out, is a highly effective way of learning the sort of stuff that doesn't make it into the textbooks.
So would I recommend it? Not really. 9 years is a long time, and it has taken me a lot of practice to get to the point where I can do this -- when I started, the subtitles would soak up my attention, and I wouldn't just "not understand" the audio track, I wouldn't hear it -- my mind blocked it out. It took me several years to stop blocking it out, and years after that to be able to consciously follow both languages, and even now I'll generally slip into either reading or listening -- it takes active concentration to do otherwise.
Now I could have forgotten all the time and effort and told myself that I obviously hadn't worked hard enough soon enough, and that if I had, I would have learned even better and quicker than I did, but I simply don't believe that's true. I don't think this technique, which is now one of the most useful in my arsenal, is actually worth the effort of learning to most people.
This serves as a useful reminder to me that everything I do now is a refinement and improvement of things I've been doing for years. These techniques can't just be "done", they have to be learned; which means that I can't just "tell" people to do them, I have to teach them... or keep my mouth shut.
So there we have another problem in learning from successful learners: all too often they will simply tell you what they do now, with no real conscious understanding of how hard it was for them to reach the point where they could.
09 January 2014
A plug or a polemic?
Whenever a book on language is released, it is traditional for the author to try to get a column in a newspaper raising some pertinent language issue, then launching into what is little more than a sales pitch for his book, describing how it addresses the problem.
It was refreshing then, to see an article in the Guardian where an author decided that the best way to use his 15-minutes of fame was not to plug his wares, but instead to pull out a polemic on the state of linguistics in general.
Harry Ritchie started off with a pertinent language issue, as is tradition. His issue was the often overlooked problem that our culture of "talking properly" in schools is actively disengaging children from their learning. (Most language experts agree on this fact, by the way; it's just the political and school systems that reject this.) His book, as far as I can gather, is simply an attempt to explain real language patterns in a clear and engaging way (I have not read it, so cannot comment on the success or otherwise of this) and he could have continued in the typical manner by throwing up multiple examples drawn straight from his book, slowly pushing the reader into wanting to know more.
But he didn't. Instead he went tangential to the content of the book, delivering a finely-tuned polemic about the state of the linguistics world, and pinning the blame on the door of Noam Chomsky.
And he is right to do so.
I was twice introduced to the world of linguistics: once through the Introduction to Machine Language unit in the University of Edinburgh's artificial intelligence department, and once through the Open University module The English Language: Past, Present and Future.
Both courses were very good, which should be no surprise giving the academic stature of Edinburgh's AI department and the OU's English department. However, both started by looking at structural grammar through the view of Noam Chomsky's generative grammars.
Given the research and refinement in the half-century since Chomsky, it should be no surprise to find that his model is primitive and of little use, and yet it is still taught as the sina qua non of modern linguistics. I've written my own little piece about this before, so I'll not repeat myself too much, but I'll say this:
Chomsky wrote a model of grammar that was based almost entirely on reading language one-dimensionally and sequentially, a model that so badly fits real language that it allows the generation of meaninglessness such as colourless green ideas sleep furiously, and then extrapolated from that nonsense that meaning and grammaticality were entirely separate things.
He wrote a demonstrably broken model, then used the brokenness to try to draw conclusions about the real world. And people treat him like some kind of genius for it...!
It was refreshing then, to see an article in the Guardian where an author decided that the best way to use his 15-minutes of fame was not to plug his wares, but instead to pull out a polemic on the state of linguistics in general.
Harry Ritchie started off with a pertinent language issue, as is tradition. His issue was the often overlooked problem that our culture of "talking properly" in schools is actively disengaging children from their learning. (Most language experts agree on this fact, by the way; it's just the political and school systems that reject this.) His book, as far as I can gather, is simply an attempt to explain real language patterns in a clear and engaging way (I have not read it, so cannot comment on the success or otherwise of this) and he could have continued in the typical manner by throwing up multiple examples drawn straight from his book, slowly pushing the reader into wanting to know more.
But he didn't. Instead he went tangential to the content of the book, delivering a finely-tuned polemic about the state of the linguistics world, and pinning the blame on the door of Noam Chomsky.
And he is right to do so.
I was twice introduced to the world of linguistics: once through the Introduction to Machine Language unit in the University of Edinburgh's artificial intelligence department, and once through the Open University module The English Language: Past, Present and Future.
Both courses were very good, which should be no surprise giving the academic stature of Edinburgh's AI department and the OU's English department. However, both started by looking at structural grammar through the view of Noam Chomsky's generative grammars.
Given the research and refinement in the half-century since Chomsky, it should be no surprise to find that his model is primitive and of little use, and yet it is still taught as the sina qua non of modern linguistics. I've written my own little piece about this before, so I'll not repeat myself too much, but I'll say this:
Chomsky wrote a model of grammar that was based almost entirely on reading language one-dimensionally and sequentially, a model that so badly fits real language that it allows the generation of meaninglessness such as colourless green ideas sleep furiously, and then extrapolated from that nonsense that meaning and grammaticality were entirely separate things.
He wrote a demonstrably broken model, then used the brokenness to try to draw conclusions about the real world. And people treat him like some kind of genius for it...!
07 January 2014
What can learners tell us about learning?
Last month, David Mansaray appeared on the How-to-learn-any-language forums discussing a new series of podcasts about language learning. His previous interviews focused exclusively on polyglots, but now he wants to expand that scope:
The problem is simple: most are happy to tell you what they do, but a great many are not happy to let you find out what they do. You can listen to what they say, but you can rarely probe or challenge it, which is a nuisance, because very few of us are ever fully aware of what we do.
Stepping away from the world of languages, consider this story I was told in my university days. There was a cheese factory near Edinburgh, and one of their employees specialised in determining when the cheese was ripe enough for packaging and selling. He was nearing retirement, and the company hoped to make a machine to do his job rather than training a replacement, so they called in experts on AI and machine learning to try to create a robot for the purpose. They asked the man how he determined when the cheese was ready, and he told them he prodded it with his thumb, and he knew if it was ready based on how springy it was. They set up a machine that bounced a little thumb-sized probe on the cheeses and measured their springiness. The problem was, there was absolutely no correlation between the measurable springiness of the cheese and the expert's judgement on whether it was ready or not. Eventually they discovered that in prodding the cheese, the man had been breaking the surface, which released a scent that he subconsciously detected. As computers can't smell (yet), the project was abandoned, and an apprentice hired.
Going back to language, for years I derided those who talked about "shadowing" other people (ie repeating audio books or films verbatim), thinking it was a passive process that wasn't truly "linguistic". I didn't shadow, anyway. But then I remembered that I used to shadow DVDs in Spanish. Not religiously, not obsessively, but from time to time. I did it, and I can't say it didn't help me -- it might have, it might not have. I can say I believe that it is of minor utility at best, but I have to be careful not to say what it "is" or "isn't".
If I can enter calmly into a discussion with other people, I can often be made to realise that I have fixated on one thing and completely ignored something else, and I can in turn make them realise that they have done the same thing. The end goal of such a discussion shouldn't be to walk away with a better understanding of each other's techniques but for each to walk away with a better understanding of their own. When we enter such a conversation with the view that we are correct and the goal of convincing everyone else that we are, we add no value.
Interviews can be a starting point, but only if the interviewee is engaged in discussion rather than merely lecturing those he sees as less informed.
"I plan to interview polyglots, linguists, teachers, expats, successful students, interpreters, lexicographers, etc.."It's definitely a good idea. There is a lot to be learned from polyglots, but sadly, there seems to be very little that can be learned from the internet polyglot community.
The problem is simple: most are happy to tell you what they do, but a great many are not happy to let you find out what they do. You can listen to what they say, but you can rarely probe or challenge it, which is a nuisance, because very few of us are ever fully aware of what we do.
Stepping away from the world of languages, consider this story I was told in my university days. There was a cheese factory near Edinburgh, and one of their employees specialised in determining when the cheese was ripe enough for packaging and selling. He was nearing retirement, and the company hoped to make a machine to do his job rather than training a replacement, so they called in experts on AI and machine learning to try to create a robot for the purpose. They asked the man how he determined when the cheese was ready, and he told them he prodded it with his thumb, and he knew if it was ready based on how springy it was. They set up a machine that bounced a little thumb-sized probe on the cheeses and measured their springiness. The problem was, there was absolutely no correlation between the measurable springiness of the cheese and the expert's judgement on whether it was ready or not. Eventually they discovered that in prodding the cheese, the man had been breaking the surface, which released a scent that he subconsciously detected. As computers can't smell (yet), the project was abandoned, and an apprentice hired.
Going back to language, for years I derided those who talked about "shadowing" other people (ie repeating audio books or films verbatim), thinking it was a passive process that wasn't truly "linguistic". I didn't shadow, anyway. But then I remembered that I used to shadow DVDs in Spanish. Not religiously, not obsessively, but from time to time. I did it, and I can't say it didn't help me -- it might have, it might not have. I can say I believe that it is of minor utility at best, but I have to be careful not to say what it "is" or "isn't".
If I can enter calmly into a discussion with other people, I can often be made to realise that I have fixated on one thing and completely ignored something else, and I can in turn make them realise that they have done the same thing. The end goal of such a discussion shouldn't be to walk away with a better understanding of each other's techniques but for each to walk away with a better understanding of their own. When we enter such a conversation with the view that we are correct and the goal of convincing everyone else that we are, we add no value.
Interviews can be a starting point, but only if the interviewee is engaged in discussion rather than merely lecturing those he sees as less informed.
03 January 2014
Cross-language interference and failing to learn by exposure
Last month I was talking about the dangers of taking a hard-line view in your learning techniques. The specific example I used was spaced recognition software, and unfortunately my post focused too heavily on the example and not on the general principle I was trying to highlight.
Well, as it turns out, I stumbled across another example on a visit to a very wet and windy Edinburgh two days before Christmas. I waited for the worst of the rain to pass before leaving Haymarket station, then headed towards the centre of town. On my way, I passed a Turkish barber's with the following phrase in the window:
The use of the hyphen indicates that he views it as a prefix, closer bound to the adjective than it truly is. This is no surprise to me, as I know many Spanish people who pronounce phrases like no bad as though they are a single word. Given that this sort of colloquial speech is never taught, it is clear that the Turkish barber and my Spanish friends learned the structure by exposure.
Now I have always respected the role and importance of exposure in the learning of a language, but there are those who would exaggerate that importance and promote exposure to being the single determining factor of language learning, and that everything else -- study, teaching, practise -- is just window-dressing, a distraction to keep the learner motivated until such time as they accumulate enough exposure to "acquire" the language. but here we have an example of a feature that is learned in most cases with nothing but exposure, while all the study, teaching and practise is carried out using Standard English, and I have met precious few who have acquired the structure correctly.
Part of the problem is English's stress patterns. Spoken English has at least three stress levels: primary stress, secondary stress, and unstressed -- Spanish has only stressed and unstressed, as do many other languages. They just don't seem to recognise the three levels in English without conscious teaching. Worse: the opposite of necesario in Spanish is no necesario so they've got a pattern to match it to that misleads. Yes, innecesario also exists in dictionaries, but it's not the most common form (the presence of a double N marks it out as a pretty antiquated form), and there is a tendency for Spanish words to replace an in- prefix with no, and crucially, this no is spoken indistinguishably from a prefix -- it could just as easily be written nonecesario as no necesario. (The reason it isn't is probably just the usual case of orthography being a bit conservative and etymological, rather than a perfect model of the modern language.)
And Turkish, of course, is an agglutinative language, so prefers affixes to particles in almost every situation.
So as a soft-liner, I would say that a lack of conscious awareness or directed practise is to blame for the failure to learn this structure correctly, and that exposure, while a vital part of full language acquisition, couldn't correct for that, regardless of quantity and intensity.
A hard-liner would put it differently, claiming instead that these people simply aren't getting enough exposure. With my Spanish friends, I could accept that, but not with a barber. How many professions offer the same opportunities for exposure as this? Have you ever had a haircut without getting a long conversation thrown into the bargain?
The problem, as I've always said, is that all language has a high level of redundancy -- there's more information encoded in the language than we would need to understand the message. (It evolved this way in order to allow us to understand each other even with background noise. If you count up in English from 1, you'll be saying a different vowel every time up to and including 8 -- the numbers are so different that it would be very difficult to confuse one with another.) So if you don't need everything to understand the message, why would you even notice it? "Good enough is good enough," as they say, and the brain has no motivation to notice more when it has already got the message.
Well, as it turns out, I stumbled across another example on a visit to a very wet and windy Edinburgh two days before Christmas. I waited for the worst of the rain to pass before leaving Haymarket station, then headed towards the centre of town. On my way, I passed a Turkish barber's with the following phrase in the window:
APPOINTMENTS NO-NECESSARYNow the error here isn't as bad as you think, because this was in Scotland. In central and southern Scottish dialects, there is a distinction between the two functions that the English word not carries: negation of a preceding verb is carried out with nae and negation of a following adjective with no (as I understand it, in the north both situations use nae). The owner has obviously learned this use of no through spoken usage, and therefore has no explicit, formal knowledge of how it functions.
The use of the hyphen indicates that he views it as a prefix, closer bound to the adjective than it truly is. This is no surprise to me, as I know many Spanish people who pronounce phrases like no bad as though they are a single word. Given that this sort of colloquial speech is never taught, it is clear that the Turkish barber and my Spanish friends learned the structure by exposure.
Now I have always respected the role and importance of exposure in the learning of a language, but there are those who would exaggerate that importance and promote exposure to being the single determining factor of language learning, and that everything else -- study, teaching, practise -- is just window-dressing, a distraction to keep the learner motivated until such time as they accumulate enough exposure to "acquire" the language. but here we have an example of a feature that is learned in most cases with nothing but exposure, while all the study, teaching and practise is carried out using Standard English, and I have met precious few who have acquired the structure correctly.
Part of the problem is English's stress patterns. Spoken English has at least three stress levels: primary stress, secondary stress, and unstressed -- Spanish has only stressed and unstressed, as do many other languages. They just don't seem to recognise the three levels in English without conscious teaching. Worse: the opposite of necesario in Spanish is no necesario so they've got a pattern to match it to that misleads. Yes, innecesario also exists in dictionaries, but it's not the most common form (the presence of a double N marks it out as a pretty antiquated form), and there is a tendency for Spanish words to replace an in- prefix with no, and crucially, this no is spoken indistinguishably from a prefix -- it could just as easily be written nonecesario as no necesario. (The reason it isn't is probably just the usual case of orthography being a bit conservative and etymological, rather than a perfect model of the modern language.)
And Turkish, of course, is an agglutinative language, so prefers affixes to particles in almost every situation.
So as a soft-liner, I would say that a lack of conscious awareness or directed practise is to blame for the failure to learn this structure correctly, and that exposure, while a vital part of full language acquisition, couldn't correct for that, regardless of quantity and intensity.
A hard-liner would put it differently, claiming instead that these people simply aren't getting enough exposure. With my Spanish friends, I could accept that, but not with a barber. How many professions offer the same opportunities for exposure as this? Have you ever had a haircut without getting a long conversation thrown into the bargain?
The problem, as I've always said, is that all language has a high level of redundancy -- there's more information encoded in the language than we would need to understand the message. (It evolved this way in order to allow us to understand each other even with background noise. If you count up in English from 1, you'll be saying a different vowel every time up to and including 8 -- the numbers are so different that it would be very difficult to confuse one with another.) So if you don't need everything to understand the message, why would you even notice it? "Good enough is good enough," as they say, and the brain has no motivation to notice more when it has already got the message.
Subscribe to:
Posts (Atom)