27 December 2012
Ability vs ability
The Telegraph was talking nonsense. The BBC have the real story.
A friend of mine shared a link to an article about a man from England who had a stroke and started speaking Welsh, despite having only spent a short spell in Wales when he was evacuated there during the Second World War.
This is pretty interesting from a neurolinguistic perspective. I did a tiny bit of AI at uni first time round, and one of the few things that stuck with me long-term was the notion that the brain works by a combination of activations and inhibitions.
Here's the concept in a simplified form (possibly oversimplified, but hey-ho)....
Human brain cells become excited or "activated" when they receive appropriate stimulation. You can extend this idea of activation to groups or paths of neurones that map to more specific concepts.
Unlike a computer, this structure lets our brains evaluate multiple things literally at the same time. If we have a complex problem, our brains will be contemplating multiple solutions. But with so many possible solutions, how does the brain decide on which one to choose?
Evolution's solution was the mechanism of "inhibition" -- certain activations inhibit other responses. In the case of a complex problem, the strongest solution "inhibits" the others, effectively switching them off.
Alun Morgan, then, had learnt enough Welsh to speak it... in theory. But he hadn't learnt enough to overcome the inhibitions that English had placed in the way of him speaking it.
The Real Challenge
That, then, is the real challenge for any language learner: not just learning enough to be able to speak it, but learning enough to be able to defeat the native language that's competing with it.
This is where the apparent "magic" of immersion comes in: the brain picks up on the fact that the native language isn't any use. On the other hand, this is probably why forced immersion so often fails -- because the learner knows that they don't need the target language to communicate with the other people in the classroom.
So do we give up on immersion?
Not "give up", no, but we have to stop seeing it as something magical. Finding ways to make the language feel genuinely necessary to ourselves or to our students is not easy. I don't personally believe that we can ever overcome the native language as a source of inhibition through pure force of will. In fact, I believe it often has the opposite effect, turning the target language into a barrier to communication rather than a means of communication.
Computer games as language study
Before the Christmas holidays I tried to give advice to my students about what to do to keep practicing their English in the holidays, and the thing I really recommended them to do was to play computer games, and I was sure I'd blogged about computer games before, but apparently not...
Anyway, computer games as language study? Nonsense, right? "Blam blam blam, pow pow zap kaboom" is the same in any language, after all.
True. Not all games are of any use to a language learner as most have virtually zero language content. Action games really aren't much use... unless you're playing with native speakers on the internet, but most people aren't going to do that when they can play with people who speak their own language (and also on a local server minimising the lag).
The games that I recommend to my students are point-and-click adventures, the likes of the old Lucasarts Monkey Island series (if you're familiar with those). If you're not familiar with the genre, you basically go around talking to people and trying to find objects to solve puzzles. There isn't usually any way to die, you just keep going until you find the solution and win the game. These are particularly useful for several reasons:
I've used this approach myself, having bought the first two Runaway games when I arrived in Spain, and having downloaded a couple of games in French before moving there.
It won't teach you a language by itself, but as a form of practice when you're at the immediate level, it's very, very effective, and I would recommend it to anyone, even if you don't normally play computer games....
Anyway, computer games as language study? Nonsense, right? "Blam blam blam, pow pow zap kaboom" is the same in any language, after all.
True. Not all games are of any use to a language learner as most have virtually zero language content. Action games really aren't much use... unless you're playing with native speakers on the internet, but most people aren't going to do that when they can play with people who speak their own language (and also on a local server minimising the lag).
The games that I recommend to my students are point-and-click adventures, the likes of the old Lucasarts Monkey Island series (if you're familiar with those). If you're not familiar with the genre, you basically go around talking to people and trying to find objects to solve puzzles. There isn't usually any way to die, you just keep going until you find the solution and win the game. These are particularly useful for several reasons:
- Dialogue is central to the game.
In most action games, very little of the dialogue is needed to complete the game; in a point-and-click adventure, the dialogue is full of clues on how to solve the puzzles. The learner is therefore forced to pay attention and to try to understand. - Dialogue is partially repetitive and has a restricted vocabulary.
On the simplest level, there are default phrases that are repeated whenever the player tries to perform a task that isn't part of the game (eg "I don't want to cut that", "It's too heavy"). More subtley, though, the vocabulary is very "tight" as the same words appear very frequently to give you clues as to how to solve the puzzles.
When you're watching a film or reading a book, there is repetition of language, but not to the same extent. And yet, because these games are designed for native speakers, the repetition is not so blatant and restrictive that it becomes boring, unlike many dedicated learners' resources. - Subtitled dialogue.Most of the games in this genre have voices and text, although the earlier ones are text only. Having the option to read the text as you're listening seems to help train people to recognise the spoken form of words they know how to read. You can't really do that with film and TV, because the subtitles never match exactly what the actors say (subtitles have to be easy and quick to read) but in these games, the spoken and written dialogue is almost always exactly the same -- as a rough estimate, I'd say at least 95% of all dialogue matches. This makes this the only class of "authentic materials" that offers the ability to read and listen at the same time.
- Slow pacing.
The dialogue in these games is pretty easy to follow, as it's not subject to the usual sources of interference. In films, people talk over each other and loud sound effects mix with the dialogue. In a point-and-click adventure, everything is usually clear and distinct (and if it's not, you can usually turn down the volume on music and sound effects independently of the dialogue). As a bonus, the pacing of the game gives you plenty of time to look up unknown vocabulary as it occurs -- the game naturally waits for you, so there's no need to pause and unpause.
I've used this approach myself, having bought the first two Runaway games when I arrived in Spain, and having downloaded a couple of games in French before moving there.
It won't teach you a language by itself, but as a form of practice when you're at the immediate level, it's very, very effective, and I would recommend it to anyone, even if you don't normally play computer games....
21 December 2012
Classroom activity: the ever-expanding story
When I started learning Gaelic, I was learning from a very experienced teacher. She was a retired headmistress (former music teacher) and had been teaching Gaelic since she gave up her previous job. (Dr Margaret MacKinnon, a long-serving judge at the Gaelic music festival "the Mod".)
It was an intensive week-long course and towards the end of the week she had us lined up on the steps of the outdoor amphitheatre (it was a gorgeous sunny day) and she got us to tell a story. The rule was simple: repeat everything that had already been said, then add something.
I liked this, and I've frequently gone back to analyse why.
My first attempt at an explanation was this:
It is easy to try to translate received language into your native language. It is easy to remember the story as the meaning only, and forget about the words. With a short sentence, you can usually get away with translating backwards and forwards. As the sequence grew longer, the complexity of trying to mentally juggle the original sentence, the translation and meaning became too great. The most efficient way to carry out the task was to stick with the Gaelic.
So that was my first thought: it "maxes out" your brain, forcing you to be more efficient.
Now I recently tried something similar but without the repetition -- just the addition of words. It was only partially successful, leading to two further observations:
The first one is pretty interesting to me, as I'm very much against rote learning, so of course I had to justify to myself why this repetition isn't rote. ;-)
Well, for one thing, they're not going to be able to recite the story the day after, so it's not really rote "learning", even if it's a somewhat rote process. Well that's sophistry, so I couldn't really kid myself on with that for very long.
The second justification is that I found that the longer the sentence got, the more I needed to visualise the story in order to remember it. You can repeat a short phrase parrot-fashion, but it takes a long time to memorise a long passage if you don't understand it. Therefore the student is forced to engage in the material meaningfully. This is just a refinement of my earlier assessment of it as a "maxing out" of the brain, but I believe it's crucial to addressing classroom problems in all activities.
Too many tasks that I have been faced with as a learner have left me with the choice between a rote, mechanical approach to solving the problem and a meaningful approach. I've always chosen the meaningful approach, which is what makes me a successful learner. The least successful learners are the ones who chose the mechanical approach -- but that's not the learner's mistake, it's the teacher's mistake, because the human brain always seeks the most efficient approach to complete the task at hand. If the easiest way to complete a language task is a mechanical one, that's bad task design.
I cannot emphasise this point enough. I have spoken to a great many teachers who simply don't get it. They say my approach is the "correct" one, and what others should be doing too. They blame the weaker students for making the wrong choice. But how can they make that choice if they don't know what it is? I knew how to learn because I was taught to learn: I spent most of my pre-school hours in the care of my mother, a fully-qualified school teacher, playing with educational toys. I did not need to be taught how to learn, but the others did. Please don't ask students to make a choice until you've started to teach them how to make that choice...
But I'm diverging from the activity....
So we've got a task that requires attention, discourages distraction, forces the student to process language efficiently and meaningfully.
The next big concept I picked up on was the idea of "mirror neurones". I had long believed that receiving and producing language were intrinsically linked, and that we understood others by considering what would make us say the things that the other party says.
Then I read about mirror neurone theory, which claims that this is pretty much what happens. So does the activity put words in your mouth? Are the students going through the process of production every time they hear this language that they now understand? I hope so, and even as the teacher doing this task in English, I feel myself "speaking" in my head while the students are trying to recall the whole story.
But today I refined my views further in terms of gamification, which I have been thinking about a lot lately.
My lack of belief in gamification has been previously documented here, and can be summed up as "gamification isn't about the core mechanics of a game, and it's the mechanics of the game that make a game 'fun'." In a gamified classroom, this activity would be rejected as there are no scores and no winners and losers. There is no "competition" or "achievements".
If you tried to add anything like that in, you would reduce the effectiveness of the game. When Margaret did it with us, she encouraged us to correct our own mistakes before continuing. When I do it with my students, I correct their mistakes and work them into the story. If a frequent pairing comes out in the wrong order due to the turn-taking, I stop and I fix it, and the language content improves (eg if one student said "butter..." and the next said "...and bread", I would correct it to the neutral order "bread and butter" to prevent rehearsing an unusual collocation).
But the only way of scoring it would be to penalise mistakes, which would probably result in much shorter and much less effective sentences.
However, the activity has a natural "game mechanic" which is solid and motivates learning: there is a challenge, and the challenge increases, and as you face the challenge you learn to cope with it. That's what a game is: learning to progressively cope with more and more difficult, and more and more varied, challenges. "Gamifying" this activity, like most educational activities, would kill "the game" that's already there... which is why gamification is such a waste of time.
So after all that theory and pontification, here's:
The activity
Arrange the class such that there is a clear order. That can be rows, a single line, or a circle.
Say one, two or three words to start the story.
The first student repeats your words, then adds 1, 2 or 3 of his own.
The second repeats his, and adds 1, 2 or 3 more.
Now it's vital that this happens quickly. Some students will want to stop and think of "what" to say when a quick "so he", "then it" or even just "and" keeps the game moving and leaves it to the next person to finish (and they've got the whole time of the repeat to think of something).
Correct errors. Make sure they're repeating correct language.
It will stutter and slow down. Some people will need prompting with a few words to jog their memory. Keep it going for a while longer -- don't restart at the first forgotten word.
But at some point stop it and start afresh -- a few problems is a challenge, but too many is frustrating, which is never good.
Don't let them write it down -- that just gives them a way to stop paying attention. (In a very mixed group, it might seem necessary for the weakest, but it's a survival strategy and it seems to reduce the educational value.)
So why all that pontificating before?
Why didn't I just explain the activity on its own, before all the theorising?
Because a learning task must serve a purpose and the teacher must know what that purpose is.
Because I'm personally tired of seeing teaching activities described without giving a clear description and justification of what they're supposed to achieve and how.
Because I don't want readers to see the activity and then "adapt" it without fully understanding what it currently does. I don't want people to delete the repetition on grounds of being "boring" or "rote" -- the activity is far more boring without it.
And maybe mostly because I'm a self-important wee so-and-so who loves the sound of his own voice. Aren't we all?
It was an intensive week-long course and towards the end of the week she had us lined up on the steps of the outdoor amphitheatre (it was a gorgeous sunny day) and she got us to tell a story. The rule was simple: repeat everything that had already been said, then add something.
I liked this, and I've frequently gone back to analyse why.
My first attempt at an explanation was this:
It is easy to try to translate received language into your native language. It is easy to remember the story as the meaning only, and forget about the words. With a short sentence, you can usually get away with translating backwards and forwards. As the sequence grew longer, the complexity of trying to mentally juggle the original sentence, the translation and meaning became too great. The most efficient way to carry out the task was to stick with the Gaelic.
So that was my first thought: it "maxes out" your brain, forcing you to be more efficient.
Now I recently tried something similar but without the repetition -- just the addition of words. It was only partially successful, leading to two further observations:
- The complexity of the structure of the story and language is supported by the repetition.
- The need to repeat is a great piece of classroom management.
The first one is pretty interesting to me, as I'm very much against rote learning, so of course I had to justify to myself why this repetition isn't rote. ;-)
Well, for one thing, they're not going to be able to recite the story the day after, so it's not really rote "learning", even if it's a somewhat rote process. Well that's sophistry, so I couldn't really kid myself on with that for very long.
The second justification is that I found that the longer the sentence got, the more I needed to visualise the story in order to remember it. You can repeat a short phrase parrot-fashion, but it takes a long time to memorise a long passage if you don't understand it. Therefore the student is forced to engage in the material meaningfully. This is just a refinement of my earlier assessment of it as a "maxing out" of the brain, but I believe it's crucial to addressing classroom problems in all activities.
Too many tasks that I have been faced with as a learner have left me with the choice between a rote, mechanical approach to solving the problem and a meaningful approach. I've always chosen the meaningful approach, which is what makes me a successful learner. The least successful learners are the ones who chose the mechanical approach -- but that's not the learner's mistake, it's the teacher's mistake, because the human brain always seeks the most efficient approach to complete the task at hand. If the easiest way to complete a language task is a mechanical one, that's bad task design.
I cannot emphasise this point enough. I have spoken to a great many teachers who simply don't get it. They say my approach is the "correct" one, and what others should be doing too. They blame the weaker students for making the wrong choice. But how can they make that choice if they don't know what it is? I knew how to learn because I was taught to learn: I spent most of my pre-school hours in the care of my mother, a fully-qualified school teacher, playing with educational toys. I did not need to be taught how to learn, but the others did. Please don't ask students to make a choice until you've started to teach them how to make that choice...
But I'm diverging from the activity....
So we've got a task that requires attention, discourages distraction, forces the student to process language efficiently and meaningfully.
The next big concept I picked up on was the idea of "mirror neurones". I had long believed that receiving and producing language were intrinsically linked, and that we understood others by considering what would make us say the things that the other party says.
Then I read about mirror neurone theory, which claims that this is pretty much what happens. So does the activity put words in your mouth? Are the students going through the process of production every time they hear this language that they now understand? I hope so, and even as the teacher doing this task in English, I feel myself "speaking" in my head while the students are trying to recall the whole story.
But today I refined my views further in terms of gamification, which I have been thinking about a lot lately.
My lack of belief in gamification has been previously documented here, and can be summed up as "gamification isn't about the core mechanics of a game, and it's the mechanics of the game that make a game 'fun'." In a gamified classroom, this activity would be rejected as there are no scores and no winners and losers. There is no "competition" or "achievements".
If you tried to add anything like that in, you would reduce the effectiveness of the game. When Margaret did it with us, she encouraged us to correct our own mistakes before continuing. When I do it with my students, I correct their mistakes and work them into the story. If a frequent pairing comes out in the wrong order due to the turn-taking, I stop and I fix it, and the language content improves (eg if one student said "butter..." and the next said "...and bread", I would correct it to the neutral order "bread and butter" to prevent rehearsing an unusual collocation).
But the only way of scoring it would be to penalise mistakes, which would probably result in much shorter and much less effective sentences.
However, the activity has a natural "game mechanic" which is solid and motivates learning: there is a challenge, and the challenge increases, and as you face the challenge you learn to cope with it. That's what a game is: learning to progressively cope with more and more difficult, and more and more varied, challenges. "Gamifying" this activity, like most educational activities, would kill "the game" that's already there... which is why gamification is such a waste of time.
So after all that theory and pontification, here's:
The activity
Arrange the class such that there is a clear order. That can be rows, a single line, or a circle.
Say one, two or three words to start the story.
The first student repeats your words, then adds 1, 2 or 3 of his own.
The second repeats his, and adds 1, 2 or 3 more.
Now it's vital that this happens quickly. Some students will want to stop and think of "what" to say when a quick "so he", "then it" or even just "and" keeps the game moving and leaves it to the next person to finish (and they've got the whole time of the repeat to think of something).
Correct errors. Make sure they're repeating correct language.
It will stutter and slow down. Some people will need prompting with a few words to jog their memory. Keep it going for a while longer -- don't restart at the first forgotten word.
But at some point stop it and start afresh -- a few problems is a challenge, but too many is frustrating, which is never good.
Don't let them write it down -- that just gives them a way to stop paying attention. (In a very mixed group, it might seem necessary for the weakest, but it's a survival strategy and it seems to reduce the educational value.)
So why all that pontificating before?
Why didn't I just explain the activity on its own, before all the theorising?
Because a learning task must serve a purpose and the teacher must know what that purpose is.
Because I'm personally tired of seeing teaching activities described without giving a clear description and justification of what they're supposed to achieve and how.
Because I don't want readers to see the activity and then "adapt" it without fully understanding what it currently does. I don't want people to delete the repetition on grounds of being "boring" or "rote" -- the activity is far more boring without it.
And maybe mostly because I'm a self-important wee so-and-so who loves the sound of his own voice. Aren't we all?
19 December 2012
Unrepresentative representation
I've heard it said that with a local councillor, a directly-elected MSP, several local list MSPs, a Westminster MP and an MEP in Europe, us Scottish people are better represented today than we have ever been. But is that the case? Commenters have noted that as populations have grown (and as the vote has been extended to commoners, women, and then younger people) the number of people represented by any individual politician has increased. How can one person represent thousands of very different people?
When we consider also that most of these politicians represent a handful of major parties and are in many ways mere figureheads for "party policy", in the end you have 6 or 7 manifestos representing the entire population of the UK. Clearly, they can't serve the public will.
When Thatcher wanted to dismantle union power in the 70s and 80s, she missed a trick: if there's one thing that democracy has taught us, it's that the best way to beat collective bargaining is by granting power to a representative body, rather than by taking it away, because the more diverse a group represented by a body, the less the body is representative of the group.
So you're probably asking yourself what this is doing on a language learning blog....
Well, it's not a language issue per se, but it is an education issue. It's an issue for universities, and for education funding. In my opinion, one of the worst things to happen to post-school education in the UK was when the technical colleges were given incentives to become new universities. The line between vocational and academic education was blurred unnecessarily. Do hairdressers need 4-year degree? Few people would genuinely say they do. And university education aims to build learner independence, when vocational education relies very much on supervised, hands-on training.
The two things are very different, and rather than grant vocational education the respect that it deserved in and of itself, they tried to make out it was something it wasn't.
Who is there today to campaign for a reversal of bad decisions? No-one.
Why? Representation.
University teachers' unions represent university teachers in all types of institution, and students' unions represent students in all types of institutions. This means that neither the students' group or the teachers' groups are able to stand up and point at one group of universities and say "they shouldn't be universities". It's pretty much impossible for these bodies to argue against any government policy (except across-the-board budget cuts) as any change will be beneficial for some of their members, and it's pretty much impossible to campaign for any new policy as it would likely be detrimental to some of their members.
The unions have therefore been given more and more representational power, leading to them rendering themselves powerless, and the government is free to do whatever they like. Even where protests have led to changes in policy, this usually on delays matters by a year or two and the changes happen anyway.
So you may be wondering why this topic came up all of a sudden.
I recently received an email from a university advertising a couple of new CPD certificates they were offering. For those of you who don't know, CPD stands for "continuing professional development", and is essentially means "job-related training courses". It is all right and proper that universities should be seeking to earn additional income from the professional training market, and I have no problem with that. These CPD certificates were built on modules in the university's degree scheme. It is all right and proper that universities should be seeking to reuse existing material in new ways, and I have no problem with that.
What I do have a problem with is the fact that these modules were priced at the standard cost of a Scottish Higher Education module. Presumably, then, the university is offering professional training, but putting it through the system as higher education and claiming government funding for it.
I contacted the student president for the institution to express my concerns about this, and he leapt to their defence. Everyone has a right to an education, he told me. Now I agree with this, but everyone should have the same right as everyone else. Why should certain people get government funding for their CPDs when other people don't? All in all, this seems like fiddling the books to me.
But in the end it doesn't matter what he personally believes, because he is duty-bound to represent all matriculated students at his institution. (I did point out to him that the CPD students aren't students until they actually sign up, but that's not the main point.)
What we have here, then, is a situation where a small group are benefitting from special treatment at the cost of an education budget with a specific goal, but no-one is able to raise an effective protest against the misuse of funds because everyone represents someone who benefits from it, even though it is to the detriment of most of the people they represent.
How can we defend free education when we aren't able to denounce those who harm the system?
When we consider also that most of these politicians represent a handful of major parties and are in many ways mere figureheads for "party policy", in the end you have 6 or 7 manifestos representing the entire population of the UK. Clearly, they can't serve the public will.
When Thatcher wanted to dismantle union power in the 70s and 80s, she missed a trick: if there's one thing that democracy has taught us, it's that the best way to beat collective bargaining is by granting power to a representative body, rather than by taking it away, because the more diverse a group represented by a body, the less the body is representative of the group.
So you're probably asking yourself what this is doing on a language learning blog....
Well, it's not a language issue per se, but it is an education issue. It's an issue for universities, and for education funding. In my opinion, one of the worst things to happen to post-school education in the UK was when the technical colleges were given incentives to become new universities. The line between vocational and academic education was blurred unnecessarily. Do hairdressers need 4-year degree? Few people would genuinely say they do. And university education aims to build learner independence, when vocational education relies very much on supervised, hands-on training.
The two things are very different, and rather than grant vocational education the respect that it deserved in and of itself, they tried to make out it was something it wasn't.
Who is there today to campaign for a reversal of bad decisions? No-one.
Why? Representation.
University teachers' unions represent university teachers in all types of institution, and students' unions represent students in all types of institutions. This means that neither the students' group or the teachers' groups are able to stand up and point at one group of universities and say "they shouldn't be universities". It's pretty much impossible for these bodies to argue against any government policy (except across-the-board budget cuts) as any change will be beneficial for some of their members, and it's pretty much impossible to campaign for any new policy as it would likely be detrimental to some of their members.
The unions have therefore been given more and more representational power, leading to them rendering themselves powerless, and the government is free to do whatever they like. Even where protests have led to changes in policy, this usually on delays matters by a year or two and the changes happen anyway.
So you may be wondering why this topic came up all of a sudden.
I recently received an email from a university advertising a couple of new CPD certificates they were offering. For those of you who don't know, CPD stands for "continuing professional development", and is essentially means "job-related training courses". It is all right and proper that universities should be seeking to earn additional income from the professional training market, and I have no problem with that. These CPD certificates were built on modules in the university's degree scheme. It is all right and proper that universities should be seeking to reuse existing material in new ways, and I have no problem with that.
What I do have a problem with is the fact that these modules were priced at the standard cost of a Scottish Higher Education module. Presumably, then, the university is offering professional training, but putting it through the system as higher education and claiming government funding for it.
I contacted the student president for the institution to express my concerns about this, and he leapt to their defence. Everyone has a right to an education, he told me. Now I agree with this, but everyone should have the same right as everyone else. Why should certain people get government funding for their CPDs when other people don't? All in all, this seems like fiddling the books to me.
But in the end it doesn't matter what he personally believes, because he is duty-bound to represent all matriculated students at his institution. (I did point out to him that the CPD students aren't students until they actually sign up, but that's not the main point.)
What we have here, then, is a situation where a small group are benefitting from special treatment at the cost of an education budget with a specific goal, but no-one is able to raise an effective protest against the misuse of funds because everyone represents someone who benefits from it, even though it is to the detriment of most of the people they represent.
How can we defend free education when we aren't able to denounce those who harm the system?
11 December 2012
Gamification... I'm not a fan
I'd been thinking for a while about writing something on Gamification, but I'd never got round to it. I was kicked into action today, though, by a video appearing on Slashdot by a US lecturer by the name of Clifford Lampe:
Gamification, if you don't know the word already, is to use game mechanics to improve whatever it is you do. It started as an idea for education and I wasn't a fan of the idea. It moved into business and I wasn't a fan of the idea. It's now moving back into the classroom ... and I'm still not a fan of the idea.
My main criticism is pretty blunt: learning is fun already.
"Now wait a moment," I hear you cry, "not everyone enjoys learning."
Well yes, yes they do. What they don't enjoy is when they're stuck in a classroom and they aren't actually learning anything. In fact, years ago I read an article claiming that by using a brain scanner, scientists had proven that all the fun in a game comes via the learning centres of the brain.
Gamification, in whatever I've read or heard on the subject, doesn't take this to heart. Instead, it focuses on the accoutrements of gaming, and tries to manipulate "achievement addiction". In business, you give out little badges to regular contributors to your website to encourage them to contribute, rather than making the actual process of contribution inherently rewarding. That's fine for customer retention, but it's misdirected focus if you're attempting to teach.
Basically, the teacher ends up looking for ways to convince students to complete the task in the hope that in doing so, they will learn, instead of designing a task that is so inherently educational that the student becomes engrossed in the process itself. The latter is what is traditionally known as "good teaching".
Gamification therefore continues the trend that talk of multiple intelligences and affective factors have established: an single small part of the puzzle eclipses the bigger picture and distracts educators from looking critically at their material in its own terms.
Now, Lampe's video is somewhat disingenuous (although that may be the editor's fault, not his). At no point is there any mention of what his course is, although the mention of a mix of computing and sociology students gives us a clue that it's something about online interaction, and if we look at his personal university page, we can see he teaches two things: a first year undergrad Introduction to Information Systems and a higher level course called eCommunities. Without this context, his talk about the use of social media in the classroom is utterly meaningless -- because web 2.0 isn't just the medium of the lesson, it's also the topic.
It's pretty hard to generalise out of this.
Worse, he himself suggests that the content of his teaching appears to be more memorable precisely because his teaching style is unique. Consequently the technique must logically lose effectiveness if used elsewhere. That's true of any mnemonic technique, of course. Give a student 2 or 3 useful acronyms and they'll remember them. Give them 2 dozen and they'll start to clash with each other and because impossible to use. So "teaching style as mnemonic" suggests we should all be doing very different things, not all adopting the same technique (for example: gamification!).
Some of the other things he suggests are elements of gamification are choice of assignments, but many teachers already do that. The question to be addressed is which teaching points can be fairly tested with a free choice assignment, and which need a specific task, because every point is different.
Moving on from Lampe specifically, the problem is that gamification comes down to the notion of "achievements". The notion of achievements started with scout badges, as far as I'm aware. Games started to recognise various skills rather than have everyone chase the same goal: the high score. With online high-score tables, that high-score became harder to achieve. But this evolved out of existing behaviour. Games provided sufficient information to start manually comparing metrics -- people started replaying Mario games and finding as many coins as possible. Sonic players took up the idea of the "speed run" in early levels. Games started giving more and more information to allow players to track their metrics: Doom told you how long you'd taken, how many secrets you'd uncovered and how accurate you were at shooting (shots on target:total shots).
So what does that mean? Early achievements were led by the players' existing behaviour -- it was not an imposition of "gamification" rules on gaming.
Of course, later achievements were an imposition of gamification on gaming. A target like "kill 500 orcs with the axe" doesn't reward skill specifically, just persistence. It doesn't promote learning, then. Others involve very specific skills that are not of general use. I tried for a while to get "Terminal Velocity" in the Steam achievements for the game Portal. To do it, you have to perform some very delicate maneouvers to fall continuously for 30,000 feet. But it's just fiddliness for its own sake, it's not a necessary skill to do anything pratical within the game itself. How ironic is that? Gamifying the game actually takes you away from the game?
They really are just using achievements as a drug to keep you coming back as an alternative to giving you a genuine reason to do so.
Copying this strategy into the university is a hiding to nothing, because you're encouraging time-on-task, but you're perverting their drive. You're rewarding time instead of rewarding learning. That's pretty rough.
And in the end, of course, any game can be gamed. We already have enough problems trying to prevent cheating against established metrics (exams and assignments), but any new assessment metric is going to need hardened against cheating....
Gamification, if you don't know the word already, is to use game mechanics to improve whatever it is you do. It started as an idea for education and I wasn't a fan of the idea. It moved into business and I wasn't a fan of the idea. It's now moving back into the classroom ... and I'm still not a fan of the idea.
My main criticism is pretty blunt: learning is fun already.
"Now wait a moment," I hear you cry, "not everyone enjoys learning."
Well yes, yes they do. What they don't enjoy is when they're stuck in a classroom and they aren't actually learning anything. In fact, years ago I read an article claiming that by using a brain scanner, scientists had proven that all the fun in a game comes via the learning centres of the brain.
Gamification, in whatever I've read or heard on the subject, doesn't take this to heart. Instead, it focuses on the accoutrements of gaming, and tries to manipulate "achievement addiction". In business, you give out little badges to regular contributors to your website to encourage them to contribute, rather than making the actual process of contribution inherently rewarding. That's fine for customer retention, but it's misdirected focus if you're attempting to teach.
Basically, the teacher ends up looking for ways to convince students to complete the task in the hope that in doing so, they will learn, instead of designing a task that is so inherently educational that the student becomes engrossed in the process itself. The latter is what is traditionally known as "good teaching".
Gamification therefore continues the trend that talk of multiple intelligences and affective factors have established: an single small part of the puzzle eclipses the bigger picture and distracts educators from looking critically at their material in its own terms.
Now, Lampe's video is somewhat disingenuous (although that may be the editor's fault, not his). At no point is there any mention of what his course is, although the mention of a mix of computing and sociology students gives us a clue that it's something about online interaction, and if we look at his personal university page, we can see he teaches two things: a first year undergrad Introduction to Information Systems and a higher level course called eCommunities. Without this context, his talk about the use of social media in the classroom is utterly meaningless -- because web 2.0 isn't just the medium of the lesson, it's also the topic.
It's pretty hard to generalise out of this.
Worse, he himself suggests that the content of his teaching appears to be more memorable precisely because his teaching style is unique. Consequently the technique must logically lose effectiveness if used elsewhere. That's true of any mnemonic technique, of course. Give a student 2 or 3 useful acronyms and they'll remember them. Give them 2 dozen and they'll start to clash with each other and because impossible to use. So "teaching style as mnemonic" suggests we should all be doing very different things, not all adopting the same technique (for example: gamification!).
Some of the other things he suggests are elements of gamification are choice of assignments, but many teachers already do that. The question to be addressed is which teaching points can be fairly tested with a free choice assignment, and which need a specific task, because every point is different.
Moving on from Lampe specifically, the problem is that gamification comes down to the notion of "achievements". The notion of achievements started with scout badges, as far as I'm aware. Games started to recognise various skills rather than have everyone chase the same goal: the high score. With online high-score tables, that high-score became harder to achieve. But this evolved out of existing behaviour. Games provided sufficient information to start manually comparing metrics -- people started replaying Mario games and finding as many coins as possible. Sonic players took up the idea of the "speed run" in early levels. Games started giving more and more information to allow players to track their metrics: Doom told you how long you'd taken, how many secrets you'd uncovered and how accurate you were at shooting (shots on target:total shots).
So what does that mean? Early achievements were led by the players' existing behaviour -- it was not an imposition of "gamification" rules on gaming.
Of course, later achievements were an imposition of gamification on gaming. A target like "kill 500 orcs with the axe" doesn't reward skill specifically, just persistence. It doesn't promote learning, then. Others involve very specific skills that are not of general use. I tried for a while to get "Terminal Velocity" in the Steam achievements for the game Portal. To do it, you have to perform some very delicate maneouvers to fall continuously for 30,000 feet. But it's just fiddliness for its own sake, it's not a necessary skill to do anything pratical within the game itself. How ironic is that? Gamifying the game actually takes you away from the game?
They really are just using achievements as a drug to keep you coming back as an alternative to giving you a genuine reason to do so.
Copying this strategy into the university is a hiding to nothing, because you're encouraging time-on-task, but you're perverting their drive. You're rewarding time instead of rewarding learning. That's pretty rough.
And in the end, of course, any game can be gamed. We already have enough problems trying to prevent cheating against established metrics (exams and assignments), but any new assessment metric is going to need hardened against cheating....
10 December 2012
Too many cooks spoil the net....
When I first took up English teaching in 2007, the internet was an incredibly useful resource. If I was stuck for a lesson idea, a quick Google search or a glance at one of my favourite sites would give me the inspiration and ideas I needed to build a useful lesson.
But now, the TEFL world and his dog are all posting their ideas on the internet. A search that would have brought up a handful of useful links in 2007 now brings up a load of low quality worksheets and seemingly aimless tasks. The act of searching for material is now arguably more time-consuming than just sitting down and writing your own material from the ground up, leading to the wonderful paradox that publishing more information leads to less reuse of material. And this potentially snowballs, as all those teachers who're making their own material "because they can't find good stuff" start to publish their stuff too, further exacerbating the problem.
Now this is not to say that the authors of these materials aren't good teachers, but it is clear that the worksheets and activities don't fully encapsulate the spirit and methods of their class. The warm-ups, the support activities, even the teacher's personality have a transformative effect on the material presented, and to take a list of a dozen questions and divorce them from that context robs them of their meaning and effectiveness.
People often compare the sharing of documents to the open-source software movement (well, it's getting less and less common now that document sharing is getting more and more common) but I've always considered that a falacious comparison. Open-source is about exposing the underlying logic to the wider world, and allowing them to improve that logic; but a document is merely the conclusions reached by the author, not the logic he followed to make those decisions.
A very well designed learning exercise or test will follow a very strict process to ensure a sufficiently wide coverage of concepts and minimise the influence of luck on obtaining the correct answer, but as soon as you change one question in a set, you can end up breaking the balance of concepts tested and leave out something important.
If we're honest with ourselves, though, most of us will admit that we don't properly balance our tasks. There will be times when we hand out a worksheet and realise that we've missed an important case, and I know that as soon as I started marking the last grammar test I set, I started spotting gaps where concepts weren't tested while other concepts were tested multiple times.
Which is, of course, why we look to other people to provide us with exercises and tests -- it is far more efficient to have one person spend all the time making a meticulously balanced question set and then have hundreds of teachers reuse those questions.
That's why books are such a great idea in theory -- it's just a shame that in practice a great many language teaching books don't live up to the promise; which is where the internet was supposed to help. Unfortunately, the online material I've found to date isn't of great help. There's two major camps: the let's-dump-our-worksheets-and-move-on crowd and the oh-look-what-I-can-do-with-technology crowd.
The first lot is basically what I've already talked about -- problem sets with little or no guidance on how to build a coherent lesson around them.
The second lot is people who have learned how to use some flashy little piece of software, but more often than not they find themselves being controlled by the software, rather than being in control of it. This leads to a proliferation of pairing exercises (question halves and question-with-answer) and ordering exercises (sentence order or line-by-line conversation) because that's what the author knows how to do with the software. It's a further weakening of pedagogy.
The last example I saw of this was for revision of conditional sentences (if...). There was an introductory page that described the four conditional types, and then a selection of revision exercises, but each exercise focused on one type of sentence only, so the learner never needed to choose which type of conditional to use, just remember how to form the given type. These sort of structurally-focused exercises are usually only recommended when initially teaching the form, with student scaffolding being reduced continually until they are able to make free, independent choices. But as I said, this was allegedly a revision page... yet they were doing exercises designed for introducing the structure, because that was the type of exercise the author knew how to create. And it looked nice, too.
So, yes, the internet is slowly becoming more of a problem, not a solution.
A way forward...?
This problem isn't really anything new -- it has been the perennial problem of the internet. In the early days, the web was a collection of articles written mostly by academics, so it was high quality, low volume. As more and more people started posting stuff online, the volume went up, and the average quality went down.
"No problem," the academics told us, "the network will self-organise, and the cream will float to the top."
The mechanism by which this self-organisation took place was intelligent linking. A trusted source recommends other trusted sources, and surfers navigated that way. It worked -- that's how I found a lot of information online at the turn of the century.
Then came Google, whose algorithm worked on the same principles -- links acted as recommendations, and the value of a link was related to the linking site's rating. It was very effective.
But this network of links just doesn't exist in the language resource world. Most resources are what we would technically consider "leaf" nodes in the network -- they are end-points that don't lead anywhere else. Even when they do, it's normally only to other materials on the same site -- there are very few teachers' resource sites that aren't dedicated to keeping you on their site and their site alone. Those that do link to other sites are (at least in my experience) pretty unfussy about what they link to, listing far too many resources and in effect simply echoing the results of a Google search in a different format.
This means that Google has very little to go on when trying to rate resources for language teachers, and this leads to a paradox from Google's point of view: Google has become so popular as the way to find resources that people are no longer building up the web of links that Google relies on. This isn't true in all fields, where forum posts have started to replace traditional websites as the source of recommendations.
However, most fields have a certain sense of simultaneity -- TV programmes, for example, are broadcast at a fixed time, and their importance fades. Most spheres are subject to such fashions, so people will be talking about the same thing at the same time. But language points don't come into or go out of fashion. Every teacher in the world teaches them... but not at the same time. Although there are forums related to the topic, it's not a topic that is really suited to the medium.
So there's not enough information out there to let Google separate the wheat from the chaff. It's a mess.
What's needed is for teachers who find genuinely useful material to start cataloguing it selectively, publishing a useful collection of links to a small number of resources that cover the major language points that most teachers need. Sites that favour quantity over quality. And we need to start sharing the links to those sites. And we need to start using those sites, rather than Google. If you know of any such sites, feel free to add links in the comments section!
Collaborative materials
More than that, though, if we genuinely want to share our materials, we need to make sure that they can be updated and improved upon. Millions of man-hours are wasted by producing multiple flawed worksheets, when we could make minor modifications to each other's and produce something of lasting value.
I doubt I'm the only teacher who alters the free materials I've downloaded from the internet, but like all the others, I keep my modifications to myself because the author's given me permission to use the material, but not to redistribute it. This is a shame, because some of the best designed materials I've come across have been from non-natives, and fixing one or two little non-native errors would make them into something valuable... but I refuse to use or recommend anything with even one non-native error in it.
But just permission to republish isn't enough, because that wouldn't stop the proliferation of materials -- it would worsen it. A dozen different sites with slightly different versions of the same worksheet... that would be a nightmare. We have to look at the software world again and look at how they control their edits, updates and revision; how they resolve differences of opinion... or not (projects often "fork" into two versions when people can't agree on a single way to progress, and quite often these forks are merged together a few years down the line).
Or...
We could stick to the books and materials we've produced ourselves. Your choice.
But now, the TEFL world and his dog are all posting their ideas on the internet. A search that would have brought up a handful of useful links in 2007 now brings up a load of low quality worksheets and seemingly aimless tasks. The act of searching for material is now arguably more time-consuming than just sitting down and writing your own material from the ground up, leading to the wonderful paradox that publishing more information leads to less reuse of material. And this potentially snowballs, as all those teachers who're making their own material "because they can't find good stuff" start to publish their stuff too, further exacerbating the problem.
Now this is not to say that the authors of these materials aren't good teachers, but it is clear that the worksheets and activities don't fully encapsulate the spirit and methods of their class. The warm-ups, the support activities, even the teacher's personality have a transformative effect on the material presented, and to take a list of a dozen questions and divorce them from that context robs them of their meaning and effectiveness.
People often compare the sharing of documents to the open-source software movement (well, it's getting less and less common now that document sharing is getting more and more common) but I've always considered that a falacious comparison. Open-source is about exposing the underlying logic to the wider world, and allowing them to improve that logic; but a document is merely the conclusions reached by the author, not the logic he followed to make those decisions.
A very well designed learning exercise or test will follow a very strict process to ensure a sufficiently wide coverage of concepts and minimise the influence of luck on obtaining the correct answer, but as soon as you change one question in a set, you can end up breaking the balance of concepts tested and leave out something important.
If we're honest with ourselves, though, most of us will admit that we don't properly balance our tasks. There will be times when we hand out a worksheet and realise that we've missed an important case, and I know that as soon as I started marking the last grammar test I set, I started spotting gaps where concepts weren't tested while other concepts were tested multiple times.
Which is, of course, why we look to other people to provide us with exercises and tests -- it is far more efficient to have one person spend all the time making a meticulously balanced question set and then have hundreds of teachers reuse those questions.
That's why books are such a great idea in theory -- it's just a shame that in practice a great many language teaching books don't live up to the promise; which is where the internet was supposed to help. Unfortunately, the online material I've found to date isn't of great help. There's two major camps: the let's-dump-our-worksheets-and-move-on crowd and the oh-look-what-I-can-do-with-technology crowd.
The first lot is basically what I've already talked about -- problem sets with little or no guidance on how to build a coherent lesson around them.
The second lot is people who have learned how to use some flashy little piece of software, but more often than not they find themselves being controlled by the software, rather than being in control of it. This leads to a proliferation of pairing exercises (question halves and question-with-answer) and ordering exercises (sentence order or line-by-line conversation) because that's what the author knows how to do with the software. It's a further weakening of pedagogy.
The last example I saw of this was for revision of conditional sentences (if...). There was an introductory page that described the four conditional types, and then a selection of revision exercises, but each exercise focused on one type of sentence only, so the learner never needed to choose which type of conditional to use, just remember how to form the given type. These sort of structurally-focused exercises are usually only recommended when initially teaching the form, with student scaffolding being reduced continually until they are able to make free, independent choices. But as I said, this was allegedly a revision page... yet they were doing exercises designed for introducing the structure, because that was the type of exercise the author knew how to create. And it looked nice, too.
So, yes, the internet is slowly becoming more of a problem, not a solution.
A way forward...?
This problem isn't really anything new -- it has been the perennial problem of the internet. In the early days, the web was a collection of articles written mostly by academics, so it was high quality, low volume. As more and more people started posting stuff online, the volume went up, and the average quality went down.
"No problem," the academics told us, "the network will self-organise, and the cream will float to the top."
The mechanism by which this self-organisation took place was intelligent linking. A trusted source recommends other trusted sources, and surfers navigated that way. It worked -- that's how I found a lot of information online at the turn of the century.
Then came Google, whose algorithm worked on the same principles -- links acted as recommendations, and the value of a link was related to the linking site's rating. It was very effective.
But this network of links just doesn't exist in the language resource world. Most resources are what we would technically consider "leaf" nodes in the network -- they are end-points that don't lead anywhere else. Even when they do, it's normally only to other materials on the same site -- there are very few teachers' resource sites that aren't dedicated to keeping you on their site and their site alone. Those that do link to other sites are (at least in my experience) pretty unfussy about what they link to, listing far too many resources and in effect simply echoing the results of a Google search in a different format.
This means that Google has very little to go on when trying to rate resources for language teachers, and this leads to a paradox from Google's point of view: Google has become so popular as the way to find resources that people are no longer building up the web of links that Google relies on. This isn't true in all fields, where forum posts have started to replace traditional websites as the source of recommendations.
However, most fields have a certain sense of simultaneity -- TV programmes, for example, are broadcast at a fixed time, and their importance fades. Most spheres are subject to such fashions, so people will be talking about the same thing at the same time. But language points don't come into or go out of fashion. Every teacher in the world teaches them... but not at the same time. Although there are forums related to the topic, it's not a topic that is really suited to the medium.
So there's not enough information out there to let Google separate the wheat from the chaff. It's a mess.
What's needed is for teachers who find genuinely useful material to start cataloguing it selectively, publishing a useful collection of links to a small number of resources that cover the major language points that most teachers need. Sites that favour quantity over quality. And we need to start sharing the links to those sites. And we need to start using those sites, rather than Google. If you know of any such sites, feel free to add links in the comments section!
Collaborative materials
More than that, though, if we genuinely want to share our materials, we need to make sure that they can be updated and improved upon. Millions of man-hours are wasted by producing multiple flawed worksheets, when we could make minor modifications to each other's and produce something of lasting value.
I doubt I'm the only teacher who alters the free materials I've downloaded from the internet, but like all the others, I keep my modifications to myself because the author's given me permission to use the material, but not to redistribute it. This is a shame, because some of the best designed materials I've come across have been from non-natives, and fixing one or two little non-native errors would make them into something valuable... but I refuse to use or recommend anything with even one non-native error in it.
But just permission to republish isn't enough, because that wouldn't stop the proliferation of materials -- it would worsen it. A dozen different sites with slightly different versions of the same worksheet... that would be a nightmare. We have to look at the software world again and look at how they control their edits, updates and revision; how they resolve differences of opinion... or not (projects often "fork" into two versions when people can't agree on a single way to progress, and quite often these forks are merged together a few years down the line).
Or...
We could stick to the books and materials we've produced ourselves. Your choice.
06 December 2012
Defining yourself by what you're not....
There is a danger, which many of us fail to avoid, that we start to define ourselves as individuals or as groups by identifying what we aren't, or what we don't do, rather than what we are and what we do.
France has traditionally had a strong identity, but all too often it is by defining differences. They have a phrase: "exception française". But this exception very very often becomes an excuse. "We're different" ends up being a weakness, a reason not to even try. And that's very, very sad.
Here in Corsica, they talk about the "exception corse", a philosophy which boils down to "the French are different and we're different from the French, so we're really different." And again, it becomes an excuse for weakness, particularly when it comes to English classes. Some of my strongest students aren't from Corsica -- they're from the mainland.
Why should Corsica be any worse off than mainland France? It's a Mediterranean island, so it's a popular tourist destination for people from all over Europe. English is spoken here all the time by tourists from the UK and the rest of Europe. Heck, I've seen reviews by Germans online complaining about the fact that the owner of a dirt-cheap backpacker's campsite can't speak English. A) why complain? B) proof that English is vital to tourism.
But they've managed to define themselves as a community that can't learn English, and they're doing their best to fulfill their own prophecy. Which is very, very sad.
The same philosophy of "we're different" seems to underlie their own language, too. Many are fiercely proud that Corsica is very different from French. But at the same time, many go to pains to identify it as different from Italian, its closest relative (the nearest part of mainland Europe is Tuscany, where most of the features of Modern Standard Italian are taken from).
As a learner, it's a bit frustrating, though, when you say something and get told it's Italian. Quite often it is, because I'm just guessing, but very often it is Corsican. Because as with any language, there are variations. Things I have learned from one source are "corrected" by someone from a different region with the usual "no, that's Italian" - it's kind of off-putting.
That said, it's not like I haven't experienced this kind of thing before, and it's not like I wasn't guilty of it myself for a fair while, because isn't it true that in the same way UK English defines itself as "not American"? How many times do we criticise each other for using "Americanisms", when a great many Americanisms actually originated in regional variation within the British Isles? And given that a great many of these originated in Hiberno-English or in Scots, or indeed the Gaelic languages of both Ireland and Scotland, and given that I'm a Scot of Irish ancestry... well, when someone from another part of the UK criticised me for using Americanisms, was it true? Had I really picked up "bad habits" from TV? And when I did the same thing, was it true? Or was the real crime not "Americanism", but simply "being different from me"?
Because that's one of the biggest problems that any group identity faces: the false assumption of identicality. Even within a group, we must all be allowed variation and individual identity. One of the reasons Scottish nationalism often slips into Anglophobia is as a reaction to the uniform notion of "Britishness" that is imposed on us from the south. "British people" hate the French (Scotland's historical ally) and "British food" includes many regional English dishes, but no Scottish ones. "British people" wave the flag and love the queen.
This narrow notion of "Britishness" doesn't account for or allow the full variation of individual identity within the group of people it purports to define, so it is rejected by a great many people.
People say that nationalism is inherently bad because it focuses on differences, and is therefore divisive. I say that is not to be taken as a given: I believe that there is a real need to focus on our differences and to accept them. It is when we try to pretend that those differences don't exist that we become divided.
France has traditionally had a strong identity, but all too often it is by defining differences. They have a phrase: "exception française". But this exception very very often becomes an excuse. "We're different" ends up being a weakness, a reason not to even try. And that's very, very sad.
Here in Corsica, they talk about the "exception corse", a philosophy which boils down to "the French are different and we're different from the French, so we're really different." And again, it becomes an excuse for weakness, particularly when it comes to English classes. Some of my strongest students aren't from Corsica -- they're from the mainland.
Why should Corsica be any worse off than mainland France? It's a Mediterranean island, so it's a popular tourist destination for people from all over Europe. English is spoken here all the time by tourists from the UK and the rest of Europe. Heck, I've seen reviews by Germans online complaining about the fact that the owner of a dirt-cheap backpacker's campsite can't speak English. A) why complain? B) proof that English is vital to tourism.
But they've managed to define themselves as a community that can't learn English, and they're doing their best to fulfill their own prophecy. Which is very, very sad.
The same philosophy of "we're different" seems to underlie their own language, too. Many are fiercely proud that Corsica is very different from French. But at the same time, many go to pains to identify it as different from Italian, its closest relative (the nearest part of mainland Europe is Tuscany, where most of the features of Modern Standard Italian are taken from).
As a learner, it's a bit frustrating, though, when you say something and get told it's Italian. Quite often it is, because I'm just guessing, but very often it is Corsican. Because as with any language, there are variations. Things I have learned from one source are "corrected" by someone from a different region with the usual "no, that's Italian" - it's kind of off-putting.
That said, it's not like I haven't experienced this kind of thing before, and it's not like I wasn't guilty of it myself for a fair while, because isn't it true that in the same way UK English defines itself as "not American"? How many times do we criticise each other for using "Americanisms", when a great many Americanisms actually originated in regional variation within the British Isles? And given that a great many of these originated in Hiberno-English or in Scots, or indeed the Gaelic languages of both Ireland and Scotland, and given that I'm a Scot of Irish ancestry... well, when someone from another part of the UK criticised me for using Americanisms, was it true? Had I really picked up "bad habits" from TV? And when I did the same thing, was it true? Or was the real crime not "Americanism", but simply "being different from me"?
Because that's one of the biggest problems that any group identity faces: the false assumption of identicality. Even within a group, we must all be allowed variation and individual identity. One of the reasons Scottish nationalism often slips into Anglophobia is as a reaction to the uniform notion of "Britishness" that is imposed on us from the south. "British people" hate the French (Scotland's historical ally) and "British food" includes many regional English dishes, but no Scottish ones. "British people" wave the flag and love the queen.
This narrow notion of "Britishness" doesn't account for or allow the full variation of individual identity within the group of people it purports to define, so it is rejected by a great many people.
People say that nationalism is inherently bad because it focuses on differences, and is therefore divisive. I say that is not to be taken as a given: I believe that there is a real need to focus on our differences and to accept them. It is when we try to pretend that those differences don't exist that we become divided.
20 November 2012
Is language subconscious?
Is language subconscious? It's a question I hadn't given much thought to -- I'd always pretty much taken it for granted that it was.
Well anyway, apparently it's been a pretty contentious topic in linguistics and psychology for a while now, but a group of scientists from Hebrew University have found compelling evidence that it is, indeed, subconscious. (And not only "language" language, but even symbolic mathematical language to an extent).
Discussing this elsewhere called to mind a time when I was working on learning Scottish Gaelic with the aid of a piece of software. It flashed the word "eye" up on the screen, and I couldn't translate it. I had a complete block. I knew the word. I knew I knew the word. But it wouldn't come.
It felt to me like my brain didn't believe it was "eye", and that it wasn't sure it wasn't "I", or "aye" (or possibly even "ay").
The way I see it, a conscious language skill should have allowed me to override this confusion through conscious attention to the written form, which is (in theory) unambiguous. The fact that I couldn't force my way through to the correct meaning suggested that language is something you have less conscious control over than you think....
But fair enough, I've always felt that contextless words are very much a first step, and that for the most part you should be practising in context. This experience kind of confirmed it for me.
But the publishing of the findings came at the perfect time for me, having just written the draft of my last blog post about translation, and the way I automatically translated frames of reference (eg using the first person when the software says "you are..." etc), because that's the source of my dilemma:
Translation is effective precisely because native language is understood subconsciously. You understand and internalise the meaning effortlessly. This means you know exactly what message you want to express in the target language. There is no other means for a teacher, book or software package to indicate so clearly to a learner what do say. Also, there is no other means to ensure that the teacher knows exactly what the student is trying to say (I've heard many teachers "correct" students errors by leading them to say something that is valid in the target language, but completely different in meaning to what they intended to say).
Translation is ineffective if and whenever the native language is not understood subconsciously. Too many courses present contrived, meaningless examples. Perhaps these aren't quite as nonsensical as the examples the researchers from Hebrew University used, but as long as they are unnatural, they will interfere with subconscious processing, leaving the learner to process them consciously -- it's not "translation" that's the real problem here, it's "word juggling".
And translation's weak spot is the one I mentioned in my last blog post: our subconscious deals with relative references automatically. When you say the word "I" to me, I understand the concept of you, the person speaking. So dealing with first and second person pronouns is a perilous task. In order to do it correctly, the student has to stop using their subconscious processing, and language starts to become a conscious process... which is when translation stops working as a language learning technique.
This leads me to a conclusion that I really didn't expect, and that I'm not yet fully convinced of:
the optimal mixture of native-language instruction and target-language instruction in the classroom isn't a simple function of how advanced the students are, it goes right down to the level of "I" vs "he".
Wow.
Well anyway, apparently it's been a pretty contentious topic in linguistics and psychology for a while now, but a group of scientists from Hebrew University have found compelling evidence that it is, indeed, subconscious. (And not only "language" language, but even symbolic mathematical language to an extent).
Discussing this elsewhere called to mind a time when I was working on learning Scottish Gaelic with the aid of a piece of software. It flashed the word "eye" up on the screen, and I couldn't translate it. I had a complete block. I knew the word. I knew I knew the word. But it wouldn't come.
It felt to me like my brain didn't believe it was "eye", and that it wasn't sure it wasn't "I", or "aye" (or possibly even "ay").
The way I see it, a conscious language skill should have allowed me to override this confusion through conscious attention to the written form, which is (in theory) unambiguous. The fact that I couldn't force my way through to the correct meaning suggested that language is something you have less conscious control over than you think....
But fair enough, I've always felt that contextless words are very much a first step, and that for the most part you should be practising in context. This experience kind of confirmed it for me.
But the publishing of the findings came at the perfect time for me, having just written the draft of my last blog post about translation, and the way I automatically translated frames of reference (eg using the first person when the software says "you are..." etc), because that's the source of my dilemma:
Translation is effective precisely because native language is understood subconsciously. You understand and internalise the meaning effortlessly. This means you know exactly what message you want to express in the target language. There is no other means for a teacher, book or software package to indicate so clearly to a learner what do say. Also, there is no other means to ensure that the teacher knows exactly what the student is trying to say (I've heard many teachers "correct" students errors by leading them to say something that is valid in the target language, but completely different in meaning to what they intended to say).
Translation is ineffective if and whenever the native language is not understood subconsciously. Too many courses present contrived, meaningless examples. Perhaps these aren't quite as nonsensical as the examples the researchers from Hebrew University used, but as long as they are unnatural, they will interfere with subconscious processing, leaving the learner to process them consciously -- it's not "translation" that's the real problem here, it's "word juggling".
And translation's weak spot is the one I mentioned in my last blog post: our subconscious deals with relative references automatically. When you say the word "I" to me, I understand the concept of you, the person speaking. So dealing with first and second person pronouns is a perilous task. In order to do it correctly, the student has to stop using their subconscious processing, and language starts to become a conscious process... which is when translation stops working as a language learning technique.
This leads me to a conclusion that I really didn't expect, and that I'm not yet fully convinced of:
the optimal mixture of native-language instruction and target-language instruction in the classroom isn't a simple function of how advanced the students are, it goes right down to the level of "I" vs "he".
Wow.
16 November 2012
Just how do you prompt a student?
I'm a big fan of the courses recorded by Michel Thomas before his death, and I'm always happy to say so. The biggest complaint I hear about the Thomas courses are that they "teach you to translate". The argument goes that because the students are only ever prompted with their native language (English) then they never learn to "think in" their target language. This bold assertion lacks any substantial evidence. I would argue that translation is one of the best methods of prompting a student, and that avoiding it actually delays proficiency.
Last year I wrote a post entitled Translation: an unjustified scapegoat, in which I pointed out that translation is very very often blamed for errors that do not arise from native-language interference, and therefore cannot be translation errors. What I neglected to say is that this is a real demotivator for learners. To be told categorically that they're translating, when they don't really believe they are translating, and to be told to do something else without any instructions on how to do it... well, the teacher is essentially blaming the student. That's not teaching, sorry.
Anyway, that's not the main point of this article, so time to put the train of thought back on the rails.
I am in favour of translation for three main reasons:
1: Translation allows simultaneous focus on meaning and form
If you perform a language class in the target language only, it is all too common for the answer to be mechanically reproducible from the question, without any real need to understand the meaning of either.
eg Do you have a flargrard? - No, I don't have a flargrard.
I've no idea what a "flargrard" is, so it's a reasonable bet I don't have one. (Note to non-native English-speaking readers: the word "flargrard" doesn't exist -- I made it up for this example.)
It gets worse if you include substitution drills:
eg
House: I have a house - I have a house
Cat - I have a cat.
Dog - I have a dog.
Flargrard - I have a flargrard
Translation, on the other hand, gives the student a prompt that can be understood unambiguously. The student cannot fail to understand the full meaning of the sentence, a meaning which will therefore be instrinsically linked to the target sentence.
2: The so-called "form focus" of many monolingual tasks is really no such thing.
If the task can be done mechanically, as in the "answer in questions" example above, or the substitution drill, then you never have to select the appropriate form. You never have to recall it from memory. If you don't have to recall it from memory, you cannot learn to recall it from memory.
In fact, it is pretty much impossible to devise a monolingual language task that will elicit the required grammar point/structure spontaneously. You either supply them with the structure, or you end up involved in a metalinguistic discussion that leads to one or two of the class recalling the "rule", and if we end up talking about rules, we're not connecting with spontaneous language.
3: Target-language-only normally fails to be "naturalistic".
I've discussed the issues of expository vs naturalistic language before, and in this case, I'll refer you back to the question Do you have a flargrard? The natural response is to say simply "No," or "No, I don't," but we're generally forced to answer in (unnatural) sentences in the monolingual classroom: "No, I don't have a flargrard." I don't know about you, but I don't like "answering in sentences" -- my brain knows it's wrong and unnecessary. I don't like telling students off for not answering in sentences because I see this as evidence that they're actually involved in language, rather than just juggling words.
So target-language-only is potentially devoid of practice of both meaning and form, which I'd say is a pretty big problem for language learning!
Is translation the panacea then?
Well, no, because it certainly has its pitfalls.
For example, if I ask you to translate "a brown bear", am I asking you to say "a bear of colour brown" or "a bear of the species brown bear, also known as grizzly"? And even if both translate to the same thing in the target language, there's a point of assymetry when we hit "white bear", which is not ambiguous. Prompts for translation must be very carefully selected, then.
This limits how far we can learn a language by translation, obviously. We cannot learn every noun and idiom by direct translation, but we don't need to -- the trick is to use translation where it's (1) obvious and easy or (2) where conscious awareness of the difference helps overcome a specific difficulty.
(1) The "obvious and easy" would include my favourite example: conditional sentences in English vs Romance languages, which translate pretty much directly -- eg "If I was/were you, I would...", "If I'd known you were coming I'd have baked a cake" etc. This is "advanced" material in traditional classes, but translation makes it trivially easy (to the point where Michel Thomas would be teaching it on the second or third day of his courses).
(2) An example of a specific difficulty is the difference in idiom between "to be" an age in English and "to have" an age in the Romance languages. Not a difficult rule, but even after loooots of practice, you'll often here a learner make the mistake one way or the other. So the practice method the teacher uses gets the student to produce the desired answer, but it doesn't build any resistance to native-language interference, so in an uncontrolled setting the original error returns. (And the teacher blames the student for translating, and the student is confused and disheartened etc.)
One of the biggest visible benefits of translation though, is simple:
Speed, volume and throughput of practice
Because translation starts with a readily-understood prompt, you don't have to waste too much time thinking about what the prompt means or what you're being asked to do. This means you can get through a lot more questions. A translation-based lesson that manages to present no more questions than a target-language-only lesson is a wasted opportunity.
In an attempt to teach myself Corsican, I've written a little program that conjugates, combines and declines words and presents them to me as translation tasks, checks my answers and tells me if I got them right or wrong. I can batter through hundreds of examples in very little time. Kind of exhausting, yes, but pretty effective. A couple of hours using it, spread over a couple of weeks, has hammered in some of the basics pretty solidly.
But...
Translation's biggest problem
Once you start whipping through the questions at speed, you really do start to work on autopilot, and you start to see patterns emerging in your errors. And I noticed one specific type of mistake that I made frequently that I hadn't been too aware of before... I kept switching my "I" and my "you".
It makes perfect sense, now that I think about it, and I probably did it a lot with MT, even though I didn't pay it any mind at the time. And heck, I've even heard the same thing from some of my students when I've asked them to translate short sentences.
Because when the computer says to me "you know it", that "you" refers to me. It's "eio", "io", "je", "yo", "ich" or whatever. That's what it means. Literal direct translation is therefore something of a higher-order function, an abstraction.
And yet it seems to be quite effective. So what do we do?
Well, personally I'll be attempting to stay away from first and second-person references as much as possible. I'll be sticking to the minimum required to learn them individually as grammar points, but when the person is included only as part of the context for a sentence testing another grammar point, I'll favour "he/she/it/they" over "I/we/you".
But I'll certainly be paying more attention to what exactly happens when we translate. I still think it's one of the best tools the learner has, but we've just got to work to eliminate the ambiguities....
Last year I wrote a post entitled Translation: an unjustified scapegoat, in which I pointed out that translation is very very often blamed for errors that do not arise from native-language interference, and therefore cannot be translation errors. What I neglected to say is that this is a real demotivator for learners. To be told categorically that they're translating, when they don't really believe they are translating, and to be told to do something else without any instructions on how to do it... well, the teacher is essentially blaming the student. That's not teaching, sorry.
Anyway, that's not the main point of this article, so time to put the train of thought back on the rails.
I am in favour of translation for three main reasons:
1: Translation allows simultaneous focus on meaning and form
If you perform a language class in the target language only, it is all too common for the answer to be mechanically reproducible from the question, without any real need to understand the meaning of either.
eg Do you have a flargrard? - No, I don't have a flargrard.
I've no idea what a "flargrard" is, so it's a reasonable bet I don't have one. (Note to non-native English-speaking readers: the word "flargrard" doesn't exist -- I made it up for this example.)
It gets worse if you include substitution drills:
eg
House: I have a house - I have a house
Cat - I have a cat.
Dog - I have a dog.
Flargrard - I have a flargrard
Translation, on the other hand, gives the student a prompt that can be understood unambiguously. The student cannot fail to understand the full meaning of the sentence, a meaning which will therefore be instrinsically linked to the target sentence.
2: The so-called "form focus" of many monolingual tasks is really no such thing.
If the task can be done mechanically, as in the "answer in questions" example above, or the substitution drill, then you never have to select the appropriate form. You never have to recall it from memory. If you don't have to recall it from memory, you cannot learn to recall it from memory.
In fact, it is pretty much impossible to devise a monolingual language task that will elicit the required grammar point/structure spontaneously. You either supply them with the structure, or you end up involved in a metalinguistic discussion that leads to one or two of the class recalling the "rule", and if we end up talking about rules, we're not connecting with spontaneous language.
3: Target-language-only normally fails to be "naturalistic".
I've discussed the issues of expository vs naturalistic language before, and in this case, I'll refer you back to the question Do you have a flargrard? The natural response is to say simply "No," or "No, I don't," but we're generally forced to answer in (unnatural) sentences in the monolingual classroom: "No, I don't have a flargrard." I don't know about you, but I don't like "answering in sentences" -- my brain knows it's wrong and unnecessary. I don't like telling students off for not answering in sentences because I see this as evidence that they're actually involved in language, rather than just juggling words.
So target-language-only is potentially devoid of practice of both meaning and form, which I'd say is a pretty big problem for language learning!
Is translation the panacea then?
Well, no, because it certainly has its pitfalls.
For example, if I ask you to translate "a brown bear", am I asking you to say "a bear of colour brown" or "a bear of the species brown bear, also known as grizzly"? And even if both translate to the same thing in the target language, there's a point of assymetry when we hit "white bear", which is not ambiguous. Prompts for translation must be very carefully selected, then.
This limits how far we can learn a language by translation, obviously. We cannot learn every noun and idiom by direct translation, but we don't need to -- the trick is to use translation where it's (1) obvious and easy or (2) where conscious awareness of the difference helps overcome a specific difficulty.
(1) The "obvious and easy" would include my favourite example: conditional sentences in English vs Romance languages, which translate pretty much directly -- eg "If I was/were you, I would...", "If I'd known you were coming I'd have baked a cake" etc. This is "advanced" material in traditional classes, but translation makes it trivially easy (to the point where Michel Thomas would be teaching it on the second or third day of his courses).
(2) An example of a specific difficulty is the difference in idiom between "to be" an age in English and "to have" an age in the Romance languages. Not a difficult rule, but even after loooots of practice, you'll often here a learner make the mistake one way or the other. So the practice method the teacher uses gets the student to produce the desired answer, but it doesn't build any resistance to native-language interference, so in an uncontrolled setting the original error returns. (And the teacher blames the student for translating, and the student is confused and disheartened etc.)
One of the biggest visible benefits of translation though, is simple:
Speed, volume and throughput of practice
Because translation starts with a readily-understood prompt, you don't have to waste too much time thinking about what the prompt means or what you're being asked to do. This means you can get through a lot more questions. A translation-based lesson that manages to present no more questions than a target-language-only lesson is a wasted opportunity.
In an attempt to teach myself Corsican, I've written a little program that conjugates, combines and declines words and presents them to me as translation tasks, checks my answers and tells me if I got them right or wrong. I can batter through hundreds of examples in very little time. Kind of exhausting, yes, but pretty effective. A couple of hours using it, spread over a couple of weeks, has hammered in some of the basics pretty solidly.
But...
Translation's biggest problem
Once you start whipping through the questions at speed, you really do start to work on autopilot, and you start to see patterns emerging in your errors. And I noticed one specific type of mistake that I made frequently that I hadn't been too aware of before... I kept switching my "I" and my "you".
It makes perfect sense, now that I think about it, and I probably did it a lot with MT, even though I didn't pay it any mind at the time. And heck, I've even heard the same thing from some of my students when I've asked them to translate short sentences.
Because when the computer says to me "you know it", that "you" refers to me. It's "eio", "io", "je", "yo", "ich" or whatever. That's what it means. Literal direct translation is therefore something of a higher-order function, an abstraction.
And yet it seems to be quite effective. So what do we do?
Well, personally I'll be attempting to stay away from first and second-person references as much as possible. I'll be sticking to the minimum required to learn them individually as grammar points, but when the person is included only as part of the context for a sentence testing another grammar point, I'll favour "he/she/it/they" over "I/we/you".
But I'll certainly be paying more attention to what exactly happens when we translate. I still think it's one of the best tools the learner has, but we've just got to work to eliminate the ambiguities....
14 November 2012
By yon bonnie banks....
The Loch Lomond and the Trossachs National Park Authority in Scotland have been busy refreshing their public image with their logo, which is pretty nice, even if it does look a little bit like an island. (I'd say it's most like Barra, but only after a few meters of sea-level increase turns Eoligarry into a separate island.)
They've put up some fancy new stone signs at some of the entrances to the park, too:
First up, why on Earth is "the Trossachs" so much smaller than "Loch Lomond"? As a child, I spent a heck of a lot of time visiting the Trossachs (Loch Venachar being one of the few lochs in the area suitable for swimming in) and not a lot of time around Loch Lomond. I personally feel a wee bit aggrieved that the much nicer part plays second fiddle to something everyone knows from "that song"....
But secondly, there's no Gaelic on it. Nothing out of the ordinary in that, and while it was traditionally part of Highland Gaeldom, Gaelic is now extinct in the local area.
But...
The signage to the park had been raised bilingually, which means that this new (and expensive) sign is actually taking away the existing visibility and status for the language.
I've written to the park authority:
All of this is true.
In my classes here, several people have asked me if I speak Gaelic -- I don't get that a lot. When I was living in Edinburgh, newly-arrived Catalans, Basques and Galicians would ask about the language. It is a tourist asset.
I say this, but I am not one of those learners who proudly declare that Gaelic is "my language". It is not. I was brought up in the lowlands, in a village that probably spoke an Anglo-Saxon tongue from its very founding (there are various Celtic-derived placenames in the vicinity, but most of these are very, very old). I didn't start learning Gaelic until my mid-twenties, and even then I learned Spanish to a much better level. Either "my language" is English, or "my languages" are Scots and English. I am a nationalist, but my support for Gaelic isn't about Scottish nationalism (most people do not consider Gaelic a pre-requisite for Scottishness, so Gaelic is more likely to damage nationalism than help it).
Gaelic deserves its support simply as a mark of respect for people who speak the language. (By extension, any learner who claims to "love the language more than a native speaker" has completely missed the point of why you should learn someone else's language.)
But if that's not good enough, then be mercenary: Gaelic is a marketable asset. Scotland has a limited tourist draw thanks to its climate and those ******* midgies. Gaelic can be employed as a commercial tool to sell us as a destination for people from the other small nations of Europe, rather than relying on the New World diaspora revisiting their roots, and the odd European whisky fan on a "distillery pilgrimage"....
They've put up some fancy new stone signs at some of the entrances to the park, too:
It's very nice, I'm sure you'll agree. There's only two problems.First up, why on Earth is "the Trossachs" so much smaller than "Loch Lomond"? As a child, I spent a heck of a lot of time visiting the Trossachs (Loch Venachar being one of the few lochs in the area suitable for swimming in) and not a lot of time around Loch Lomond. I personally feel a wee bit aggrieved that the much nicer part plays second fiddle to something everyone knows from "that song"....
But secondly, there's no Gaelic on it. Nothing out of the ordinary in that, and while it was traditionally part of Highland Gaeldom, Gaelic is now extinct in the local area.
But...
The signage to the park had been raised bilingually, which means that this new (and expensive) sign is actually taking away the existing visibility and status for the language.
I've written to the park authority:
I am from near to the national park (Gargunnock) and a frequent visitor. At present I am working overseas, but am looking forward to finishing and getting home, and getting my bike back up the Duke's Pass, along the Inversnaid road and round Loch Katrine.
I am aware that the park has taken on a new brand identity, reflected both on the website and on new long-life stone signage erected at certain major entrance points.
I am disappointed, however, to find that the new logo has no space for the Gaelic name of the park, and that this has led to existing bilingual signage being replaced by a monolingual alternative, which is directly contrary to the general trend in Scotland.
Indeed, it is likely that at some point the park authorities will be subject to a Gaelic Language Plan, and that one of the key actions for the park will be the use of bilingual branding and signage. It is therefore easily foreseeable that the present signage will need to be replaced in the not-too-distant future, leading to additional expenses on the park that could have been avoided had a little foresight been applied.
I am currently living and working in Corsica, one of the many minority areas in Europe that considers themselves a "nation". The current debate over independence has led to increasing exposure for Scotland in the public consciousness here, and in other areas such as the Basque Country and Catalonia, where developments in Scotland are often featured in major news bulletins.
This presents an opportunity for Scotland, whether or not we eventually opt for independence.
While Gaelic does not represent a central feature of Scottish identity in the same way as Welsh in Wales, Catalan in Catalonia and Basque in the Basque Country, it is certainly a feature that is currently garnering much attention, and is therefore useful in attracting tourists. Loch Lomond, the Trossachs and most of the placenames therein are Gaelic in origin, and this is something that should be exploited to its maximum to attract tourists to the area.
Thank you for your time and attention in this matter,
All of this is true.
In my classes here, several people have asked me if I speak Gaelic -- I don't get that a lot. When I was living in Edinburgh, newly-arrived Catalans, Basques and Galicians would ask about the language. It is a tourist asset.
I say this, but I am not one of those learners who proudly declare that Gaelic is "my language". It is not. I was brought up in the lowlands, in a village that probably spoke an Anglo-Saxon tongue from its very founding (there are various Celtic-derived placenames in the vicinity, but most of these are very, very old). I didn't start learning Gaelic until my mid-twenties, and even then I learned Spanish to a much better level. Either "my language" is English, or "my languages" are Scots and English. I am a nationalist, but my support for Gaelic isn't about Scottish nationalism (most people do not consider Gaelic a pre-requisite for Scottishness, so Gaelic is more likely to damage nationalism than help it).
Gaelic deserves its support simply as a mark of respect for people who speak the language. (By extension, any learner who claims to "love the language more than a native speaker" has completely missed the point of why you should learn someone else's language.)
But if that's not good enough, then be mercenary: Gaelic is a marketable asset. Scotland has a limited tourist draw thanks to its climate and those ******* midgies. Gaelic can be employed as a commercial tool to sell us as a destination for people from the other small nations of Europe, rather than relying on the New World diaspora revisiting their roots, and the odd European whisky fan on a "distillery pilgrimage"....
13 November 2012
More than words....
It is a truism that a language isn't just a collection of words. This is interpreted by some teachers and learners that meaning that there's no point in studying a language formally, and they instead propose that we should memorise set phrases, and just read stuff until we understand.
OK, perhaps I'm overstating the case, and building something of a strawman out of the extreme position. However, even if most practitioners attempt to mix the two approaches, they've still missed the point of the observation.
It's called the "Principle of Compositionality" and it's summed up excellently in O'Reilly's book by Steven Bird, Ewan Klein, and Edward Loper on the Natural Language Toolkit for Python programming:
So a language isn't just a collection of words. It's a collection of words and a collection of ways of combining words. (Ignoring the fact that a "word" is often a combination of smaller morphemes.)
Teaching individual phrases as fixed units leaves behind much of the subtle, beautiful complexity of how languages build up their meaning.
In English teaching, it is often claimed that so-called "phrasal verbs" are not systematic and must be memorised, but what we do with "verb + particle", the Romance languages do with "prefix + verb root". A fire extinguisher puts out fires, and we shout out our exclamations. Seems pretty systematic to me. (Not to mention German, where a prefix often becomes detached from its verb and becomes a particle -- see? it's all part of a single spectrum....)
And when people talk about the arbitrarity of "to be" vs "to have" in ages (en "I am 33" vs fr, it, es etc "I have 33 years"), well, at least it's consistent within the language. It's a logical consequence of the Romance "to have" structure that phrases like "at 40" (life begins...!) become "with 40" in these languages.
But while most learners are capable of getting a handle on the be/have difference, I still meet a great many people who borrow the "with" structure into English. How easy would it be for the teacher to point out a few of these little things? To encourage the learner to build a meaningful model that (at least in part) mirrors the native speaker's one?
But perhaps that would take too much time. Nevertheless, we have built an environment where we discourage our students from looking for meaning and structure. We expect them to resign themselves to learning everything as an arbitrary single data-point.
That subtle, beautiful complexity I was talking about? We hide it from them. We keep it from them. We make learning a language into an ugly, clumsy drudge. What we are hiding from them isn't just thte beauty of the language, because that beauty is intrinsic to the language. To hide the beauty, we must hide the language.
How can we teach someone a language we are unwilling to truly share with them?
OK, perhaps I'm overstating the case, and building something of a strawman out of the extreme position. However, even if most practitioners attempt to mix the two approaches, they've still missed the point of the observation.
It's called the "Principle of Compositionality" and it's summed up excellently in O'Reilly's book by Steven Bird, Ewan Klein, and Edward Loper on the Natural Language Toolkit for Python programming:
the meaning of a complex expression is composed from the meaning of its parts and their mode of combinationThere's a deeper examination of the term at Wikipedia, but Bird et al's summary is pretty clear and correct.
So a language isn't just a collection of words. It's a collection of words and a collection of ways of combining words. (Ignoring the fact that a "word" is often a combination of smaller morphemes.)
Teaching individual phrases as fixed units leaves behind much of the subtle, beautiful complexity of how languages build up their meaning.
In English teaching, it is often claimed that so-called "phrasal verbs" are not systematic and must be memorised, but what we do with "verb + particle", the Romance languages do with "prefix + verb root". A fire extinguisher puts out fires, and we shout out our exclamations. Seems pretty systematic to me. (Not to mention German, where a prefix often becomes detached from its verb and becomes a particle -- see? it's all part of a single spectrum....)
And when people talk about the arbitrarity of "to be" vs "to have" in ages (en "I am 33" vs fr, it, es etc "I have 33 years"), well, at least it's consistent within the language. It's a logical consequence of the Romance "to have" structure that phrases like "at 40" (life begins...!) become "with 40" in these languages.
But while most learners are capable of getting a handle on the be/have difference, I still meet a great many people who borrow the "with" structure into English. How easy would it be for the teacher to point out a few of these little things? To encourage the learner to build a meaningful model that (at least in part) mirrors the native speaker's one?
But perhaps that would take too much time. Nevertheless, we have built an environment where we discourage our students from looking for meaning and structure. We expect them to resign themselves to learning everything as an arbitrary single data-point.
That subtle, beautiful complexity I was talking about? We hide it from them. We keep it from them. We make learning a language into an ugly, clumsy drudge. What we are hiding from them isn't just thte beauty of the language, because that beauty is intrinsic to the language. To hide the beauty, we must hide the language.
How can we teach someone a language we are unwilling to truly share with them?
09 November 2012
Speaking tasks.
So I've been setting lots of speaking tasks in class recently, and I've had a chance to test out a bit of received wisdom that was passed on to me in my training, and appears in various books etc. It is said that giving students the opportunity to script a speaking task helps build confidence and ability for later spontaneous production.
My experience is slightly different. What I see again and again is students crippled by indecision. I hear them sitting discussing (in French) what to write, and almost invariably deciding that they can't write that. They agree on a great many things that they can't write, and very few that they can. If it takes more than a quarter of an hour to script a minute or two of dialogue, I can't see how that will be of any use to them in later spontaneous production.
I don't think I'll be doing a great many of this type of exercise in the future....
My experience is slightly different. What I see again and again is students crippled by indecision. I hear them sitting discussing (in French) what to write, and almost invariably deciding that they can't write that. They agree on a great many things that they can't write, and very few that they can. If it takes more than a quarter of an hour to script a minute or two of dialogue, I can't see how that will be of any use to them in later spontaneous production.
I don't think I'll be doing a great many of this type of exercise in the future....
05 November 2012
New languages, and new views on old ones....
I had a bit of a realisation this morning about English, and it's all thanks to Corsican. Corsican tends to weaken certain vowels when they're unstressed. So marking the stressed vowel with bold type, "accende" (the infinitive to light or enflame) becomes "accindite" (present, 2nd person plural). And this happens with almost all Es. Almost. Note the unstressed E at the end of both accende and accindite. But these unstressed Es only seem to occur where they have a specific grammatical purpose -- as far as I can tell any other E becomes I.
Now in certain parts these vowel "mutations" don't occur, but the majority dialects tend to do it. The odd thing, then, is that vowel mutation happens even though speakers of the language are evidently capable of saying the "forbidden" sound. Why not say it if you can?
Well, somewhere along the line, I started thinking about English, and in particular the prefixes pre- and re-.
There's two pronunciations for each: one with schwa and one with /i/. The schwa occurs wherever the syllable has no stress, normally adjacent to the primary stressed syllable -- eg "report", "reply" -- and the /ri/ pronunciation when it has secondary stress. So that's an "ee" sound, like a Corsican "I". It never seems to have an "eh" sound, like a Corsican "E".
And no matter how much I try to, I can't think of a single word in English with the "eh" of "pedal" and "petal" anywhere but in the position of primary stress.
Unless your American, in which case it occurs everywhere.
And that's what I'd never noticed before -- I always thought of the US "reh"-produce as though it was something specific to the re- prefix, but it's a bit more fundamental than that, isn't it...?
Now in certain parts these vowel "mutations" don't occur, but the majority dialects tend to do it. The odd thing, then, is that vowel mutation happens even though speakers of the language are evidently capable of saying the "forbidden" sound. Why not say it if you can?
Well, somewhere along the line, I started thinking about English, and in particular the prefixes pre- and re-.
There's two pronunciations for each: one with schwa and one with /i/. The schwa occurs wherever the syllable has no stress, normally adjacent to the primary stressed syllable -- eg "report", "reply" -- and the /ri/ pronunciation when it has secondary stress. So that's an "ee" sound, like a Corsican "I". It never seems to have an "eh" sound, like a Corsican "E".
And no matter how much I try to, I can't think of a single word in English with the "eh" of "pedal" and "petal" anywhere but in the position of primary stress.
Unless your American, in which case it occurs everywhere.
And that's what I'd never noticed before -- I always thought of the US "reh"-produce as though it was something specific to the re- prefix, but it's a bit more fundamental than that, isn't it...?
02 November 2012
Why I'm afraid of conlangs
Part of me loves the idea of building new languages for fun, and I've often considered learning one myself. If nothing else, I don't imagine there's a great amount of competition in the teaching space for sci-fi languages like Klingon and Avatar's Na'vi.
But every time I get close to giving it a go, I back off. Why?
We can start with the "big one": Esperanto. I've mostly specialised in Romance languages, a choice which is in part laziness. But only in part. The more I learn of the Romance languages, the more I'm fascinated by the way the languages form a continuum. Corsican, for example, has the unique feature of having two major dialect groups that have a phonology much like Italian (in the south) and far more "Iberian" in nature (in the north). In the south, double vowels are "geminated" (lengthened) as in Italian. In the north, they aren't, and a single vowel may be "lenited" (softened, weakened, lightened). So while in the south, the island name "Corsica" is said much like an Italian would expect, the pronunciation in the north softens the second C by voicing it -- making it sound identical to a G. Spanish people call the island "Córsega". Past participle endings in Italian almost all have a /t/ sound: -ato, -ito, -uto; in Spanish they have a /d/: -ado, -ido. And in the south of Corsica it's a T sound and in the north a D (both represented in the written form by the same single letter T). But of course the "D" of -ado in Spanish in some accents is weakened to /ð/ ("th" of English "the")... and if you put a single D between two vowels in Northern Corsican, you get the same sound.
When I first looked at Esperanto, I saw that it took Romance roots and Germanic roots, and it modified them in ways that were not possible within the two language families themselves. As I learn more Romance languages, I find I'm more able to deal with variations, so I can almost understand languages like Portuguese and Occitan, even though I haven't learnt them. My fear with Esperanto, then, was feeding "false data" into the language function -- polluting the natural spectrum that I am acquiring with points that would mislead my brain and reduce my ability to understand one language from another.
It's an unprovable assertion, and no-one's going to be able to prove otherwise with enough certainty to make me risk it.
My latest temptation was the idea of using a simple language such as Toki Pona as a test case for a language learning application/framework that I've been trying to develop. I figured that its minimalist featureset and its completely formalised, regularised grammar would make it a simple and quick language to program and test.
But what scared me off this time was V.S. Ramachandran's notion of "synkinesthesia" -- the idea that language is about shape at some level. When thinking about pronouns and demonstratives, I always remember watching Ramachandran on TV demonstrating the "pointing" that a speaker will do with their lips when using many of these words. I was helping someone learn Gaelic over the summer, and she could never remember her "here,there,yonder" distinction, so I pointed out to her that "seo" feels close (the tongue stays in the back of the mouth), "sin" points forward a bit (palatalised N) and "siud" is far away (dental T, tongue nearly leaves the mouth). It seemed to help her (at the time, at least).
But that's something that no conlang I know of covers. Perhaps there are conlangers incorporating Ramachandran's ideas -- it's an idea I've toyed with myself in the past -- but then again, it's still just theory.
Any conlang can only encode what is known in theory, and will miss many of the subtleties of real languages that have evolved through natural usage and change. It may even miss one of the really core ideas of real language.
It's the same, then, as my original concern about Esperanto, but the system isn't one of sound changes and a few superficial syntactic differences: no, what I'm worried about now is that I train myself out of recognising the underlying principles of natural language by teaching my brain an artifical language that doesn't exhibit them.
But every time I get close to giving it a go, I back off. Why?
We can start with the "big one": Esperanto. I've mostly specialised in Romance languages, a choice which is in part laziness. But only in part. The more I learn of the Romance languages, the more I'm fascinated by the way the languages form a continuum. Corsican, for example, has the unique feature of having two major dialect groups that have a phonology much like Italian (in the south) and far more "Iberian" in nature (in the north). In the south, double vowels are "geminated" (lengthened) as in Italian. In the north, they aren't, and a single vowel may be "lenited" (softened, weakened, lightened). So while in the south, the island name "Corsica" is said much like an Italian would expect, the pronunciation in the north softens the second C by voicing it -- making it sound identical to a G. Spanish people call the island "Córsega". Past participle endings in Italian almost all have a /t/ sound: -ato, -ito, -uto; in Spanish they have a /d/: -ado, -ido. And in the south of Corsica it's a T sound and in the north a D (both represented in the written form by the same single letter T). But of course the "D" of -ado in Spanish in some accents is weakened to /ð/ ("th" of English "the")... and if you put a single D between two vowels in Northern Corsican, you get the same sound.
When I first looked at Esperanto, I saw that it took Romance roots and Germanic roots, and it modified them in ways that were not possible within the two language families themselves. As I learn more Romance languages, I find I'm more able to deal with variations, so I can almost understand languages like Portuguese and Occitan, even though I haven't learnt them. My fear with Esperanto, then, was feeding "false data" into the language function -- polluting the natural spectrum that I am acquiring with points that would mislead my brain and reduce my ability to understand one language from another.
It's an unprovable assertion, and no-one's going to be able to prove otherwise with enough certainty to make me risk it.
My latest temptation was the idea of using a simple language such as Toki Pona as a test case for a language learning application/framework that I've been trying to develop. I figured that its minimalist featureset and its completely formalised, regularised grammar would make it a simple and quick language to program and test.
But what scared me off this time was V.S. Ramachandran's notion of "synkinesthesia" -- the idea that language is about shape at some level. When thinking about pronouns and demonstratives, I always remember watching Ramachandran on TV demonstrating the "pointing" that a speaker will do with their lips when using many of these words. I was helping someone learn Gaelic over the summer, and she could never remember her "here,there,yonder" distinction, so I pointed out to her that "seo" feels close (the tongue stays in the back of the mouth), "sin" points forward a bit (palatalised N) and "siud" is far away (dental T, tongue nearly leaves the mouth). It seemed to help her (at the time, at least).
But that's something that no conlang I know of covers. Perhaps there are conlangers incorporating Ramachandran's ideas -- it's an idea I've toyed with myself in the past -- but then again, it's still just theory.
Any conlang can only encode what is known in theory, and will miss many of the subtleties of real languages that have evolved through natural usage and change. It may even miss one of the really core ideas of real language.
It's the same, then, as my original concern about Esperanto, but the system isn't one of sound changes and a few superficial syntactic differences: no, what I'm worried about now is that I train myself out of recognising the underlying principles of natural language by teaching my brain an artifical language that doesn't exhibit them.
31 October 2012
Québécois: a question of prestige
Recently, several of the other blogs I've been reading have mentioned a new book, Le Québécois en 10 Leçons, self-published on Lulu.com by Alexandre Coutu, known on how-to-learn-any-language as Arrekusu. Now I'm pretty sure I'll be buying this book myself very soon, but right now I'm trying to stop myself as I've got enough on my plate what with a busying teaching schedule and trying to pick up Corsican.
Anyway, so I decided to start looking around to see what other people's opinions were, and I came across one mother of a thread on HTLAL itself...
The thread became kind of derailed when another user, s_allard, objected to the title, suggesting that it should be "le québécois populaire" or "le québécois de rue" or somesuch.
s_allard's position can be summarised thus:
By using the term "québécois" to describe low-status speech, Coutu makes the term "québécois" into a low-status term by association. Instead, s_allard wants to use the term "québécois" inclusively to include the "standard" language used in higher register situations. Besides, many of the features Coutu highlights are going out of fashion.
It's a valid viewpoint, but it's one that I don't personally hold. Rather, I would say that s_allard's position, while not in itself malicious, maintains an unfair distinction where one's intelligence is determined by the language one speaks. It has been pointed out that middle-class children generally do best in school the world over, but that this is simply a language bias -- school-teachers are middle-class, therefore the school lect is the middle-class sociolect.
We have a choice of action, therefore: change the lect of the children to match that of the school, or change the lect of the school to match that of the children. Academically, it's mostly agreed that the latter is the option that has proven most effective time and again, but s_allard's standpoint better matches popular opinion, whether you're talking about India, where many children are taught through the foreign medium of English; Scotland, where many say Gaelic shouldn't be taught "because many children can't even speak English properly" (for some undefined notion of "properly") or s_allard's own Canada.
Personally, it's an attitude I'd want to challenge, not least because many of the notions of "proper" English and "proper" French have been well and truly disproven by corpus studies of the language. (EG. The statistical norm in French is to drop the "ne" particle in the negative, and the statistical norm in English is to say "can", not "may", when asking permission.)
Moreover, there's the issue of words and phrases going out of fashion. I personally believe the examples s_allard gave are pretty misleading. He picked English slang words with no real history, that were invented by one generation and dropped by the next. But the "québécois" that he objects to is a historically attested form, and one that is being lost not simply due to natural language change, but by the constant imposition of the Standard French norm.
One of the books that I'm using to help me learn Corsican, le corse en 23 lettres, puts it very clearly. The author, Ghjaseppiu Gaggioli, is a descriptivist grammarian, and in the introduction states that he doesn't want anyone to interpret his work as authoritative. Instead, he wants to inform the reader so that the reader can make an educated choice. Because, he says, all languages change, but for a language to stay healthy, that change needs to come from within the boundaries of the language itself. Much of the change in Corsican today is the borrowing of feature after feature from French into the language. Similarly, much of the change in Scottish Gaelic is the borrowing of feature after feature from English.
And of course, much of the change in Quebec French is the borrowing of feature after feature from School French*. s_allard's approach leads to us defining "québécois" as something that every day becomes more like School French. He wants to differentiate the language name, while differentiating the language less and less.
Coutu's approach is more like that Gaggioli. He wants to bring people's attention to the features that exist in their local tongues, features that they are not themselves aware of. He mentions in the HTLAL thread that he gets people telling him "I don't talk like that," only to use the exact same word or phrase a sentence or two later.
This is completely and utterly normal, and anyone who has studied linguistics in a modern setting will have experienced a lesson where the teacher will tell them that "everybody really says X" and the student doesn't believe it. Over the next couple of days/weeks, the student simply can't stop hearing the phrase.
A couple of years ago, I was telling my manager about how us Scottish people hardly ever say "please" -- we go into a shop and say "I'll have a, thanks." That's "thanks", not "please". He was having none of it. He always said please... Well, that same day he came in from lunch looking shocked. "You're right," he said, "I just asked for a sandwich, and I didn't say please."
No-one can protect their own language until they recognise it for what it is....
* I'm giving up on the term "Standard X" unless it's a statistically-defined norm-reference. A standard isn't a standard just because a minority of people say it should be. "School X" is a far more accurate term.
Anyway, so I decided to start looking around to see what other people's opinions were, and I came across one mother of a thread on HTLAL itself...
The thread became kind of derailed when another user, s_allard, objected to the title, suggesting that it should be "le québécois populaire" or "le québécois de rue" or somesuch.
s_allard's position can be summarised thus:
By using the term "québécois" to describe low-status speech, Coutu makes the term "québécois" into a low-status term by association. Instead, s_allard wants to use the term "québécois" inclusively to include the "standard" language used in higher register situations. Besides, many of the features Coutu highlights are going out of fashion.
It's a valid viewpoint, but it's one that I don't personally hold. Rather, I would say that s_allard's position, while not in itself malicious, maintains an unfair distinction where one's intelligence is determined by the language one speaks. It has been pointed out that middle-class children generally do best in school the world over, but that this is simply a language bias -- school-teachers are middle-class, therefore the school lect is the middle-class sociolect.
We have a choice of action, therefore: change the lect of the children to match that of the school, or change the lect of the school to match that of the children. Academically, it's mostly agreed that the latter is the option that has proven most effective time and again, but s_allard's standpoint better matches popular opinion, whether you're talking about India, where many children are taught through the foreign medium of English; Scotland, where many say Gaelic shouldn't be taught "because many children can't even speak English properly" (for some undefined notion of "properly") or s_allard's own Canada.
Personally, it's an attitude I'd want to challenge, not least because many of the notions of "proper" English and "proper" French have been well and truly disproven by corpus studies of the language. (EG. The statistical norm in French is to drop the "ne" particle in the negative, and the statistical norm in English is to say "can", not "may", when asking permission.)
Moreover, there's the issue of words and phrases going out of fashion. I personally believe the examples s_allard gave are pretty misleading. He picked English slang words with no real history, that were invented by one generation and dropped by the next. But the "québécois" that he objects to is a historically attested form, and one that is being lost not simply due to natural language change, but by the constant imposition of the Standard French norm.
One of the books that I'm using to help me learn Corsican, le corse en 23 lettres, puts it very clearly. The author, Ghjaseppiu Gaggioli, is a descriptivist grammarian, and in the introduction states that he doesn't want anyone to interpret his work as authoritative. Instead, he wants to inform the reader so that the reader can make an educated choice. Because, he says, all languages change, but for a language to stay healthy, that change needs to come from within the boundaries of the language itself. Much of the change in Corsican today is the borrowing of feature after feature from French into the language. Similarly, much of the change in Scottish Gaelic is the borrowing of feature after feature from English.
And of course, much of the change in Quebec French is the borrowing of feature after feature from School French*. s_allard's approach leads to us defining "québécois" as something that every day becomes more like School French. He wants to differentiate the language name, while differentiating the language less and less.
Coutu's approach is more like that Gaggioli. He wants to bring people's attention to the features that exist in their local tongues, features that they are not themselves aware of. He mentions in the HTLAL thread that he gets people telling him "I don't talk like that," only to use the exact same word or phrase a sentence or two later.
This is completely and utterly normal, and anyone who has studied linguistics in a modern setting will have experienced a lesson where the teacher will tell them that "everybody really says X" and the student doesn't believe it. Over the next couple of days/weeks, the student simply can't stop hearing the phrase.
A couple of years ago, I was telling my manager about how us Scottish people hardly ever say "please" -- we go into a shop and say "I'll have a
No-one can protect their own language until they recognise it for what it is....
* I'm giving up on the term "Standard X" unless it's a statistically-defined norm-reference. A standard isn't a standard just because a minority of people say it should be. "School X" is a far more accurate term.
29 October 2012
CEFR
Have you heard of the Common European Framework of Reference for Languages? I have, and I don't particularly like it, which is an opinion I'm maybe too quick to express. I think it's worth me giving a more in-depth and complete critique here. This is part I...
What is the CEFR?
For those of you not yet familiar with it, the CEFR was an idea conceived by the Council of Europe in the 90s as a solution to the problem of differing language standards across Europe. Some bodies would use terms like "Beginner", "Intermediate" and "Advanced", or their local equivalents, and these didn't always match up, and some would insert additional intermediate grades. With EFL, for example, the scale is normally "beginner", "post-beginner", "lower intermediate", "intermediate", "upper intermediate", "advanced". Others would have numbered grades, others still lettered grades. When assessing CVs (en-US: "resume"), this made it very difficult to compare candidates' language levels.
The Council of Europe's scheme took the basic "beginner-intermediate-advanced" scheme and relabelled it as "basic-independent-proficient", then transferred it to a lettered form, with A as basic, B as independent and C as proficient. They subdivided each of these into two bands, giving 6 levels in total: A1, A2, B1, B2, C1, C2.
It was adopted by the European Union in 2001 as an official recommendation to all member states. In practical terms this means that it's quite hard to get European funding for a language teaching initiative if you can't align it to the CEFR.
Why I don't like the CEFR
The CEFR is, I feel, just as vague as the grading systems that preceded it. The only benefit to the learner, teacher or employer is that it results in everyone using the same terminology, which makes it easier to discuss language proficiency internationally. But while we can discuss it easier, we're still not able to discuss it precisely, because the CEFR is still far from precise.
Now, when I say this, the response is normally to point me towards such things as the language portfolio. The initial level descriptions are necessarily quite vague, and while there's a whole host of documents surrounding this, in reality, what we have is a devolution of vagueness to secondary sources. Much of the "detail" added is actually false detail.
Take for example the Swiss self-assessment checklist, which has picked out as by the Council of Europe as a good example of the CEFR in practice.
The very first checklist item is a level 1 objective: I can understand when someone speaks very slowly to me and articulates carefully, with long pauses for me to assimilate meaning.
What does it mean to "understand"? We're looking here at someone who has barely started learning a language, and there will be very little content that he will be able to understand. Technically, I could mark this as "no" for all my languages, because I won't be able to understand someone even if they speak slowly, because I won't know all the words they are saying.
Moreover, the ideas of "speaking very slowly", "articulating carefully" and "with long pauses" are all inherently vague. Am I allowed to slow them down as much as I like? Can the pauses by long enough for me to consult a dictionary or a grammar book?
Another problem is pervasive tendency to tautology:
A1
"I can ask and answer simple questions, initiate and respond to simple statements in areas of immediate need or on very familiar topics. "
"I can understand phrases, words and expressions related to areas of most immediate priority"
Familiar subjects are subjects which the learner has become familiar with. The first sentence therefore is logically equivalent to the statement "I can discuss stuff I'm able to discuss."
The second is potentially just as tautologous. Consider that the teacher will be teaching language to a certain priority. So this is effectively "I can understand phrases, words and expressions that I have been taught".
A characteristic of descriptions of the CEFR is that this idea of subject-specificity carries through all levels, moving through talking about your area of work, technical documents etc, but always in a way that really is self-defined.
The CEFR dictates methodology
The people behind the CEFR deny it, stating that the CEFR only dictates content, not method, but have a look at this from the Swiss guidelines:
A2
"I can make myself understood using memorised phrases and single expressions. "
The CEFR itself doesn't state this, but the very same subject-specificity I highlighted above forces the issue. If you're measuring people on dealing with topics and situations, you have to train them with topic-specific language.
But that's not what language is -- the core of all language is topic independent. Our basic grammar -- tenses, prepositions, word order etc -- is the same in all fields.
By breaking into topic specific right from level A1, the CEFR actually dictates material that the student isn't ready to learn yet, so they have to memorise phrases. This imposes a certain view of teaching on the course writer. It is a view of teaching that is held by the majority of teachers, but it is not universally accepted, and I personally believe it is the wrong way round.
If the grammar is taught first, all that subject-specific stuff will fall into place with ease, but the goals for A1 steal time from the teacher, so it can't be done.
Furthermore...
Not all languages are equal
The CEFR is a single framework for all languages, for all students. Essentially it suggests an order of learning survival -> career related -> general. Even if a fixed order is a good idea, is that the right order?
In the Romance languages, career-related stuff is simplicity itself... depending on your career.
I used to work in IT, while studying language. With practically zero effort, I learned to discuss linguistics in Catalan. I can sort of get by discussing computers, but it's a struggle. I couldn't ask for directions to the railway station.
So my natural order of learning was specialist->career->survival
This order of difficulty is a bit different from what the CEFR would predict. Why?
Well, linguistics is mostly Latin (even the word linguistics!) so translation into Catalan is simply a matter of applying regular transformations to known English words. Computing, on the other hand, uses mostly English words, so the translations into Catalan can be very difficult to predict, and have to be learned individually. Thus the CEFR suggests that people should get stuck at a certain level for longer simply because some people need more words than others. But that's not prerequisite knowledge for the other levels.
But both are still easier than getting to the railway station. Why?
Any technical field is well-defined and logically organised. Translations are very often one-to-one and quite literal. Technical stuff is (contrary to expectations) easy. And more to the point, it is far, far easier to integrate it.
When you teach instructions, the progression of complexity is along the lines of:
left, right, straight on
Turn left, turn right, keep going straight on.
1st, 2nd, 3rd left/right
Next to (other place), opposite (other place) etc.
Turn... at the lights/roundabout etc
But that's a closed set of phrases. They combine in a highly constrained way, and they can't be integrated with other general language that you might be taught the week before or after.
But late introduction of survival language is effortless - once you know the grammar of a language, it is very easy to build some of these survival phrases independently; even when you can't, it's still a lot easier to memorise them once your learnt all the basic building blocks.
Take "what is your name?"
It's a totally regular question, and the following learning path leads a learner to be able to produce it independently:
"is" statements -> inverted order in simple questions -> question words
And yet it isn't uncommon for a teacher to ask the question on the very first day, before even the verb "to be" has been mentioned.
But the CEFR asks us to do this. It tells us which way we should be teaching. And I think it's wrong.
And finally:
No-one actually teaches to the CEFR anyway
It is very rare that you'll find a course that genuinely teaches you how to discuss your specific profession in your target language. Even if you do take "English for Specific Purposes", it's still going to be at a very high, abstract level, like "English for Business", not "English for Operations Managers in Logistics SMEs"; or "English for IT", not "English for Object-Oriented Database Admins in the Public Sector".
And even if you do take the course, most of the exams that you can take from members of the Association of Language Testers in Europe (ALTE) do not include a professional component. And yet they all offer exams that they class as B2, the level that is explicitly defined as language relating to your area of work or study.
It leaves the whole thing seeming a bit... pointless.
What is the CEFR?
For those of you not yet familiar with it, the CEFR was an idea conceived by the Council of Europe in the 90s as a solution to the problem of differing language standards across Europe. Some bodies would use terms like "Beginner", "Intermediate" and "Advanced", or their local equivalents, and these didn't always match up, and some would insert additional intermediate grades. With EFL, for example, the scale is normally "beginner", "post-beginner", "lower intermediate", "intermediate", "upper intermediate", "advanced". Others would have numbered grades, others still lettered grades. When assessing CVs (en-US: "resume"), this made it very difficult to compare candidates' language levels.
The Council of Europe's scheme took the basic "beginner-intermediate-advanced" scheme and relabelled it as "basic-independent-proficient", then transferred it to a lettered form, with A as basic, B as independent and C as proficient. They subdivided each of these into two bands, giving 6 levels in total: A1, A2, B1, B2, C1, C2.
It was adopted by the European Union in 2001 as an official recommendation to all member states. In practical terms this means that it's quite hard to get European funding for a language teaching initiative if you can't align it to the CEFR.
Why I don't like the CEFR
The CEFR is, I feel, just as vague as the grading systems that preceded it. The only benefit to the learner, teacher or employer is that it results in everyone using the same terminology, which makes it easier to discuss language proficiency internationally. But while we can discuss it easier, we're still not able to discuss it precisely, because the CEFR is still far from precise.
Now, when I say this, the response is normally to point me towards such things as the language portfolio. The initial level descriptions are necessarily quite vague, and while there's a whole host of documents surrounding this, in reality, what we have is a devolution of vagueness to secondary sources. Much of the "detail" added is actually false detail.
Take for example the Swiss self-assessment checklist, which has picked out as by the Council of Europe as a good example of the CEFR in practice.
The very first checklist item is a level 1 objective: I can understand when someone speaks very slowly to me and articulates carefully, with long pauses for me to assimilate meaning.
What does it mean to "understand"? We're looking here at someone who has barely started learning a language, and there will be very little content that he will be able to understand. Technically, I could mark this as "no" for all my languages, because I won't be able to understand someone even if they speak slowly, because I won't know all the words they are saying.
Moreover, the ideas of "speaking very slowly", "articulating carefully" and "with long pauses" are all inherently vague. Am I allowed to slow them down as much as I like? Can the pauses by long enough for me to consult a dictionary or a grammar book?
Another problem is pervasive tendency to tautology:
A1
"I can ask and answer simple questions, initiate and respond to simple statements in areas of immediate need or on very familiar topics. "
"I can understand phrases, words and expressions related to areas of most immediate priority"
Familiar subjects are subjects which the learner has become familiar with. The first sentence therefore is logically equivalent to the statement "I can discuss stuff I'm able to discuss."
The second is potentially just as tautologous. Consider that the teacher will be teaching language to a certain priority. So this is effectively "I can understand phrases, words and expressions that I have been taught".
A characteristic of descriptions of the CEFR is that this idea of subject-specificity carries through all levels, moving through talking about your area of work, technical documents etc, but always in a way that really is self-defined.
The CEFR dictates methodology
The people behind the CEFR deny it, stating that the CEFR only dictates content, not method, but have a look at this from the Swiss guidelines:
A2
"I can make myself understood using memorised phrases and single expressions. "
The CEFR itself doesn't state this, but the very same subject-specificity I highlighted above forces the issue. If you're measuring people on dealing with topics and situations, you have to train them with topic-specific language.
But that's not what language is -- the core of all language is topic independent. Our basic grammar -- tenses, prepositions, word order etc -- is the same in all fields.
By breaking into topic specific right from level A1, the CEFR actually dictates material that the student isn't ready to learn yet, so they have to memorise phrases. This imposes a certain view of teaching on the course writer. It is a view of teaching that is held by the majority of teachers, but it is not universally accepted, and I personally believe it is the wrong way round.
If the grammar is taught first, all that subject-specific stuff will fall into place with ease, but the goals for A1 steal time from the teacher, so it can't be done.
Furthermore...
Not all languages are equal
The CEFR is a single framework for all languages, for all students. Essentially it suggests an order of learning survival -> career related -> general. Even if a fixed order is a good idea, is that the right order?
In the Romance languages, career-related stuff is simplicity itself... depending on your career.
I used to work in IT, while studying language. With practically zero effort, I learned to discuss linguistics in Catalan. I can sort of get by discussing computers, but it's a struggle. I couldn't ask for directions to the railway station.
So my natural order of learning was specialist->career->survival
This order of difficulty is a bit different from what the CEFR would predict. Why?
Well, linguistics is mostly Latin (even the word linguistics!) so translation into Catalan is simply a matter of applying regular transformations to known English words. Computing, on the other hand, uses mostly English words, so the translations into Catalan can be very difficult to predict, and have to be learned individually. Thus the CEFR suggests that people should get stuck at a certain level for longer simply because some people need more words than others. But that's not prerequisite knowledge for the other levels.
But both are still easier than getting to the railway station. Why?
Any technical field is well-defined and logically organised. Translations are very often one-to-one and quite literal. Technical stuff is (contrary to expectations) easy. And more to the point, it is far, far easier to integrate it.
When you teach instructions, the progression of complexity is along the lines of:
left, right, straight on
Turn left, turn right, keep going straight on.
1st, 2nd, 3rd left/right
Next to (other place), opposite (other place) etc.
Turn... at the lights/roundabout etc
But that's a closed set of phrases. They combine in a highly constrained way, and they can't be integrated with other general language that you might be taught the week before or after.
But late introduction of survival language is effortless - once you know the grammar of a language, it is very easy to build some of these survival phrases independently; even when you can't, it's still a lot easier to memorise them once your learnt all the basic building blocks.
Take "what is your name?"
It's a totally regular question, and the following learning path leads a learner to be able to produce it independently:
"is" statements -> inverted order in simple questions -> question words
And yet it isn't uncommon for a teacher to ask the question on the very first day, before even the verb "to be" has been mentioned.
But the CEFR asks us to do this. It tells us which way we should be teaching. And I think it's wrong.
And finally:
No-one actually teaches to the CEFR anyway
It is very rare that you'll find a course that genuinely teaches you how to discuss your specific profession in your target language. Even if you do take "English for Specific Purposes", it's still going to be at a very high, abstract level, like "English for Business", not "English for Operations Managers in Logistics SMEs"; or "English for IT", not "English for Object-Oriented Database Admins in the Public Sector".
And even if you do take the course, most of the exams that you can take from members of the Association of Language Testers in Europe (ALTE) do not include a professional component. And yet they all offer exams that they class as B2, the level that is explicitly defined as language relating to your area of work or study.
It leaves the whole thing seeming a bit... pointless.
27 October 2012
A new look at homework
While I was living on Skye, I made a few quid by running a few night classes in Spanish to speakers of Gaelic. It was a very interesting experience, as I was juggling three languages in the classroom -- I used both English and Gaelic for instruction, as each had similarities to Spanish that the other didn't, and even when neither was like the Spanish, the difference between the Spanish and English at least gave justification for why the Spanish was completely different.
I only had four people in the class (well, Spanish for bilingual speakers of Gaelic and English is a fairly limited market, isn't it? Particularly in a remote corner of a sparsely populated island) but it went well.
But one thing I was acutely aware of was the problem of forgetting between lessons if you're only in the class once a week. The solution to that was, unsurprisingly, to set them homework. But for the first couple of weeks, we did no writing, so what do you do? I sat down with a Zoom field recorder and a list of prompts and made a series of short MP3 files, each 3-6 minutes long containing prompts and responses that used the language we'd covered in class.
I never assumed I was the first to do it, and with software like Audio Lesson Studio out there, it's clear that it's occurred to others before me. One such is Ravi Purushotma, who complained in his 2006 Masters thesis and elsewhere that homework sheets have stayed the same over the years, even when in-class teaching has changed with the latest teaching fashions (ie. homework is the exception to the pattern in Decoo's lecture On the mortality of language learning methods). In the thesis itself, Purushotma proposes the usual handwavery of web 2.0 (use Twitter, write a blog, make a YouTube video etc) without giving clear directions on the how, why or when; but in another article he specifically recommends Pimsleur-like content as a method of setting homework. I certainly can't find fault with that, as long as it's covering the class material properly, or alternatively being used to teach stuff that isn't an effective use of time in class.
Personally, I've got a fair bit of use out of the software Gradint, an audio-only spaced recognition system (Wikipedia definition) created by a partially-sighted learner to make up for the lack of resources for the visually impaired. As it stands, this is only really suited for the independent learner, although with a bit of tweaking, it could be made into a really handy little homework generator. And it just so happens that it's open source, and I'm going to be trying to learn Python programming properly over the next wee while....
I only had four people in the class (well, Spanish for bilingual speakers of Gaelic and English is a fairly limited market, isn't it? Particularly in a remote corner of a sparsely populated island) but it went well.
But one thing I was acutely aware of was the problem of forgetting between lessons if you're only in the class once a week. The solution to that was, unsurprisingly, to set them homework. But for the first couple of weeks, we did no writing, so what do you do? I sat down with a Zoom field recorder and a list of prompts and made a series of short MP3 files, each 3-6 minutes long containing prompts and responses that used the language we'd covered in class.
I never assumed I was the first to do it, and with software like Audio Lesson Studio out there, it's clear that it's occurred to others before me. One such is Ravi Purushotma, who complained in his 2006 Masters thesis and elsewhere that homework sheets have stayed the same over the years, even when in-class teaching has changed with the latest teaching fashions (ie. homework is the exception to the pattern in Decoo's lecture On the mortality of language learning methods). In the thesis itself, Purushotma proposes the usual handwavery of web 2.0 (use Twitter, write a blog, make a YouTube video etc) without giving clear directions on the how, why or when; but in another article he specifically recommends Pimsleur-like content as a method of setting homework. I certainly can't find fault with that, as long as it's covering the class material properly, or alternatively being used to teach stuff that isn't an effective use of time in class.
Personally, I've got a fair bit of use out of the software Gradint, an audio-only spaced recognition system (Wikipedia definition) created by a partially-sighted learner to make up for the lack of resources for the visually impaired. As it stands, this is only really suited for the independent learner, although with a bit of tweaking, it could be made into a really handy little homework generator. And it just so happens that it's open source, and I'm going to be trying to learn Python programming properly over the next wee while....
23 October 2012
Udacity review: Web development (C253)
OK, so this isn't strictly about
language, but I've been following Udacity's course on web app
development as I've been working on designing a language learning app
for a while now, and I'm really not too hot on web technologies at
the moment (and where would you put a language learning app other
than on the web these days?).
I've written about MOOCs before, and
shortly after writing that post I read a very detailed review of Udacity's Introduction to Statistics at the AngryMathblog.
A lot of commenters suggested that
the problems identified were unique to the particular course, but
with it was with those criticisms in the back of my head that I've
spent several hours over the last couple of weeks rattling through
this course, and I have to say that I have very similar concerns to
Delta over at AngryMath.
To summarise, Delta picked out a “top
10” of problems:
- Lack of planning
- Sloppy writing
- Quiz regime
- Population and sample
- Normal curve calculations
- Central Limit Theorem not explained
- Bipolar difficulty
- Final exam certification
- Hucksterism
- Lack of updates?
Everything there matches to my own
observations with the web development course, except the final exam
(which I haven't reached yet – I'm on unit 6 of 7) and
the stats-specific items (4,5,6) – although there are problems with
Steve Huffman's course that are analogous to these.
1. Lack of planning
It is not uncommon
to hear Huffman change his mind halfway through a unit, or
even after giving a quiz. Mostly, this is because he uncovers
another quirk in the Google AppEngine or one of the Python code
libraries that affects the outcome. OK, we can forgive the guy for
not being an expert on a relatively new technology, but why didn't he
take a couple of hours to check all these things before starting
filming?
In
video 4.38 he even says "One final password thing. I know I
promised the last one was the final one but..." Now he really
should have known he was going to say that when he filmed the
previous segment, and if he really wanted to change it, he could have
gone back and reshot part of the earlier section in order to edit it
out (or even just redubbed the section in question).
If he can't plan
an hour or two ahead, it throws his whole schedule into doubt.
2. Sloppy writing
Huffman makes several spelling errors during the course on some pretty fundamental computing terms, talking about “algorithims” (ouch) or a database being “comitted” (yuck). After having “protol buffers” on screen for half a minute, he spots it and corrects it to “protocol buffers” (5.16).
His handwriting becomes progressively
more crooked, moving across the screen at an angle, and he
consistently and clearly writes his quote marks as open and close
quotes on the whiteboard, even though most computers make no
distinction (and Python, along with most languages, definitely
doesn't).
This is core stuff he's dealing with, and he's failing to be precise.
3. Quiz regime
The quizzes seem just as forced as Delta found in the stats course, with annoying simple ones, then difficult ones that require you to remember an exact command that you've seen once, to ones that suffer from a rather odd sense of humour. I was not familiar with the “hunter2” meme, and the constant reference to that value forced me to go and look it up. Not particularly interesting. As an inside joke, using it as the default password example was sufficient – giving it as an incorrect option to several multiple-choice quizzes was unnecessary and distracting.
But the other thing that I really
noticed about the quizzes in this course is more serious: they just
didn't feel like an integral part of the lesson. Most of them
started with a dedicated video, rather than just being asked at the
end of a video. This inserted a little pause as the next video
loaded. You'd sit there waiting as Huffman unnecessarily read
out the answers (I can read, as you may have noticed). That wasn't
the worst of it, though. Huffman insisted on constantly telling
you you were about to have a quiz. Why? Isn't it enough to ask the
question?
Worse, this kills one of the clearest
pedagogical rules: don't overwrite useful information in working
memory – take full advantage of the "echo effect". I found myself
lost on several occasions, because after giving me new information,
Huffman would wipe the “echo” from my working memory by
telling me “I think it's time for a quick quiz”. There'd then be
a pause while the next video loaded, where the only thing repeating
in my head was the fact that there was going to be a quiz – the
information I needed to actually complete the quiz was gone. I skipped the quiz and went straight to the answer, because I didn't know it, and there was no scaffolding or structured guidance in the question.
And then, of course, whether I got the
answer right or wrong (or didn't even try), Huffman decides to
explain why all the answers are right or wrong anyway. No attempt
was made to focus on my specific misunderstandings, and when you're
giving a course to thousands of people, wouldn't it make sense to
take a little extra time and include a few extra video snippets to
match the different answer combinations to the quizzes? A couple of
hours of your time to save 10-20 minutes each for thousands of people
is a good trade-off (and what you might consider being a “good
citizen”, Huffman, as your own course proposes we all should
be).
4. Population and sample / ACID
Delta complains that Thrun's course doesn't present a clear distinction between two fundamental statistical concepts – I would say that Huffman's course similarly fails when it touches on databases. It's not as serious a problem, as this isn't a database course, but if you're going to teach something, for pity's sake, teach it right. ACID stands from Atomicity, Consistency, Isolation and Durability. Huffman's explanation in unit 3 fails to fully define consistency, leaving it difficult to see the difference between atomicity and consistency. The confusion is compounded by the fact that the whole definition of ACID relies on the idea of a database “transaction”, which Huffman readily admits to not having talked about before. (So I could actually add this into “poor planning” above if I wanted to.)
5. Normal curve calculations /
multiple frameworks and libraries
There's not necessarily anything as
fundamental as this missing from this course as the normal curve, but
the end result of something “magical” happening (ie powerful,
important, and not understood) is present. By jumping about from
framework to framework and library to library, Huffman keeps
introducing stuff that we, as learners, just aren't going to
understand. To me, that decreases my confidence: I like to
understand (which is why I'm taking the course).
6. Central Limit Theorem not
explained
No real equivalent, I suppose.
7. Bipolar difficulty
The difficulty problem in Thrun's stats
course is slightly different from the problem here. Thrun asked
questions that he didn't expect the student to know the answer to
(oddly), but here Huffman expects you to know the answer...
except that he has a very odd set of assumed prior knowledge.
For example, he starts with the
assumption that you have never encountered HTML before, but HTML is
extremely well known now, even among non-geeks. But then he assumes
you know Python. Python is a fairly popular programming language at
the moment, but really – not everyone
knows it. I'm also willing to wager a fair chunk of cash that most
Python scripters are very comfortable indeed with HTML, but that the converse is not true.
Now, I might be
doing him a disservice – his assumption no doubt comes because
Udacity's own Computer Science 101 course teaches Python, but
then again the course prerequisites don't mention either Udacity CS101 or specifically Python:
What do I need
to know?
--------------------------------------------------------------------------------
A moderate
amount of programming and computer science experience is necessary
for this course.
See? No mention of Python. Now I've
got a degree in Computer Science, so I've got what I thought was a
“moderate amount” of experience. But as soon as he asked a
code-based question, I was stuck. Not only did I not know the
appropriate syntax, but often I had no idea of the type of structure
required.
You see, Python is a very
sophisticated, very high-level language that does lots of clever
things that a lot of the lower-level languages don't. It has very
useful and flexible tools for manipulating strings and data-sets, and
even allows you to build “dictionaries” of key/value pairs. A
great many of the tasks presented in the course were easy if you were
familiar with the structures. If you weren't, you wouldn't A) know
how to write the code or B) know where to look for the answer, or what it would be called. OK,
so the answer to B is “the course forums,” I suppose, but that's
hardly adequate, surely? Audience participation is great and all,
but shouldn't good teaching prevent these blockages, these obstacles
to the learner?
8. Final exam certification
As I said, I haven't got that far yet. I suspect retaking will be less of an issue as a lot more of the material will be practical, and you can't expect to pass a coding exam by trial and error.
9. Hucksterism
Huffman doesn't seem to be as
evangelistic as Thrun, but he still does talk a bit too positively
after some of the quizzes (despite not knowing whether I got the
answer right or wrong), and he does say from time to time that now we
“know how to” do something. Are you sure? I've followed a
fairly tightly defined process – take away the scaffolding, and
could I repeat it? That's not guaranteed.
10. Lack of updates?
The grating
positivity does seem to die down during the course, so there's some
evidence of responding to feedback, but the course first went out
months ago, and despite presumably thousands of completed courses,
there's no evidence of them going back to attempt to fix any problems
in the earlier videos. As I stated in my previous post on MOOCs: any
conscientious teacher reconsiders his material after any class, which
means an update for every 20-30 students – this course has had a
lot more students than that, so where are the updates.
My own evaluation
So the above was
recreating Delta's complaints, with the specific purpose of defending
him/her against those who claim that the AngryMath article was unfair
as it focused on a sample size of one. But I'd also like to post my
views in their own terms.
Because
to me, the big problem wasn't one that appeared in Delta's top 10; it
was that the course is not what I would consider a university-level
course. Or at least, not a complete
university-level course. What I have experienced so far feels a
little too blinkered and focused on one project. I don't remember
any course at any of the three universities I've studied at where the
teaching was driving so clearly towards one single end-of-course task. Each of
the end-of-unit programming tasks brings you closer to that final
task, and there feels like there's a lack of breadth. As I went
through my programming tasks as a student at Edinburgh, we were dealing with
incrementally increasing code complexity, but on an exponentially
increasing problem base – no more than two homework tasks would be
as closely linked as all the tasks here. In essence, what we're
doing is more like a “class project” than a full “class”. Most courses in Edinburgh would change the programming tasks substantially from year to year (certain courses excepted – my hardware design and compiler classes were fairly specialised), but Udacity simply cannot do this as the tasks are fundamental to the syllabus structure.
And Huffman, we're told, “teaches from experience” – which basically translated to "he is not a teacher," in layman's terms. He does an admirable job for someone who hasn't
been trained in pedagogy, but really, seriously, would it kill them
to get an actual teacher to teach the course? Huffman's
awkwardness and uncertainty about the format is the reason he keeps
killing the echo effect – he hasn't developed the instinct to know
how much time and space we need to process an idea. At times, he
gives a reasonably broad view of the topic, but at others, he just
splurges onto the page what is needed for the task at hand. There's
no progressive differentiation of concepts, and he doesn't use any
advance organisers to help the learner understand new concepts.
Case
in point: introducing ROT13/the Caesar cypher without once
demonstrating or even describing a codewheel – a video demonstration of the code wheel is easy, cheap and clear. His demonstration with lines on the virtual whiteboard was not clear. Even if you don't use a codewheel, you can always use the parallel alphabets method:
So, yeah, I can see that Thrun really genuinely believes that the
educational establishment doesn't “get it” when it comes to new
education, but he's throwing the baby out with the bath water if that
means getting rid of educationalists altogether.
Teaching vs
training
But Udacity isn't completely abandoning academia – oh no; it's
recreating its mistakes. A recent post on the Udacity blog repeats
that hoary old complaint that education simply doesn't adapt fastenough to newtechnologies.
In Udacity's own words:
Technologies change quickly. While savvy companies are quick to adapt to these changes, universities are sometimes slower to react. This discrepancy can lead to a growing gap between the skills graduates have and the skills employers need. So how do you figure out exactly what skills employers are looking for? Our thinking: work with industry leaders to teach those skills!
It's the old “academic” vs “vocational” debate once again,
and just as many universities are sacrificing their academic
credentials by providing more and more courses that are mere
“training courses” for a given technology, that's what Udacity is
becoming. Forthcoming courses from Udacity are pretty specific:.
- Mobile Applications Development with Android
- Applications Development with Windows 8
- Data Visualization with Mathematica
Thrun keeps talking himself up as an alternative to
university, but he's starting to repaint his site as something that's
more an alternative to a Sams/Teach Yourself/for Dummies coursebook.
Because as they say:
We are working with leading academic researchers and collaborating with Google, NVIDIA, Microsoft, Autodesk, Cadence, and Wolfram to teach knowledge and skills that students will be able to put to use immediately, either in personal projects or as an employee at one of the many companies where these skills are sought after.
That's not what university is about. So Thrun doesn't like
university. Fine. But plenty of us do. Stop criticising
universities for being universities. If you want to be a vendor-specific bootcamp, knock yourself out, but please don't criticise universities for teaching us how to think instead of leading us through the nose on writing a Sharepoint site.
The UK used to have a strong distinction between vocational
institutions (known as “colleges of further education”) and
academic institutions (universities, higher education). It's a
useful distinction, and we should have both – it's not an
“either/or” question.
On the other hand, I suppose Thrun's worked out the answer to how to fund MOOCs: sell out to big business. I hope they're paying you well enough.
Subscribe to:
Posts (Atom)