08 December 2013

Memory schedules and the danger of the hard line

Spaced recognition systems (SRS) have become very popular in the computer language learning community as a means to learn and revise vocabulary and phrases.

Spaced recognition is essentially just flashcard software, but flashcard software backed with an intelligent algorithm that attempts to find the optimally efficient timing to aid your memory.

This timing all arises from the idea of "memory schedules" put forth by people such as Paul Pimsleur, who gave his name to a popular series of cassette-based language courses in the 1960s (a series which continues to be published to this day with virtually no differences to the basic structure).

The first principle behind these memory schedules is that new information is gradually forgotten, and needs to be reminded. (So far, so obvious.)

The second principle is that the better learned something is, the longer it takes before being fully forgotten -- the memory pattern is stronger. (Also obvious.)

Taken together, these principles give the basic form of memory schedules: revise the item to be remembered just late enough so that it's not forgotten, at progressively longer intervals.

Now this of course is obvious, and all teachers schedule their revision along similar principles. The real promise of memory schedules is the ability to get a measurable and verified number on it.

So the hard liners within memory schedules did put figures in place, completely ignoring the fact that some things are more difficult to remember than others -- for example, memorising the full 10 or 11 digit phone number for someone in a foreign country is going to be harder than memorising the number of your neighbour, whose number will probably only differ from yours by 3 or 4 digits.

The numbers, therefore can't be right, which is why more recent memory researchers have been looking for formulas that can take into account complexity. And as it's hard to objectively measure that complexity beforehand, very sensibly, SRS tries to work it out for each item to be learned as they go, based on the learner's performance. That's a good compromise.

But there's another claim from the hardliners that's really very hard to stomach: the claim that nearly forgetting and then reminding makes for a stronger memory trace. Than what? Than revising it more frequently.

Less revision = better learning...?

That is a weird claim, and the guys that claim it must presumably have data to back it up, but I just can't see how it can be a generally applicable truth.

What is definitely true is that excessively frequent repetition can keep something in short-term memory without ever forcing it to be stored in, or recalled from, long-term memory; and something that isn't in long-term memory isn't learned. Worse: if you are really frequent in your repetition, you can hold the information in working memory, and never have to recall it at all.

This is, of course, a problem that should be familiar to most language teachers and learners. The first hour of language instruction is often very light. In a one-hour introduction to Finnish, I was taught to say "good morning", "how are you?", "I'm fine,", "what is your name?", "My name is...", "what is it?" (or maybe "what is this?" or "that?"), "it's a..." and 6 or 8 proper nouns (including "car", "key" and "aeroplane"). I forgot everything except "good morning" and "key" within about 24 hours. I now only remember "good morning". Everything was repeated too quickly, so nothing went into long-term memory.

But that is not to say that repeating at the very last minute is the correct answer, and this flies in the face of a lot of material on memory anyway, and particularly in terms of language.

There is a simple rule in memory: the more often you are called on to recall something, the quicker and easier it becomes to recall. To use a trivial example, "is" is far quicker to recall than "carbunkle", because we use "is" every single day of our lives. (Or the equivalent in our language, if not English speakers.)

If not reaching that threshold of "nearly forgetting" inhibited better memorisation, then we would be in the paradoxical situation of knowing words we almost never say better than the words we say every day, but I have never said "what's the word... it's on the tip of my tongue... oh yes... is." It just doesn't work that way.

So no, SRS isn't the optimal way to remember individual items, but it's certainly a pretty efficient way to learn a bulk load of items.

It's good, but it doesn't deserve the hard line.

11 November 2013

Is one-parent-one-language misguided?

The Economist recently published an article on infant bilingualism, which challenged one of the great assumptions of the multilingual parenting community: that of "one parent, one language" as the best means of language transfer.

The basic idea is supposed to be that the two parents speak different languages to the child, and the child picks them both up. Many extend this into a three-language pattern by having a third language that the parents speak to each other, so that there's "Mum's language", "Dad's language" and "family language".

The article cites one François Grosjean, a professor emeritus from the field of psycholinguistics and speech processing. The way the article puts it, one parent one language is not "a must", and that "the child must experience regular monolingual situations in each language". In Grosjean's terms, there must be a "need factor" -- the child must feel that a particular language is necessary in certain given context.

The argument is not very well developed in the article, or even on Grosjean's own website. He doesn't really talk directly about the weakness in the one-parent-one-language model, but from the subtext, I would interpret it as follows:

A child in the one-parent-one-language household may well twig that both parents understand one or both of the languages, and therefore could end up picking only one language to speak in.

Grosjean proposes instead a model of home language and outside language (ie the local language used when out shopping etc), but I don't see how this addresses the problem of the child twigging that one language is enough, which in this case would be the outside language (because everyone they know speaks it, whereas in their minds, only two or three people speak the home language).

The other problem that Grosjean fails to address is a matter not directly pertaining to language, but a pragmatic one of education. He talks about the erroneous blaming of language problems on bilingualism, but he has forgotten to the lesson of the Germans in the US.

Home language and outside language was the strategy by default, because schools used English almost exclusively. The German-speakers lived in a state of diglossia where they could talk about everyday matters in what became called "kitchen German", but discussing more serious matters required English.

This was already a dangerous situation for the German of the immigrants, as not being able to speak about important matters trivialises a language in the eyes of its speakers, but the language survived... for a while.

The final nail in the coffin was schooling. With initial literacy being taught exclusively in English, all spelling lessons assumed that the child would know certain a certain base of common household vocabulary, and these words were used in teaching. But if you only speak German at home, your child won't know the word "faucet" (en_gb: "tap"). The establishment of the time misdiagnosed the problem as one of bilingualism interfering with language development, instead of simply identifying a different set of common words to use in initial teaching. Parents were talked into dropping the German at home for the sake of their children's development and the transmission of German within the US stopped.

This problem still hangs over the home-language-and-outside-language family. Is your child going to have problems at school when the teacher holds up a picture of an apple, and the only word your child knows for it is pomme or manzana? How do you prevent your child's confidence from being damaged by sudden exposure to a raft of unknown words?

29 October 2013

Journalists: beware the headline writer

There are some people in life you have to avoid offending; for example, however important being nice to your boss is, it's far more important that you're nice to his PA. Treating the janitor and cleaners with respect always makes your working life much easier and more pleasant, too.

But the journalist has a much more important person to please: the headline writer. Whether this is your editor or there's someone dedicated to the role, this person has the power to undermine your entire article, or, in extreme cases, just outright insult you.

This appears to have happened to a Scotsman journalist by the name of Hugh Reilly. Hugh is a retired teacher who writes a column that is quite informal in style, and often more than a little... abrasive.

Yesterday's column, though, was downright insulting. He opened with the clear implication that Gaelic hasn't moved with the times. Lie. He described it as "terminally-ill". Lie. (And why the sub-editors let him away with that superfluous hyphen I'll never know.) He put the responsibility for its current resurgence at the hands of the SNP, who in real terms have done less for the language than the Tories (who oversaw the inauguration of Gaelic-medium education) or the Labour and Liberal Democrat parties (whose coalition government passed the Gaelic Language Act 2005). The SNP is more open to accusations of tokenism towards Gaelic than anything approaching Reilly's claimed "life support".

Reilly also treads that weary line of quoting figures that are incomprehensible to readers. Twenty-five million pounds seems like a lot to the average punter who earns a thousanth of that in a year, but in television terms it's utterly piffling. And of course, several prominent figures claim that this claimed figure is an exaggeration of the true cost anyway.

I could continue to deconstruct the column, but that would only serve to labour the point: Hugh Reilly's article was ignorant and bigoted, and downright insulting to a great many people.

And one of those people, it would seem, was the man responsible for putting a headline on the piece: A tilt at the windmill of Gaelic.

Ah yes, The Ingenious Gentleman Don Quixote of La Mancha, the great Spanish novel that is often credited as being the first true modern novel, the watershed between the heroic romances of the Middle Ages and the realism and cynicism of Renaissance literature.

The phrase "tilting at windmills" has passed into common speech, and refers to a specific incident in the novel. "Tilting" is a word for charging with a lance, and Don Quixote "tilted" at windmills because he mistook them for giants, believing their whirling sails to be flailing arms.

Don Quixote, as you see, was quite seriously deluded. He was a man declining in years, a retired gentleman, and as a pasttime read far too much heroic fiction -- fiction he mistook for fact.

The headline is frankly brilliant. In a mere seven words, it fillets the entire article and gives the author a tremendous slap in the face.

So if you're ever called upon to write a newspaper column, check you're not going to offend the headline writer before you submit your copy.

26 October 2013

Verb valency and its consequences in formal grammar

When I first went to university, I was studying computer science and artificial intelligence (although I switched to pure CS in third year) and I was introduced to formal grammars in elementary natural language processing (NLP -- also known as computational linguistics.) When I later studied languages with the Open University, I again went through an introduction to formal grammars.

Both of these basic introductions were based very heavily on the idea of context-free grammars (Wikipedia link for the curious). First time round, I found it difficult to get my head round these CFGs, but as I hadn't done any serious study of language (French and Italian in high school doesn't count) I couldn't put my finger on it... somehow they just felt wrong.

The only objection that I could form clearly was that language is so intrinsically tied to context that it's meaningless without it. For CFGs to be a valid model of language would imply that grammar has at most only a very minor role in the formation of meaning -- an idea that I personally am opposed to. This was indeed one of claims of Noam Chomsky (WP), the man credited with first formalising CFGs. Chomsky's claim was that our awareness of grammaticality was not tied into our understanding of language. This idea he demonstrated with nonsense sentences that were superficially grammatically correct, the most famous of which is colorless green ideas sleep furiously.

When I came back to formal grammars it was in the context of a course on English grammar as part of my Open University studies, and comparing the results and consequences of CFGs to the reality of complex structures in English  started to make my objections to CFGs much clearer.

My beef was that CFGs typically broke a sentence into a noun phrase and a verb phrase, which then contained everything but the  This rule was typically formalised as:
S -> NP VP

The first problem I had with this was to do with the passive and active voice, and the role of the grammatical subject of a sentence.

For example, this splits the man ate the snake into NP "the man" and VP "ate the snake", but the man was eaten by the snake also gets given an NP of "the man", even though the relationship between "the man" and the verb "eat" is pretty much diametrically opposite in the two examples.

With some verbs, this complication is even clearer. Consider the verb close.

The company closed the factory.
The factory was closed by the company.
The factory was closed.
The factory closed.

Notice that in all the different permutations, there is one constant: the thing being closed. We don't need an agent, but we must always have an affected party. Traditional grammatical terminology tells us that close is transitive, because it has to take a direct object in normal use, but there's more to it than that, because we can remove the idea of an agent altogether. There's a small set of verbs that behave this way in English, but that set is growing. These verbs are referred to as ergative, even though there are differences between them and the handling of verbs in an ergative-transitive language such as Basque. (WP article on English ergative verbs.)

If we split our sentence S->NP VP, then we're not considering the role of the VP, and without knowing what we've already got, how can we identify what is still needed? This is how Chomsky's model allows us to generate such utter nonsense.

The other thing that bugged me about Chomsky's model was how the trees we were dealing with were so Anglo-centric. Now, we were always given the caveat that the rules were different for different languages, and while that in principle seems fair enough, the differences in structure between CFGs for different languages are pretty huge.

Consider the effect of word order. Languages are often classified by the order of the subject (S), verb (V) and object (O). English is typically SVO: I (S) shot (V) him (O) and in S->NP VP our subject S is NP and our object O becomes part of the verb phrase VP. Most of the Romance languages follow the same SVO pattern SVO, and even though German verbs can be fairly weird from an Anglo-centric viewpoint, most of the time, German sentences start with the subject, so Sentence->NP VP handles most of Western Europe.

In fact, the rule even survives into the extremities of the Indo-European family, as the Indic languages (Hindi/Urdu, Punjabi, Bengali etc) are SOV, so the only difference is that you now have a VP of OV instead of VO, which is a pretty trivial difference. Even Japanese has a tendency to be SOV.

Even a VOS language would be OK, because now you Sentence -> VP NP, and your VP is verb,object and your NP subject. It's a trivial reordering.

But what about the Celtic languages in Europe?

They're mostly VSO (Breton's changed a bit under influence from Latin and then French) and with the subject trapped between the verb and the object, any rule you make now is fundamentally different from the old S->NP VP of English.

First up, if you define it as S->VP NP, your NP now represents the object, not the subject, which is a non-trivial difference. Secondly, it's just plain wrong, because adverbials go after the object (normally; exceptions apply) and you can't make the adverbials part of the object, because they are tied to the verb (hence the name).

Now it should be noted that Noam Chomsky's other big idea is that of universal grammar. Chomsky proposes that all language is just combinations and permutations of an evolved grammatical system that occurs in the brain as a sort of "language acquisition device", hence all subsets of a single genetically-determined "universal grammar". It is quite ironic, then, that it was Chomsky who ended up defining a system that exaggerates the differences between languages rather than highlighting the fundamental similarities between them.

And even if you don't believe in universal grammar, this hiding of similarity is still a problem: conceptually, the Indic languages are very similar to the Celtic languages when word order is ignored (because they are still surprisingly closely related despite thousands of years of independent evolution), but superficially, a CFG for Hindi looks more like one for Japanese than one for Welsh.

Which brings us to valency. This term was proposed by Lucien Tesnière in a 1959 book, as an analogy to valency in chemistry -- the idea that atoms bond to other atoms by a number of connections with certain restrictions.

Looking back at the earlier example of close, the verb has a mandatory bond to an affected entity (the factory, in the example), and an optional bond to an active agent (the company).

Basically, all noun phrases are subordinate to the verb, or as I like to say, the clause "pivots" on its verb.

Tesnière's valency should have changed the way we defined formal grammars, because on paper it's a very small change to the model, and yet 40 years later after Tesnière's book was published, my CS/AI study of formal grammars started with a near-identical model to Chomsky's. 47 years after Tesnière, the OU was still using the same models for teaching to language students. And a full 55 years after the book, Daniel Jurafsky and James H. Martin published the second edition of Speech and Language Processing, and Chomskyian CFGs are still the sina qua non of formal grammars, rather than a historical footnote. Any course in NLP or computational linguistics is layered on top of CFGs, and results in piling hack upon hack on top to overcome the deep-seated flaws in the paradigm, rather than simply building a paradigm which works.

So why didn't CFGs die?


CFGs, while they were designed to describe human language, are closely coupled to the development of the computer. The parse trees behind CFGs model structures developed early on in the history of computer science, and it is therefore evident that the cognitive scientists involved in them were of the school that believed that the human brain works on the same lines as an electronic computer. (This, by the way, is patent nonsense. If anyone built a computer that worked like a brain, it would be impossible to program, and would be as trustworthy as a single human being in its conclusions -- i.e. completely untrustworthy.)

As it turns out, CFGs are pretty efficient as a model for computers to use to identify parts of speech in a sentence, and it was all the computers had the power to do in the early days. It gave us a good approximation, and something to work with.

But while an approximation based on fundamentally flawed models may get us close, if it gets us close in the wrong way, improving the approximation may be intrinsically too difficult.

The planet Earth, for example, is over forty thousand kilometres in diameter at its widest point. To be able to pinpoint a spot to within 30km would therefore seem to be a pretty damn good approximation. However, the Grand Canyon is 29km wide at its widest, and the amount of work, time and effort it would take to get from the right bank to the left bank would be phenomenal.

So, just as this is a numerically "good" approximation but in practice a really bad one, so are CFGs apparently good, but store up problems for later.

This all goes back to what I was saying last time about meaning existing on multiple levels of abstraction, and as Chomsky had decided that there was some conceptual gap between meaning and grammar, his trees make no attempt to model meaning.

22 October 2013

The word is not the basic unit of language.

I'm sure I've said it before, but the word is not the basic, indivisible unit of language.

From one perspective, in all but completely analytical languages, there are word roots and affixes that words can be divided into. The English presuppose can only really be thought of in English as two elements (pre- and suppose), but the Spanish equivalent presuponer retains the four elements of its Latin origin (pre su(p) pon er). It is not, therefore indivisible.

So, am I proposing that we spend a lot of time learning the affixes and roots independently? No.

You see, the affixes and roots only gain any real conceptual meaning once they are combined into words that have concrete meaning. An overly explicit focus on the meaning of prefixes becomes an academic exercise, rather than language learning. NB: I say an overly explicit focus as there must be some focus or you're going to introduce a lot of extra work -- all the pose verbs (suppose, oppose, propose, impose, etc) in Spanish follow the same conjugation -- the irregular verb poner that each incorporates. If you don't learn that poner is a recurring element, you either memorise all the verbs independently or start making mistakes.

Now, rereading the last paragraph, I notice that I have unconsciously switched from discussing the root pon and the infinitival suffix er to talking about the "element" poner. This demonstrates quite aptly the point I was trying to make in this piece: as language users, we maintain multiple levels of abstraction simultaneously, and at each level, we conceptualise a unit of meaning. Teaching and learning must focus on all units of meaning.

But what does that mean, "all units of meaning"?

The idiom principle and lexical approach suggest that meaning lies in the phrase level, suggesting that the fixed phrase is the indivisible unit of meaning.

The communicative approach and Total Physical Response suggest something similar, stating that meaning is only given to language through use to solve problems.

So here we've already catalogued four levels of abstract that define units of meaning:
  1. Morphemes (word roots and affixes)
  2. Word
  3. Phrase
  4. Task
These form a clear hierarchy built on selection. A word is a specific selection of morphemes. A phrase is a specific selection of words. Performing a classroom task requires a specific selection of phrases (and words).

The flaw in most approaches is that they place excessive emphasis on individual levels of abstraction. The most infuriating part of it is that they often place excessive emphasis on multiple levels of abstraction. If this seems paradoxical, consider this stereotype of a classroom situation.

The phrase Would you like a banana? is introduced and practiced (phrase focus). Now the banana is replaced with an apple, an orange, a lemon and a strawberry. There is the illusion that we're still phrase focused, but in fact we've switched into word focus and are effectively simply listing fruit vocabulary. There is no need to even think about the meaning of the Would you like part, and you descend into rote repetition; it becomes a mere string of sounds before the one bit that actually needs you to think. In the end, we're memorising a list of phrases and a list of words -- there is no greater model of meaning built.

But there's another practice in teaching that undermines meaning even more, and it manifests itself clearest in verb drills. Consider:

I sleep.
He eats.
She drinks.

How much meaning do these individual utterances have? I argue very little. Everyone sleeps. Everyone eats. Everyone drinks... unless this last one is understatement and you're telling me she's an alcoholic.

But what do we need to make these meaningful? Part of the answer is in the valency or transitivity of the verbs. In English, almost all verbs can be used intransitively (ie with no noun phrase following as direct object or predicate), and in this sense it means "this person carries out this activity with some degree of frequency", but the more common the verb is, the less common this form is. I mean, he skydives is fair enough, because you don't expect it, but he talks is extremely unlikely because, well, doesn't everybody?

So if I'm asked to translate something that doesn't really mean anything to me, it's always going to be a rote exercise. Even if I'm not translating, and I'm just conjugating within the language, changing comer to come produces an utterance just as devoid of meaning as the English he eats, and language without meaning is nothing. The obvious solution is to use transitive patterns of the verb, adding in objects to make a more meaningful utterence -- he eats fish, they don't eat meat etc.

But that doesn't work for I sleep, because sleep is intransitive, so never takes an object.

So this is where the idea of verb valency comes in. The term is taken from Chemistry, where it refers to the number of bonds atoms can make, and here refers to the number of arguments a verb can take. "Arguments" in this context refers to a wider range of items than simply the objects of the transitive verb -- the subject and any adverbials are also classed as arguments.

What's missing in the utterance I sleep? An adverbial. It would become natural as soon as it had an adverbial of quality, time or place. eg:
  • I sleep badly
  • I sleep during the day
  • I sleep in a bed
These are still very contrived examples, but they feel more meaningful because they follow the patterns of real use.

The concept of valency is far more appropriate here than the idea of "fixed phrases" with "slots" for individual words, as otherwise we end up with rather minimalist "phrases" like the following:
<person> <conjugation of to do>n't <verb>
This "phrase" would explain the prevalence of the "he doesn't X" over the rare "he Xs"... for certain verbs.

09 October 2013

A wrod on errors

Since I started learning languages, I've had lots of discussions, both online and face-to-face, about the nature of errors. What always surprised me was how blasé many people were about errors -- including teachers.

There are two particular philosophies that I find quite worrying.

First:

"Errors take care of themselves."

The belief expressed by many is that there's no need for any systematic correction of errors, as the learner will work them out given enough time and contact with the language.

In the extreme case, this means no correction whatsoever, leaving the student to pick it up from input.
In the more moderate case, the teacher is expected only to give minimal correction, relating to the very specific error.

But isn't it self-evident that this is not the case? Hasn't everyone met at least one immigrant who still speaks with errors that are systematic and predictable? It's easy to dismiss this as an immigrant "language bubble", with the immigrant living and socialising within his minority community, but that only works in a major urban area, where there is enough of a concentrated population for clique community to form. But when you get to a rural community, and there's only one or two minority families in the area, why is it that a pensioner who has lived in the country for half his life still sounds decidedly foreign? And not just in accent, but in language patterns too?

Errors do not take care of themselves.

So on to the second:

"Errors don't matter - native speakers make mistakes too"

This one I've heard in many situations, but the most potentially damaging of these is in the learning of minority languages, because in those instances, it is argued that a learner who has failed to learn correctly is somehow equal to a native speaker.

This is a pretty insidious leap of logic, and confuses two issues.

First up, you have the distinction between two classes of errors: systemic errors where the speaker doesn't know the correct structure or word; and performance errors, or "slips", where the speaker stumbles during speech, despite being perfectly comfortable with the correct form.

Then we've got that hoary old chestnut of "bad grammar". Apparently, when we split our infinitives, or end a sentence with a preposition, or use "who" for a grammatical object, we're natives making errors. Well, no. People "break" these rules all the time, so they are in fact not errors. At worst, they are variant forms and therefore a correct form; at best they are the most common forms, hence the correct form.

In truth, the only type of error a native can make is a slip, a performance error, because a native has a full internal model of their language (in a particular dialect), assuming we are talking about someone without a mental or learning disability.

As many linguists say: there is no such thing as a common native error.

Non-native errors are real, and informative

But common non-native errors do exist, and we do a disservice to ourselves and/or our students by ignoring them, as errors provide a very useful insight into what's wrong with a learner's internal model of the language.

The main inspiration for this post was an error I noticed recently in my own French.

It was something along the lines of *ça ne me rien dit, which should've been ça ne me dit rien. I noticed the mistake immediately, and what I said was the corrected version, but my brain had initially formed the sentence incorrectly.

Why? What was the underlying cause of the error?

For those of you who aren't familiar with French, basic negatives traditionally consist of two parts: the particle ne before the verb and any clitic pronouns, and a second particle (pas=not, rien=nothing, jamais=never etc) after the verb. Or rather, I should say "after the first verb", which is more correct, even if some teachers don't bother to go that far.

You see, in French, there is very often only one verb -- where we add "do" in the negative (I do not know), the French don't (je ne sais pas -- compare with the archaic English I know not).

As high school had drummed this into me in simple (one word) tenses, I initially had great difficulty in correctly forming compound verb structures -- I would erroneously place the "pas" after the final verb:
*je ne peux voir pas instead of je ne peux pas voir
and
*je n'ai vu pas instead of je n'ai pas vu

That was a diagnosable error, and having diagnosed it, I consciously worked to eliminate it, and now I have no problems with ne... pas.

And yet I made this mistake with the placing of jamais, even though it's the exact same structure... and when I made this mistake, I recognised that it was something I struggle with frequently. Furthermore, I make that mistake with every negative word except "pas".

So I have a diagnosis for this error: my internal model has incorrectly built two structures where it should have created one, because a native speaker has only one. It is clearly, therefore, a non-native error.

(Actually, there's a longer story about a series of errors and corrections, but let's keep it short, shall we?)

Taking action...

What can we do as teachers?

Well, it's not easy, but we have to monitor our students constantly to identify consistent errors. Moreover, we have to look out for apparently inconsistent errors -- I say "apparently" inconsistent, because there really is no such thing as an inconsistent error. If it appears inconsistent, it means that the learner has done what I did with French negatives: used two rules where one should be used. It is then the teachers job not to correct the broken rule, but to guide the student to use the correct rule.

The more you spot these errors, the more you'll see them recurring in different students, and you'll find that they're actually pretty common errors. The fixes you implement for your students will feed into your initial teaching as a way to avoid the errors in the first place, and everyone wins in the long-term.

04 October 2013

Spaced Recognition Systems: timing and time off

Every now and then I head back to the How To Learn Any Language forums for a wee lurk. A few days ago I popped in and one of the users there was asking for advice on how to deal with the backlog of cards in an SRS flashcard system after time off.

About SRS

For anyone not familiar with SRS, its basically flashcard software that incorporates an algorithm to schedule repeats and revision of individual items at increasing intervals. The notion of increasing intervals is nothing groundbreaking, often occurring even in primary school spelling lessons (introduce new words, test later in the day, test in end of week spelling test, test at end of term in final "big" spelling test), but there have been researchers who have tried to formalise this and identify optimal gap lengths. Paul Pimsleur, for instance, devised a series of strictly scheduled gaps for optimal memorisation, a schedule which was used for the creation of the audio language learning software GradInt.

Problems with "memory schedules"

The problem with this strict scheduling is evident in GradInt: the first few "lessons" you generate will consist mostly of silence, because you have no old material to revise. It's not hard to believe the author of GradInt when he states that the commercially-produced Pimsleur courses do not strictly follow the memory schedule that Paul Pimsleur laid out.

Furthermore, the Pimsleur courses are a great example of why memory schedules cannot work: Pimsleur courses are almost identical for most languages (at least at the initial stages) and yet some languages are still more difficult than others. I borrowed several of the 5 and 6 CD courses from my local library (hence why I can't really talk about anything after the initial stages) and while I found Irish easy and finished it quickly with no repeats, I found Vietnamese so difficult that I couldn't even get the first two lessons down pat, so I gave up and handed it back without finishing.  Irish is of course similar to Scottish Gaelic, which I was already passably proficient in at the time, whereas Vietnamese is nothing like anything I had ever studied up to that point. Even if I hadn't learnt any Scottish Gaelic beforehand, Irish would still have been easier than Vietnamese, as there are fewer alien sounds to it, and the grammar is still somewhat related to English.

So yes, that's the common-sense and anecdotal rebuttal, but this sort of thing has been measured and the common-sense anecdote proven: it's easier to learn something that's kind of familiar to start off with. Memory schedules don't factor in the "difficulty" of the material, and that's a weakness.

Enter the SRS

SRS systems don't try to introduce any direct notion of difficulty or complexity at the level of the item being memorised, but instead try to adapt based on the user's perception of difficulty at each revision. It's a good compromise and it works well for the most part, and certainly better than blindly following a set pattern.

However, the more you adapt the algorithm away from the published research papers, the less you can support your software by quoting research. Isn't that an interesting quandry? It's probably better, and few but a handful of extreme determinists would doubt it, but it's difficult to say it's better, because they've got research and you haven't.

That said, I'm happy to accept that an adaptive algorithm is better than a rigid schedule; it's only once we start attempting to chose between adaptive algorithms that the problems start, because whichever choice you make, there's no proof for it. Which brings us to the problem raised in the HTLAL thread...

Stelle's problem

The original poster, Stelle, is planning a holiday, and is going to be away from all PCs smartphones etc for six weeks, and the algorithm-as-dictator is going to be pretty unhappy the first time Anki is booted up after the holiday, because there will be an awful lot of overdue cards.

Several posters commented that that's to be expected, because time away from Anki is time forgetting words, but Stelle then revealed that the holiday in question is walking the Camino de Santiago de Compostela, through Spain. Some words will be forgotten during that time, but a hell of a lot will be revised and remembered better than Anki alone could ever achieve.

SRS and its false assumption

You see, SRS seems to assume that it's the only source of learning. As a compromise, that's fine, because you can't go asking the user to log absolutely everything they've done in every other package, class and book -- it would be too big a job, and would put people off. But if you assume that no SRS for six weeks means no learning, there will inevitably be a lot of revision after that.

The algorithm Anki uses is derived from the SuperMemo 2 algorithm, which generates scores based entirely on the card's previous score and the user's perceived difficulty. The algorithm appears to take for granted that each revision has been carried out when scheduled.

A possible solution

So I say it's wrong to assume that an SRS user hasn't been learning and revising outside of the SRS software, but obviously it's equally wrong to assume that they have. And by the same token it would be wrong to ask "have you revised?" and apply a single answer to every card in the deck, because we can be sure that Stelle will ask where the toilets are, but there's no guarantee that rainbow trout will ever come up in conversation.

But the algorithm could try to work out whether you've revised or not.

I mean, say you had two cards that were due for revision after a day, and you came back to them six weeks later. You get the first one right, you get the second wrong. Isn't it fairly reasonable to assume that you revised the first and not the second? And even though that isn't a 100% safe assumption (the first one might be an easy word, like "taxi", the second may be abnormally difficult), it doesn't really matter, because one way or another, you know one, you don't know the other.

The scheduling of the next revision really has to take this extra information into account.

Under the SM2 algorithm, the first word, the easy one, will be treated as though it's been asked after one day, and even if I rate it really easy, I'm still going to have it scheduled for a few days later... even though I've proven that I've already learnt it. Isn't that crazy? Why not schedule it for weeks away? This may not immediately clear the backlog, but it will reduce it quicker. (Because cards won't be re-added to the backlog as quickly.) As you work through the backlog, the dispersion of cards will increase, as cards are increasingly late.

By the time this is finished, they'll be thoroughly mixed, and you'll be revising the genuinely least well known cards more. If the algorithm was too optimistic about the words you seemed to know, this will be corrected in a couple of iterations.

02 October 2013

Onward and upward

So yesterday I got out of my sling, and I'm now typing two-handed again, which is soooooooooo much easier: I just couldn't avoid typos one-handed, and spent ages correcting them (which made typing so little fun that I did nothing on this blog for the whole time).

I swore I was going to stop wasting my time, yet I achieved nothing today (apart from doing a few mobilisation exercises to try to get the arm moving again).

Tomorrow I'm going into town (heading to the Jobcentre), so while I'm in I'm going to pick up a few very useful things: a very large pad of paper (a flipchart pad, probably) and some pens for it.

I have two projects that have been hard to get into on a computer left-handed, and the idea of being able to "splurge" my thoughts onto paper in a messy, arbitrary way is quite appealing, cos it's hard to really map out language stuff on computer screen, given that the structures in language are themselves rather messy and arbitrary, and don't really fit tables that well.

So what exactly are these projects?

Project 1: Face-to-face Gaelic course

I've been thinking about teaching a Gaelic course in my quite little Central Scotland village. There's a fair few people here who would be interested (in theory, at least) and I've spoken to one guy who is very keen on it. But it's scary doing the "home gig", where everyone in the crowd is someone you know, and I've always chickened out. However, I'm sitting here with no definite work (there's some possible online teaching in the pipeline) and no unemployment benefits, so it would be nice to get some money coming in, even if it is scary.

What I want to do now is get a map of some of the big patterns in the language and try to see how to teach them in a logical, integrated manner, rather than the normal scattergun approach a lot of courses (in any language) take.

Then I can phone up the caretaker of the local community centre and price up a course, and canvas interest before booking anything.

Project 2: Language learning software

About a year ago, I started prototyping a somewhat ambitious piece of language learning software, which I used to pick up a bit of Corsican while I was living in Corsica. The prototype was a bit of a hack, and every time I upgraded it the software got more and more unwieldy, with every new addition and change taking longer and longer to program in, and taking more and more memory and processor time.

So I've restarted, trying to reduce the memory requirements and increase the modularity of the code, so that changing one thing doesn't start a domino-rally of changes. This also means that it's easier to generalise across languages, so I've been trying to build the latest version in several languages simultaneously.

But last year I promised my (now former) students I'd have an alpha/beta version for them (English for French speakers) and I've lost a lot of time while I did very very little left-hand-only work on the software.

So I've got no choice: I made a promise to get this software going, and I really want to have something useful before Christmas.

29 September 2013

And end to laziness

This summer, I got the opportunity to take a once-in-a-lifetime type of trip: I attached a trailer to a racing bike and crossed France from the Mediterranean to the English Channel. As you might imagine, I was more than a little tired when I made it back to Scotland, and I was going to give myself a week to rest.

I had one more thing to do before I could give myself total rest, and that was to visit my brother in the Highlands. Unfortunately, there was some light rain the day I was heading home, and when I crossed the tracks at a level crossing, my bicycle slid sideways and brought me down hard on my right shoulder, breaking my collarbone.

Tomorrow, I'll have been in this sling for seven weeks, and I've not really put that time to good use. No language study time, anyway, and rather than watch any of my foreign language DVD collection, I just watched oodles of rubbish off satellite TV.

OK, so everything's easier with a right hand, but nothing is impossible without one. The pain for the first couple of weeks was very distracting, but after that it was self-pity merging into just straight up bad habits and poor excuses.

Tomorrow, I'm going for a check-up at the hospital, and with luck I'll be out of the sling. Whether I am or not, this has to be a breakpoint: an end to laziness, a new push to make the most of my time. I've loaded up my phone with several CDs of Michel Thomas German (a course I never finished) to listen to in the waiting room.

No more excuses.

14 August 2013

Hiatus extended...

My right arm's in a sling (cycling accident), so blogging's off the menu for a while longer...

27 June 2013

Sustainability - the missing word in language revitalisation?

Since the start of the century, a new watchword has been on the rise in the world of development: sustainability.

The word started its life associated with ecology, with ideas such as "sustainable forestry", meaning only logging as many trees as a forest can grow back and "sustainable fisheries" – only taking as many fish from the sea, lake or river as can be replaced at a natural rate.

Sustainability moved away from the idea of ecology, to an idea of economics – sustainable explotation of natural resources isn't just a matter of caring about nature for its own sake, but rather the basic common sense of not undermining future supplies in the quest for short-term profit.

But economic sustainability is about the availability of all resources, not merely natural ones.  An economically sustainable system requires no ongoing subsidy, even if it needs some initial investment.  You will rarely hear the word "sustainable" used to describe renewable energy, as while it may use natural resources in a sustainable way, it is not an economically self-sustaining system.

Charities the world over are taking on this idea of sustainability within economic development in poorer areas, and rather than providing indiscriminate aid, they are starting to target their aid on providing startup funding for economically and ecologically sustainable businesses, serving local needs and generating local employment while leaving the local environment intact and friendly to continued human habitation.

And here's where we get to language revitalisation, because language revitalisation is increasingly being recognised as being inextricably linked to community development, and yet sustainability is something of a missing element in a lot of decisions surrounding public funding of language development.

This is not to say that sustainability has been excluded – a few years ago Bòrd na Gàidhlig, the Scottish Gaelic development agency, granted a large sum to a private company in order to fund the development of proprietary teaching method, and there was a large bonus written into the contract contingent on reaching a certain number of learners, and crucially completing the course.  But this was limited funding, and the company was expected to make its own profits directly through student fees and sales of supplementary products.  That is sustainable spending, but while it's there at that level, it's missing elsewhere.

The word that Bòrd na Gàidhlig and many other language development agencies worldwide seem to be missing isn't actually "sustainability", but rather "ecosystem", because every project is viewed in isolation, which is a particular problem when you're dealing with projects to produce learner materials.  If we look at many of the materials projects out there, it is very difficult to see how they can be plugged in to wider teaching, and most of this is down to the problems around rights of use arising from copyright.

As an example, I'd point to some of the resources available at learngaelic.net.

First of all, there's a database of excerpts from the daily news programme An Là, from BBC Alba.  The site has various news stories in video format with an interactive transcript, but they are expected to be viewed on the website only.  You are not expected to download the to watch offline and (maddeningly) there is nothing there to clarify to teachers what rights they have (if any) to use this as a classroom resource.  And heck, even if I was permitted to use it in class, I still couldn't, because my policy as a teacher is never ever use streaming in class.  It breaks far too often.  In this case, the problem is compounded by the fact that the video is encoded in high-quality, and the player doesn't maintain a large video cache, meaning that on a slow connection, you just cannot watch it... which is obviously also a problem for the self-teacher at home.

The other video series at learngaelic.net is entitled Look@LearnGaelic, and consists of specially commissioned videos including interviews and short documentary style videos spoken slowly and clearly, with accompanying transcripts and/or subtitling.  These are good quality resources, but they're hemmed into the site, and kept away from teachers.

I mean, the video player doesn't even have a full-screen button, so even if you put it up on a projector, you'd still have a horribly small image – not what you want in a classroom.  (From a technical perspective, it gets worse.  The BBC Alba videos are in standard TV definition, as you'd expect, as BBC Alba isn't an HD channel.  The media player on the website has SD dimensions.  The same player is used for the Look videos, but the ones I examined were in high def, thus meaning I was streaming almost 4x as much data as could be shown on the screen anyway.)

What a waste!  These are materials paid for almost entirely with public money, and they're of value to the public, but the public won't use them, because without a course or a teacher behind them, they're stumbling in the dark looking for something that's appropriate to them.

As I said, it's not only in Scottish Gaelic that this problem arises.  A month or two ago I learned of an Irish "phrase of the week" series free on the net, funded by the Irish language agency (whether fully or in part, I can't be sure).  But it was copyrighted to the people that made it, all rights reserved, so it was only available to the independent learner.  The thing that really irked me was that this was basic beginners' phrases – the sort of thing that everyone would be learning in their lessons or from their books anyway.  It added nothing to the "ecosystem".  When I started writing this post, I went searching for it... and I can't find it, because there are dozens of "beginners' phrases" video serieses on YouTube (not subsidised) so this series adds no value whatsoever.

So what is the sustainable solution?  How do we get the material into the hands of the teachers without disadvantaging the creators?

The way I see it, there are two sides to this, and a distinction has to be made between "individually sustainable" activity and "ecosystem" activity, with distinct models of ownership of intellectual property for each.

Individually sustainable activity

A full language course could and should be individually sustainable in that it makes enough money to fund itself.  In such a system, the ownership should remain with producer of the material, because it is only through control of the material that the course can generate revenue.

Ecosystem activity

When a production house creates an educational resource, they retain full rights to the material.  A great amount of this material is free of charge to the end user, and generates no profits, so is in no way sustainable.  So we should think of it as analogous to an environmental issue.

When an environmental group gets a grant to repair human damage to a local stream, this does not result in them gaining ownership of the stream.  The stream is not a business and cannot be seen as "individually sustainable" divorced from the wider ecosystem of natural and human use.  The end-goal of the environmental work is to the wider public benefit.  Crucially, though, you wouldn't pay for a river cleanup if you knew that the next day, some factory upstream was just going to pollute it all again.

So any activity that cannot be individually financially self-sustaining can only be sustainable if it feeds into and nourishes a wider productive ecosystem, and individual resources should not be under individual control.

Instead, development agencies should be commissioning the materials and taking ownership of them.  These materials can then be made available for exploitation within commercial activity.

In this model, the resources are an indirect subsidy, and crucially a shared subsidy – the agency pays once for something that is used in a dozen courses.  Not only is this cheaper for the development agency, but it would allow much more experimentation and innovation.

I could recut the videos, reorder the videos, recombine the videos; I could optimise their applicability to my students.  This is stuff I cannot afford to do as an individual teacher if I'm starting from zero.  As the material is freely available anyway, I can't pretend it's mine and "sell" it – the only "product" I would be selling is the teaching, and my teaching would be improved.  I would work just as hard, I would charge the same as I would otherwise, and in the end it would be the students that benefited most from it.

A final thought...

People who want to make their materials free often add a condition of "non-commercial" to their license.  But the major publishing houses are not the only commercial entities in language teaching.  Evening class teachers and private tutors are usually self-employed and therefore commercial actors, and in some parts of the world even schools and universities are private sector institutions.  Non-commercial licenses are rarely appropriate to education materials....

25 June 2013

Language, the independence referendum and the Scottish identity

It was always going to be controversial, but Scottish Gaelic didn't get the approval it needed to be included on the referendum paper for Scottish independence.

A bunch of people mobilised rather late in the day to campaign for it.  Right from the start, though, I argued that a campaign for Gaelic on the referendum paper was really just a hiding to nothing.  The question of legitimacy always hang over the referendum, and the best way to prevent any questions of legitimacy was to stick with the Westminster/Whitehall rules on elections, which is why I always felt that the appropriate course of action was to campaign for those rules to be changed, which would not only affect the current referendum, but all future elections whichever way the vote goes.

As far as I'm concerned, all the noise about the referendum question is just wasted energy, for two reasons:
  1. Even if they succeeded in changing the policy, it would be a one-off with no automatic effect; while for the same effort we could get the referendum and everything else.
  2. There was no way in hell the Yes campaign were going to go for it, because it doesn't help them achieve their goals, and gives the No campaign another stick to bash them unfairly with.
This morning I was directed thanks to Facebook post by a man born in the Basque Country (the part north of the Spanish/French border) to Galician and Asturian parents, via a blog that would appear to be Irish to a blog post from a Scotsman decrying the lack of Gaelic on the paper, entitled "Yes Scotland. No Gaelic. Feart Horses."

The post is (and I'm sorry if this seems harsh) just an exercise in bigotry.  Let's look at the author's opening line:
"It's the language that dare not speak its name. Partly because its name - Scots/ Scottish - has been hijacked by another - Inglis/ Anglo-Saxon."
Since when did the term "Scots/Scottish" belong to Gaelic? It never did.

Yes, there were times in history when some outside commentators called it "the Scottis tongue" or similar, but this was always an exonym -- a name chosen from outside the speech community to describe it. If you look at the endonym (the name from inside the community -- i.e. the name the Gaels themselves use) it has always been Gàidhlig or some variant thereof (and the same goes for Irish, incidentally).

His insistence that the Anglo-Saxon-derived Lowland Scots language is properly called Inglis or Anglo-Saxon isn't a standard to which any other language would be held. Consider that the ancestors of Modern French, Italian, Spanish etc all called themselves either "Latin" or "Roman" for many centuries after they ceased to be mutually comprehensible, and at some point the speakers started self-identifying differently, and changed the way they referred to their language.

Consider the Franks.  They were a Germanic tribe who were conquered by Rome and taught Latin (badly), but they never ceased identifying themselves as "Franks", even though they spoke "Roman".  Eventually they decided this was stupid so now they're French people who speak French -- even those who aren't descended from the Franks (most of modern France isn't).

Which brings us neatly to "Scotland" and "Scottish".

"Scot" was never a term that "belonged to" the Scottish Gael. It originated as a Latin term for an Irishman, but there is no historical evidence of either Irish or Scottish Gaels self-identifying with the term.  It was an "exonym" (a name imposed from outside the speech community) rather than an "endonym" (one used from inside). It was the Scottish Lowlanders, not the Highlanders that continued the term when Gaelic died out, and it is entirely reasonable that a Scottish person would want to use the same term for his nationality and his language -- not to do so would be to effectively declare himself "less Scottish" than a Highlander.

Now, like France, modern Scotland is composed of the territory of multiple tribal/ethnic groups.In the northwest, the Highland Gaels; in the northeast, the Picts; in the southwest, the Galwegians (a Gaelic-speaking people) and the Strathclyde Britons; and in the southeast, the Anglo-Saxon/Danish people of northern Mercia.

Five peoples, one country.  It would be crazy to try to unwind a millenium of history and declare that the only true Scot pertains to the one of those five cultures and races that the name applied to then.

And yet that is what Mac an t-Srònnaich wants us to do.

He is particularly vocal about the fact that the Gaels tend to allow their self-identity to be subjugated by English:
"The fact that some Gaels think this way is neither here nor there - every indigenous and once-repressed people has it's own doubters. Centuries of repression and decades of having the language beat out of you in school will leave some people's self-respect at a low ebb."
And yet... notice the word "feart" in the title -- the original referent was "frighten the horses", so the author chose the word himself.  He also said "aye".
 
I do not know who Mac an t-Srònnaich is, but it seems to me most likely that he's a Lowlander, and a learner of Gaelic.  I strongly suspect that his childhood lect was somewhere on the spectrum between Scots and English.  I wouldn't be scared to suggest that at primary school he, like me, was told not to use words like "feart", "aye" and "cannae", but instead "afraid", "yes" and "cannot".
 
I would suggest that Mac an t-Srònnaich is suffering for the same affliction that he accuses the Gaels of: he has been so deeply shamed into devaluing his own language that he denigrates it himself, and like many of his ilk, he's found it easier to pick a new language to support than to challenge the negative attitudes towards his own.  If that's the case, he has no right to criticise anyone else for it.

20 June 2013

Pattern identification and language learning

A few weeks ago I read an article reporting on a scientific study which had found that students who are good at abstract pattern matching tasks perform better in Hebrew language lessons. (Unfortunately I don't have access to the original journal paper.)

Now, we have two ways of interpreting this outcome: the fatalistic and the optimistic.

The fatalistic interpretation

A fatalist will say that it is proof positive of the existence of talent, and that those who do not have this talent are doomed to failure at learning languages.

The optimistic interpretation

The optimistic interpretation is to say that the successful learners are succeeding despite the teaching, and that this study is, by shedding new light on what the successful learners actually do, showing what teachers should be doing if they want to be successful.

"Rules" are a bit out of fashion in language teaching.  I've always said that this is because people have been teaching incorrect rules, rather than that rules are inherently unsuited to language teaching.  A linguistic "rule" is (or should be, at least) nothing more than an observed pattern emerging from real usage.  If successful learners are those who can identify patterns, then we must assume that after identifying these patterns, they learn them.

So I would suggest that the logical conclusion is that we should be teaching these patterns to students, rather than relying on them identifying them.

An example pattern

When I was learning French (my first foreign language) at high school, I noticed the distinction between the "long conjugations" and the "short conjugations" of the present tense (the long conjugations are the nous and vous forms, and the short conjugations are the rest), and I noticed that in irregular verbs, these forms were almost always regular.  (Is there any verb other than être that has irregular long conjugations?)

So while my classmates were attempting to memorise the irregular verb tables by rote, I was saving myself time and effort by only memorising the conjugations that didn't arise out of bog standard, regular conjugation.

Now it could be that the advocates of the discovery method are right, and that part of my success was down to the fact that I worked this out for myself, but I doubt it.  And even if that's true, is it fair to trade off the success of the majority against the success of a lucky few?  It is obvious that being told this would have reduced the effort required by my classmates to learn their irregular verbs.

Teaching patterns vs teaching rules

The problem with most of the "rules" traditionally presented in grammar books is that they are more strictly ranked and regimented than they are in real life.  Real patterns in real language can't be so neatly packaged by tense.

Take one of the patterns in Spanish taught by Michel Thomas:
The third person plural conjugation of a verb is the third person singular conjugation plus N, except in the preterite.

Now that's not how he taught it, but that's the concise description.  When he taught it, he simply used it in the present, and then got the students to apply it in the other tenses.

One of the reasons some people don't like this pattern is that "except" bit, but this really isn't a problem, because the pattern also holds for the second person singular: third singular +S, except (again) in the preterite.

It's regular, it's predictable.  Even the "exception" is regular in that it's an exception for both 2S and 3P.

This is the sort of pattern that I suspect all those successful learners are finding, and this is the sort of pattern that made Thomas such an effective teacher.

So let's find those patterns, and let's teach them.

14 June 2013

The obligatory car analogy...


When you want to explain something complex to a non-expert, there's no tool more useful than a good analogy. Sadly, there's are few tool more open to abuse than the humble analogy, and in a great many cases, the subject of this abuse is the humble automobile. There is a rule of thumb on the internet that says you should never trust a car analogy.

The danger in analogy is that it comes to what appears to be a logical conclusion, even when the analogy is false, but thankfully we've had the conceptual tools to analyse logic since at least Ancient Greece. Heck, even the words “logic” and “analogy” come from the Ancient Greek language.
Regular readers will know that I'm not a fan of the “learning/acquisition” distinction, or the school of thought that says that rules don't matter, and that the only way to “acquire” is through exposure.  Well, recently I was reminded of that particular school-of-thought's own pet car analogy, and I would like to dismantle it here.
Grammar, they tell us, is unimportant. Do we need to know how a language works in order to speak it? Well, they say, consider a car: do you need to know how the engine works in order to drive it?

The reasoning seems persuasive to those who are predisposed to listen, but as with all analogies, the problem lies in the equivalence of the analogised items.
Is “how a language works” analogous to “how the engine works”? Certainly not – it is analogous to “how the car works”. Some commentators would suggest that the engine is how the car works – I would like to argue against this.
To a driver, a car is not the engine. From the very beginning, the goal of the engineer has been to abstract away features that the driver shouldn't have to think about and turn the engine into something of a “black box” – you read the instrumentation, manipulate the controls, and then the car responds in a consistent and predictable way based on what you tell it to do. The driver does not need to know what “RPM” means to recognise when they're over-revving the engine – revolutions-per-minute, cylinder cycles... irrelevant – but the driver does have to be told that over-revving is a bad thing, and has to learn the “rules” of reading the needle and listening to engine noise to avoid doing it.
The acquisition crowd are not, I hope, suggesting that you could put someone in a car with no knowledge of the steering system, gearbox, speed controls and indicator and just let them get on with it. The end result of this would at best nothing, at worst a seriously damaged car. OK, so you're not going to destroy someone's brain by throwing them into a language at the deep end, but if they can't even start the language's “car”, they're never going to get any useful feedback at all.
So we have three elements in the target of our analogy:
  • The car as a whole
  • The car's control system
  • The car's engine
The question is, is the grammar the “control system” or the “engine”. Quite simply... urm... possibly maybe both....

Grammar as Control Sytem


Most of the grammar of a language is unambiguously “control system”, as the speaker must directly manipulate it in order to make himself understood.
Consider the spark-plugs in a diesel engine. Wait... a diesel engine doesn't have any spark-plugs. But this doesn't matter – this makes practically no difference to the driver. The accelerator works the same as the accelerator in a petrol engine with its spark plugs, and pressing it down harder makes the wheels spin faster. “The car”, as an entity, is operated identically – as far as the driver is concerned, it “works the same way”.
But let's look at a grammatical distinction, and for the sake of the argument I'll take the use of articles. English has them, Polish doesn't. If articles were like spark-plugs, that would mean that the article is entirely irrelevant to the manipulation of the language, but this is patently false. If you don't correctly manipulate the article, your sentence is wrong.
So a great many grammar rules are undeniably part of the control system.

Grammar as engine


Grammar as a whole has been a very expansive and extensive field of study – in fact, I'm led to believe that grammar originally meant the description of a whole language. Grammar today usually means “everything except vocabulary, pronunciation and spelling”, so a lot of stuff gets caught up in it which may be considered “engine”. Historical changes, derivational morphology (the etymology of the word presuppose is of very little use to the average learner of English) and distinctions like that between reflexive and impersonal/pronominal pronouns in the Romance languages.
But to use these few examples as a reason to throw out all conscious description of grammar is hugely short-sighted.

My car analogy

So that's their car analogy disproven, and I'd like to replace it with one of my own.

To ask someone to learn a language without grammar is like putting someone in the driving seat of a car without drumming the words “mirror, signal, manoeuvre” into their heads, and without telling them never to cross their hands at the wheel.
On the other hand, a lot of grammar-heavy teaching is like teaching someone to drive by carrying out the exact same manoeuvre 20 times in superficially different (but functionally identical) locations, then moving on and doing the same thing with a different manoeuvre.
This is not what any good driving teacher does.

A driving teacher takes the beginner to a safe, simple environment (eg an empty car park) and teaches the basic rules of operating the vehicle. The learner won't even be allowed to start the engine until they've started building the habit of checking all mirrors. Then they will learn to start and stop. A bit of controlled speed, then a bit of steering. The complexity increases steadily, and the instructor chooses increasingly complex environments so that the learner has to apply and combine the rules in ever more sophisticated ways. Rules are introduced gradually, as required, and then applied and manipulated as the situation demands. The teacher initially picks routes that don't require turning across traffic, then picks a route with a safe across-traffic turn, then adds in crossroads, traffic lights, roundabouts, filter lanes etc one by one. But these features are never treated as discrete items to learn individually – they are elements of one continuous whole that must be practised in the context of that whole.
This is what a good driving teacher does, and this is what a good language teacher does. Listen to one of Michel Thomas's courses1 and you'll see that's exactly what he does: an increasingly complicated linguistic environment, and no language point ever treated in isolation beyond its basic introduction.  That's proper teaching, and it's all built on grammar and rules.

1I mean a course that he himself planned and delivered, not one of the courses released after his death.

11 June 2013

Link drop: to myself!

A couple of weeks ago, I got invited to write an article for a multiauthor blog entitled MOOC News and Reviews, all about these newfangled online free course thingummijigs.  Well, it seemed like a good opportunity to continue writing about these things without constantly boring my language-orientated readers here with it.

My first article has just gone online, in which I discuss the benefits to the learner of having a good old-fashioned whinge once in a while, and the barriers that online discussion places to the student who feels confused or dissatisfied.

10 June 2013

Ivan MacDonald

I've said plenty of times in plenty of places that my interest in languages is about people, about the personal bond that comes from making the honest effort to speak to them in their own language, even if just sticking to English would be easier for both parties. It's something I've experienced in several languages, but none more so than Scottish Gaelic.

Scottish Gaelic is a language that comes with a real sense of community – although half the speakers now live in Scotland's major cities, they're all still only a step or two away from their extended families in one of the island or isolated mainland communities. Spend even a small amount of time in one of these communities and you can't fail to be impressed by the welcome and generosity you are shown, even if you don't speak Gaelic.
 
I've spent less than a month in Uist in total, yet there are people there who recognise me on sight and always greet me with a warm smile. A little over a year ago I walked into the local pub where I was living on Skye, and a guy came up and talked to me, addressing me by name. I didn't recognise him at first – we'd only met twice in our lives, two summers earlier and the one before that – but he was so happy to see me and catch up. I was blown away by it, I really was. I swore to myself that I'd never let myself forget his face, and he genuinely became someone I would recognise absolutely anywhere. When I caught up with him last summer, he was right at the heart of the community, sitting on the board of Ceòlas as they discussed some major changes to the organisation.
 
Mid-morning today, I switched on my computer to check my messages, and saw the following Facebook status a friend had posted last night:
    Well this is the end of a tragic day in uist! Many people in uist and beyond have been affected by today's news. We've lost a true gentleman character. Ivan you'll be sorely missed by everyone whose has the pleasure of being in your company. God rest your soul!! Xx

Ivan.

Even before I hovered over the name, I knew it was him, and not only because Ivan isn't a particularly common name there. In the islands, it's relatively rare for anyone to be commonly referred to by a single name, but Ivan was Ivan.
 
I met Ivan at a party during Ceòlas week where he turned up in a grubby boiler suit carrying a pipe case. Now this wasn't just any old session – we had more than a few of the “hot” young names on the Scottish traditional music scene, and they would be turning to Ivan for forgotten tunes and suggestions on what tunes to put together in a set.

I'd never heard of this guy, and I still know nothing about his past musical career, but that's beside the point – Ivan didn't need a CV to get respect, he naturally commanded it. Calm and unassuming, his confidence and ability on his instruments was something that is rarely equalled, even among full time professionals.

He was 33 and his loss will be sorely felt far beyond his island.  My grief seems disproportionate given how little I actually knew him, but that says more about Ivan than anything else could.
 
 

07 June 2013

The filter of perception

I've been a bit preoccupied with exam season and have been putting off many things, including blogging.  But I'm going to try to get back into it, and I'm going to try to get this blog back on the language track, and I'll mostly be posting my thoughts on MOOCs elsewhere.

So here for your perusal is some recent thoughts on the filter of perception, and how it affects notions of learning by absorption.

During a recent break, while I was holed up at my parents' house due to a foot injury (a pratfall in a doorway), I started studying German with one of the free interactive web courses.  I'm relatively happy with the course itself, although there are (as with any course) some rather stupid things in it.

There are various types of questions in the course, including translation, multiple-choice and taking dictations.

So there I was, and I heard the computer say to me:
"Liest du Büchen?"
which I started typing.  But then I stopped, because I'd made a mistake with this before.  The form I should have been typing was "Bücher".  I knew this.  But I'm telling you, I heard Büchen.  Yes, the computer said Bücher, but I heard Büchen, clear as day.

The brain is a remarkable thing, and what we perceive is not always what hits our eyes or ears.

There are rules to the universe, and once our brain knows the rules, it filters what we receive to produce a perception that matches our expectation of the universe.  If we see a man standing half-hidden behind a wall, we don't perceive "half a man", we perceive "a man that we can see half of".  We make a rough guess at the hidden bits based on proportions relative to what we can see and the many hundreds of humans we've seen in our lives.  If you see a man with a hand over one (presumed) eye, you assume there's an eye underneath, and you would only be surprised when he moved his hand if there wasn't an eye there.

In language, this is particularly useful as it lets us understand dialectal variation without too much effort, and crucially without ever having to truly "learn" the dialect we're trying to listen to. 
If an Irishman said to me "ten times tree is tirty", I might well perceive "ten times three is thirty", and if the conversation was quick enough, I might not even be consciously aware that he had said T sounds instead of TH.  And if I said to him that "ten times three is thirty", he wouldn't have any problems understanding me, just because I used the "extra" TH sound that isn't in his inventory.

But while that's good for the fluent speaker, it's a potential pitfall for the language learner.  In the case of my German lesson, I had an unconscious rule in my head that said "-en is the German plural suffix" and that filtered the received "Bücher" and gave me the perceived "Büchen".  Now before anyone blames "rules" for my error, let me make it clear that this was an internalised, procedural rule rather than a conscious, declarative one.

Had I never been punished for perceiving it wrong, my ear would probably never have been learned to perceive the difference, because there would have been no impetus to do so.  (Say, for example, I was only asked to translate from German to English.)

And so it is for anyone living through a foreign language.  I recall one interesting experience when doing a listening lesson with two private students (I thought I'd mentioned this here before, but I can't find it in my posting archive).  There was a gist-listening exercise with comprehension questions, and then there was a series of close-listening tasks consisting of a sentence or two of audio and a fill-in-the-gaps version of the sentence on the worksheet.  As they whittled away the gaps word by word, they were left with two gaps, but that wasn't enough, because every time they listening to the recording, they heard three words: "prices of houses".  I replayed the file several times, watching them in fascination: "prices of houses", "prices of houses", "prices of houses".  How was it that even when they were listening very, very closely, they couldn't perceive the simple phrase "house prices"?

As far as I can see, it comes down to this:
that structure wasn't part of their language model, and continued exposure to the language only trained their ability to filter the input to adapt it to their structure, rather than adapting their structure to match the input.

If we comprehend input by mangling it to match our internal model, then accurate acquisition by comprehensible input alone must be an impossible dream.

06 June 2013

Mailmerge fail...

I received the following in an email this morning from the Open University:


June’s edition of OpenNews


You may have noticed that in your June edition of OpenNews the usual salutation was omitted. We’re sorry for this technical error but will ensure it is rectified before next month’s newsletter.

We hope this does not detract from your enjoyment of the newsletter.
Detract from my enjoyment?  Perish the thought!  There's few things better to alleviate newsletter fatigue than a simple little silly error.  The previous email had greeted me with the wonderful line Dear {salutation}.

I liked it that way....

31 May 2013

Backflipping the classroom – nothing new under the sun.


Before it's sudden closure, the Fundamentals of Online Education reintroduced me to two terms I'd previously encountered in passing, but never really thought too much about (I probably wasn't actively teaching at the time, so didn't really have much of a framework of reference to evaluate them against): the flipped classroom and backwards design.

The flipped classroom is a fairly simple idea, and its theoretical merits should be immediately obvious. I believe it arose in higher education, so let's look at it in that context. Every year, a lecturer delivers the same lectures (more or less) to a room of students. Lectures are not generally highly participatory, particularly early on in degree schemes (during my 1st and 2nd year in Edinburgh University, we were well into the 3 figures even in my smallest lecture group). But teaching time is a precious resource, and very limited. Why are we wasting the time of some of the most intelligent people in the world by having them say the same thing year in, year out, rather than freeing them up to get extra time with the students, dealing with problems? And why do we, as students, end up doing most of our practice exercises at home, where there's no-one to help us when we go wrong or get confused?
So the goal of the flipped classroom is to overturn the orthodoxy. Let's make the lectures available as video for study beforehand, and then when students come into class, the teacher's dedicated to what they individually need.
There are several reasons that this might not be such a good thing in practice for many subjects, but that's not what I'm interested in today. No, today, I just want to show that this is not a new idea.
What I've read about the flipped classroom seems to be coming more out of science faculties than arts, which is not surprising to anyone who had friends that studied literature at uni. Us science students used to mock the arts students for their light workload, because they had fewer classes on their timetable than us, but we saw that backbreaking pile of books they were carrying and thought “there but for the grace of God go I.” A literature student may have to read a long, heavy novel every week, and they have to read it before class. Their timetable is as empty as it is because they have very few lectures, and instead have more seminars where they discuss what they've read.

The same is true for a lot of arts degrees. You may be expected to read a major treatise by one of the great thinkers before going to a Philosophy class, and if you're studying classics, you might even be expected to read it in the original!  So it is wrong to suggest it's a new idea, simply because we now attempt to apply it to science classes.

Does it matter that it's not a new idea? In and of itself, no. However, in practical terms, if you don't acknowledge that someone is already doing it, you deny yourself the opportunity to go and ask the experts how it should be done!
Anyone who wants to “flip” their classroom should instead by asking how science can be made more like the arts. They should be asking arts lecturers what works and what doesn't; what can be passed to the student to do beforehand, and what has to be kept for the classroom. They should be auditing arts courses and experiencing for themselves the phenomenon they wish to replicate.


The other idea was backwards design.
Backwards design is the idea of starting by setting out what you want the students to know at the end of the course; then by deciding how you will verify how they have learned it; and finally you work out how to teach it.
For this to be presented as new or in anyway unusual is pretty hard to swallow, because people do this all the time, it's the absolute norm in schooling. A national committee writes a national curriculum. The exam board plans an exam format to test the criteria set out in the curriculum. Finally, the teachers and textbook writers are given the curriculum and sample exams and write their lessons.
Now, the traditional line is that teachers should be teaching to the syllabus, not the exam, but in reality, most teachers know that the exam is the primary goal for the students and they do indeed “teach to test”.
I said presenting it as something new was hard to swallow, but in fact I actually found it more frightening than anything. Were there people who weren't actually doing this?!?
...and then I realised: there are, and as a language teacher, I'm one of them.
It's been a source of frustration to me ever since I got into languages almost a decade ago that language teaching seems to have institutionally rejected the notion of a “syllabus”. There is no list of what a student should know at any level. We're asked to “learn/teach the language” rather than “learning/teaching the test”.
It's a laudable goal, but it leaves the learner or teacher, and particularly the self-teacher, in a rather bewildering forest of choices. Where to start? What next? Can I afford the time to cover this language point properly, or do I need to make do with an incomplete understanding and move on to something else?
For a long time I was convinced, though, that Cambridge (for example) had to have some kind of syllabus internally; a list of words, expressions and language points that examiners are allowed to include at every level, but now I'm beginning to wonder. Do they give their examiners the same advice they give us, the mere teachers that only have to prepare the students to sit and pass an exam based on unpredictable language: to use their “judgement” to pick something “appropriate to their level”?
Because to be blunt, institutions like Cambridge are completely failing in their goals. A responsible teacher will always “teach the test”, and if you don't give us the language we need to teach, then we have no choice but to devote more of our time to exam techniques, and we end up spending less time teaching language.
So I'm very much in favour of the goals of backwards design, but I'm worried that by naming it and treating it as something new and different, it will come up against resistance to “change”, even though it is not, in fact, real “change” – it's a defence of the longest standing traditions in education against a combination of flawed teaching ideologies and sloppy practice.

So these philosophies have created two obstacles for themselves by pretending to be new: they discount all the existing evidence, and they turn off people who might otherwise be convinced by the past experience of their colleagues.

04 May 2013

Coursera offering free teacher training!

I've just been nosing around on Coursera looking for interesting courses to take.  I'd read recently that they'd signed up several new course providers, including the first of their providers that aren't accredited universities.

My first reaction was to doubt the value of non-university courses, but one of these suppliers has brought with them something that was lacking in the previous material: course progression.  Some of the universities have been joining Coursera just because it's the in thing, and others have been using it as an advert for their distance education programmes.  But it's never in a university's interest to offer an entire programme for free.

Enter the Commonwealth Education Trust, a charity whose mission is to provide teacher training at primary and secondary level to improve children's education in developing countries in the Commonwealth.

Their whole goal is to provide complete teacher training for free, so teaming up with Coursera reduces their costs and extends their reach and their 8 module teacher training programme is a win for everyone involved.

Their main target is at practicing teachers who haven't had any formal training, and I'm intending to follow it as a supplement to my CELTA certificate, which I always felt was slightly insufficient as teacher training.  The CET programme is estimated at between 180 and 280 hours in total, covering 46 weeks of activities spread over about 16 months (the first sitting of module 1 starts this August, and the first sitting of the final module starts next November).  In total, that's actually comparable to the amount of time you're expected to spend on a 4 week intensive CELTA course, so I suppose I'm hoping there's a difference due to the quality of content, and the fact that this is general teaching with no specific language focus (I've always felt that language teaching suffers due to a belief that "language is different", so the lessons from general teaching are sometimes ignored).  Also, the pacing of the course should theoretically help long-term retention: my CELTA felt heavily "crammed", with no proper consolidation of learning.

On top of this, the Trust are also offering some kind of certification for people who complete all 8 modules:
On the satisfactory completion of each course you will receive a statement of accomplishment related to the course.  On the completion of all the courses you may contact the Commonwealth Education Trust to request a statement of accomplishment related to the overall program.
I'm not aware of whether the Trust is part of any recognised accreditation scheme, but it's certainly likely to be looked on favourably if you're applying for voluntary teaching work in a Commonwealth country.

I'll be taking it this year (or at the very least "starting it" -- I've got a poor record with free online courses, not having completed a single one yet), so I'll let you know how I get on.  There's a second sitting starting next January.