30 December 2011

Who am I? Who am I? WHO AM I?


So as I said a while ago, I've recently started trying to work on my Welsh again.  I did a beginners' course last year, and I never really felt I'd got any real competence in the language (despite getting a pass in the course), and so I figured it was time to do it properly.

Now when I dug out the books (as I said), I just found myself really frustrated (and I've tried two more sets of course materials since the previous post).

So I got myself onto iTunes U to see if they had any useful materials and found a podcast Dialogues for Welsh Learners from the University of Glamorgan.

Well I've just fired up the playlist.  I listened to the introduction; fine.  I listened to the first "episode": Pwy dych chi? (= Who are you?)  The podcast was 5:39 long (including the usual timewastery) and was literally devoted to the question "Who are you?" and it's response "I am ...?"  Please note that this is not aimed at teaching the question, only at practising what you should already have learnt during your course.

Surely, surely, there is something wrong if these 5 words are so difficult that it takes this long.  And what is wrong?  It's my favourite phrase of 2011: disordered state.  The question and answer are trivially easy in terms of the language itself -- it's one of the most basic structures imaginable.  And yet people find it, as a phrase, difficult enough to merit 5 minutes of dedicated practice as well as untold teaching time in the class itself.

Doesn't this show just how inefficient phrase-based learning really is?

29 December 2011

Counterintuitive, perhaps, but sometimes it's easier to start with the harder material...


In general, whenever we teach or learn something new, we start with the easy stuff then build on to the more difficult stuff.  But this isn't always a good idea, because sometimes the easy stuff causes us to be stuck in a "good enough" situation.

When I started learning the harmonica, I learned to play with a "pucker technique", ie I covered the wholes with my lips.  The alternative technique of "tongue blocking" (self descriptive, really), was just "too" difficult for me as a learner.  So for a long, long time, the pucker was "good enough" and tongue blocking was too difficult for not enough reward.  It limited my technique for a good number of years, and now that I can do it, I wish I'd learnt it years ago.

The same block of effort vs reward happens in all spheres of learning.  If you learn something easy, but of limited utility, it's far too easy to just continue along doing the same old thing, and it's far too difficult to learn something new, so you stagnate.  Harmonicas, singing, swimming, skiing, mathematics, computer programming; there's always the temptation to just hack about with what you've got rather than learn a new and appropriate technique.

This problem, unsurprisingly, rears its ugly head all too often in language learning, but with language it has an altogether insidious form: the "like your native language" form.  If you've got a choice of forms, one is going to be more like your native language than the other, and this is therefore easier to learn.  Obviously, this form is going to be "good enough", and the immediate reward to the learner for learning the more difficult form (ie different from the native language) isn't enough to justify the effort.  However, in the long term, the learner who seeks mastery is going to need that form in order to understand language encountered in the real world.

The problem gets worse, though, when you're talking about dialectal forms.

Here's an example.  Continuous tenses in the Celtic languages traditionally use a noun as the head verbal element (known as the verbal noun or verb-noun).  I am at creation [of] blog post, as it were.  Because it's a noun, the concept of a "direct object" is quite alien, and instead genitives are used to tie the "object" to the verbal noun.  In the case of object pronouns, they use possessives.  I am at its creation instead of *I am at creation [of] it.  Note that the object therefore switches sides from after to before the verbal noun.

Now in Welsh, the verbal noun has become identical to the verb root, and is losing its identity as a noun.  This has led to a duplication of the object pronoun, once as a possessive, once as a plain pronoun -- effectively I am in its creation [of] it.  This really isn't a stable state, as very few languages would tolerate this sort of redundancy, and the likely end-state is that the possessive gets lost, and the more English-like form (I am in creation [of] it) will win out.  In fact, there are many speakers who already talk this way.

But for the learner, learning this newer form at the beginning is a false efficiency.  There are plenty of places where the old form is still current, so unless the learner knows for certain that they'll be spending their time in an area with the newer form, they're going to need the conservative form anyway.  To a learner who knows the conservative form, adapting to the newer form is trivially easy, but for someone who knows only the newer form, the conservative form is really quite difficult to grasp.

So teaching simple forms early risks restricting the learner's long-term potential.  So while you want to make life simple for yourself or you students, make sure you're not doing them or yourself a disservice.

26 December 2011

Creoles - the same story once again


So I was directed this morning to a news story on the BBC about the translation of the Bible to Jamaican Patois.  It's a move that's long overdue -- whatever you think about religion, you have to accept that the place of worship is vitally important in the survival of language wherever a large percentage of the population are religious.  The lack of a Bible translation and the use of the dominant language in religious services have been cited in the decline of many languages, including Scottish Gaelic.

It has been welcomed by some:
Several women rise to testify, in patois, to what it means to hear the Bible in their mother tongue.
"It's almost as if you are seeing it," says a woman, referring to the moment when Jesus is tempted by the Devil.
"In the blink of an eye, you get the whole notion. It's as though you are watching a movie… it brings excitement to the word of God."
Unfortunately, not everyone is so happy.
But some traditionalist Christians say the patois Bible dilutes the word of God, and insist that creole is no substitute for English.
You know what?  There was a time when people would insist that English is no substitute for Latin.  And even that was bigotted, because the Latin Bible was just another translation of the Greek, and it wasn't even that accurate!
What we have here is proof, if proof were needed, that a great many objections to minority language are a simple case of resistance to change.

Creole in primary education

The article doesn't restrict itself to the Bible, but follows on to a topic that is a matter of active debate in most creole-speaking countries: the place of Creole in the primary sector.

The story is always the same: the "big" language is of major economic importance, and therefore should be the focus of education.  As a political statement, it's appealling, and it doesn't take much thought to agree with it.  Which is just as well, because it doesn't really hold up to much scrutiny.

One thing that has been fairly well proven across the world is that kids do better in school if they are given "initial literacy" (their first experience of reading and writing) in their own language.  On the other hand, gaining their initial literacy in a new language actually hampers their ability to pick up the language accurately.
Worse, in some creole-speaking countries, the teachers are really only creole-speakers themselves.  Education in Haiti, for example, is very heavily orientated towards French, but the teachers really don't speak the language properly.  What you end up with is kids who aren't competent in either their own language or the "important" language.

All the figures show that the best thing to do is to start school in the kids' own language (and the teachers'!), and that the new language is best introduced in a spoken form, and by a native speaker.

Which isn't quite the same as what we do in Scotland with Gaelic-medium education, sadly....

15 December 2011

I love learning languages... but I hate language learning

Ok, so yesterday I had my last exam of the semester, so I decided to take a break from Gaelic and start working on my Welsh.  I never really did much study before, but trying to catch as much as I could by watching the Welsh-language soap opera Pobol Y Cwm regularly has helped some of it stick (but not all, by a long shot).

So I went to the college library, and started reading Asterix ym myddin Cesar, the Welsh translation of Asterix the Legionary.  Oooooh... it's tough going.

So rather than attempt to struggle through it in the library with a dictionary, I decided to check it out and take it back to my room to go over it seriously with a grammar book.  I was the first person ever to do so -- which isn't surprising given that there isn't even a Welsh course here...

So I took my copy of Teach Yourself Welsh Grammar off my bookshelf, and started reading... then stopped.  You see, while I love learning languages, the vast majority of language learning material is excruciatingly bad.  I know that this book isn't a language course, but it is aimed at learners.  So when the first chapter after the pronunciation guide starts by individually listing 31 different circumstances in which the soft mutation occurs, it immediately loses its audience.  There's no structure -- just a list.  In several of these circumstances, LL ard RH are immune to mutation.  Did they group these together?  They're numbers 1, 5, 6, 18 and 28.  There's no implication that these are in any way related, meaning the learner risks trying to learn 5 exceptions instead of one group.

I'm trying to extract enough information to teach myself, but I'm overwhelmed by information -- I have to try to read and understand it all in order to identify the patterns and salient points.  It's tiring, frustrating, and to a great extent insulting.

Yes, insulting.  Because at one level, the mere existence of the book is a claim by the author that this is good enough for the learner.  And if the book is good enough for the learner, then it must be me that is the problem.

I'm lucky -- I feel insulted.  Many, many people genuinely believe that they're at fault -- that they're "stupid" or "not good at languages".  And they think that I'm good at languages.  Well believe me, I'm not.  Even despite spending countless hours in this sort of book, I still can't make head nor tail of some of them.  If anything I'm worse at languages than the average, and I've only got where I am today because I refuse to believe I'm incapable.

The hardest part for me in learning any new language is getting started, because in general there's just too much information thrown at you in an unstructured and poorly thought out way.

So for those of you starting out and discouraged by your materials, remember: you're not the only one.

24 November 2011

The Myth of Groupwork


Today's blog post was inspired by me walking out of a class for what may be the first time in my life.  (I probably ran out of a few classes as part of childhood tantrums, but that doesn't count.)

Now I've always felt a lot of groupwork is a waste of time, because you could complete the task much quicker on your own.  But then I would say that, wouldn't I, because I always did well at school.  Theory has it that groupwork is an opportunity for the weaker students to learn off the stronger ones.

OK, so in this particular class, I've found myself being "the one who knows stuff" in pairs a few times, so I've sat as scribe and asked the other person for all the answers, and only offered anything myself when the other person wasn't sure or when I disagreed with them.  But today we were working in threes, not pairs, and for once in my life I was no longer the brainbox/swot/smart-alec because it was something I've never learned properly.  But the group scribe (not me) was writing away, filling in the "easy" ones, including quite a few I wasn't sure about.  Her and the other guy were discussing answers, and I wasn't really able to chip in, as I didn't really know how to explain what I was trying to say, or how to word a question if I had any doubts.  So I muttered a few swear words, put down my pen, and left the room.

Why wasn't I learning off the stronger students?  Quite simply because there is a difference between a good student and a good teacher: it is a teacher's job to ask questions that they already know the answer to.  Students, on the other hand, ask questions that they don't know the answer to.

What exactly was going through my classmate's head is hard to say for sure, but there's two likely explanations.
  1. She was acting in a goal-orientated way.  She had a quiz in front of her and the goal was to get all the answers, like in a pub quiz.
  2. She categorised the questions as "hard" and "easy" based on her own perception of difficulty, and only asked our opinion on the "hard" ones, assuming that we weren't interested in the "easy" ones. 
As I say, I can't say which of these (if either) was her motivation.  However, I can say that these two situations are quite possible, and indeed likely, in any classroom.

Both of these approaches introduce problems. 
  1. In a pub quiz, everyone answers questions on topics they're confident about.  People who aren't into sports might pop outside for a fag during the sports round, for example.  Unfortunately only answering questions on what you already know doesn't lead to learning.
  2. The "easy" questions are the ones we expect the weakest members of the group to answer, and we hope that by listening to the strong students answer the "hard" ones, they'll learn from them.  However, if the scribe is a strong student (and they're the ones most likely to volunteer), then the easy questions may never be asked, so the weak students never get any opportunity to do anything.  And as weak students are usually shy about their weaknesses, they're not going to butt in.
Now of course neither of these two situations is inevitable, but there are very few students who are genuinely aware of what is expected of them in groupwork -- I am only aware of it because of my own situation as a teacher.

Although I don't have any statistics to say how often these two situations arise, I can state categorically that current groupwork practices leave open the possibility that these situations arise, and it's a possibility that the teacher has little control over or visibility of.

Perhaps the teacher is also blinded by a task-orientated mindset.  When we see that the task is completed and the students have the correct answers, how often do we ask ourselves how they reached those answers?   Can we ever truly know?  I think not.

And that is why I called this post "the myth of groupwork".  I am not saying there's no such thing as groupwork, but that groupwork is something we take on faith, uncritical of the facts or evidence.

As teachers we cannot directly control our students' thoughts, but we must take steps to reduce the possibilities for them to complete tasks in pedagogically pointless ways.  Current groupwork practice opens up too many "wrong paths", and that needs to change.

22 November 2011

Link drop: how technology is changing language

A very well-written article with a lot of material from David Crystal about the effects technology is having on language and literacy at: http://www.silicon.com/technology/software/2011/11/21/from-lolcat-to-textspeak-how-technology-is-shaping-our-language-39747927/print/

17 November 2011

A dull echo of bad practice in teaching...

There are many things in language teaching theory that are hotly debated, but there are some things that are universally accepted.  In theory.  In practice, they can be forgotten about.  I'm currently working through the Michel Thomas Polish Foundation course and one of these springs to mind:

The echo effect

The echo effect is quite simple: the last thing you hear stays in your mind longest.  The theory around this varies as our understanding of the human brain improves, and some people talk about "echoic memory", others about "feedback loops", others still "working memory".  But whatever the theoretical models people come up with, they all seek to model the same universally agreed observation: the last thing you hear stays in your mind longest.
The echo effect in practice

So that's the theory, but how does this work in practice?  The canonical example would be the listening exam.  A sentence or passage presented in a listening paper will be followed by silence -- all instructions come before the passage so that the internal echo is the actual material, not the instructions.  After all, if the instructions are clear, the student should understand and internalise them easily.

Failure to follow through to the classroom

However, you will find that some teachers don't consciously consider the echo effect in their day-to-day teaching.  Instead, they try to follow a natural order for language.  The reason this example is based on an MT course is that it's the nearest most lay people get to being able to observe a language class.

Let's look at a couple of quotes from Jolanta Cecula's MT course:

"I'm sorry, but I don't quite understand what you are saying"... talking to a man? (CD3 Track 2)
Notice here that the background information, "talking to a man" comes after the sentence to be translated.  This means that "talking to a man" is in echoic memory, rather than "I'm sorry, but I don't quite understand what you are saying".  This makes the task harder in a way that is of benefit to the learner.

Two tracks later, we get this:
"Can you help him", meaning to him, asking a woman? (CD3 Track 5)

Here we have two pieces of background information coming after the material that should really be in echoic memory.  The learner then has to expend effort on recalling the prompt, distracting from the task of producing the desired target language.

Hi-ho, hi-ho, it's off to work(ing memory) we go...

But the problem of prompt wording goes beyond the simple echo effect, and into bigger questions of language processing.  On a few occassions, the course has prompts of the form:

So what would "" be? / So "" would be...?

Here we make life harder for working memory by interrupting the simple prompt with the phrase for translation.  Processing the interrupted clause further distracts our working memory from the target translation, and makes the task unnecessarily difficult.

What I strive to do in class is to make sure the students know what is expected of them with the minimum of prompting.  In the case of teaching-by-translation, MT-style, I would start a session with some explicit prompting, but quickly move to just giving them the target phrase with no other prompting.

Let them concentrate on the language, not on the classroom

14 November 2011

An example of language change: Genealogy.


Genealogy has always been moderately popular as a hobby, but in recent years it has become all the rage, thanks to TV programmes like the BBC's Who Do You Think You Are? which shows celebrities and public figures tracing their family trees (and often crossing continents in the process).

Now, I had always thought the word was geneology, but the BBC and various websites disabused me of this notion.  But just the other day, one of the other students here mentioned that her dad was working on the family tree... and she said "geneology".

Let's have a look at the etymology of the word.

According to Etymonline, genealogy comes from the Greek "genea" (generation, descent), + "logia", (to speak about).  So originally -logy was about lecturers, and over time was generalised to experts, and hence knowledge.

Unfortunately, the English-speaking brain doesn't understand declension of nouns, so it sees the first morpheme as "gene", not "genea", and expects the "alogy" bit to be a single morpheme.  As most "-logy" words are "ologies" (biology, radiology, geology etc), we have generalised all -logies to -ologies.  (Even though Etymonline has the suffix entry as "-logy".)

Don't believe me?  Consider this famous advert from the 1980s:

If the English-speaking brain recognised the original morpheme boundary, would they have scripted it as "ology"?  And would we have understood as easily?  The popularity of the advert (it was a widely-used pop-culture reference for years after it stopped showing) suggests it's natural English.

Given all that, I can only conclude that the word is, to all intents and purposes "geneology", and that attempts to preserve the A are misguided.

Let English be English and let Greek be Greek.

09 November 2011

Overgeneralising and undergeneralising in general...

In English, we have two ways to talk about nouns in a general sense.  In normal speech, we say things like cats are vicious little creatures -- i.e. we use an indefinite plural.  In some very formal prose, you'll see instead the cat is a vicious little creature --i.e. a definite singular.

The existence of the second is probably just a case of "translationese" -- it arises in lots of translations of Latin works, and I believe it is used that way in most of the modern Romance languages (French, Italian etc).  Unfortunately this isn't easy for me to verify, as I have no idea whatsoever what to look for in the index of my grammar books.

Bizarrely, this fundamental (and straightforward) element of language seems to have been overlooked in the classical grammar models, so there is no common label for it (hence me not being able to look it up!).  This means it is often overlooked in teaching, too.  Many beginners' courses pass it by, and even when it comes up, you're not likely to get more than a little box-out mentioning it.  It's not really "taught" in the same way as other grammar points.  I suppose the reason for this goes back to the very basics of the structuralist view of grammar, which values form over meaning, and too often simply gives a few short sentences explaining usage after drilling form.

But we've been moving away from structuralism for quite some time now.  The in-general/universal has been marooned by the incoming tide, as functional and communicative approaches have picked up on the link between form and meaning in the noun and article for specifical and truly indefinite cases, but they've not integrated the general/universal with it.

This underemphasis of the general/universal is particularly noticeable in Gaelic.  It's not a subject I've seen come up often at all.  I read it in one book and one book only, and I don't believe I've ever heard it discussed ever in classes.  According to the book (well, my memory of it -- the book's 100 miles away), the general/universal in Gaelic is the definite singular. (The cat is a vicious little creature, the lion is a noble beast etc.)  And yet....

When you study the genitive in Gaelic, it may be pointed out to you that while "describer nouns" in English always stay singular even when representing a plural concept (for example "biscuit" in "biscuit tin", "tooth" in "toothbrush"), this isn't the case in Gaelic genitives, which have both singular and plural forms.

So I was giving a talk in a classroom debate, and I mentioned "teenage pregnancy" which I rendered as "leatromach nan deugairean" -- "pregnancy [of] the teenagers".  Genitive, plural.  After the class, I started asking myself if that was right, thinking of the general/universal rule.  Now I'm too confused and I'll just have to ask one of my teachers to try to clarify....

04 November 2011

Choose your terminology carefully

People often get confused with all these fancy words in language learning and teaching.  I discussed my basic views on terminology before, but I want to look at one of the big problems that poor thought over terminology can cause: false dichotomies caused by false opposites.

For example, in the use of reading or listening, we often talk about intensive and extensive input.
  • Extensive input means reading or listening to a lot.
  • Intensive input is reading or listening closely and carefully, perhaps going back over material to get a lot of detail.
These things are not opposites, and on a conscious, intellectual level, most people recognise this.  Of course, they're not completely compatible either, because it's hard to read a lot if you're reading it slowly and carefully.

But once you stop thinking very carefully, most people start talking about the two things as though they were truly opposite.  Why?  Because whoever chose the terminology wanted two names that looked like a set, but inadvertantly made them look like opposites.

(Of course, this is just an extreme example of the problem of counterintuitive grammatical terminology, such as "regular" in the earlier article).

31 October 2011

When teaching grammar, stick with the uncontroversial stuff...

I was in a grammar class for my Gaelic course today, and we were looking at noun declensions.  For one set of questions, we were using the word fàinne (ring).  Unfortunately, this word is masculine in some dialects and feminine in others.  The question stated that the word was masculine, but my partner for the exercise (a native speaker) has used it all her life as a feminine word.  She declined it perfectly correctly in each case -- as a feminine word.  The explicit instruction to decline it as masculine was ignored because she already has 100% intuitive command of the word.

Correct completion of the task therefore required that she stop dealing with the words as "language" and start thinking of them as some kind of mechanical logic puzzle.

The problem with the task is that it became counter-intuitive.  When teaching grammar, we need to employ as much pre-existing knowledge as possible.  Grammar teaching for natives has to start with forms they know, because you are not actually teaching "grammar", you are teaching "grammar awareness", and that simply means making them consciously aware of things they already know intuitively.

So you have to pick the most uncontroversial examples, the most universal and unchanging.

Work with your students, not against them.

30 October 2011

Think before you teach

Every culture has its own set of persistent myths that are passed down the generations.  I'm not talking about gods and monsters, though, I'm talking about myths about language.  Myths are like optical illusions -- once you know what you're looking at, it's obvious to you, and you can't imagine how you missed it.

As teachers, we should be wary about passing on linguistic folklore uncritically.  We should look at everything we've been told about our languages in detail to see whether it holds up to scrutiny, and if it doesn't, we shouldn't teach it.

I'd like to give you an example from the teaching of Scottish Gaelic.

If a course teaches noun cases explicitly, it will state that the "second noun" is always in the genitive.  It will then go on to add the acception that if another noun follows it, it isn't in the genitive, so that in a long and complex noun-phrase, only the final noun is in the genitive.

So it is self-evident that "the second noun takes the genitive" is an incorrect rule.  The actual rule is "the last noun in a noun phrase takes the genitive".  Once you see this, it is obvious, but the "second noun" rule is now so all-pervasive that I'm currently hearing it in grammar classes aimed at fluent speakers in their second year of university study.

The myth is being passed on to a new generation.

27 October 2011

Learning a related language without trying...?

So, you know French and Spanish and want to learn Catalan?   English and German and want to learn Dutch?  Polish and Russian, and Ukranian?  It should be dead easy, right?  All you need to do is start reading and you'll get it.

Possibly, but it's still worth picking up a book and doing a bit of study.

Eliminate the negatives:
If you just start reading (and/or listening), you will develop a reasonable passive understanding, but there's a couple of ways this limits you.  Passive understanding does not require you to process all the language in front of you (you can gain complete comprehension without complete perception) so you are never forced to develop an accurate internal model of the language.

This can be very limiting, because in the future, if you decide to learn to speak the language then you're faced with a massive "frustration barrier" -- a lot of people find that being able to understand lots but not answer is a very unpleasant situation.  It seems more discouraging to me at times than simply having low skills all round.

Accentuate the positives:
Besides, the most important thing stage of learning a related language is the very basics.  How so?  Well this is about the nature of regular and irregular language forms (as I've been talking a bit about recently).

Irregular forms are almost always the most common forms.  So in English "child, children" vs "adolescent, adolescents"; "give, gave" vs "donate, donated".

Interestingly enough, it's those most common words that are least stable.  The English word "will" (I will go, etc) developed from a word meaning "to want" (compare Modern German will and the Modern English noun will -- eg strength of will, willpower), whereas "want" originally meant "to lack" (they found him wanting, ie inadequate).

The same effects can be seen in other language families -- while the Italian and French words for "to have" come from the Latin "avere", the Spanish word for to have (I have a car etc) is derived for the Latin word to hold -- "tenere".  But when we get to a rarer word like "cultivate", we have an almost identical word: cultiver(FR), coltivare (IT), cultivar(ES).

So when a Spanish person says they can "understand" Italian or Catalan without ever having studied it, they genuinely believe that they can, because they can understand what they think are the "difficult" words, but are in reality the easy words.

The mistake most people in this situation make is to skip the beginners' material and jump straight to the advanced.  But it's the beginners' material that teaches most of the things you really need to learn. 

A little bit of time dedicated to the basics (conjugations, pronouns, declensions) at the start will accelarate you through to 90% understanding very quickly.

21 October 2011

How irregular!
I often say that the problem most people have with grammar isn't the grammar itself, but how it's described (I even wrote a post about this a couple of months ago).  The Romans came up with quite a sophisticated way to describe grammar, and it was so successful that we still use it to this day.  However, what a lot of grammarians still haven't twigged is that what meant a lot to Romans means absolutely nothing to your average inhabitant of 21st century Earth.
One of the words that would be completely straightforward to a Roman is regular, and of course the converse irregular.  People familiar with grammatical terminology tend to think it should be easy for an English speaker too, because the root of the word is so common in English: reign, rule, regulations.  A "regular" form is one that follows the rules.  Simple.
Except it's not, because we don't use the word regular to mean a rule-follower in any other situation.  Outside of language circles, it means to do something with a predictable frequency or schedule.  So what - it's a different thing, so there's no need for confusion, right?  Wrong, and this is a subtlety that's easy to miss, even though it goes to the very heart of irregular forms.
The majority of words in any language are regular -- the vast majority follow the rules.  The irregular ones, the ones that break the rules, are in a minority.  Which words are irregular?  Well, it's always the common ones: to be, to have, to go; child/children etc.  This is uncontroversial - it's a well-known statistic.  That there are a few uncommon irregular forms (eg ox/oxen) doesn't break the rule, because these are forms that were common relatively recently (oxen were still in use 100 years ago, because not everyone could afford a new-fangled "tractor") and are being lost anyway (when did you last call talk about an "ox"?).
So here we have a rather nasty piece of cognitive dissonance - the forms that are most regular in terms of frequency of occurrence are the ones we call irregular, and the ones that are least regular by frequency of occurrence are called regular.
A word that is supposed to help us understand actually ends up confusing us further, and we're not even sure why we're confused.  Not good.
This problem isn't limited to English, though, as the equivalent word in the Romance languages (Italian, French, Spanish etc) tends to have a similar meaning to the English.
The failure to understand the concept of regular has profound consequences in the teaching of forms, particularly when it comes to verbs.
How irregular?

The first thing that people forget is that regularity is not a yes/no question.  While the majority of verbs are completely regular, some "irregular" verbs are only slightly irregular.

For example, the conditional and future simple of a Spanish verb are formed by adding a suffix to the infinitive.  There are no verbs in the language that are irregular in terms of the suffixes.  There are a handful of verbs that don't use the infinitive, instead forming a "future stem" by dropping the vowel from the infinitive ending.  But then, is this even an irregularity?  The process we're looking at here has a name -- syncope -- and it occurs in other languages.  You can argue, then, that these future forms aren't irregular, because they do indeed follow a rule.  After all, the 3 major verb groups in Spanish (-ar, -ir, -er) all follow different rules, yet we still refer to "regular" verbs in each "conjugation".  So if we make a category of "vowel-dropping verbs", suddenly we find that we've defined the Spanish future as having no irregular verbs whatsoever.  At the very least, I would argue that the verbs that undergo syncope are only slightly irregular.

But maybe it's unfair of me to talk about the future stem, because as a rule Spanish stems are far more stable than their counterpoints in Italian and French.  In all three languages, the conjugation of "to go" is built on three different Latin roots -- ire, andare, vadare -- making it the single most irregular verb in the language (whereas the most irregular verb in English is be/is/was).  But with other irregular verbs, Spanish picks a stem for a tense and runs with it.  So while the present tense of "to have" in French and Italian is bisyllabic in the 1st and 2nd plural forms but monosyllabic in the singular forms, both words in Spanish ("tener", lexical verb; "haber", auxiliary verb) stay consistent across all persons, even though these are irregular verbs.

But all in all, we can see that some verbs are more irregular than others.

Introducing...

But if there is a scale of irregularity, what is the extreme of this scale?  I'd like to introduce a new term to describe it.  I call it...

...Contraregular

There are words out there that completely ignore the rules (eg go -> went), but there are others that go completely against the rules.

For example, nouns ending in -o in Spanish and Italian are masculine, as a rule (rule -- Latin regula -- regular).  Yet "mano" (hand) is feminine.  It goes completely contrary to the rule, hence "contraregular".

Why is this important?

This goes back to the fundamental nature of irregular forms that I mentioned earlier: they only occur in frequent words or structures.  And in fact, the most common items are generally the most irregular.

And what is it that we tend to teach first?  That's right, the most frequent words and structures.  Hence the most irregular -- including the contraregular.

So in Spanish, you might learn the following within the first hour:
Good morning - Buenos días
Good afternoon/evening - Buenas tardes
Good night - Buenas noches

These are all contraregular -- -es normally marks masculine plural, but tarde and noche are feminine nouns, so the adjectives are marked in feminine plural. -as normally marks the feminine plural, but día is actually masculine, so the adjective is marked in masculine plural.

This isn't that big a deal if you just tell someone about it, but in many classes, you don't -- the student is expected to infer the grammar from examples.  If your students are first exposed to counter-examples, to exceptions, how can they generalise?  The irregular forms become a blockage, and the students are forced to learn each example as if there were no rules whatsoever.

But even if we do explain, should we be teaching this before we've covered the basics of regular adjectives?  I see no reason why we should.  There is no proven pedagogical advantage to being able to parrot a few fixed phrases before learning to use the grammar productively.  As a practical matter, such greetings may seem immediately useful, but there is no genuine value in being able to say "good morning" when you are otherwise incapable of saying anything in the language.  Besides, within a few hours in the classroom, you should be able to get your students to the point where they can construct such phrases themselves with a little bit of guidance.

18 October 2011

An unfunny joke


An Englishman, a German, an American and a guy from Barra walk into a bar.  "Tha Gàidhlig cho cudromach," says the Englishman [Gaelic is so important].  "Tha Gàidhlig cho sònraichte," says the German [special]. "Tha Gàidhlig cho breagha," says the American.  "I'm going for a slash," says the Barrach.

Not funny at all, I'm sure you'll agree, but you might not fully appreciate just how unfunny it truly is.  In order to understand it, though, you need to know that the Barrach is a native-speaking Gael.  So why did he speak in English?

It's something linguists like to call "divergence".  We use language to indicate social distance from, and proximity to, others.  When we speak like someone, we show variously agreement, respect or even affection.  I find my accent when speaking any foreign language varies depending on who I'm talking to, as I try to match them (particularly if it's someone I fancy).

The Barrach in the "joke" isn't rejecting Gaelic, then, but is indicating that he doesn't associate himself with the three foreigners.

What we have here is the core paradox of the current Gaelic revival.  While everyone says that the goal is for Gaelic to be considered normal in all contexts, the act of attempting to achieve this is actually making Gaelic into a far more self-conscious choice.  Gaelic is at risk of developing a sort of "personality" based on the feelings of the loudest advocates of the language, and therefore people who do not identify with this personality will therefore find themselves subconsciously pushing away from the language.

Well, I say "at risk", but I actually think that this is already the case in many parts of Scotland.  While not a statistically significant portion of the population, there is a reasonable number of native Gaels in Edinburgh.  Yet when there is a Gaelic-related event put on, it's often mostly the learners that turn up.  The natives will happily sit and talk to each other in their own language, but Gaelic in a public setting seems to be overly politicised for most to identify with.  (The association of Gaelic with nationalism has no real basis in fact - Gaelic is a language and is spoken by people of every political allegiance.)

The problem is that the domain of the well-meaning learner is stretching further and encroaching into the few remaining Gaelic heartlands.  Adult learners are gaining ever-increasing air-time on television and radio, as well as positions at all levels of Gaelic education.  Even several prominant members of the Scottish Government's Gaelic language agency are adult learners.  People are even being encouraged to learn Gaelic in order to teach in Gaelic medium schools, despite it being self-evident that the education available is insufficient to bring anyone close to a near-native model.

It is now often said that Gaelic's future is in the hands of the learners.  This is true, but it does not mean what it is supposed to mean.  We as learners cannot save Gaelic, but we do have the power to kill it within a generation.

If we want Gaelic to continue, then we must be humble.  We must accept that:
  1. we are not "Gaelic speakers", and we never will be;
  2. the books we study do not, in fact, contain "correct" Gaelic, but someone else's guess about what Gaelic is - the natives are the only real model worth following;
  3. Gaelic is not "ours" or "our heritage" - it belongs to the Gaels;
  4. and the most difficult of all: we shouldn't put ourselves forward as representatives of the language, either in a professional or amateur capacity.
In fact, I think it would be far more healthy if no-one even defined themselves as a "Gaelic learner", but instead as a "language learner".  Gaelic is a language, just like any other.  Learning another language or two will not only help you see this, but it will also actually improve your Gaelic.

13 October 2011

The effects of Michel Thomas in the wider teaching world

It seems like every other post I mention the excellent lecture by Wilfried Decoo On the mortality of language learning methods.  So I suppose it's not a surprise to see me bring it up again.

One of Decoo's central points was :
A new method draws its originality and its force from a concept that is stressed above all others. Usually it is an easy to understand concept that speaks to the imagination.
As more and more people bring out products inspired to some degree by Michel Thomas's work and the mist starts to clear, we're starting to see what concepts have been taken from MT to drive the next batch of teaching styles.

There's quite a few floating about now, but as I'm now a professional teacher, I don't feel comfortable discussing them by name.

The general notion that we're getting from all of them suggests that the soundbite for the next generation is something along the lines of:
Learn to form sentences, instead of parroting phrases.
This is a good start.  I agree with it 100%.  However, once we reduce the whole teaching philosophy to an eight-word phrase, we're in danger of slipping further away from Thomas again.

If you think about it, it's a very broad and vague phrase.  It's very easy indeed for anyone to rebrnd their materials to demonstrate how they fulfill this criterion without actually changing anything.

By definition, any tables-and-rules grammar course can claim straight off that it's all about sentence building.  But we know that the strict table-based methods are pretty ineffective.

And the phrase-based courses will reassert that they only use the phrases to show you how to form sentences.  Changing je voudrais acheter un croissant to je voudrais acheter un stilo is, at least superficially, a form of sentence building.

What I predict happening is that there will be a few more of these "upstart" entries into the market, but that within a few years, all the major publishers will be looking to knock the wind out of their sales by taking the rhetoric of this new movement and applying it to the latest iteration of their material.  What we'll be left with won't be much different from what we've had over the last 100 years, but with luck, it will be slightly better.

07 October 2011

I'm back!

Yesterday I finally put an end to my study with the Open University, so I've got a bit more time to think and write about stuff.

I've also just started a full-time university course in Gaelic, and will be supporting myself by teaching various bits and bobs of other languages while I'm here, so it's a good opportunity to look at the classroom from both ends simultaneously.

I've got a lot of stuff sitting in the drafts folder, and a few of them tie neatly into some of my thoughts about the course here, so I should be able to keep the blog regular for a while.

24 August 2011

Mother tongue is mother's milk

I was reading an article on the role of Haitian Creole in the Haitian education system on the BBC news website, and it saddened me a little to see the same old debate that I've seen a thousand times before, and with to see from the comments that people still don't understand it.

The article proposes nothing radical.  The proposal is to teach Haitian children to read and write in their own language.  In academic terminology, this is "mother tongue initial literacy", and it has been proven time and again to be one of the most effective strategies.

In many, many countries, the establishment has imposed the dominant official language on the education system.  Generally the speakers of local or minority languages do badly at school.  Traditionally, this was dismissed by claiming that whichever groups was inferior -- remember that not that long ago, many serious scientists tried to define taxonomies of human "races" showing the "deficiencies" of anyone who wasn't in their own demographic.  Heck, people in my part of the world used to think having dark skin made someone an animal, and a comodity to be traded for a handful of shiny metal discs!

Thankfully, most intelligent people now accept that intelligence is universal.  The apparent differences in intelligence between dark-skinned Africans and light-skinned Europeans are down to the level of development of the education system.

Yet people are still willing to believe that people are in a particular social class because of their intelligence, and are willing to put down failures in education to being "working class".  This is inconsistence, because if hereditary differences in intelligence are such a big factor in academic success, surely we have to believe that race is a factor...?

Please listen to the experts!

It's widely established in academic circles that the language of the classroom is a critical factor in success.  Being criticised for being "wrong" inhibits children's expressiveness and willingness to contribute.  If children don't engage with the class, they don't learn.  It's as simple as that.

But people just aren't willing to accept the expert opinion.  One of the main flaws of democracy is that experts are outnumbered by ill-informed individuals.  As soon as you suggest accepting "how people speak" as a classroom model of language, you're greeted with howls of protest.

If you're talking anout regional varieties of a language, you're accused of "dumbing down", and the other person will rarely see the snobbery inherent in calling someone else's way of speech "dumb".  They won't accept that the suggestion comes from rigorous studies, but tell you you're just being a wishy-washy liberal.

The argument is slightly different when you're talking about teaching in a completely different language, although again you're accused of being a wishy-washy liberal.  We're asked to believe that eaching someone in their own language is robbing them of the opportunity to learn another, more useful language.  By that token, all schools in the world should be teaching English.

But it's not a question of either/or!  You can teach both!

Mother tongue as gateway language

I mentioned "initial literacy" earlier.  When you learn to read in your native language, you use all your knowledge of the spoken language to help you decode the symbols on the page.  Children can often "self-correct" when reading, thanks to their knowledge of sentence structure.  The principles of reading can be generalised across languages, so learning to read another language later is actually fairly easy.

But imagine that your first encounter with writing is in a language you don't speak yet.  You have no concept of how the words tie together and you're trying to sound out stuff off the page.  In the case of Haiti, this is a right pain -- in Haitian, like most creoles, verbs don't conjugate for person (consider a Jamaican Creole speaker saying "me go", "he go", "they go"), whereas in French they do.  Worse! - in French several conjugations are written differently but pronounced the same, and the ending -ent for verbs is silent while the ending -ent for adjectives (eg different) is pronounced.

Initial literacy in a foreign language is very, very hard, and a student will probably never master it.  People do better at the foreign language if the task complexity is reduced and they're not trying to learn two distinct skills at the same time.  The answer is...

The bilingual school

As I said earlier, it's not a question of either/or.  The best model of education, according to the experts, is a truly bilingual school.  Give initial literacy in the mother tongue, while teaching the new language in the spoken mode.  After three years of literacy schooling in one language, you can very quickly teach children literacy in any and all other languages that they speak.  It's an established pattern, and as far as I can see, that's pretty much what's being proposed in Haiti.

But I went to an international school, and I'm fluent in [insert language here]

One or two of the comments on the BBC site were of the form above.  But the International Schools are far from the norm.  On the whole they are expensive elitist schools that pay a lot of money to get well-qualified and very capable native-speakers to travel half-way around the world to teach in them.

This is very different from the situation in Haiti where the teachers themselves are likely to be underpaid native creole-speakers teaching in non-native French.  Believe me, this rarely results in fluency.  4 years ago, I was teaching English to teenagers who had been learning English all their school lives from Spanish speakers.  Their English was, well, extremely foreign.  In fact, you could even describe it as a Spanish-English creole...

Language revitalisation in primary schooling

Of course, Haiti is a place where the local language is strong.  What happens where the language is weaker?

This is where I shake my head in disgust.  There is a growing demand in minority language communities for immersive education, even where the minority language is not spoken in the home.  The only way Scottish Gaelic is offered in primary schools is with the first three years exclusively through the medium of Gaelic, which means for most children, initial literacy is in a non-native language.  Ask what's wrong with the internationally-recognised bilingual model, and you'll be told it's not suitable for an endangered language (everyone likes to feel different, after all).

People are also quick to point out that the Scottish model is based on the most popular option in the Basque Country in Spain.  But I'd like to point out that the model is popular with the parents, and parents are not experts.

In particular, it's impossible to have a debate with most parents about the effectiveness of the teaching their own children received, because they're already personally invested in the idea that they've given their children the best education possible, and they are averse to even considering that they may have been wrong.  You can't use examples of their own children's faults, or they're going to take personal offence, and if you take examples from elsewhere (eg a TV documentary on a Gaelic school) you'll just be told that bad Gaelic's better than no Gaelic at all.

Except that the bilingual model offers the opportunity to learn better Gaelic -- many of the mistakes that kids make in Gaelic-medium classes are caused because the effort of initial literacy distracts them from grammatical accuracy.

But sadly, in education, decisions are made by parents, and most parents really have no idea what education is all about....

19 August 2011

Use of liguistic terminology

(I've been busy this week and didn't have time to finish the promised article on phonology, so here's something that's been sitting in my drafts folder for a while.  It's quite relevant now as I've been using a fair bit of jargon of late.)
I've taken a bit of flak on a number of forums for my use of linguistics jargon (particularly when I get it wrong!), so I want to clarify something here: I use jargon to describe what concepts are to be taught (and sometimes how to teach them), but I do not advocate use of jargon itself with beginners, unless they are students of linguistics anyway.

The international standardisation on Latin terminology is quite useful in that I'm now able to discuss linguistics in several different languages.  It really impresses people that I can teach them grammar in their own language, but it's little more than a parlour trick.  A few regular sound changes and the appropriate suffix and your subjunctive is subjonctif or subjuntivo.  It doesn't generally get any harder than the Italians and Germans who call in a congiuntivo and Konjunktiv respectively.

Having studied a lot of grammar, I'm not only comfortable with plain conjunctions, but also with coordinating vs subordinating conjunctions so the terminology is useful to me.  (Subordinating conjunctions, see?)

The labels we give language aren't always meaning to the new learner, so don't really help.  But in the original Greek and Latin, they were designed specifically to help.  Take, for example, Latin's dative case.  "Dative" is a Latin adjective (oh look, adjective, another meaningless term!) derived from the word for "giving", and describes one of the fundamental uses of the case: indirect object as recipient or beneficiary.

"Accusative", on the other hand, comes from Greek, where it originally could mean either "for something caused" or "for the accused" (at least according to wikipedia).  The Romans picked one translation, and really chose the less useful one, "for the accused", which misleads people even to this day.  The accusitive is most commonly used for the direct object, and most "speaking" words use an indirect object for the person you're speaking to, yet the word "accusative" seems to suggest it's to do with speaking.

But not all languages use the Latin system.  Basque is a highly inflected language, and their cases are simply named by inflecting the word "who?"  Basque can do this very neatly due to it's nature, and while it isn't as neat in English, you could still name cases similarly.  Have a look at these and tell me that they are descriptive mnemonic labels:

who-did | who-done-to | where-to | where-from | who-to | who-from

True, such a description could become quite long or complex for certain languages, but it's still better than trying to remember things by such meaningless terms as alative and ablative (two words which are very easily confused -- they fit all three of my categories of confusion: similar form, similar usage, and frequent co-occurrence.

The same goes for sounds.  If you want to talk about a "bilabial unvoiced aspirated plosive", just say P.  If you trying to get a student to pronounce a "bilabial unvoiced unaspirated plosive", you just need to get the student to pronounce an "unaspirated P".  However, you don't need the word "unaspirated", but you do need the concept of aspiration.  Call it the "puffiness" of a sound, call it the "breathiness"... what you call it isn't important as long as you teach the difference.  If you want, you can even call it "aspiration", but there's no point introducing the term until after the student is relatively comfortable with the concept.  In fact, it is probably counter-productive to introduce the term "aspiration" too early, because it means something completely unrelated in colloquial English (related to goals and ambition).  Hearing a word automatically evokes its meaning, so the old meaning will interfere with learning the new concept.

So if I use terminology in this blog, it isn't my personal seal of approval on its use -- it's a concession to its current use in expert circles.  I don't think it's of practical use for beginners.

A concession to reality

On the other hand, if your students are going to be going out into the big wide world without you and are going to be relying on reference books to continue, then yes, they're going to need the terminology.  So teach it.  But look again at what I wrote about "aspiration", because again you really need to teach the concept before the word.  A word is a label for a meaningful "thing", whether a physical item, a phenomenon or just an abstract concept.  How are we supposed to learn words if we don't yet know what that "thing" is?  A word learned without meaning goes against the whole idea of language.  It's a disordered state, and once the student is in a disordered state, the teacher has lost control.

A massive change of opinion

Isn't it interesting how quickly you can change your own opinion by reasoning something through?  In the course of writing this post my own view of linguistic terminology has gone from "vehemently against" to "neutral-to-slightly-for".  I was always against it as I felt it was meaningless to the learner, but in talking about teaching the term after the concept, I realised that taught that way, it isn't meaningless at all.  I'd still prefer a more intuitive terminology, but maybe the old stuff isn't as big a problem as I thought....

14 August 2011

Phonology -- whats and hows part II

Last time, I wrote about phonology and the necessity of physically training the tongue to produce new sounds.  However, as I pointed out, not all new phonemes require new physical skills.  Can we pick these up just by listening?  I think not, and I'd be happy to tell you how.

Meaningful sounds

The problem that I'm always trying to stress is that the brain is only interested in meaningful input -- if something has no meaning, the brain isn't interested.

This leads to some striking (and often unexpected) results. The BBC documentary Horizon showed this with colours in the programme Do You See What I See? (UK only). In the program, you see several Himba tribespeople trying to pick out different colours on a computer screen. The show two tests -- one with a very slightly different green, which is difficult for the viewer and fairly easy for the Himba, and one with an obviously different colour... well, obvious to us, but not to the Himba.

The distinctions that the Himba find easy are ones that they have names for, and the distinctions we find easy are the ones we have names for. It would appear that the act of naming something focuses the consciousness on it, so if you tell me that a French P has a puffy sound, I'm more likely to notice it, because I know what I'm looking for.

Consider the old face/vase optical illusion: the first time you look at it, you see either the faces or the vase, and your brain fixates on that single image. If someone else tells you about the other picture, you struggle to see it at first, because your brain already sees something meaningful in the image. But once your brain finally sees the second image, you can change your mental focus between the two meaningful images at will.

But that example doesn't say much about subjectivity and objectivity, because the two objects are fairly arbitrary. A better example would be one where you can predict what the viewer will see based on simple demographic information. Maybe adults vs children, like this painting, where adults immediately see a particular image and children see a different one. (View the picture, and then read the explanation on the page.  I saw the second picture without reading the explanation, but only because I could understand the French label on the bottle....)

So what is meaningful to us is normally a matter of past experience and expectation. When it comes to meaningful sounds, past experience and expectation all comes from the languages we already speak.  So it would follow that we need to consciously draw the student's attention to the differences, or they're just not likely to notice them.

What do we need to draw their attention to?

The phoneme is not the minimal unit of sound

The phoneme is often mistakenly considered the atomic unit of pronunciation in a language, but most languages build their phonemes out of a series of distinctions, in a fairly systematic manner.

In English, for example, we have voicing of consonants as a distinction, and it occurs pretty much wherever it can.  Voicing is the difference between P & B (at the front of the mouth), T & D (in the middle) and C/K & G (at the back).  We also have nasalisation, which takes those three pairs and gives us the sounds M, N and NG.  It's a stable and systematic structure.

There are other languages (EG Gaelic) where the distinction between P & B is not one of voicing, but aspiration.  The same distinction carries through for P&B and T&D.  In fact, it's hard to find any language that has a voicing distinction on one of those pairs, but makes a distinction in aspiration -- in general, the same distinction carries through.

Polish gives a great example of how regular these consonant distinctions can be.
In the diagram above, you can see a clear structure uniting 12 sounds in 3 distinctions (two 2-way distinctions, one 3-way distinction).  It's almost entirely systematic -- this cannot happen by accident, so we must assume that the native speaker's internal model of language acts on the level of these distinctions.

For this reason, I believe that it is not enough to draw the learner's attention to an individual phoneme, but that we must teach them the individual distinctions.

This doesn't have to be done in a dry "linguistics" way, though.

Teach once, then repeat

When teaching a phonemic distinction like voicing or aspiration, you don't need to start with the idea in the abstract.  Instead, you can start by teaching the pronunciation of one letter, then its contrast (eg P first, then B).  In teaching the contrast, you pick a word that describes it ("puffiness" or "breathiness" is more meaningful than "aspiration") or you just describe it.  Then when you move onto the next pair (T,D), you can refer back to the first pair, because it's the same difference.  And once you get to the final pair (K,G), it'll be very easy to do.

Of course, this means that you have to restrict the number of phonemes to start off with, but there are many people who are theoretically in favour of gradually introducing phonemes -- it's just the order of material that messes them up.

Teaching one thing at a time

Most teachers like to start with seemingly useful words and phrases.  Hello, how are you, goodbye -- that sort of thing.  This takes away the teacher's control over the phonemes -- teachers don't choose them, they just use whichever ones pop up.

Worse, quite a lot of teachers will introduce numbers early on, and in many languages you'll have encountered half of the phonemes of the language by the time you reach ten.  (This probably isn't an accident -- ambiguity in numbers would be a problem, so they naturally evolve to be fairly different.)

One commercial course points out this problem, and suggests that the way round it is to teach numbers one at a time, in a way which supports a progressive increase in the number of phonemes.  The example they used was 10 and 100 in Spanish: diez and cien.  These two words share all but one phoneme (C before I or E is pronounced the same as Z in Spanish), so if you teach one then the other, you're only introducing one phoneme the second time round. 
(I think I remember which course this was, but the blurb on the website no longer mentions this, so I'm not going to link to it.)

And after all, why should we teach numbers in numerical order in a second language?  When teaching children numbers in their first language, we're teaching both the concepts and the words, but in a second language you're only teaching the words, because they've already got the appropriate concepts to peg them to.  We can now selectively use any of those pegs we want to, in any order we want to.

Putting it together

So if we teach a couple of consonants well, and then we introduce new consonants one by one, we can use the earlier consonants as an anchor to show repeated distinctions.  It doesn't matter whether the student can consciously remember what those distinctions were -- a native speaker normally wouldn't have a clue.  What matters is that the model the student uses automatically for pronunciation implicitly respects the consistent rules of the language.

This will not happen if the student is left to listen, because one misheard phoneme can threaten the integrity of the entire structure -- pull any one of the sounds out of my neat little Polish diagram and dump it somewhere else and the whole thing will collapse.

Next time

Previously I spoke about sounds as new muscle movements, today I spoke about simply the meaning of sounds.  Next time, I'd like to demonstrate how almost all new sounds really are new physical movements anyway.

10 August 2011

Phonology -- whats and hows

A couple of weeks ago, I was discussing the importance of phonology, trying to demonstrate why it should be consciously dealt with in the teaching/learning process, but I took the decision not to include any comments on how to teach it in that article.  Basically, I didn't want to give anyone any grounds to reject my argument out-of-hand.  In this post, I'd like to cover how I believe it should be taught, but remember that this, the how, doesn't affect my argument on the importance, the why.  Reject my methods if you want, but please don't reject phonology as an area of study.

So, what did I establish in the previous post?
  • Incorrect pronunciation of an individual phoneme leads to problems in pronouncing clusters with that phoneme.
  • Problems in pronouncing certain sequences of phonemes lead to grammatical errors.
  • That vocabulary is harder to learn when you're not familiar with the rules of pronunciation in a language.
  • That not understanding target language phoneme boundaries makes it hard to understand native speakers.
  • That sounds that the learner drops in speech are often matched by a dropping of the corresponding letters in writing.
These are things that I have observed and do not see as particularly controversial.  And yet, my conclusion that pronunciation requires active instruction is rejected by many teachers.  Accent, they say, will take care of itself.  And accent, they say, is a personal thing.  But we're not talking about accent.  Accent is something that is layered on top of phonology.  Phonology is like the basic letter forms in writing, accent is more like individual differences in handwriting.  At school we are taught initially to get the basic forms right, and over the years we develop our own personal "hand".

Can we learn pronunciation from listening?

Some even argue that we learn pronunciation from hearing (and they sometimes add "just like children").  However, as I tried to demonstrate in my recent post receptive skills as a reflective act, there is good reason to believe that we understand language by comparison to our own internal model of the language.  In the follow-up post, I gave a concrete example of mishearing a word on Italian radio, and how my flawed internal model was good enough to understand the message without perceiving every sound.

OK, so that's anecdotal and doesn't prove a general case, but ask yourself this: how many different accents can you understand in your own language?  And how many of those accents can you speak in?

So you can see that simple exposure hasn't given you extra accents.  As I said above, accent is not phonology.  But our brains have learned to ignore accental differences (to an extent) to enable us to understand the widest possible number of people around us.  So if our brain assumes a different phonology is just a different accent, it throws away all the information you're supposed to be learning from.

So I really don't believe it's possible to learn from "just listening", no matter how much you do.

Motherese and exaggeration

Here's the outcome of an interesting study (YouTube video).  It turns out that when we teach kids to speak, we don't expect them to learn from natural speech, but we exaggerate our phonemes, effectively making them "more real than real" or "whiter than white".  And if you think about it, isn't this what we do when speaking to foreigners or people with a very different accent from ours?

The point is that we have to make the differences clear and noticeable, so that one phoneme doesn't blend into another.

I would suggest that this points towards the right answer in language teaching to adults: if even children (who have no preconceptions of what a phoneme is) need extra emphasis to understand the difference between similar phonemes, then us adults (who are biased towards our native language's phonology) really could do with a bit of help.  The brain has to be told that this new information is useful, or it will throw it all away.

Exaggeration of pronunciation appears to help the listener notice the differences.

Learning pronunciation through pronouncing

However, we learn to dance by dancing, and we learn to drive by driving.  In both cases we can pick up a few hints and tips from watching, but we need a heck of a lot of practice.  Why shouldn't this be the case with language?

People are very quick to tell me that language is different from every other skill.  That is a valid opinion, but it is still only an opinion - no-one has ever presented anything to me that demonstrates it to be true, or even likely.  Right now, it's just a theory... and it's one I do not believe.

To me, pronunciation is a muscle skill.  Let's consider some of the extremes sounds that don't occur in English.

Take retroflex consonants.  Retro - backwards; flex - bend.  In retroflex consonants, your tongue bends backwards, and the tip goes behind the alveolar ridge.  This type of sound doesn't occur in English, so a monolingual English-speaker will probably never produce this sound in his life.  If you ask such a person to put their tongue into that position, they won't be able to -- their tongue just can't bend that way.

But then your average person couldn't do yoga postures on a first attempt either -- the yoga teacher will lead them through some simple postures and exercises to encourage the muscles to stretch and strengthen appropriately until they are capable of performing the required movements.

The brain doesn't prepare the muscles just because you've seen the movements; the body prepares the muscles once you've started doing the movements.  Your brain similarly cannot train the tongue as it's just another muscle, after all -- only the body can do that.

So clearly, there are certain sounds that must be taught consciously, or the learner won't physically be able to say it.  But obviously there are also sounds that the learner is physically capable of saying, but isn't in the habit of saying.

This post is starting to get a bit on the long side, so I'll come back to the question of this second category of sounds next time.

How I learned to pronounce retroflex consonants

I had a notion to learn a few words in various Indian languages a few years ago when I was working in IT support.  Our front-line helpdesk was in India and I wanted to try to build a better rapport with my coworkers.

One of the sources I used stated quite plainly that while languages like French and Spanish let you get away with "close enough" pronunciation (not entirely true...) with Hindi, you would simply not be understood if you spoke in an English-speaker's accent.  It described the retroflex articulation and what I did was to start doing a regime of "tongue stretches" -- as I walked to and from work, I would tap my tongue continually off the roof of my mouth, and move it slowly backwards and forwards, to create a sort of silent T-t-t-t-t-t-t-t-t-t-t or D-d-d-d-d-d-d-d-d-d.  Every day I could reach slightly further back, and in about a week and a half I was able to produce a convincingly Hindi-like retroflex for all of the various consonants (except R, cos that's really quite complicated). I was curious about how far I could go, and within another few days I'd got to the point where I could touch the tip of my tongue to my soft palate.


So certain sounds need to be learned physically, and it's something that can be done.  Next time, I'll start looking at sounds that are more a matter of habit, and showing that the boundary between "habit" and "ability" isn't always that clear.

04 August 2011

The wolf in the forest

"Be careful in the forest -- there are wolves in there."

"Nonsense.  I go through the forest every day and I have never been attacked."

As fallacies go, this one's pretty clear.  Something does not have to occur to everyone every time in order to be a danger.  Knowing that there's a wolf in the forest, the second traveller should alter his behaviour to minimise the risk.  Carry a weapon, sleep next to a fire, the usual stuff.

When it comes to language learning, though, people do tend to take the attitude of the second traveller. Point out any of the potential pitfalls in a language learning strategy, and the other person will usually accuse you of talking out of your hat, and point out that he learned OK that way, or that some of his students did.

But just as one man emerging safe and sound from the forest doesn't disprove the presence of wolves, the success of one or two language learners doesn't demonstrate a lack of potential pitfalls in the methodology they employed.

People tend to have a hard time accepting this, though.  I point out a pitfall, and they declare it isn't a problem.  "And I'm living proof."  When I try to point out the fallacy, I'm often accused of disrespecting their experience. No no no.  I respect and acknowledge their experience, but my point is that there is more in the world than one man can experience.  Our capacity for reason allows us to go beyond our experience, and we should take full advantage of that.  We should not limit ourselves to our own experience, and we certainly shouldn't limit others to it either.  We need to reconcile our experiences with the knowledge of others, and thereby remove the pitfalls and reduce the risks before giving advice to others.

Please.

31 July 2011

Common Errors: further evidence for the prosecution

I argued in two previous posts (here and here)that the so-called "common error" of should of, could of etc is actually a change in the grammar of English and today I came upon some very good supporting evidence.
And yet companies are constantly being sued over patents which are so broad or trivial they should've never been granted in the first place. [ Slashdot ]
There we have verb verb adverb(time) past-participle.  The adverb is after two verbs... that can't be right.  I mean, I will never do, I would never do, etc.

And if you have a look at Jane Austen's Emma on Project Gutenberg, you'll find that "have never" only occurs in the present perfect, and that when using other compound tenses, "never" goes between the first auxiliary and have.

If we can say I would've never known, then surely would've is now a single word in the internal model of a great many native speakers...?

29 July 2011

The importance of phonology

OK, so I promised this a while ago, and I've let myself get distracted by a few other points in the interim, but I'll try to draw them in and show how they are related to the teaching of phonology in general.

In my posts 4 skills safe and 3 skills safe, I argued that the division of language teaching into the traditional 4 skills of reading, writing, speaking and listening was trivial, superficial and of very little pedagogic value.  Instead, I suggested that we should look at individual skills of syntax, morphology and phonology, and that we could add orthography as an additional, more abstract skill (Lev Vygotsky described reading and writing as "second-order abstractions").
vowe
Phonology often gets very little attention in the classroom, as it is seen as a sub-skill of speaking, and speaking's "difficult".  But phonology is fundamental to many languages.

If you haven't already, you might want to take a look at my posts In language, there's no such thing as a common error, and Common errors: My mistake!  In the first post I described a particular common error in written English (might of instead of might have, could of instead of could have etc) and in the second I expanded on the mechanisms that cause this "error", with the aim of showing that this wasn't an "error", but in fact a change in grammar, analogous to changes that have occurred in other languages.  What I didn't focus on there, but which is extremely relevant here, is that this change in grammar is pronunciation-led -- ie the phonology of English has caused this change in grammar.  The prosody of English has led to 've being always weak, and it has lost the link to the related strong form have.

And of course the change in the Romance languages that I mentioned in the second post is also led by phonological patterns.  If you look at any language whatsoever, many grammatical rules have arisen from mere matters of pronunciation.

The archetypal example is the English indefinite article -- a/an.  You may well be aware that like most other Indo-European languages in western Europe, this evolved out of the same root as the number one.  But the modern number one is a strong form and has a diphthong.  A/an is a clitic and always weak, so split off (completely analogous to 've and have).  This weak word /ən/ then lost its [n] before consonants, simply because it's easier to say that way, and retained it before vowels again because it's easier to say like that.  (And if you'll indulge a slight digression, that brings us back to would've etc, because you'll often hear woulda before a consonant and would've before a vowel.)

If you look at the Celtic languages, one of the trickiest parts of the grammar is the idea of initial consonant mutations.  Lenition in Modern Irish is a bit inconsistent (probably due to the relatively large number of school-taught speakers against native speakers), but the three mutations in Welsh are fairly systematic, with mutated forms usually only differing from the radical in one "dimension" of pronunciation.

These sorts of rules become very arbitrary and complex when described purely in terms of grammar, whereas when considered physically, they make a lot more sense.

Let's go back to a/an and take a closer look.  We all know the rule: a before a consonant, an before a vowel, right? Wrong! It's actually: a before a consonant phoneme, an before a vowel phoneme.  To see the difference between the two, fill in the following blanks with a or an:
I want __ biscuit.
I need __ explanation.
He is __ honest man.
I have __ university degree.
Now it's not a difficult task for a native speaker, because you wouldn't normally have to think about it: honest may start with the letter H, but you know intuitively that you don't pronounce it, so you write an without thinking.  Similary, university may start with the letter U, but you know intuitively that it starts with a y-glide sound (like "yoo", not "oo"), so you write a.

I have seen quite a few English learners write "an university" or "a honest man" because they are either trying to work from a grammatical rule in isolation from pronunciation, or because they simply pronounce these words wrong.  In the case of honest, the problem is compounded if the student can't pronounce H, because if he follows the rule correctly on paper, he undermines the phonological basis for the true rule.

It follows, then, that we cannot teach grammar without considering phonology.  (And anyone who has succeeded in understanding the French liaison rules can tell you categorically that this is true.)

But how does phonology affect us in other ways?

Phonology and the ease of vocabulary learning

It may seem trivial, but for his PhD thesis, an Australian teacher of Russian demonstrated that it is easier to learn foreign words that are possible in your native language than ones that aren't.  EG the word brobling with first-syllable stress is easy, brobling with second-syllable stress is a bit harder, grtarstlbing with lots of consonant clusters that can't occur in English is very difficult.  He then took a massive leap of logic that I'll examine later in greater depth.

This corresponds with what a lot of teachers believe, but few teachers have the time or patience to implement: that it's easier to teach phonemes one at a time and reuse them in different words.  Again I'll come back to that when I start discussing techniques.

For now, though, I'll simply suggest that it's easier to learn words that are made out of familiar "blocks" than ones that aren't.  It follows from this that good teaching of phonetics (whatever that means) is a prerequisite to vocabulary learning.

Phonotactics: the "crisps" problem

My high school had an exchange programme running with a school in France.  Teenagers are naturally curious beasts, and when my big brother and sister first went on one of these exchanges, the class discovered how funny it was to get the French people to say crisps (UK English for what the French and Americans call chips).  Very few of the French kids could actually pronounce it, because they were using French phonemes with a northern accent (the school was near Lille).  The French P is unaspirated (unlike English) and the French S is quite slender and hissy.  As a combination of sounds, French SPS is difficult, nearly impossible -- the P either gets lost in the hiss or one of the Ses gets cut short.  The English combination is physically much easier.

Similar problems occur in other places.  Spanish people find wants quite difficult to say, because Spanish T is not compatible with Spanish N or S due to the method of articulation.  NTS in Spanish needs the tip of the tongue to be in two different places at once -- the alveolar ridge for N and S and the gumline for T.

The problem is that many books will tell us that T, D, B, P etc are sufficiently similar in English and Spanish, French or whatever that we can use them equivalently, but this is only true for each phoneme in isolation.  Once we start trying to combine them, the differences start to accumulate.

Which brings us back to:

Grammar again - and how writing suffers for it

If you cannot pronounce the inflectional affixes in a language, your grammar suffers.  Many, many Spanish learners of English drop their -s and -ed suffixes because of the problems of incompatible sounds.  They replace it's with is.  These mistakes filter through from their pronunciation into their internal model of grammar and eventually into their writing.  But it's easy to ignore this, because most of the time they correct their own writing mistakes with their declarative knowledge, and on the few occassions where they don't correct it, the teacher simply tells them the rule again, but never attacks the root cause of the problem: if they learned to pronounce English [t] and [d] phonemes, most of the difficult sound combinations would become much, much easier, their internal model of the grammar would be built up to incorporate these non-syllabic morphemes (and there are no non-syllabic morphemes in Spanish as far as I know, so it's a totally new concept to them edit (2-feb-2014): Spanish has at least one non-syllabic morpheme: plural S after a vowel.) and they would write natural based on their procedural knowledge of the grammar..

And finally...
Allophones and comprehension

Apparently there are certain accents that are considered "hard" in some languages. Now I'm not implying that there is no such thing as a hard accent, but I do believe that most of the difficulties stem from the teaching, not from the language.

In Spain, the accent of Madrid is considered quite difficult to understand.  The reason for this is that the madrileño accent tends to lenite (weaken or soften) its non-intervocalic consonants.  The classic is the weaking of D to /ð/ (roughly equivalent to TH of then).  There is little physical similarity between the English D and ð as is clear from their technical descriptions: /d/ - voiced alveolar plosive; /ð/ - voiced dental fricative.  But the Spanish /d/ is a voiced dental plosive, which the description shows is quite similar to /ð/.  Basically, the soft D in Madrid is basically an incomplete hard D -- the tongue doesn't quite go far enough to touch the teeth and stop the sound, but instead it hisses slightly.

Now, if understanding language is a reflective act (as I claim here and here) then we understand sounds by considering what shape our mouths would be in if we were to make the sound we hear (something suggested by the concept of mirror neurons).  The soft and hard Ds in Spanish are not "soundalike" allophones at all, but they have a similar shape, which is different from the English D.  To me it seems clear that physically learning the Spanish hard D shape would result in better comprehension of the similarly shaped soft D in a way that simple hearing it won't accomplish.


Conclusion

All in all, it seems to me that phonology is an intrinsic component of language, and that the system of a language falls apart when phonology is not given the proper support throughout the learning process.

As for how to teach phonology, I have my own views, but I'm currently reading up on some alternative opinions so as to give a more balanced write-up of the options available.

26 July 2011

Common Errors: my mistake!

Hmmm... I should maybe reread my posts more before publishing, because in another article I said "we should be paying close attention to the thought processes behind this change and trying to make the way we right English match the way we speak it" and then forgot to describe the process in any detail.

Because even though the solution I proposed was to legitimise the writing of contractions, this is not the internal process causing the change. If the average speaker's internal model saw 've as a contraction of have, then no-one would make the 'mistake' of using of instead.

Put it this way: the "error" only occurs when have is used as a second auxiliary -- no-one would say *I of done it in place of I've done it/I have done it, for example.

So it would appear that here we have ceased to think of 've as an verb at all, let alone an infinitive.  At best it is a clitic that modifies the first auxiliary to make it part of the perfect construction, but in fact it would appear to me to be in reality a new suffix, because I cannot see any situation where you could syntactically separate 've from the first auxiliary.

What we see here is English gaining a new fusional feature, and while English has displayed a tendency to become more isolating over the centuries, it isn't unknown for a language to pick up new fusional elements even when the general tendencies is towards isolation.

Consider, Latin vs the Western Romance language family.

Latin was a highly fusional language, and relied on very few periphrastic constructions.  However, the future was a periphrastic form, consisting of the verb in the infinitive followed by the present indicative of to have -- so I will do was literally formed as to-do I-have.

Most members of the Western Romance family has lost a lot of the fusional features of Latin, but at the same time, the future has mutated into an inflected tense, with suffixes derived from (and in some cases identical to) the present tense of to have added to a future root that is almost identical to the infinitive.

Note that the creation of these new suffixes didn't alter the present tense of to have in any other constructions, even though in the early stages of this change, grammarians would most likely have declared that it was "obvious" that they were the same thing, and lamented the "common error" of people saying nous le ferons instead of the previous nous le faire avons.  But today, the latter looks so unnatural that it would not be understood except by a scholar.

This is analogous to what I believe is happening in English.  One particular usage of the verb to have is becoming replaced with a suffix derived from, but not identical to, a form of the verb.  However much the status quo appears more logical, the frequency of occurrence of the "of" error (Google "would of", "could of" etc, and you get millions upon millions of hits) tells us that people's brains just don't work that way.

You cannot rewrite how people pick up their native language.   People seem to pick up 've as a suffix, not an infinitive, so it's time to stop resisting.  While it would be natural for a suffix to be incorporated into the word without the apostrophe, that would be a step too far for most pedants, and even besides that would be a fairly radical change that would take a bit of getting used to.

So I advocate using the contraction notation for now, but recognising that it has now ceased to be a contraction in the mind of the native speaker.