25 September 2016

Don't get any big ideas: the dominance of names in social sciences

It has bugged me for quite some time that in language, in education and in the social sciences in general, there are certain names whose big ideas get repeated ad nauseam even once others have moved the state-of-the-art onwards.

For example, when textbooks and courses discuss formal grammars, they typically focus on Noam Chomsky's generative grammars. Chomsky's model of grammar divorced structure from meaning, which he demonstrated with the nonsense sentence "Colorless green ideas sleep furiously." Since then, however, it has been observed by many commentators that a command of grammar as an independent from semantics is only really valid for people with training in grammar, and culturally not universal.

Take for example the early nineties computer game "First Samurai". Developed in the UK, they asked a Japanese translator what the game's title would be in the Japanese script, as they wanted this for the cover. The response was that it was impossible, because you can't rank samurai. Reportedly, they then asked how you would say the "first samurai" you see in the morning -- still impossible. In the end, they had to ask for another phrase to be translated and then take the symbol for "first" from that and place it next to the symbol for samurai.

There are dozens of similar anecdotes attested worldwide. In her 1978 book Children's Minds, Margaret Donaldson cites a report of an adventurer who asked a Native American to translate "the white man shot six bears today."
"'How can I do that?' said the Indian. 'No white man could shoot six bears in one day.'"
Donaldson roundly rejects the idea that this grammaticality sense is anything more than a result of our education.

The other major blow to Chomsky's model was Lucien Tesnière's valency/dependency grammars. Whereas Chomsky built his trees based on part-of-speech only, Tesnière identified that certain words had to be accompanied by certain other features, and could optionally be qualified by additional ones.

Chomsky split his basic sentence trees into subject and predicate, as was the norm at the time. Tesnière instead argued for "verb centrality", putting the verb at the top of the tree. This meant that the verb in Chomsky's model had a direct link to the grammatical subject. It is trivially obvious that this is a superior model, because with no direct link, Chomsky's model essentially claims that "*He say I does" would pass a grammaticality test. Now I'm sure Chomsky at some point will have presented a round-about argument to say why that's not acceptable, quite simply Tesnière's model was better. Tesnière's model is widely accepted, and it's key to a lot of computer-based language techniques.

And yet when I studied language, lots of space was presented to Chomsky, and I have no recollection of seeing the name Tesnière or talk of valency or dependency grammars. When I briefly studied formal grammar in computing, lots of time was given to Chomsky, and dependency grammars were mentioned only in passing. To me, verb centrality was an obvious notion, and every time Chomskyan grammar was presented to me, I wanted to put the verb at the top. It wasn't until about four years ago that I picked up a book on Computational Linguistics/Natural Language Processing and was introduced to Tesnière's theories.

This is a very dangerous state of affairs -- students are being taught outdated, disproven theories instead of the current state of the art. In education, one of the best examples would be Piaget, whose theories have been proven wrong time and again, but are still one of the main focuses in most introductions to childhood development.

Why? The typical answer is that to understand the current system, we have to understand the underlying theories they're built on. This, I'm afraid, is not true. Or rather, it is true, but the underlying theories of modern grammar are derived from Chomsky, and the underlying theories of modern childhood development are derived from Piaget. By teaching the original theories, we end up holding back development in the field: most courses spend so long talking about the outdated theories that they don't leave time to fully discuss the current ones and the students leave the courses with a working model of the wrong theories. We therefore spend a lot of time debating the same thing as the generation before.

Certainly, we don't do this in the physical sciences. No-one would suggest that in order to learn about the big bang theory we first have to learn about the theories that predated it. Such theories are clearly of some interest, but are best restricted to specific modules on the history of science.

The problem in the social sciences seems to be a reluctance to rewrite part of a major theory based on subsequent observations and refinement. It appears that to be genuinely influential in social sciences, you cannot simply do incremental improvement, and instead must write a new theory practically from the ground up. In doing so, you are guaranteed posterity, because your grand theory will continue to be published, cited, repeated and taught as is long after all the elements it is built of are individually discredited -- the field will not allow anyone else to revise it for you.

Take Bloom's Taxonomy for instance. Even when it was first devised it was a bit of a kludge. The most common form seen today is still the simplest triangle form, and all versions and derivatives still hold the same ordering of "remembering" before "understanding" -- i.e. it preaches rote learning in line with the behaviorist thinking of Bloom's day, even though no modern school of thought actively professes a belief in rote learning as a useful mechanism.

The book Second Language Learning Theories by Mitchell, Myles and Marsden discusses the proliferation of theories in the introduction, and says "We believe that our understanding advances best when theories are freely debated and challenged among a community of scholars." I would certainly not dispute that, but I think we waste an awful lot of time when we discuss disproven theories and treat them as though they have equal merit to theories not yet disproven. We also, as I said earlier, do a lot of damage to the next generation of academics by preparing them to discuss the disproven theories rather than the current ones.

Worse, though, is the effect on non-academics. Teachers overexposed to outdated theories and not familiar with current ones are unable to take advantage of advances in the field and translate it into classroom practice.

24 September 2016

The master approaches...

So I've just embarked on a new phase in my career, beginning a masters degree programme in TESOL. After Christmas I get the opportunity to specialise, and the plan at the moment is to specialise in computer-assisted language learning, which really is my kind of thing.

I figure it's time to dust off this old blog and start using it as a scratchpad to reflect on the sort of issues that I'm dealing with on the course, and to comment on the materials I come across in my reading.

It's also an opportunity for me to break the habit of a lifetime and start using proper citations and referencing on the blog, something which I hope to stick to in the future so that things I post here are better informed and therefore more useful to others.

27 February 2016

Edinburgh's trouble with multilingual education.

The Scottish Government has long had an aspiration to wider availability of multilingual education, and recently formalised on the European model of 1+2. 1+2 is the idea that a child will be educated in their first language, and that during their primary schooling, they will be taught at least two additional languages; the first being introduced from the first year of schooling, the second no later than the 5th year of primary school. (Earlier draft versions of the regulations said the first additional language should be introduced no later than P3, but this has since changed.)

There are several steering principles underpinning the 1+2 approach. With regards to the first additional language ("L2") the government themselves state:
" The Working Group expects young people to continue with some form of language study in the L2 language up to the end of the broad general education, i.e to the end of S3. " [Language Learning in Scotland: A 1+2 Approach]
Children are therefore expected to be given the opportunity for continuity of access to their L2 until around the age of 14 or 15, and it is assumed that there will be the option to continue beyond that age, subject to the usual logistical constraints around class sizes and the viability of running exam-level classes for a small number of pupils.

Another of the principles is that language should not merely be taught as a subject, but should be embedded into classroom routine. There is even the hope that in the future it would be possible to offer subjects (or units within subjects) delivered through foreign languages. What could be more natural than listening to accounts from French or German WWII soldiers and civilians in their own language as part of the history curriculum, for example? It's a laudable goal, and even if we're not likely to achieve it in the foreseeable future, it's certainly something to aspire to.

The government's deadline for the implementation of this policy is 2020, and several local authorities are pushing to get themselves ready ahead of this date. Last year, Edinburgh City Council announced their intention to have the scheme implemented by 2017.

This too is laudable, but recent news has thrown the city council's commitment to this into doubt.

Gaelic-medium education (GME) has been available since 1988, when a Gaelic-medium unit was opened within a mainstream school, Tollcross Primary. Since then, uptake of the option for GME in the city has increased year on year. Tollcross Primary is a feeder school for the city-centre high school James Gillespie's, so secondary Gaelic-medium was implemented there. In 2013, primary GME education was moved to a dedicated school on Bonnington Road in the north of the city, outside of James Gillespie's school normal catchment area, but JG's retained its place as the city's secondary GME facility and the new school was given official status as a "feeder primary" to the school.

This year, however, James Gillespie's have found themselves with more applications for new admissions than they have capacity to accept, and the council have announced that the standard rules for oversubscription apply: priority to children within the geographical catchment area and those with older siblings already attending the school. As the intake for the Gaelic primary is drawn from the entire city (and beyond), it is most likely that the pupils who lose out will be those currently going through GME. There are 24 pupils in this year's primary 7 class, and current projections see 9 of them being refused a place at JG's.

The council's current proposed solution to this is to offer these children the option of attending Tynecastle High School, or the school for their local catchment area, but neither of these options fulfils the aspirations set out for 1+2, as local schools will offer these children no continuity in their L2 (Gaelic), and Tynecastle is little better. Tynecastle currently only offers Gaelic for learners, something which is not appropriate to children from a GME background. Indeed, children who have undergone three or more years in GME are not allowed to sit the Gaelic learners' exams at National 5 or above at all.

Going by the council's current projections, then, we're likely to see 15 GME kids in JG's first-year intake and at most 9 in Tynecastle's. With class sizes pegged at 30, that means that we've taken one class and turned it into two, which certainly does nothing to reduce problems of capacity at either school. When we look at what that means for course choice at 3rd and 4th year, when come of the pupils may be dropping Gaelic, what are the chances that either school will see a continuing Gaelic class for the GME pupils as viable?

This then leads on to a wider issue with GME provision at JG's. Aside from Gaelic itself, the school currently only teaches Art, RE, PE and Modern Studies through Gaelic, and currently none at a certificate level, although National 5 Modern Studies will be offered next year (see section 3.78). It seems likely that these classes will not operate in Gaelic for next year's first year, as that would mean having half-empty classrooms in a school that had already turned children away for capacity constraints.

Part of Edinburgh Council's justification for this decision is that:
"The level of current Gaelic provision at James Gillespie’s High School is not significant and could be relatively easily replicated, at least in part. There continue to be significant issues nationally with the recruitment of Gaelic speaking staff which limit what could actually be delivered at a secondary level, regardless of where it was provided. " (section 3.75, same document as above)

Both of these statements are true, but this is something of a question of cause and effect.

First of all, the reason for the low level of Gaelic provision is due to the lack of critical mass, and dispersing the GME primary cohort across two or more high schools will certainly not resolve this. Secondly, part of the problem nationally with the availability Gaelic-speaking staff is that for they typically spend the majority of their time teaching in English, and again this stems from a lack of critical mass within the pupil cohort. If the council's actions will lead to Gaelic-speaking teachers spending even less time teaching in Gaelic, then the council's justification is little more than a self-fulfilling prophecy that leads them to further squander what is already a limited resource, rendering their argument somewhat self-defeating.

The lack of availability of trained GME teachers is something that is being addressed at the national level, but there's something of a chicken-and-egg situation: with the low number of classes being taught in GME at present, it is very difficult for a teaching student to gain placement experience in a Gaelic-medium setting. Depending on the subject you are training to teach and the school you are placed in, Gaelic-medium classes may be limited to BGE (the first three years) or even only the first year or two. Some subjects may not be available at all. This makes it very difficult for a new teacher to build up the confidence required in delivering through Gaelic a subject that they themselves will have learned through English. Any action at a local level that risks decreasing the availability of GME has knock-on effects at a national level that hamper our ability to address the issue.

Not just a problem for Gaelic

Many people will shrug their shoulders and say "it's only Gaelic", but they're missing the point, because at the moment it's only Gaelic that offers us a current model for language learning throughout schooling, and much of the Scottish Government's policy on language learning leans on the experience of GME.

Four years away from the government's deadline on 1+2 and one year from its own self-imposed deadline, the council is already making decisions that take it further away from its goal. This does not bode well for children going through primary education in other languages, and Edinburgh council's schools will be offering a fairly broad selection (among them French, Spanish, Mandarin, Polish, Farsi and Gaelic). What happens if a child who has learned Spanish since primary 1 finds themselves allocated to Forrester High School (French and German)?

This is a logistical matter that will become a serious issue for parents across the city in the next few years, and this is an opportunity for the council to pilot a solution on a small scale and work out a strategy before it's too late.

If the council can't handle the transition between primary and secondary correctly, it will turn children off languages: kids placed in a language class that is too easy for them will lose interest in languages, and kids placed in classes above their level will lose confidence in their own ability to learn.

The goal of 1+2 isn't just to give kids "the right language", but to give them the right attitude to language, so that they can go on to be successful language learners and pick up the particular language they need later in life when the need arises. Getting the primary-secondary transition right is absolutely vital in developing this attitude, and if the council can get this right for 24 pupils next year, how can it hope to do so for the hundreds of pupils moving into its high schools in 2017 and every year after?

15 February 2016

Implicit and explicit, meaningful and meaningless

Last week I was across in Edinburgh catching up with friends. I arrived early, so went into a bookshop to kill time... and came out with two chunky academic texts. I probably would have escaped without buying anything if one particular book title hadn't caught my eye: Implicit and Explicit Knowledge in Second Language Learning, Testing and Teaching.

The main thrust of the book was looking into the ongoing debate as to whether implicit teaching styles lead exclusively to implicit knowledge and explicit styles to explicit knowledge only, or whether explicit teaching could lead to better implicit knowledge. It's an important area of discussion because at present the mainstream theory of language teaching holds that only implicit learning can ever lead to implicit understanding and production, and that explicit teaching only ever makes people consciously aware of rules and able to apply them mechanically and consciously.

And yet there are very few teachers who don't include some explicit instruction in their lessons, whether that's word-lists or conjugation tables. (Even Assimil, who sell themselves on the principle of natural assimilation, dedicate more letters to grammatical explanations than they do to the dialogues and their transcriptions and translation.)

I haven't read the whole book yet, and (unsurprisingly) what I've seen so far is pretty inconclusive. However, it does lean towards the opinion that explicit teaching does indeed help in language mastery. It also discounts a lot of the past counter-evidence to this theory on the grounds that their models of explicit teaching are simply bad examples, using overly mechanical, rote methods, and not being equivalently "meaningful" to the implicit method under examination.

It's this word "meaningful" that I think is the crux of the problems faced in language learning – language is nothing without meaning.

In the language classroom, items that seem to be inherently rich in meaning can paradoxically be rendered devoid of meaning by context.

My cousins buy trousers.

In an objective sense, it carries a lot of meaning, and there is no truly redundant information in the sentence – every word, every morpheme, brings something not explicitly present elsewhere. (But even then, it has no real personal meaning to me, as I can't imagine myself ever saying it. This is a side issue for the moment, though.)

But what happens when we put that sentence into a classroom exercise?

For an extreme example, let's take the behaviorist idea of substitution drills. In "New Key" style teaching, a substitution drill would be target language only, and one element of the sentence would be substituted with something else in the target language. So our theoretical exercise might go:
Teacher: My aunt buys hats
Learner: My aunt buys hats
Teacher: My mother
Learner: My mother buys hats
Teacher: Trousers
Learner: My mother buys trousers
Teacher: My parents
Learner: My parents buy trousers
Teacher: My cousins
Learner: My cousins buy trousers

At the point of utterance, the learner does not have to pay any attention whatsoever to the meaning of anything in the sentence beyond the plural marking of my cousins and the s-free verb form of buy, which is made even easier by the fact that this plural example follows an earlier plural example. Thus the student has no immediate motivation to attend to meaning, and it is a struggle to do so.

Substitution drilling is, as I said, an extreme example, but I do feel it is useful in establishing a principle that affects a great deal of learning, even where the effects are not so obvious.

Consider, for example, the fairly established and mainstream idea of focusing on a particular grammar point in some particular lesson, or section thereof. If I am set a dozen questions all of which involve conjugating regular Spanish -er verbs into the present simple third person singular (or whatever), then I do not need to attend to the meaning of the present simple third person singular, just the form -e.

To me, attending to meaning is the single most important matter when it comes to language learning, and yet it is rarely explicitly discussed. Instead, it is typically wrapped into a specific embodiment of the principle. Krashen's comprehensible input hypothesis suggests we learn language by understanding input, the communicative approach says we learn when we use language to solve problems. Total Physical Response says language has to be tied to physical action All of these are attempts to address the meaningfulness of language, but they are a narrow, specialised form of attention to meaning. CI and TPR deny us access to the colourfulness of abstract language with its subtle, personal meanings, and the CA doesn't do much better in that regard – while modal language may be taught in a communicative classroom, the nature of the task implies a rather pragmatic, utilitarian meaning, so there isn't really any meaningful difference between blunt orders like give me it; plain requests (can I have...?) or those indirected with a conditional mood (could I have...?); and statements of desire either, whether in declarative (I want...) or indirected further in the conditional (I would like...).

Other teachers take the idea of "personalisation" and raise it above all other forms of meaningfulness, insisting that students only learn by inventing model sentences that are true for them. But isn't (eg) I want it, but I don't have it true for everyone? Does the brain not immediately personalise a sentence such as that? (When I came across a similar sentence in the Michel Thomas Spanish course, I was cast back to throwing coins in a wishing well as a child.)

Perhaps the reason few writers wish to discuss attention to meaning is that it throws up a lot of questions that often fundamentally challenge their methodologies. For example, comprehensible input (and any similar learn-by-absorption philosophy) is confounded by redundancy in language – there is no need to attend to the meaning of every morpheme when the same information is encoded twice in the sentence. Perhaps the clearest example of this is the present tense -s suffix for English verbs (third person singular, i.e. he/she/it). It is readily apparent that a lot of learners do not pick this up, and there are a great many foreigners who spend years in English speaking countries, hearing thousands of hours of input with the correct form, but who never pick it up. There is no need for the learner to attend to the meaning of the -s, because they already know from the subject of the sentence that it's the third person singular being discussed. When they speak, they are understood, because even though it sounds incorrect to a native speaker, there's practically no risk of being misunderstood. In the communicative approach, such an error is not a barrier to completing the intended task (particularly seeing as there's a good chance your conversation partner will make the same mistake, given that everyone in your class is a learner), so there is no requirement to attend to it.

Language has a natural tendency to redundancy, in order to make our utterances easier to understand; we are naturally disinclined to attend to every element of the sentence. Therefore any attention to the meaning of all the individual components of an utterance will be a higher-order process, a conscious or semi-conscious overriding of our lower instincts. Surely that makes it an explicit process? And if it is an explicit process, surely it is better for it to be directed by an expert (the teacher) than carried out on an ad hoc basis by a non-expert (the learner)?

02 January 2016

Why I learn languages,and why I'm not learning any languages

I've probably said before, but the reason I like learning languages is because of the look on people's faces when I speak their language to them.

What I didn't realise was how far back this went.

My dad was a teacher at my high school, and I knew the main janitor before I knew most of the teachers. Lucien, the janny, was from France, and by a couple of years into high school I would say "Bonjour" to him - I couldn't say much else, but it was enough.

When my dad started talking about him this Christmas, the main memory in my head was just a smile - the delight of being spoken to in his own language, even for a moment. It was the same sensation that I've seen so many times since, and I started to wonder why it had taken me so long to start learning languages properly, and then I realised that I've stopped learning again.

I remember that all through my late teens and early twenties, I was keen on the idea of learning languages, and I picked up a couple of books here and there but never got anywhere. I only started to get the proper motivation back when I started speaking broken high-school Italian to a young woman serving in a local sandwich shop. Again I tried picking up the old books, and again I put them down.

As it turns out, restarting a half-forgotten language was really hard - if you attempt to read notes on things you already sort of know, you switch off, so that's when I switched to new languages: Spanish and then Scottish Gaelic. After that, returning to tidy up my French and Italian was a lot easier.

But right now, I'm not really learning, or relearning, or even consolidating anything. Why not? Maybe it's because I can already give lots of people the satisfaction of hearing their own language. More likely, though, it's just because I'm not meeting enough non-English-speaking people. That would be understandable, I suppose. Last summer I moved to an island off the north-west coast of Scotland, where I'm studying full-time, and there aren't many foreigners in the area at all.

What there is, though, is a lot of native speakers of Gaelic, and I just keep falling back to English.

Am I just being lazy? Or is my brain overworked? Or am I just being antisocial?

Probably a bit of each. I really want to get back on track this year, and start learning something new. To that end, I'm starting to plan my summer holidays now -- an epic cycle journey across part of Europe. I need to take in at least one area which requires a new language, and at the moment, I think Germany fits the bill. I've already got a solid basis to build on, so I just need to build fluency and vocab and see where I can get to.

Even just thinking about it, I can start to feel some of the anticipation building.

19 August 2015

The folly of trying to pronounce a place "like the natives do"

Well intentioned people often insist on trying to pronounce placenames in the "authentic" native form, even when there's a well-known variation in their own language.

At times, that change is remarkably successful, such as the change of "Peking" to "Beijing". There is an argument that this is futile, however, as the Chinese phonemes are rarely that close to English ones, and the tones of the Chinese are completely absent.

About a fortnight ago, the TV was marking the 70th anniversary of the brutal slaughter of the people of Hiroshima.

Now, most of us say "hiROshima", but a few people say "HEEroSHEEma". I thought about it a bit, and I figured that the first one is probably right, as the second one sounds like two words. I then looked up on the internet, and felt a bit sheepish when I read that the name means "wide island" in Japanese. Two words? Oh. But then I brought up Forvo and nope -- it's pronounced as one word.

So why do we end up with two forms in English?

It's all about perception. There are multiple things that you might detect. First up, there's word stress. In Japanese, Hiroshima is stressed on the second syllable, which is how I pronounce it in English. However, a knock-on effect of English stress is that adjacent syllables are weakened, so the Is are both I-schwa in English. However, in Japanese, vowels are generally clear, and vowel reduction is a matter of length, not vowel quality.

When the English speaker's ear hears Hiroshima, it either notes the correct stress, and fails to perceive the "EE" sounds, or it hears the EE sounds and fails to perceive the correct stress.

Which of these is further from the original? From an English speaker's perspective, it's impossible to say -- you need to make reference to the original language. I do not know for sure, but as Japanese has far fewer vowels than English, I would imagine hiROshima is readily recognised for the intended meaning, and that HEEroSHEEma would be pretty hard to process.

So it's a bit of a fool's errand trying to be "authentic", in my book.

30 July 2015

Language following

Last week, I was at a party in Edinburgh to mark Peruvian independence day. As I was leaving, I heard someone refusing a drink because "tengo que manejar" -- "I have to drive".

Funnily enough, I've had a couple of discussions recently about that very word "drive". It all started with a discussion on a Welsh-language Facebook group. The traditional word presented there was gyrru, whereas people often tend to use the term dreifio, presented there as an Anglicism. Strangely enough, the very next day, I ran into an old university classmate of mine, Carwyn, who was up from Wales to visit a conference in Edinburgh. When I asked him which word he would use to say "drive", his answer was "probably the wrong one", which I immediately took to mean dreifio.

I explained to him why I felt that dreifio was less of an Anglicism than gyrru.

How so?

This is a phenomenon that I call "dictionary following", for wont of a better term. (If there's a widely-accepted alternative name, please do let me know in the comments.) It's a peculiar form of language change that minority languages seem particularly prone to undergoing, where a word-form in one language gets locked to the changing meaning of a single equivalent in another language.
Edit: An Cionnfhaolach over at IrishLanguageForum.com tells me that this transferrence of all meanings for a word in one language to a similar word in another is called a "semantic loan".

In this case, the dictionary word gyrru is a word that means to spur animals onwards -- it's "drive" as in "driving cattle": what drovers do. The modern sense of "drive" comes via the idea of forcing carthorses forward, and thus the English word has broadened.

Across Europe, the equivalent word often evolved analogously. The French and Italian equivalent term is actually to "conduct" a car, and in Spanish, you either "conduct" or "handle" your car -- which is where manejar comes into the equation (manejar = manage = handle; mano = hand).

It's too easy to focus on the grammatical and lexical items as being the characteristics of a language, but if that is not underpinned by idiomatic usage and unique linguistic metaphors, then it doesn't feel like a complete language; and for me at least, much of the joy of learning and speaking that language is lost.

So for me, I'm happier to adopt the English "drive" morpheme into languages like Gaelic and Welsh than to adopt the English metaphor with a native room and claim that this is somehow "purer".