tag:blogger.com,1999:blog-301143852024-02-28T17:30:39.824+00:00Lingua FranklyI'm a language grad and a student language teacher, and in my spare time I learn languages. I have a special interest in minority languages and as a former IT professional I am particularly interested in where human and computer meet.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.comBlogger301125tag:blogger.com,1999:blog-30114385.post-68519105132615523912020-06-02T19:58:00.002+01:002020-06-02T19:58:43.002+01:00A cognitive failing that leads us to value the least valuable learning activities.So I realise I haven't written anything for a very long time indeed, and there's probably no-one reading at this point, but there's something that I feel like writing about, so I'm going to write about it, even if no-one ever reads it. Sounds fair, right? Well as I don't hear any disagreement, I'll continue. ;-)<br />
<br />
OK, so someone in a Facebook group recently asked for pointers on creating mnemonics to help learn vocabulary in Scottish Gaelic, and I commented that it wasn't really worth doing. For one thing, mnemonics based on other languages risk encouraging bad pronunciation habits -- I've seen this in a lot of learners, who struggle to move away from their L1 pronunciation of a mnemonic to the natural L2 pronunciation. I've even met people who struggle with close cognates, and even having achieved relatively good general pronunciation in the L2 still can't manage to say (for example) <i>adopción</i> in Spanish or <i>adoption</i> in French without the SH sound of <i>adoption</i> in English.<br />
<br />
But the use of mnemonics is still actively recommended by a lot of learners and teachers to new beginners. Why so?<br /><br />My belief is that this is a misperception driven by the <i>salience </i>(noticeability, the property of "sticking out")<i> </i>of words learned this way. If you've ever learned any L2 vocabulary via a mnemonic, you're sure to remember at least one of them. I can tell you now that a French swimming pool is the place where you'll find (adopt French accent now) "all ze baby peess een." These words therefore stick out in your mind, and you give them much more importance than they really deserve.<br />
<br />
This doesn't only happen with mnemonics. The same salience is given to items you learn by making a mistake, like the old classic of trying to tell someone you're tired in Spanish (cansado) and accidentally saying you're married (casado). This is one of a series of classic mistakes that you make once and then never again, such as trying to say your embarrassed in French, Spanish or Italian and accidentally saying you're pregnant; or French, Spanish or Italian people trying to say they've got a bunged up nose and instead saying they're constipated.<br />
<br />
This sort of mistake is instantly memorable -- a great many people can tell you when and where they made that exact mistake. We learned effectively from the situation. This can lead us to assume that it's a good way to learn -- make mistakes, get corrected.<br />
<br />
I can also tell you about why I know the Spanish for oats (<i>avena</i>) and for hazelnuts (<i>avellanas</i>). I tend to eat quite a lot of oats, and I can still picture to this day one small section of the small supermarket down the road from the school I taught in in Donostia where the oats was on one shelf and the nuts were either 2 up or 2 down. It took me several weeks, if not months, to remember which was which. This is the sort of anecdote that superficially supports the idea that learning is most effective when it is immediately personal and relevant, but in reality it doesn't. I remember the process, but it was neither particularly efficient or useful.<br /><br />I've heard all these claims numerous times, both from teachers and from experienced learners, and they're always backed up with little anecdotes about the time they got given condoms when they asked for jam or similar. However, the reason that most people can list the words they learnt by mnemonic, embarrassing mistake, or specific personal experience is that these are rare occurrences. -- the way we learned the words gains saliency precisely because these were one-off memorable occurrences.<br /><br />This way, we manage to convince ourselves that a handful of words, possibly as many as a dozen, provide us with a model for how to learn language, when in reality they are vanishingly insignificant compared to the hundreds or thousands of words that we have picked up through more mundane processes during the course of our learning journey. As a result, we end up advising others to pursue inefficient learning techniques.<br /><br />But it's worse than that, because you can't replicate the accidental embarrassing error experience artificially and you can't artificially engineer the sort of situation that leads to truly personalised memorable experienced. As for word mnemonics, the reason that a lot of advice on mnemonics starts with common cliched examples shows that there really is a very limited set of circumstances where good mnemonics exist. This means that the advice is not simply inefficient, it's actually pretty difficult to put into practice, and there is nothing more guaranteed to mess up learners' heads than to give them advice that they literally can't follow.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com1tag:blogger.com,1999:blog-30114385.post-45959918781641386522017-01-22T00:06:00.000+00:002017-01-22T00:06:26.759+00:00Testing terminology: Formative and Summative assessmentI've talked before about specialist terminology a few times in the past.<br />
<br />
My view (in short) is that terminology should be meaningful to be useful, but far too often teachers forget to explain why a word means what it means.<br />
<br />
For example, the terms "formative assessment" and "summative assessment" are very common. I first encountered them when studying with the OU. There was a brief explanation of how they affected us as students and they moved on. When I later started studying teaching, I was given more formal explanations, and it struck me as interesting that there was actually <i>less</i> clarity about the meaning of the terms on the teacher training courses than in general student populations.<br />
<br />
But I don't recall any of my teacher trainers ever pointing us towards the words themselves, because actually in this case I believe the simplest explanation for the concept comes from explaining the words. (If the explanations below aren't clear, please feel free to mention it in the comments.)<br />
<h4>
Formative Assessment</h4>
The term "formative" comes from the verb "to form" – i.e. to shape or to develop. Our "formative years" are the years of our youth that shape us, and make us who we are.<br />
Formative assessment is any kind of test or assessment that is designed to give the student feedback or assistance in developing their skills.<br />
<h4>
Summative Assessment</h4>
The first syllable here – "sum" – says it all. It's about adding up and calculating scores.<br />
Any assessed piece of work which is used as part (or all!) of a final course grade is a summative assessment.<br />
<h4>
A false dichotomy</h4>
As well as a lack of clarity in the description of both terms, people are often confused because formative and summative assessment are often presented as though they are mutually exclusive categories. However, if we look at the actual definitions, we find that while some assessments fall into one category or the other, many fall into both.<br />
<br />
For example, when I took my first degree, the typical structure of assessment was made up of three elements:<br />
<ol>
<li>Weekly homework that would be marked in class. Our results would be a reference for ourselves as what to work on during our study out of class, and the tutors would specifically address as many of our problems as they could in the class. No marks would be recorded.</li>
<li>Monthly assignments that would be marked and returned with relatively detailed feedback as to our strengths and weaknesses. Some of the same material might be covered in future assignments or in the final exam, so it was a good idea to read the feedback very carefully. Marks would be kept and would make up 50% of our final mark.</li>
<li>An exam. No feedback would be given, just a numerical score. This was 50% of our final mark.</li>
</ol>
Looking at them, number 1 is clearly formative (feedback, no grade), and number 3 is clearly summative (grade, no feedback), but number 2 has elements of both. Indeed, number 2 <i>is</i> both formative and summative, and most continuous assessment fulfills both roles. <br />
<h4>
Conventional use of the terms </h4>
In my experience as a student in higher education, the terms are typically used in a much narrower sense.<br />
<br />
First of all, the terms are only ever used with students to describe assignments done in our own time for submission. Exams are not described as "summative assessment", even though they technically are. The term "formative assessment" is typically used to describe a sort of "practice assignment" handed in early on in the course to ensure the students understand what's expected of them.<br />
<br />
That's all well and good for labelling things in the student handbook, but the teacher needs to be working at a deeper level than that, and not thinking of an assessment as "a formative assessment" or "a summative assessment", but as an assessment that has formative and/or summative <i>functions</i>.<br />
<br />
It might seem quite a subtle difference, but I personally think it's an important one.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-26204814761510917682016-09-25T17:17:00.000+01:002016-09-25T17:17:28.006+01:00Don't get any big ideas: the dominance of names in social sciencesIt has bugged me for quite some time that in language, in education and in the social sciences in general, there are certain names whose big ideas get repeated ad nauseam even once others have moved the state-of-the-art onwards.<br />
<br />
For example, when textbooks and courses discuss formal grammars, they typically focus on Noam Chomsky's generative grammars. Chomsky's model of grammar divorced structure from meaning, which he demonstrated with the nonsense sentence "Colorless green ideas sleep furiously." Since then, however, it has been observed by many commentators that a command of grammar as an independent from semantics is only really valid for people with training in grammar, and culturally not universal.<br />
<br />
Take for example the early nineties computer game "First Samurai". Developed in the UK, they asked a Japanese translator what the game's title would be in the Japanese script, as they wanted this for the cover. The response was that it was impossible, because you can't rank samurai. Reportedly, they then asked how you would say the "first samurai" you see in the morning -- still impossible. In the end, they had to ask for another phrase to be translated and then take the symbol for "first" from that and place it next to the symbol for samurai.<br />
<br />
There are dozens of similar anecdotes attested worldwide. In her 1978 book <i>Children's Minds</i>, Margaret Donaldson cites a report of an adventurer who asked a Native American to translate "the white man shot six bears today."<br />
<blockquote class="tr_bq">
"'How can I do that?' said the Indian. 'No white man could shoot six bears in one day.'"</blockquote>
Donaldson roundly rejects the idea that this grammaticality sense is anything more than a result of our education.<br />
<br />
The other major blow to Chomsky's model was Lucien Tesnière's valency/dependency grammars. Whereas Chomsky built his trees based on part-of-speech only, Tesnière identified that certain words had to be accompanied by certain other features, and could optionally be qualified by additional ones.<br />
<br />
Chomsky split his basic sentence trees into subject and predicate, as was the norm at the time. Tesnière instead argued for "verb centrality", putting the verb at the top of the tree. This meant that the verb in Chomsky's model had a direct link to the grammatical subject. It is trivially obvious that this is a superior model, because with no direct link, Chomsky's model essentially claims that "*He say I does" would pass a grammaticality test. Now I'm sure Chomsky at some point will have presented a round-about argument to say why that's not acceptable, quite simply Tesnière's model was better. Tesnière's model is widely accepted, and it's key to a lot of computer-based language techniques.<br />
<br />
And yet when I studied language, lots of space was presented to Chomsky, and I have no recollection of seeing the name Tesnière or talk of valency or dependency grammars. When I briefly studied formal grammar in computing, lots of time was given to Chomsky, and dependency grammars were mentioned only in passing. To me, verb centrality was an obvious notion, and every time Chomskyan grammar was presented to me, I wanted to put the verb at the top. It wasn't until about four years ago that I picked up a book on Computational Linguistics/Natural Language Processing and was introduced to Tesnière's theories.<br />
<br />
This is a very dangerous state of affairs -- students are being taught outdated, disproven theories instead of the current state of the art. In education, one of the best examples would be Piaget, whose theories have been proven wrong time and again, but are still one of the main focuses in most introductions to childhood development.<br />
<br />
Why? The typical answer is that to understand the current system, we have to understand the underlying theories they're built on. This, I'm afraid, is not true. Or rather, it <i>is</i> true, but the underlying theories of modern grammar are <i>derived from</i> Chomsky, and the underlying theories of modern childhood development are <i>derived from </i>Piaget. By teaching the original theories, we end up holding back development in the field: most courses spend so long talking about the outdated theories that they don't leave time to fully discuss the current ones and the students leave the courses with a working model of the wrong theories. We therefore spend a lot of time debating the same thing as the generation before.<br />
<br />
Certainly, we don't do this in the physical sciences. No-one would suggest that in order to learn about the big bang theory we first have to learn about the theories that predated it. Such theories are clearly of some interest, but are best restricted to specific modules on the history of science.<br />
<br />
The problem in the social sciences seems to be a reluctance to rewrite part of a major theory based on subsequent observations and refinement. It appears that to be genuinely influential in social sciences, you cannot simply do incremental improvement, and instead must write a new theory practically from the ground up. In doing so, you are guaranteed posterity, because your grand theory will continue to be published, cited, repeated and taught as is long after all the elements it is built of are individually discredited -- the field will not allow anyone else to revise it for you.<br />
<br />
Take Bloom's Taxonomy for instance. Even when it was first devised it was a bit of a kludge. The most common form seen today is still the simplest triangle form, and all versions and derivatives still hold the same ordering of "remembering" before "understanding" -- i.e. it preaches rote learning in line with the behaviorist thinking of Bloom's day, even though no modern school of thought actively professes a belief in rote learning as a useful mechanism.<br />
<br />
The book <i>Second Language Learning Theories</i> by Mitchell, Myles and Marsden discusses the proliferation of theories in the introduction, and says "We believe that our understanding advances best when theories are freely debated and challenged among a community of scholars." I would certainly not dispute that, but I think we waste an awful lot of time when we discuss disproven theories and treat them as though they have equal merit to theories not yet disproven. We also, as I said earlier, do a lot of damage to the next generation of academics by preparing them to discuss the disproven theories rather than the current ones.<br />
<br />
Worse, though, is the effect on non-academics. Teachers overexposed to outdated theories and not familiar with current ones are unable to take advantage of advances in the field and translate it into classroom practice.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com1tag:blogger.com,1999:blog-30114385.post-14029180335756940622016-09-25T11:37:00.002+01:002017-04-26T09:31:15.410+01:00Does cross-language transfer exist?<i>[I actually wrote this about 6 months ago and I'm not sure why I didn't publish it. I've read it over and it looks finished to me...]</i><br />
<br />
It's always seemed weird to me that some people claim that L2 errors are not related to the form of the learner's L1. Surely it's common sense? For instance, a speaker of Chinese, or Polish, or any other language with no articles is going to have a problem learning when to use "a", when to use "the" and when not to use an article at all in English, right?<br />
<br />
But of course, common sense is only common sense until it's proven wrong. It was once common sense that the sun went round the Earth and that there was such a thing as "races" of people -- scientific study has since shown us otherwise.<br />
<br />
So language must be open to scientific study, and teachers must be open to the results of study. And there are plenty of studies that purport to disprove the idea of cross-linguistic interference.<br />
<br />
For example, I'm currently skimming through bits of the book Understanding Second Language Acquisition by Lourdes Ortega. There's a chapter on "cross-linguistic influences" (they reject the term "interference", because they reckon it puts too much blame on the L1) and they open by looking at a couple of studies that show that the similarities and differences between languages have less influence than might be expected. The first of these is to do with the placement of negatives in Swedish, where they find that learners on the whole tend to incorrectly place negatives before the verb, even if their own language also places negatives after the verb. [Hyltenstam, K., 1977. Implicational patterns in interlanguage syntax variation. <i>Language learning</i>, <i>27</i>(2), pp.383-410.]<br />
<br />
What questions does this raise? First I would say that negation is an undeniable universal of language, so here we're talking about a linguistic concept that is already known to the learner, and the only variable under examination is word placement -- syntax. There is no examination of usage -- for example, what happens with complex clauses? In English we most commonly say "I don't think so", but some languages are more likely to say "I think not", which now sounds quite old-fashioned. It's far more likely that a native English speaker would say "I don't think he's coming" than "I think he's not coming", but "creo que no viene" is perfectly normal in Spanish. It's readily apparent that this sort of transference occurs at the phrasal level, and this is the sort of thing that is often never taught and simply left to the student to work out. It is also readily observable that learners make errors handling polysemy (multiple meanings of a single word) -- again, looking at Spanish "esperar" is to wait, to hope and to expect, and it doesn't take long in the company of Spanish learners of English to hear someone pick the wrong one of the three.<br />
<br />
<br />
I haven't had a chance to read the original paper yet (I'm not familiar with my new university's online search and I'm feeling too impatient today to start hunting) so for the moment I'm just thinking about what I'll be looking for when I get round to reading it (which might not be for a while!). First up, is there anything that goes deeper than simple word placement? Secondly, as a tangent to the point about cross-language transfer, and continuing the thread on slots, does something change when it's a multi-part structure, with the two sections separated by other language content?<br />
<br />
Besides, is negatives even a fair example? My gut reaction is that in every language that I've come across a sort of "Tarzan speak", the stereotypical form is for pre-negation ("me no hurt pretty pretty" and the like). <br />
<br />
The second paper cited discusses object pronoun placement in French-speaking learners of English and English-speaking learners of French. [Zobl, H., 1980. The formal and developmental selectivity of LI influence on L2 acquisition. <i>Language learning</i>, <i>30</i>(1), pp.43-57.]<br />
<br />
Again, I've not read the paper yet, but the summary in the book baffles me a bit, because it seems somewhat obvious. We have the example of the pair "je les vois"/"I see them" and it is observed that English speakers often erroneously say "*je vois les", but French speakers don't tend to say "I them see". This is not a symmetrical error, but then, this is not a symmetrical pattern. As I see it, the difficulty for the English speaker isn't simply one of the absolute position of the pronoun, but the fact that pronouns appear in a different place from explicit noun phrases as object -- <i>je les vois</i> vs <i>je vois des hommes</i>. The English speaker learning French has the difficulty that he sees pronoun noun phrases and explicit noun phrases as subcategories of a single thing that share a position, and in order to learn French accurate must learn a fundamentally new distinction between weak pronouns and other noun phrases. The French learner of English, conversely, doesn't necessarily need to learn the English speaker's categorisation, and can maintain the French distinction and notion of two slots, and can then simply focus on the syntax, learning two different positions that happen to be the same. It feels to me that focusing on the superficial differences here has failed to account for the real underlying difference.<br />
<br />
Another question this leaves in my head is about my own observations from dealing with Romance speakers. I recall hearing a lot of them dropping weak pronouns altogether, and my first reaction is that the article focuses on the lack of examples of "*I them see" as proof that there's not a problem, but doesn't comment on the presence or absence of examples of pronounless renderings (i.e. "I see", with no them). My observation from my own pupils was that several of them had learned not to place the pronoun before the verb, but if the verb in Spanish would be the last word of a sentence, they'd just stop without including the pronoun at all. I'm now left second-guessing my own recollection here, though, as recently I only recall hearing this error with Spanish people saying "I like" for "me gusta", and of course there's no explicit "it" in the Spanish, so that's a different issue (but still cross-linguistic).<br />
<br />
This is an issue that intrigues me, and I hope to revisit it during the year when I'm working on cross-linguistic issues as part of my masters. For now, though, I just wanted to get my thoughts jotted down for future reference.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-41178130983750808062016-09-24T12:42:00.000+01:002016-09-24T12:42:19.855+01:00The master approaches...So I've just embarked on a new phase in my career, beginning a masters degree programme in TESOL. After Christmas I get the opportunity to specialise, and the plan at the moment is to specialise in computer-assisted language learning, which really is my kind of thing.<br />
<br />
I figure it's time to dust off this old blog and start using it as a scratchpad to reflect on the sort of issues that I'm dealing with on the course, and to comment on the materials I come across in my reading.<br />
<br />
It's also an opportunity for me to break the habit of a lifetime and start using proper citations and referencing on the blog, something which I hope to stick to in the future so that things I post here are better informed and therefore more useful to others.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-80383880636632390012016-02-27T16:56:00.000+00:002016-02-27T16:56:32.364+00:00Edinburgh's trouble with multilingual education.The Scottish Government has long had an aspiration to wider availability of multilingual education, and recently formalised on the European model of 1+2. 1+2 is the idea that a child will be educated in their first language, and that during their primary schooling, they will be taught at least two additional languages; the first being introduced from the first year of schooling, the second no later than the 5th year of primary school. (Earlier draft versions of the regulations said the first additional language should be introduced no later than P3, but this has since changed.)<br />
<br />
There are several steering principles underpinning the 1+2 approach. With regards to the first additional language ("L2") the government themselves state:<br />"<span style="background-color: white; font-family: Verdana, Arial, Helvetica, sans-serif; font-size: 12.09px; line-height: 18.135px;"> </span><span style="background-color: white; font-family: Verdana, Arial, Helvetica, sans-serif; font-size: 12.09px; line-height: 18.135px;">The Working Group expects young people to continue with some form of language study in the L2 language up to the end of the broad general education, i.e to the end of S3. </span>" [<a href="http://www.gov.scot/Publications/2012/05/3670/6">Language Learning in Scotland: A 1+2 Approach</a>]<br />
Children are therefore expected to be given the opportunity for continuity of access to their L2 until around the age of 14 or 15, and it is assumed that there will be the option to continue beyond that age, subject to the usual logistical constraints around class sizes and the viability of running exam-level classes for a small number of pupils.<br />
<br />
Another of the principles is that language should not merely be taught as a subject, but should be embedded into classroom routine. There is even the hope that in the future it would be possible to offer subjects (or units within subjects) delivered through foreign languages. What could be more natural than listening to accounts from French or German WWII soldiers and civilians in their own language as part of the history curriculum, for example? It's a laudable goal, and even if we're not likely to achieve it in the foreseeable future, it's certainly something to aspire to.<br />
<br />
The government's deadline for the implementation of this policy is 2020, and several local authorities are pushing to get themselves ready ahead of this date. Last year, Edinburgh City Council <a href="http://www.bbc.co.uk/news/uk-scotland-edinburgh-east-fife-34515908">announced their intention to have the scheme implemented by 2017</a>.<br />
<br />
This too is laudable, but recent news has thrown the city council's commitment to this into doubt.<br />
<br />
Gaelic-medium education (GME) has been available since 1988, when a Gaelic-medium unit was opened within a mainstream school, Tollcross Primary. Since then, uptake of the option for GME in the city has increased year on year. Tollcross Primary is a feeder school for the city-centre high school James Gillespie's, so secondary Gaelic-medium was implemented there. In 2013, primary GME education was moved to a dedicated school on Bonnington Road in the north of the city, outside of James Gillespie's school normal catchment area, but JG's retained its place as the city's secondary GME facility and the new school was given official status as a "feeder primary" to the school.<br />
<br />
This year, however, James Gillespie's have found themselves with more applications for new admissions than they have capacity to accept, and <a href="http://www.edinburghnews.scotsman.com/news/education/james-gillespie-high-school-may-turn-away-catchment-students-1-4038362">the council have announced that the standard rules for oversubscription apply</a>: priority to children within the geographical catchment area and those with older siblings already attending the school. As the intake for the Gaelic primary is drawn from the entire city (and beyond), it is most likely that the pupils who lose out will be those currently going through GME. There are 24 pupils in this year's primary 7 class, and current projections see 9 of them being refused a place at JG's.<br />
<br />
The council's current proposed solution to this is to offer these children the option of attending Tynecastle High School, or the school for their local catchment area, but neither of these options fulfils the aspirations set out for 1+2, as local schools will offer these children no continuity in their L2 (Gaelic), and Tynecastle is little better. Tynecastle currently only offers Gaelic for learners, something which is not appropriate to children from a GME background. Indeed, children who have undergone three or more years in GME are not allowed to sit the Gaelic learners' exams at National 5 or above at all.<br />
<br />
Going by the council's current projections, then, we're likely to see 15 GME kids in JG's first-year intake and at most 9 in Tynecastle's. With class sizes pegged at 30, that means that we've taken one class and turned it into two, which certainly does nothing to reduce problems of capacity at either school. When we look at what that means for course choice at 3rd and 4th year, when come of the pupils may be dropping Gaelic, what are the chances that <i>either </i>school will see a continuing Gaelic class for the GME pupils as viable?<br />
<br />
This then leads on to a wider issue with GME provision at JG's. Aside from Gaelic itself, the school currently only teaches Art, RE, PE and Modern Studies through Gaelic, and currently none at a certificate level, although National 5 Modern Studies will be offered next year (<a href="http://l.facebook.com/l.php?u=http%3A%2F%2Fwww.edinburgh.gov.uk%2Fdownload%2Fmeetings%2Fid%2F49933%2Ffull_meeting_papers_-_ec_and_f_committee_-_010316&h=TAQEauNzl">see section 3.78</a>). It seems likely that these classes will not operate in Gaelic for next year's first year, as that would mean having half-empty classrooms in a school that had already turned children away for capacity constraints.<br />
<br />
Part of Edinburgh Council's justification for this decision is that:<br />
"The level of current Gaelic provision at James Gillespie’s High School is not
significant and could be relatively easily replicated, at least in part. There
continue to be significant issues nationally with the recruitment of Gaelic
speaking staff which limit what could actually be delivered at a secondary
level, regardless of where it was provided. " (section 3.75, same document as above)<br />
<br />Both of these statements are true, but this is something of a question of cause and effect.<br />
<br />
First of all, the reason for the low level of Gaelic provision is due to the lack of critical mass, and dispersing the GME primary cohort across two or more high schools will certainly not resolve this. Secondly, part of the problem nationally with the availability Gaelic-speaking staff is that for they typically spend the majority of their time teaching in English, and again this stems from a lack of critical mass within the pupil cohort. If the council's actions will lead to Gaelic-speaking teachers spending even less time teaching in Gaelic, then the council's justification is little more than a self-fulfilling prophecy that leads them to further squander what is already a limited resource, rendering their argument somewhat self-defeating.<br />
<br />
The lack of availability of trained GME teachers is something that is being addressed at the national level, but there's something of a chicken-and-egg situation: with the low number of classes being taught in GME at present, it is very difficult for a teaching student to gain placement experience in a Gaelic-medium setting. Depending on the subject you are training to teach and the school you are placed in, Gaelic-medium classes may be limited to BGE (the first three years) or even only the first year or two. Some subjects may not be available at all. This makes it very difficult for a new teacher to build up the confidence required in delivering through Gaelic a subject that they themselves will have learned through English. Any action at a local level that risks decreasing the availability of GME has knock-on effects at a national level that hamper our ability to address the issue.<br />
<br />
<h3>
Not just a problem for Gaelic</h3>
Many people will shrug their shoulders and say "it's only Gaelic", but they're missing the point, because at the moment it's only Gaelic that offers us a current model for language learning throughout schooling, and much of the Scottish Government's policy on language learning leans on the experience of GME.<br />
<br />
Four years away from the government's deadline on 1+2 and one year from its own self-imposed deadline, the council is already making decisions that take it further away from its goal. This does not bode well for children going through primary education in other languages, and Edinburgh council's schools will be offering a fairly broad selection (among them French, Spanish, Mandarin, Polish, Farsi and Gaelic). What happens if a child who has learned Spanish since primary 1 finds themselves allocated to Forrester High School (French and German)?<br />
<br />
This is a logistical matter that will become a serious issue for parents across the city in the next few years, and this is an opportunity for the council to pilot a solution on a small scale and work out a strategy before it's too late.<br />
<br />
If the council can't handle the transition between primary and secondary correctly, it will turn children off languages: kids placed in a language class that is too easy for them will lose interest in languages, and kids placed in classes above their level will lose confidence in their own ability to learn.<br />
<br />
The goal of 1+2 isn't just to give kids "the right language", but to give them the right <i>attitude</i> to language, so that they can go on to be successful language learners and pick up the particular language they need later in life when the need arises. Getting the primary-secondary transition right is absolutely vital in developing this attitude, and if the council can get this right for 24 pupils next year, how can it hope to do so for the hundreds of pupils moving into its high schools in 2017 and every year after?Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-57189787009523277752016-02-15T19:20:00.000+00:002016-09-24T12:55:18.601+01:00Implicit and explicit, meaningful and meaninglessLast week I was across in Edinburgh catching up with friends. I arrived early, so went into a bookshop to kill time... and came out with two chunky academic texts. I probably would have escaped without buying anything if one particular book title hadn't caught my eye: <i>Implicit and Explicit Knowledge in Second Language Learning, Testing and Teaching</i>.<br />
<br />
The main thrust of the book was looking into the ongoing debate as to whether implicit teaching styles lead exclusively to implicit knowledge and explicit styles to explicit knowledge only, or whether explicit teaching could lead to better implicit knowledge. It's an important area of discussion because at present the mainstream theory of language teaching holds that only implicit learning can ever lead to implicit understanding and production, and that explicit teaching only ever makes people consciously aware of rules and able to apply them mechanically and consciously.<br />
<br />
And yet there are very few teachers who <i>don't</i> include some explicit instruction in their lessons, whether that's word-lists or conjugation tables. (Even Assimil, who sell themselves on the principle of natural assimilation, dedicate more letters to grammatical explanations than they do to the dialogues and their transcriptions and translation.)<br />
<br />
I haven't read the whole book yet, and (unsurprisingly) what I've seen so far is pretty inconclusive. However, it does lean towards the opinion that explicit teaching does indeed help in language mastery. It also discounts a lot of the past counter-evidence to this theory on the grounds that their models of explicit teaching are simply bad examples, using overly mechanical, rote methods, and not being equivalently "meaningful" to the implicit method under examination.<br />
<br />
It's this word "meaningful" that I think is the crux of the problems faced in language learning – language is nothing without meaning.<br />
<br />
In the language classroom, items that seem to be inherently rich in meaning can paradoxically be rendered devoid of meaning by context.<br />
<br />
Consider:<br />
<i>My cousins buy trousers.</i><br />
<i><br /></i>
In an objective sense, it carries a lot of meaning, and there is no truly redundant information in the sentence – every word, every morpheme, brings something not explicitly present elsewhere. (But even then, it has no real personal meaning to me, as I can't imagine myself ever saying it. This is a side issue for the moment, though.)<br />
<br />
But what happens when we put that sentence into a classroom exercise?<br />
<br />
For an extreme example, let's take the behaviorist idea of substitution drills. In "New Key" style teaching, a substitution drill would be target language only, and one element of the sentence would be substituted with something else in the target language. So our theoretical exercise might go:<br />
<i>Teacher: My aunt buys hats</i><br />
<i>Learner: My aunt buys hats</i><br />
<i>Teacher: My mother</i><br />
<i>Learner: My mother buys hats</i><br />
<i>Teacher: Trousers</i><br />
<i>Learner: My mother buys trousers</i><br />
<i>Teacher: My parents</i><br />
<i>Learner: My parents buy trousers</i><br />
<i>Teacher: My cousins</i><br />
<i>Learner: My cousins buy trousers</i><br />
<i><br /></i>
At the point of utterance, the learner does not have to pay any attention whatsoever to the meaning of anything in the sentence beyond the plural marking of <i>my cousins</i> and the s-free verb form of <i>buy</i>, which is made even easier by the fact that this plural example follows an earlier plural example. Thus the student has no immediate motivation to attend to meaning, and it is a struggle to do so.<br />
<br />
Substitution drilling is, as I said, an extreme example, but I do feel it is useful in establishing a principle that affects a great deal of learning, even where the effects are not so obvious.<br />
<br />
Consider, for example, the fairly established and mainstream idea of focusing on a particular grammar point in some particular lesson, or section thereof. If I am set a dozen questions all of which involve conjugating regular Spanish -er verbs into the present simple third person singular (or whatever), then I do not need to attend to the meaning of the present simple third person singular, just the form <i>-e</i>.<br />
<br />
To me, attending to meaning is the single most important matter when it comes to language learning, and yet it is rarely explicitly discussed. Instead, it is typically wrapped into a specific embodiment of the principle. Krashen's comprehensible input hypothesis suggests we learn language by understanding input, the communicative approach says we learn when we use language to solve problems. Total Physical Response says language has to be tied to physical action All of these are attempts to address the meaningfulness of language, but they are a narrow, specialised form of attention to meaning. CI and TPR deny us access to the colourfulness of abstract language with its subtle, personal meanings, and the CA doesn't do much better in that regard – while modal language may be taught in a communicative classroom, the nature of the task implies a rather pragmatic, utilitarian meaning, so there isn't really any meaningful difference between blunt orders like <i>give me it</i>; plain requests (<i>can I have...?</i>) or those indirected with a conditional mood (<i>could I have...?</i>); and statements of desire either, whether in declarative (<i>I want...</i>) or indirected further in the conditional (<i>I would like...</i>).<br />
<br />
Other teachers take the idea of "personalisation" and raise it above all other forms of meaningfulness, insisting that students only learn by inventing model sentences that are true for them. But isn't (eg) <i>I want it, but I don't have it</i> true for everyone? Does the brain not immediately personalise a sentence such as that? (When I came across a similar sentence in the Michel Thomas Spanish course, I was cast back to throwing coins in a wishing well as a child.)<br />
<br />
Perhaps the reason few writers wish to discuss attention to meaning is that it throws up a lot of questions that often fundamentally challenge their methodologies. For example, comprehensible input (and any similar learn-by-absorption philosophy) is confounded by redundancy in language – there is no need to attend to the meaning of every morpheme when the same information is encoded twice in the sentence. Perhaps the clearest example of this is the present tense <i>-s</i> suffix for English verbs (third person singular, i.e. he/she/it). It is readily apparent that a lot of learners do not pick this up, and there are a great many foreigners who spend years in English speaking countries, hearing thousands of hours of input with the correct form, but who never pick it up. There is no need for the learner to attend to the meaning of the -s, because they already know from the subject of the sentence that it's the third person singular being discussed. When they speak, they are understood, because even though it sounds incorrect to a native speaker, there's practically no risk of being misunderstood. In the communicative approach, such an error is not a barrier to completing the intended task (particularly seeing as there's a good chance your conversation partner will make the same mistake, given that everyone in your class is a learner), so there is no requirement to attend to it.<br />
<br />
<div>
Language has a natural tendency to redundancy, in order to make our utterances easier to understand; we are naturally disinclined to attend to every element of the sentence. Therefore any attention to the meaning of all the individual components of an utterance will be a higher-order process, a conscious or semi-conscious overriding of our lower instincts. Surely that makes it an explicit process? And if it is an explicit process, surely it is better for it to be directed by an expert (the teacher) than carried out on an ad hoc basis by a non-expert (the learner)?</div>
Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com1tag:blogger.com,1999:blog-30114385.post-59651624240924844912016-01-02T20:04:00.001+00:002016-01-02T20:04:26.409+00:00Why I learn languages,and why I'm not learning any languagesI've probably said before, but the reason I like learning languages is because of the look on people's faces when I speak their language to them.<br />
<br />
What I didn't realise was how far back this went.<br />
<br />
My dad was a teacher at my high school, and I knew the main janitor before I knew most of the teachers. Lucien, the janny, was from France, and by a couple of years into high school I would say "Bonjour" to him - I couldn't say much else, but it was enough.<br />
<br />
When my dad started talking about him this Christmas, the main memory in my head was just a smile - the delight of being spoken to in his own language, even for a moment. It was the same sensation that I've seen so many times since, and I started to wonder why it had taken me so long to start learning languages properly, and then I realised that I've stopped learning again.<br />
<br />
I remember that all through my late teens and early twenties, I was keen on the idea of learning languages, and I picked up a couple of books here and there but never got anywhere. I only started to get the proper motivation back when I started speaking broken high-school Italian to a young woman serving in a local sandwich shop. Again I tried picking up the old books, and again I put them down.<br />
<br />
As it turns out, restarting a half-forgotten language was really hard - if you attempt to read notes on things you already sort of know, you switch off, so that's when I switched to new languages: Spanish and then Scottish Gaelic. After that, returning to tidy up my French and Italian was a lot easier.<br />
<br />
But right now, I'm not really learning, or relearning, or even consolidating <i>anything</i>. Why not? Maybe it's because I can already give lots of people the satisfaction of hearing their own language. More likely, though, it's just because I'm not meeting enough non-English-speaking people. That would be understandable, I suppose. Last summer I moved to an island off the north-west coast of Scotland, where I'm studying full-time, and there aren't many foreigners in the area at all.<br />
<br />
What there is, though, is a lot of native speakers of Gaelic, and I just keep falling back to English.<br />
<br />
Am I just being lazy? Or is my brain overworked? Or am I just being antisocial?<br />
<br />
Probably a bit of each. I really want to get back on track this year, and start learning something new. To that end, I'm starting to plan my summer holidays now -- an epic cycle journey across part of Europe. I need to take in at least one area which requires a new language, and at the moment, I think Germany fits the bill. I've already got a solid basis to build on, so I just need to build fluency and vocab and see where I can get to.<br />
<br />
Even just thinking about it, I can start to feel some of the anticipation building.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com3tag:blogger.com,1999:blog-30114385.post-8635542887928101612015-08-19T19:54:00.000+01:002015-08-19T19:54:06.512+01:00The folly of trying to pronounce a place "like the natives do"Well intentioned people often insist on trying to pronounce placenames in the "authentic" native form, even when there's a well-known variation in their own language.<br />
<br />
At times, that change is remarkably successful, such as the change of "Peking" to "Beijing". There is an argument that this is futile, however, as the Chinese phonemes are rarely that close to English ones, and the tones of the Chinese are completely absent.<br />
<br />
About a fortnight ago, the TV was marking the 70th anniversary of the brutal slaughter of the people of Hiroshima.<br />
<br />
Now, <i>most</i> of us say "hiROshima", but a few people say "HEEroSHEEma". I thought about it a bit, and I figured that the first one is probably right, as the second one sounds like two words. I then looked up on the internet, and felt a bit sheepish when I read that the name means "wide island" in Japanese. Two words? Oh. But then I brought up Forvo and nope -- <a href="http://forvo.com/word/hiroshima/#ja">it's pronounced as one word</a>.<br />
<br />
So why do we end up with two forms in English?<br />
<br />
It's all about perception. There are multiple things that you might detect. First up, there's word stress. In Japanese, Hiroshima is stressed on the second syllable, which is how I pronounce it in English. However, a knock-on effect of English stress is that adjacent syllables are weakened, so the Is are both I-schwa in English. However, in Japanese, vowels are generally clear, and vowel reduction is a matter of length, not vowel quality.<br />
<br />
When the English speaker's ear hears Hiroshima, it either notes the correct stress, and fails to perceive the "EE" sounds, or it hears the EE sounds and fails to perceive the correct stress.<br />
<br />
Which of these is further from the original? From an English speaker's perspective, it's impossible to say -- you need to make reference to the original language. I do not know for sure, but as Japanese has far fewer vowels than English, I would imagine hiROshima is readily recognised for the intended meaning, and that HEEroSHEEma would be pretty hard to process.<br />
<br />
So it's a bit of a fool's errand trying to be "authentic", in my book.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com1tag:blogger.com,1999:blog-30114385.post-85276214652404619812015-07-30T16:31:00.002+01:002015-08-19T19:30:40.003+01:00Language followingLast week, I was at a party in Edinburgh to mark Peruvian independence day. As I was leaving, I heard someone refusing a drink because "tengo que manejar" -- "I have to drive".<br />
<br />
Funnily enough, I've had a couple of discussions recently about that very word "drive". It all started with a discussion on a Welsh-language Facebook group. The traditional word presented there was <em>gyrru</em>, whereas people often tend to use the term <em>dreifio</em>, presented there as an Anglicism. Strangely enough, the very next day, I ran into an old university classmate of mine, Carwyn, who was up from Wales to visit a conference in Edinburgh. When I asked him which word he would use to say "drive", his answer was "probably the wrong one", which I immediately took to mean <em>dreifio</em>.<br />
<br />
I explained to him why I felt that <em>dreifio </em>was less of an Anglicism than <em>gyrru</em>.<br />
<br />
How so?<br />
<br />
This is a phenomenon that I call "dictionary following", for wont of a better term. (If there's a widely-accepted alternative name, please do let me know in the comments.) It's a peculiar form of language change that minority languages seem particularly prone to undergoing, where a word-form in one language gets locked to the changing meaning of a single equivalent in another language.<br />
<i><b>Edit: </b>An Cionnfhaolach over at IrishLanguageForum.com tells me that this transferrence of all meanings for a word in one language to a similar word in another is called a "semantic loan".</i><br />
<br />
In this case, the dictionary word <em>gyrru</em> is a word that means to spur animals onwards -- it's "drive" as in "driving cattle": what <em>drovers</em> do. The modern sense of "drive" comes via the idea of forcing carthorses forward, and thus the English word has broadened.<br />
<br />
Across Europe, the equivalent word often evolved analogously. The French and Italian equivalent term is actually to "conduct" a car, and in Spanish, you either "conduct" or "handle" your car -- which is where <em>manejar</em> comes into the equation (manejar = manage = handle; mano = hand).<br />
<br />
It's too easy to focus on the grammatical and lexical items as being the characteristics of a language, but if that is not underpinned by idiomatic usage and unique linguistic metaphors, then it doesn't feel like a complete language; and for me at least, much of the joy of learning and speaking that language is lost.<br />
<br />
So for me, I'm happier to adopt the English "drive" morpheme into languages like Gaelic and Welsh than to adopt the English metaphor with a native room and claim that this is somehow "purer".Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-49410369298261716032015-07-20T16:23:00.000+01:002015-07-20T16:23:09.257+01:00Undefined article errorNo, Blogger isn't on the blink, that's the intended title of the article.<br />
<br />
The error in question is the continued use of classical terminology for grammatical articles: specifically the terms <i>definite article</i> and <i>indefinite article</i>. For over a decade, I tried to reconcile the grammatical feature with the common sense of the words "definite" and "indefinite" -- i.e. certain and uncertain -- but it made no sense at all.<br />
<br />
It wasn't until I started discussing grammar in foreign languages that I clicked what I'd been missing all along -- the terms we use are basically a <i>mistranslation</i> of classical terminology.<br />
<br />
The English word <i>definite</i> has diverged drastically from its etymological roots, but this is not true in the Romance languages on mainland Europe. When the French say <i>défini </i>or the Spanish say <i>definido</i>, what they are actually saying is <i>defined</i>.<br />
<br />
That's right, the definite article is really the <i>defined article</i>, which means the indefinite article must be the <i>undefined article</i>. From that perspective, everything seems to make much more sense.<br />
<br />
Plenty of languages survive quite well without any articles -- they are essentially redundant as even in English, in a lot of circumstances you can drop them without losing any information in the sentence.<br />
<br />
What I'd never got my head round was that the articles don't add any information to the sentence -- they simply act as a sort of "signpost" to information that already exists elsewhere. But most importantly, it refers to the listener's frame of reference and not the speakers.<br />
<br />
What the definite article flags up is essentially "you know which one I mean", and the indefinite article says "you don't know which one I mean". If I say "You should go home -- the wife'll be waiting," context says I'm talking about your wife, but if I say "I<i> </i>should go home -- the wife'll be waiting," then you know that I'm talking about my wife. And if I say "a friend of mine is coming to visit," I'm telling you that I don't expect you to know which one I'm talking about. But in both cases, if you delete the articles, I would still make the assumption of yours/mine or that I'm not sure in the second.<br />
<br />
Now I know that isn't very clear, but to be honest, I still haven't got this that clear in my own head.<br />
<br />
This "signposting" idea is pretty abstract, so describing it is pretty difficult. But to be fair, it's no more abstract than the phenomenon it's describing, and the more I think about articles, the more weird and abstract they look to me. For something at first class so basic, they are incredibly complex.<br />
<br />
I suppose I'll be working for years trying to work out the best way to teach, discuss and describe them, but for now I'll satisfy myself with using the terms defined and undefined in place of definite and indefinite, because at the very least we'll be one step closer to a meaningful definition.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com5tag:blogger.com,1999:blog-30114385.post-15940237624941723872015-04-12T11:57:00.001+01:002015-04-12T11:57:22.192+01:00I would of written this headline properly, but...I wanted to revisit an old theme today. A lot of people still complain about people writing <i>would of </i>instead of <i>would have</i>. There's a saying in linguistics: <b>there's no such thing as a common error</b> (for native speakers) because standard language is (or should be) a statistical norm of what people actually say or write, and a legitimate standard is one that accepts all common variations (hence modern English spellcheckers accepting both "all right" and "alright" -- and as if just to cause maximum embarassment, the Firefox spellchecker doesn't like "alright"... or "spellchecker").<br />
<br />
If people write "would of", it's because in their internal model of the English language, they do not see the verb "to have" in there at all. I was looking back at <a href="http://linguafrankly.blogspot.co.uk/2011/07/common-errors-my-mistake-hmmm.html">an earlier blog post on this topic</a>, and I saw that I used the phrase "the "error" only occurs when <i>have</i> is used as a second auxiliary". Spot the mistake.<br />
<br />
Standard Modern English clauses can only ever have one auxiliary -- there is no "I will can..." or "I would can...", you either have to switch to a copular construction ("I will be able to...") or inflect, eg <i>can</i> to <i>could: I could tell him (if you want)</i>.<br />
<br />
The have of the perfect aspect in English has traditionally been slightly ambiguous as to whether it's an auxiliary or not. Placement of adverbs gives us an indication of what's going on: "I always have time for it" is fine where "*<strike>I have always time for it</strike>" feels quite odd and stilted, whereas perfect have is perfectly OK with having such adverbs after it, which makes it look like an auxiliary: "I have always been lucky".<br />
<br />
Negatives (and questions) take us further: "I don't have a car" is far more natural to many English speakers than "I haven't a car", but "*<strike>I don't have been to Russia</strike>" is clearly wrong, and "I haven't been to Russia" is the only possible correction.<br />
<br />
So, let's say that the history of the perfect-aspect-have has been one of becoming more and more like the auxiliary verbs. English has, over time, lost the ability to have more than one auxiliary verb in a clause. Those two changes, taken in parallel, means the construction "would have" is in the process of becoming impossible in English.<br />
<br />
What do we have instead? Well, like I said before, I see it as the formation of a new suffix, one that is applied to auxiliary verbs to indicate perfect aspect.<br />
<br />
I would argue that we already have one established, recognised auxiliary suffix in English: -ould. This first appeared as "would" (or rather "wolde"), the past form (both indicative and subjunctive of "willan" (will). Notice that there are two changes here -- firstly the grammatical vowel change i->o (->ou), and the suffixing of past D. The same changes from first principles could describe <i>shall</i> giving us <i>should</i>, even though the exact vowel change is different, but cannot account for <i>can</i> giving us <i>could</i>, as the N->L change isn't typical in English. Furthermore, it is not a commonly observed pattern for people to spell could would and should differently. Therefore -ould must be a single morpheme common to all three words.<br />
<br />
If this is the case, then adding another suffix to that seems perfectly sensible, and we've got coulda, woulda, shoulda; or could've, would've, should've; or coodov, woodov, shoodov or however you want to write it.<br /><br />Of course, this same perfective suffix can be applied to certain auxiliaries without the -ould suffix:<br />
<ul>
<li><i>must:</i> that must've been him etc.</li>
<li><i>will: </i>he'll've been told by now</li>
</ul>
And yet "<i>must</i>" is already practically dead (we all use <i>have to/have got to</i>) in normal usage, leaving "will" rather isolated as the only non-ould auxiliary to take [ha]ve, so even that might slip out of usage fairly quickly.<br /><br />The case for writing "have" is purely etymological, it doesn't fit the evidence from "mistakes", and it presents a rather more complex model of the language than the alternative I present. It's a complexity that is possible, but I believe only insofar as it is as a transitional form between two stable conditions. I think we should let the language take that final evolutionary step to find a stable state.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-18934819994421685452015-04-02T13:26:00.001+01:002015-04-02T13:26:29.167+01:00The UK's privatisation agenda hits immigrants and language studentsI normally try not to put too much politics into a language blog, but this time it definitely deserves it. I have never been a fan of privatising public infrastructure, as it typically shifts the burden of cost to those who can least afford it. This case is no exception.<br />
<br />
I discovered through <a href="https://commonspace.scot/articles/897/exclusive-outrage-at-uk-home-office-plans-to-devalue-scottish-qualifications">a news story shared on Facebook this morning</a> that the UK's Home Office is changing the English language prerequisites for visas. Previously, the SQA (the public sector exam board for Scotland) had an ESOL qualification that was recognised by the Home Office, but this will be struck off the list, which will now consist only of two exams -- the big ones, the expensive ones: Cambridge and Trinity.<br />
<br />
The site reporting it, having nationalist inclinations, chose to focus on the angle that Westminster was trying to undermine Scotland's education sector. As a left-leaning site, though, they failed to spot the bigger picture: this is about privatisation.<br />
<br />
The current UK government is determined to dismantle whatever public infrastructure that remains to us, and leave the populace at the mercy of marketplace economics. (Which does make this a Westminster vs Holyrood issue, to an extent, as the Scottish Government is far less keen on privatisation.)<br /><br />But anyone involved in the language teaching sector will know roughly how expensive the private sector exams are, and anyone teaching English in the UK will have seen firsthand how little their students can afford these tests.<br /><br />Forcing more immigrants into expensive exams (which many criticise for not being a good measure of language ability anyway) is just making life harder for some of the most vulnerable members of our society, because make no mistake -- an immigrant <i>is</i> a member of our society, regardless of what the majority of politicians and newspapers tell us.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com1tag:blogger.com,1999:blog-30114385.post-44035874881761289762015-02-23T14:45:00.000+00:002015-02-23T14:45:34.541+00:00Well-rounded learning...?I just stumbled across <a href="http://www.theguardian.com/education/2014/apr/12/i-need-verb-tables-to-learn-spanish">a post</a> from last year on the Guardian's language learning blog. It's by a guy trying to learn Spanish using only Duolingo, and I thought his feedback was quite interesting.<br />
<br />
The author Alan Haburchak, found himself struggling to internalise the grammar of Spanish as there is no conscious presentation of verb tables. I found similar problems using the course for German -- not with the verbs (I'd done most of the verb system via the Michel Thomas course already), but with the articles. The declension of the articles in German is more complex than verb conjugation, because there are so many sets of overlaps. There is no marking for gender for plural, but gender is marked in singular. The feminine declension matches the plural declension in 3 out of 4 cases. Masculine and neuter declensions match in genitive and dative, but not nominative and accusative... and the neuter nominative and accusative are the same as each other, but not the masculine ones. I'll stop now because I've probably lost you, and that's only half of it. And when you've finished with the definite articles, there's the indefinite ones too, and the adjective endings which are more complicated again. Certain patterns are shared, certain are distinct.<br /><br />Trying to learn such a complex and arbitrary pattern from examples is, I think, pretty much impossible. So much information is obscured (not least of all the gender of the noun) that you cannot generalise. The end result is an ability to read fairly comfortably, but not to reproduce.<br />
<br />
Since I started looking at tables myself, I've found the German course much easier, but still not trivial. Alan's doubt was about whether this was a general problem with "naturalistic" learning, or just his own habits formed through school language classes. I could (and maybe should) have the same doubt, but I just cannot fathom how anyone would intuit these patterns without some conscious knowledge to help them sort through the tangle.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com2tag:blogger.com,1999:blog-30114385.post-16628863592134130592015-02-19T11:38:00.002+00:002015-02-19T11:38:59.352+00:00Possessives and terminologyA couple of years ago, I wrote a post describing my objections to the traditional definitions of <a href="http://linguafrankly.blogspot.co.uk/2012/08/terminological-illogicality-possessive.html">possessive adjectives and possessive pronouns</a>. At the time, I still favoured calling what is called the possessive pronoun a "possessive adjective" (eg. "it is mine"), and calling the possessive adjective a "possessive pronoun" (eg "my car"0 for English... in theory.<br />
<br />
However, practicality is another thing entirely, because there is nothing more confusing than using the same words as someone else to mean entirely the opposite thing, so I have never used the terms that way for students. In fact, I actively avoid using either term if I can possibly avoid it, as it doesn't seem helpful.<br />
<br />
So recently I've been working on trying to find ways to better categorise grammar, and I've settled on what actually seems like a reasonable compromise.<br />
<br />
For the possessive "adjective" of traditional grammar, I'm going with the alternative from the previous post -- the <i>possessive <b>determiner</b></i>. It aligns with this, that, a and the, so it naturally falls into the class of determiners. This doesn't mean it isn't a pronominal form -- it actually means that forms like "John's" have to be considered determiners themselves... which is entirely logical, as "John's car" is "the car that belongs to John", but "John's" has replaced the definite article; hence "John's" must be a determiner anyway.<br />
<br />
For the possessive "pronoun", I'm going with the <i>possessive <b>predicate</b></i>. I decided on this when I was thinking about it as a predicative adjective in sentences like "It is mine," or "that's yours." Of course, that's not the only situation it occurs in, which is something I overlooked a little when blogging from the unbearable heat of Corsica in the summer. There is no predicate in "I'll give you mine," or "She didn't have hers, so she took yours." But at least it gets away from the counter-intuitive implication that the other form is not a pronoun.<br />
<br />
I will continue to think about this, but if I do ever come up with a better term, I can just do a search-and-replace on what I've already got without any problems....Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-91869262742505637632015-01-30T18:21:00.001+00:002015-01-30T18:21:39.747+00:00The slow death of the MOOC continuesThis morning, I checked my email like I always do, and Coursera were plugging their latest "specialization" -- one for so-called <i>cloud computing</i>.<br />
<br />
Coursera specialisations were originally launched as a single certificate for a series of "signature track" (ie "paid for") courses, but there's always the free option alongside it.<br />
<br />
So I was very surprised when I clicked on the link for more information about the specialisation, then clicked through to the course, and it was only offering the $49 paid-for version. Now I did go back later and track down the free version of the course by searching the course catalogue, but the notable thing was that you can't get to the free version by navigating from the information about the specialisation.<br />
<br />
It's there -- it is -- but by making click-through impossible, they're actively trying to push people into the paid versions. This suggests that the business model isn't working, and it's not really much of a surprise -- there's no such thing as a free lunch, and the only free cheese is in the mousetrap.<br />
<br />
Some of the universities seemed to be using the free courses as an advert for there accredited courses, but it's a very large and expensive way to advertise -- teaching thousands in order to get half-a-dozen extra seats filled on your masters programme -- and so really the only way to get money is to get more of the students to pay.<br />
<br />
Is it worth it for the student?<br />
<br />Cloud Computing costs £150, and going by their time estimates, that's between 120 and 190 hours of work. The academic credit system here in Scotland says that ten hours of work is one "credit point", and there are 120 credits in a year. Timewise, the Cloud Computing specialisation is then roughly equivalent to a 15-point or 20-point course -- ie. a single "module" in a degree course. A 15-point module costs £227.50, and a 20-point module costs just over £300, so £150 for this seems like a pretty good deal. Of course, this is only the cost to students resident in Scotland to begin with, and it is controlled by law to stay artificially low -- in England, the basic rate would be £750 for a 15-point course or £1000 for a 20-point one, but many universities "top-up" their fees by half again: £1125 and £1500 respectively. And English universities are <i>still </i>cheaper than many of their American counterparts.<br />
<br />
So the Coursera specialization could be half the price of a university equivalent, or a tenth, or even less, depending on where you live. Sounds like a good deal, right?<br />
<br />
Sadly, though, the certificates are worthless -- almost all the institutions
offering courses through Coursera (and EdX, and FutureLearn) are
allowed to accredit their own courses for university credit, but they choose not to. If they
accredited a £30 course as university-level study, they'd be competing
against themselves, and they'd kill the market for their established
distance courses, and perhaps even their on-campus courses.<br />
If they can run a course for £150, is there any justification for their usual high prices? Well... yes. Coursera is on a freemium model (free for basic use, pay for "premium" services), but in reality everything on Coursera is still the "free" part of the freemium. The online-only courses are not viable for universities for a number of reasons, so it's the fully-accredited courses run by the universities themselves that make it possible for the universities to offer the cheap courses "on the side", using repurposed versions of their existing course materials.<br />
<br />
Technology and knowledge sharing can and should be used to reduce the cost of education. When I studied languages with the Open University, I looked at the cost of the course I was taking, vs equivalent unaccredited alternatives -- I could have bought equivalent books and spent more time with a one-on-one teacher than I did in group tutorials, and still only spent half of the money I did with the OU. If I hadn't wanted to get the degree, it would have made no sense at all to continue with them, but I want to teach in schools, so I need the degree.<br />
<br />
So yes, there is undoubtedly unnecessary expense in education and there's a lot of "fat" that could be trimmed away, but the Coursera model won't do it, and for now it remains something of a distraction -- a shiny object that draws our attention away from the real problems and solutions.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-13478644310813701752015-01-23T16:14:00.002+00:002015-01-23T16:14:56.428+00:00The militant wing of immersion and the examiner's dilemma.So, last time I was talking about <a href="http://linguafrankly.blogspot.co.uk/2015/01/shooting-my-mouth-off.html">a discussion I had</a> with a communicative approach hardliner. A couple of days later, I had a new student ask for exam prep classes, so I got out my exam prep materials and had a quick look over them to remind myself of the specifics of the Cambridge Advanced exam, and I very quickly remembered something else from Sunday's conversation.<br />
<br />
One of his big bugbears about the Scottish education system was that the foreign language exams all have their instructions in English. This, of course, is a natural consequence in the belief in immersion above all else -- if language must be immersive, then native-language task instructions clearly break the immersion, and therefore burst the language "bubble".<br />
<br />But here's the thing: when I prepare students for an exam, I explain the language task to them and then practise it over and over. By the time my students reach the exam, they don't need to read the instructions. Now the exams I prepare people for are international exams, so economies of scale dictate that the exam questions stay in English. My students go into the exam, don't need to spend time reading and understanding the question and can instead focus on carrying out the actual task that is set for them.<br />
<br />
But there are people who don't do a lot of preparation for exams, and will go in and need to read the task. Sometimes they misunderstand the task, which means they lose marks. A hardliner would say this is fair enough, because if they don't understand English, they shouldn't pass an English exam. That would be all well and good if anyone really understood the question first time round, but students who prepare are not being tested on understanding the nature of the task, so this is inherently asymmetrical.<br />
<br />
Indeed, most adherents to a target-language-only method are also likely to believe in the "four skills" model of language (which <a href="http://linguafrankly.blogspot.com/2011/07/4-skills-safe.html">I don't agree with</a>, but that's not the point here), which is fundamentally incompatible with target-language-only exam instructions.<br /><br />How so? Well, if you believe that language is composed of reading, writing, speaking and listening, then it follows that you should test the four components individually. However, if you put task instructions in the target language, then <em>every</em> exercise includes a reading component, and you cannot objectively measure the students' levels in the other four skills.<br /><br />It's a dilemma I have heard discussed even at university level, and it's very much a living debate, so nobody really should be putting forward their views as though they are objectively correct, because as with everything, we can all agree that a line has to be drawn somewhere, but we all have different views on where.<br />
<br />
I personally feel that with a student cohort with a shared native language, native-language task instructions are the fairest way to ensure that students are being tested on the skills that we claim to be testing.<br />
<br />
But what about listening tasks? Should we be asking the comprehension questions in the native language too, in order to ensure that we are genuinely assessing their listening comprehension? I kind of think we should, but at the same time, it doesn't feel right. But I have personally done exam past papers with students where they have clearly understood the meaning of the recording, but didn't understand the synonym used in the answer. How can you lose a mark in a listening comprehension test for failing to understand a piece of written language?<br />
<br />
But of course, that argument does start to extend to the reading comprehension test too, because you can understand the set passage perfectly, but again have problems with the question. Here it <em>is</em> a reading comprehension problem leading to a lost reading mark, but there is still a fundamental question to answer about whether you should be setting an exam where you cannot determine the cause of the students' errors.<br />
<br />
When you think about it, though, the problem in both previous paragraphs (although only one example of the various types of errors that students might make) is not really one of listening or reading anyway -- it's a <em>vocabulary</em> problem; vocabulary, which we do not consider worthy of the title "skill".<br />
<br />
Some examiners have tacitly recognised this, and stopped trying to explicitly score the "four skills" individually, such as the Open University, whose degree-level exams have only spoken and written assessment, with the written part incorporating listening and reading as source material for an essay writing task. It's a holistic approach that accepts that trying to identify why a student isn't good enough isn't really an issue -- either they are or they aren't. I was perfectly happy with the approach as a student, and I would be happy with it as a teacher.<br />
<br />
Language is certainly too complicated for us to ever hope to devise a truly fair and balance way to assess student attainment, but the current orthodoxy has tied itself in something of a knot trying to reconcile two competing goals. So are we offering immersion, or are we assessing the skills?Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-64415411485273385292015-01-18T22:39:00.000+00:002015-01-18T22:39:24.268+00:00Shooting my mouth off...A student of mine invited me round to his flat for Sunday lunch -- I big traditional paella, which was absolutely delicious. Now this isn't twitpic, so I'm not going to bore you with a photograph hashtagged #nomnomnom -- no, I'm more interested in a discussion I had.<br />
<br />
Regular readers know I can be more than a little opinionated at times, and I'm not afraid to disagree with people, so when I met another teacher shortly after I arrived, the conversation quickly turned heated.<br />
<br />
First, he asked where I taught. I said I was teaching privately because I don't like the way things are done in schools. He asked what I meant, and I explained that I don't like mixed native-language groups, because the problems a Spanish person has with English are completely different from those a Polish person has (a stereotypical TEFL class in Edinburgh is composed of one Polish person, one Italian, and then a whole pile of Spanish people). Targeting lessons at resolving student problems is then really difficult.<br />
<br />
He had a bone to pick with that "mixed-ability is real life, there are no heterogeneous groups in the real world."<br />
<br />
Mixed ability is real-life, true, and you will never have a truly heterogeneous class group, true. However, this argument doesn't hold up to logical analysis -- a simple <i>reductio ad absurdum </i>(note: this is not a strawman) is enough to cut it down: we do not mix absolute beginners and advanced students, therefore we all draw a line somewhere on the scale between heterogeneous and completely mixed; every teacher sees that line as being somewhere different, and "no such thing as heterogeneous" is no more a justification for his chosen line than it is for mine.<br />
<br />
The next thing he said was quite interesting, and certainly bears reflecting upon. He suggested that my desire to teach students in more uniform groups was not respecting their individual needs. It's an interesting viewpoint. He felt that teachers who propose heterogeneous groups in order to reduce individual differences so that they could give one lesson and not worry about addressing individual needs. This may be true of some teachers, but it is not true of me. Personally, I find that in a heterogeneous group, I can predict individual needs better, because actually most Spanish speakers I've met have exactly the same problems... which means they are not "individual problems" at all. If I eliminate all the group problems early, then I can really deal with the genuinely individual problems as they come up.<br />
<br />
He wasn't convinced... far from it. Now he objected that I was talking about "accuracy" when... (wait for it!)... "communication is the important thing." Oh dear -- my least favourite meaningless statement. I had used an example of a particular error that a lot of Spanish people make: even if they normally remember to put the adjective before the noun (eg "a pretty girl"), when they qualify the adjective, it tends to migrate to after the noun (eg "*<strike>a girl very pretty</strike>"). He (quite correctly) responded by saying that this type of error does not interfere with communication. However, just as with mixed-ability groups, there must be a line somewhere, and inductive logic allows us to generate incomprehensible:<br />
<ol>
<li>the US colloquial term "purdy" is readily understood to mean "pretty"</li>
<li>foreigners who have difficulty pronouncing /<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">ʌ/ will usually be understood if they instead use /</span><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">æ/</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">=> It therefore follows that saying "pardy" instead of "pretty" will be understood.</span></li>
</ol>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Except, of course, it doesn't. Language has evolved to contain a lot of redundancy, and one or two steps of difference is acceptable, but the effect of errors on communication is cumulative, and there's a critical mass where the information finally gets lost (as well explained by Claude Shannon's information theory, in particular the Noisy Channel Theorem).</span><br />
<br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Let's continue the induction.</span><br />
<ol start="4">
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">"a girl very pretty" is understood to mean "a very pretty girl"</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">=> "a girl very pardy" is understood to mean "a very pretty girl"</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Oh, and swapping T and D isn't normally a problem either, eg "breat and budder" instead of "bread and butter"</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">=> "a very party girl"</span></li>
</ol>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Three errors, and the meaning is gone. But is this because it's just a phrase and not a sentence? Let's add in another "insignificant" error that's common in the English of Spanish speakers and get ourselves a sentence to look at:</span><br />
<ol start="8">
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">dropping a subject pronoun that can be inferred from the context, though incorrect in English, does not hinder comprehension. eg "Last night, met a very pretty girl" instead of "Last night, I met a very pretty girl."</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">=> "Last night, met a girl very party"</span></li>
</ol>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Four errors, and not a lot of meaning left. You might just get it, but it's going to be an effort to understand.</span><br />
<ol start="10">
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Superfluous "the" added to "last night" doesn't interfere with comprehension. (In many sentences, this is true.)</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">=> "The last night met a girl very party"</span></li>
</ol>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">And of course Polish and Chinese people have a tendency that we can add in here</span><br />
<ol start="12">
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Dropping of "a" or "the" doesn't usually interfere with comprehension -- eg "give me apple" can easily be understood as either "give me an apple", "give me the apple" or even "give me apple" from the context</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">=> "The last night met girl very party"</span></li>
</ol>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Oh yes, and I've still got some Ts I could make into Ds</span><br />
<ol start="14">
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">The lazd nighd med girl very party</span></li>
</ol>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Notice how the first two Ds do nothing to interfere with understanding, as you would still recognise "last night", but it doesn't make sense to accept it in this situation as that sets up a habit that will also affect words and phrases that aren't ambiguous.</span><br />
<br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">The third D now makes things more difficult. Is that "met" or "made"? Or maybe we're talking about a "med girl", ie. a student doctor.</span><br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><br /></span>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Can we agree that <i>the lazd nighd med girl very party</i> is not comprehensible? I hope so.</span><br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><br /></span>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">But let's rewind and look at all the individual sentences we can make with one single error:</span><br />
<ul>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Last night, I met a very purdy girl</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Last night, I met a girl very pretty</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Last night, met a very pretty girl</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Last night, I met a very preddy girl</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">Lazd nighd, I med a very pretty girl</span></li>
<li><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">The last night, I met a very pretty girl</span></li>
</ul>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">None of these are going to cause a native speaker too much trouble, although the last one may be ambiguous depending on the context, but when we add all these errors together, the result is incomprehensible.</span><br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><br /></span>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">The boundaries of comprehensibility are difficult to judge, especially for a teacher, whose own ability to understand non-native language is much better developed than that of an average native speaker, even if only due to the difference in the amount of contact time they have with non-natives.<br /><br />But</span><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"> if</span> comprehensibility can be measured, it can only be done in terms of the number and severity of errors... ie. comprehensibility can only be gauged by measuring accuracy.</span><br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><br /></span>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">And if comprehensibility and accuracy are so tightly connected, you cannot declare that one is more or less important than the other.</span><br />
<br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">But I digress.</span><br />
<br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">The conversation moved on, and the other teacher lamented Scotland's lack of attention to and alignment with the CEFR. He was there when it started, he told me, back in 2001. Now it's been a while since I've mentioned the CEFR on this blog, but when I did, I gave <a href="http://linguafrankly.blogspot.com/2012/10/cefr.html">a fairly strong opinion</a> -- an opinion that I readily repeated to my new acquaintance... who then clarified that when he said "he was there" he didn't just mean he was teaching -- he was involved in setting it up. Ah, right.</span><br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><br /></span>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">As it turns out, the guy isn't actually an active teacher any more either... he's now employed in his country's diplomatic service and is in Scotland to liaise locally on the teaching of foreign languages in Scotland's schools and universities.</span><br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><br /></span>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">They say you should pick your battles carefully, and in this case I didn't -- this was a debate that could not be won. To argue against Communicative Language Teaching with someone whose entire career and professional identity was built on the championing of CLT is not going to get you anywhere.<br /><br />It's not just that I couldn't change <i>his</i> mind, though, but also that <i>he</i> couldn't change <i>mine</i>. It is very rare that I come out of a debate on language teaching without at the very least questioning myself, but that typically occurs when a question of personal style or perspective comes in. When you reconcile the personal views with the formal views, you can start to see why people believe what they believe, even if you don't share that belief. Both parties get to reanalyse themselves against someone else's frame-of-reference, and try to analyse the other in terms of their own. It offers new perspectives.</span><br />
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)"><br /></span>
<span class="IPA" title="Representation in the International Phonetic Alphabet (IPA)">But if someone is truly, deeply invested in the orthodoxy, all you will hear is the dogma. There will be nothing new -- you will have heard it before from teacher trainers, colleagues and boss. You will have read it in several books and magazine/web articles.<br /><br />Ah well, never mind.</span>Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com3tag:blogger.com,1999:blog-30114385.post-33168118075348816362015-01-16T15:52:00.000+00:002015-01-16T15:52:35.177+00:00Duolingo: Web 2.0, free labour and the power of ignoranceLast time I wrote anything here, I had decided I was going to get some German under my belt. So I've tried out a couple of things on the net, and I've spent a lot of time on Duolingo, which in many ways is a very good resource, but is frustrating in the way it keeps generating nonsensical phrases and fragments.<br />
<br />
Well, it turns out they recently added an interesting clause to their user agreement:<blockquote class="tr_bq">
<b>Temporary Restrictions on Users from the European Union</b><br />
Users
within the European Union are not presently allowed to submit materials
for translation or translated materials to Duolingo. While these users
can continue to use the educational services offered through the
Website, they will not be involved in the translation of any documents.
If you submit a request for translation or translated materials to
Duolingo, you thereby warrant and represent that you are not currently
within the European Union, did not translate the document within the
European Union, and will not be within the European Union when your
translation request has been finalized. </blockquote>
So what's going on here then?<br />
<br />
I have always felt that most dot-com organisations run on a model that breaches workers' rights laws. In most countries, a for-profit organisation is not allowed to solicit or accept free labour, and yet a great many commercial internet sites rely on free labour for their profit.<br /><br />When YouTube first launched, all advertising revenue was kept by the site -- uploaders made no money. YouTube argued that the uploaders weren't working for the site, so didn't need paid... and yet, without the uploaders, there would be no site. YouTube changed their business model later on to grant uploaders a share of the advertising take. The reason they did this was so that they could get on board professional media (including music videos) and then also to stop the higher quality amateurs from migrating to sites that were willing to split the profits. Market forces worked in the interests of the little guy... this time.<br /><br />But what about Facebook's big translation push at the time of the public share offering? The public sale brought in enough money to translate the site into all the world's major languages several times over, and yet they did not pay a single translator, instead "crowdsourcing" the translation. It would be one thing if they had opened translation to any and all languages, but they chose the languages and were only interested in the "big" languages that would draw plenty of users and make Facebook more money.<br />
<br />
<br />
If it was small languages, I could understand: you don't want to pay for a translation to eg Irish when all the users will happily use the English version -- it doesn't make you any money. But when you're translating into Spanish, one of the world's most widely spoken native languages, you'll make your money back many times over even if you pay for one of the world's best translators.<br />
<br />
Facebook clearly thought that if so many other websites had got away with free labour, they would too, but they inadvertently brought the issue to far more public attention than they expected.<br />
<br />
You see, translators have real power in Europe. With such a linguistically diverse base, the institutions of the European Union are full of translators, which makes them one of the most powerful lobby groups you can imagine. Seriously, there is no-one who "has the ear" of a Brussels bureaucrat than the person who's talking in that ear throughout the meetings.<br />
<br />
<br />
<br />
<br />
Now I don't recall ever hearing of any sanctions being made against Facebook for this, but the groundwork was set and Duolingo walked right into the problem, because more than any other site, their business model is built on unpaid labour... and crucially unpaid <i>translation</i>. Duolingo seeks to generate income by having learners translate documents for paying clients as part of their "immersion" in the language. Already, Duolingo is translating articles for<a href="http://blog.duolingo.com/post/64024962586/duolingo-now-translating-buzzfeed-and-cnn"> Buzzfeed and CNN</a>. Their justification is that the translators are getting something in return -- the teaching. I can see where they're coming from, to a point, but that's the same justification people try to make for internships as a source of unpaid labour.<br />
<br />
So somewhere along the line, Duolingo has been warned off and put up these "temporary restrictions"... but didn't tell anyone about it. It's there, right at the end of the Ts&Cs, but they didn't actively notify users, and there is no notice on the translation page to warn you that you might be about to do something potentially illegal.<br />
<br />
But it gets worse, because they don't only leave you access to the section that is illegal, but they actively encourage you to use it. I've been using it a lot recently, and after most exercises, it tells me to try the translation.<br />
<br />
Now, if you're sitting at a computer in the UK and try to access BBC Worldwide clips on YouTube, you won't get anything. Why? BBC Worldwide content is licensed for use <i>outside</i> the UK, and YouTube knows where you are. The same thing happens on plenty of sites.<br /><br />Duolingo makes no attempt to block based on location, but there is no technical reason that they shouldn't. I cannot imagine that a company their size would not be tracking user locations anyway, in order to optimise their marketing strategies and their technology. They must know. Furthermore, there is even a section in the profile (optional, admittedly) for you to tell them where you live.<br />
<br />
It's a pretty stupid course of action, if you ask me. With geolocation being such a simple and standard admin task (although admittedly not 100% accurate), failure to attempt to identify and block EU-based users could be argued to be negligent. That negligence is surely made worse by the fact that they are leading their users not only to arguably (not tested in court) break the law, but also to indisputably break their own license agreement. And all the while their negligence allows them to continue selling translations to commercial clients.<br />
<br />
It's a dangerous path, and it could lead to a very messy end....Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-1483419388902656592014-11-27T15:43:00.001+00:002014-11-27T15:43:58.861+00:00The feeling of "I should be able to speak this..."Right now, I'm sitting in a beautiful part of the world: Franconia, a culturally and historically distinct region of Germany, mostly within the boundaries of modern Bavaria.<br />
<br />
Or to cut a long story short, I'm on holiday in Germany, and people around me are speaking lots of German.<br />
<br />
I've studied a little bit of German, having done most of the Michel Thomas course and 8 or 9 levels of the Duolingo course. When I'm in the shops, the bars and the streets, I keep hearing German, and although I don't understand it, I feel completely like I <i>could</i>. The sounds, the rhythms and even some of the words just feel natural to my ears.<br />
<br />And so I will learn German properly now. I'm not going to wait another month just for the sake of turning it into a New Year's resolution. Previously, I'd been using German as a mostly-unknown language so that I could get a feel for what DuoLingo was doing (and I have lots of year-old notes and screenshots on a harddrive somewhere, prep work for a review that I never bothered to write up) so it wasn't a serious push. Time to change that.<br />
<br />
This is a feeling I've had before, and it's always been followed by an intense period of focus, because my brain is just ready to soak up all that it needs.<br />
<br />
I suppose it's a bit like cycling up a hill, and struggling up a steep bit, then hitting a milder uphill that feels almost like a downhill. You're still peddling, still pushing yourself up, but somehow it feels effortless by comparison to the previous slog.<br />
<br />
Ich will Deutsch sprechen können.<br />(I hope that's the right word order...)Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com7tag:blogger.com,1999:blog-30114385.post-68947679414088597752014-11-12T16:15:00.000+00:002014-11-12T16:15:14.784+00:00Language tends to deteriate dramasticly.Johnson, the guy who compiled that famous early dictionary of English, once said<br />
<blockquote class="tr_bq">
Language is only the instrument of science, and words are but the signs
of ideas: I wish, however, that the instrument might be less apt to
decay,</blockquote>
There seems to be a value judgement their against the natural processes of language change, but it strikes me as far less clear than others make out.<br />
<br />
Is he talking about how language change in general? Was he lamenting the loss of conservative feature such as "thee"/"thou" and subjunctive conjugations?<br />
<br />
Or was he talking about loss of <i>precision</i> in language, such as the change in meaning of "decimation" from "killing one tenth" via "massacre" to more general "destruction".<br />
<br />
Either way, language does most definitely change, and the two words in the title that your spellchecker probably would say aren't words at all are good examples that I hear with reasonable frequency in my own life.<br />
<br />
I don't know how many of you will have worked out there meanings and/or origins from reading them, but please leave a comment to let me know.<br />
<h4>
Deteriate</h4>
Of the two, this is the one I hear most often; from my own mouth, from the mouths of friends, and often even on TV. You probably hear it too. It's simply a contraction of "deteriorate". It occurs in the derived noun too: "deteriation".<br />
<br />
The loss of a vowel in the middle of a sentence is what we call "<i><b>syncope</b></i>". Syncope is particularly common where a vowel is sandwiched between two instances of the same consonant. Here, it's the repeated R that triggers the lost syllable.<br />
<br />
A more topical example of the same mechanism is the word "quantitative", as in "quantitative easing". Listen to the news, and most reporters will pronounce it fully. In an unscripted interview, though, you may just hear "quantative". Discuss the economy in a bar, and after a couple of glasses, you'll all be saying it that way. Even that form still has two Ts, so one day it might just shrink to "quantive".<br />
<br />
What's interesting about "deteriate", though is the /i/ sound. We haven't lost the "io" from deteriorate, and just had the Rs collide, we've lost the "or". Hmmm...<br />
<br />
<h4>
Dramasticly</h4>
(Or possibly "dramastically".) This is something I'm not aware of hearing that much. I associate it particularly with my little sister (although I'm aware that several of us in the family have said it), and a few months ago I heard it in the pub in my parents' village, so maybe it's a local thing. I'll keep my ears open.<br />
<br />
This work is a confusion between "dramatically" and "drastically", and I was always conscious of that fact. But that doesn't mean it's not a legitimate word. We have a lot of evidence of words "<i><b>falling together</b></i>" in multiple languages.<br />
<br />
For example, the conjugations of the verb "to go" in Spanish, French and Italian are a mixture of three verbs in Latin: andare, ire and vadere. But now they're just one word.<br />
<br />
A far more recent example of falling together is the term "nailed it".<br />
<br />
Most of us would associate that with getting something right/doing it perfectly. In that sense, it derives from the phrase "to hit the nail on the head" and evolved from saying the perfect answer to doing something really well, like "nailing" a jump at a skate park.<br />
<br />
On the other hand, we have the management version, where "nailing" something is just getting it finished. It probably derives from the phrase "to nail ((something)) to the wall". That's pretty much the opposite meaning, because that phrase is all about <i>not</i> doing things perfectly. The metaphor is a kitchen cabinet -- you don't care if the door is slightly squint, you just want it on the wall so that the job's finished and everyone can go home.<br />
<br />
Both of these long proverbial forms have reduced to the same verb, which can cause misunderstandings.<br />
<br />
This sort of change isn't uncommon, though, so you should always be careful about discounting any theory about the origin of a term because of some other theory. It could turn out that both are right....Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-77083525742827842192014-11-01T12:41:00.002+00:002014-11-01T12:41:36.293+00:00Newsflash: unquestioning praise considered harmful!I have found myself for years defending the teaching profession against what I saw as unwarranted attacks. People accused the culture of praise in modern schooling as being namby-pamby liberal nonsense. I tried to explain there was a good body of evidence behind it.<br />
<br />
But news came this week of a <a href="http://www.scotsman.com/news/education/teachers-told-praise-can-hamper-pupils-learning-1-3589631">report rubbishing the practice</a>.<br />
<br />
Perhaps my mistake was in trying to educate the lay people rather than the teachers. You see, most teachers didn't get it, not because individually they aren't bright, motivated professionals, but because the news came to them filtered through layers of management, focus groups and in-service training teams, each of which reinterpreted what the last had told them in a great game of academic Chinese whispers.<br />
<br />
The message that reached the teachers wasn't the message that the psychological studies had given, but something completely different: "Always encourage! Never criticise!" I never believed that could work. I have always hated praise or encouragement when I don't understand something. The difference between me and a "bad learner" is that when this happens, I don't lose faith in myself -- I lose faith in the teacher. (And I have lost a lot of faith in a lot of teachers over the many years I've spent in full-time and part-time education.)<br />
<br />
No, the advice was more subtle and nuanced than that.<br />
<br />
As I recall it, the observation came first with misbehaving children. It was noted in classroom observations (carefully managed studies involving timing of teacher-pupil interactions) that teachers spent a lot more time berating misbehaving children than they did praising them when they behaved well. There is a school of thought that sees most children's misbehaviour as a call for attention, and by giving children more attention for misbehaving than behaving, on a certain level you reward the bad behaviour. The theory was that by increasing contact time during periods of good behaviour, you would reinforce the fact that good behaviour leads to adult approval.<br />
<br />
But it went further than that. The observers noted that the teachers' response to a misbehaving child just wasn't positive at all. There was visible relief, and the teachers would actually draw attention to the child's normal poor behaviour. By doing so, the researchers claimed, they were establishing the teachers' low expectations, and undermining the pupils' confidence.<br />
<br />
Now, can anyone really argue against the idea that we should show kids that we appreciate their good behaviour? Does anyone think that showing a kid that us adults have identified them as a "problem child" has any possible benefits? I doubt it.<br />
<br />
So far, so uncontroversial.<br />
<br />
But the follow-up to this was that researchers identified similar patterns with children that didn't necessarily misbehave, but just weren't doing well. Criticism for getting it wrong, implied criticism on the occasions they do get it right.<br />
<br />
I still think we're in pretty uncontroversial territory here, because the advice is still pretty straightforward: when an underperforming child finally answers a question right, don't say "Thank God! At last you've got one right!"<br />
<br />
And, in fact, the most uncontroversial advice from the experts was to <i>smile as you say it</i>, because they saw teachers who <i>never</i> smiled at a correct answer from an underperforming pupil.<br />
<br />
The experts were not calling for uncritical, undeserving praise.<br />
<br />
How does that translate into the classroom?<br /><br />Well, a couple of years ago, I was teaching English in a French university. The French (like the Italians and the Spanish) believe themselves to be incapable of learning languages. I also had the challenge of an <i>extremely</i> mixed-ability first year group in the law faculty, everything from people with no previous experience of English to people I could have sat and talked to for an hour or two without problems. Imagine trying to teach a room of 20+ in that situation. It was not fun.<br />
<br />
Often trying to get answers felt like trying to get blood from a stone, and when I finally got a good answer from some of them, the relief was palpable...<br />
<br />
... and that was it. I had caught myself falling into the trap that the experts warned about. My attitude was more or less "why couldn't you have said that in the first place?!?" and it was all too obvious. The students were reluctant to give answers, and when they <i>did</i> give them, I did nothing to bolster their confidence.<br />
<br />
What I had to do wasn't a simple matter of giving out "well done" stickers, but a complete change in <i>my attitude to the students</i>. I was seeing them as obstacles, as problems, when the problem was the circumstance, <i>and my reaction to it</i>. It was easier to see this as a situation that I could do nothing about than to actually do something.<br /><br />So I set about the task of finding material that was suitable for everyone (and to a great extent succeeded), but more importantly, I changed my attitude to my students. Instead of feeling relieved when I finally got the correct answer, <i>I felt happy</i>. Instead of dropping my shoulders and saying "why didn't you say that earlier?" I smiled and said "of course! I told you you knew it"... until I had built such a rapport with them that I could start dropping my shoulders again and saying "why didn't you say that earlier?"<br />
<br />
Which brings us back to the report, and the suggestion that invariable praise projects low expectations onto the students.<br />
<br />
In the end, whatever I did was projecting my expectations onto the students. I always had higher expectations of the students than they had of themselves. But my projection had to satisfy two criteria for the students to accept it: it had to be realistic based on their ability, and it had to be close to their own expectations.<br />
<br />
If a student's confidence is five steps behind their ability, there's no point in projecting a confidence that matches their ability -- you have to project one that's one step ahead of theirs, and slowly bring up their confidence.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com3tag:blogger.com,1999:blog-30114385.post-21876105740549230062014-10-05T13:58:00.001+01:002014-10-05T13:58:50.692+01:00What's it like to lose language?I recently started an <a href="https://www.futurelearn.com/courses/play/">online course about children's play</a> with FutureLearn. One of the optional readings for week 1 was an <a href="http://www.journalofplay.org/sites/www.journalofplay.org/files/pdf-articles/3-3-interview-ruth-codier-resch.pdf">interview with a practicing psychologist who had a stroke</a> [journalofplay.com], triggering aphasia from which she has never fully recovered.<br />
<br />
There's a lot of intriguing ideas in there, although it's a very long article and it wasn't interesting enough for me to get to the very end, but I figure a few of my regulars would enjoy it anyway.Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com1tag:blogger.com,1999:blog-30114385.post-13463322482865009342014-09-30T11:46:00.000+01:002014-09-30T11:46:01.542+01:00Language shaped holes: looking back at my Gaelic learningAfter my first couple of intensive (week-long) courses in Scottish Gaelic, I developed an analogy for how I thought language learning worked. Classes, I decided, don't put the language in your head, but instead drill "language-shaped holes" that you can later pour stuff into and, like jelly in a mould, what you get in the end is language.<br />
<br />
A lot of my school-level language learning had worked that way too. I studied stuff, but really had no conscious command of most of it, but at some point later I mastered through a combination of self-study, practice and exposure. Adherents of the "input hypothesis" would say that only the last one counts, but I don't buy it.<br />
<br />
The process I went through with Gaelic was all about production. The first courses I took favoured production over input (the teachers were kind of old-school) and outside of those courses, I took part in discussions on internet forums and at a conversation circle.<br />
<br />
My strategy at the conversation circle was to read a little of a coursebook every week before going. The book was ordered grammatically, rather than the thematic units of modern TY/Colloquial and the like. This meant that I could pick a feature, read it up, and then practise it as much as possible during the hour-or-so of conversation.<br /><br />Quite often, these were features that I'd studied in class but forgotten, and I don't think the book explanations alone would have done the trick.<br /><br />I kind of went blank on the conditional for a long time, but I started noticing other people use it at the conversation circle. But I could not have noticed if there wasn't a conditional-shaped hole in my head.<br />
<br />
Since then, I've decided that language-shaped holes are not the optimum manner of teaching, but as suboptimal goes, they're pretty good....Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0tag:blogger.com,1999:blog-30114385.post-37839825538320186992014-07-19T10:48:00.000+01:002014-07-19T10:48:05.438+01:00iPad for teachers? No thanks.My laptop is stuffed. There's some kind of fault in the power circuit and it keeps refusing the mains power. As the battery is nearly exhausted and holds about twenty minutes of charge at most, it's basically unusable.<br />
<br />
As the problem worsened, I slowly migrated as much of my daily activities as possible to my iPad. First it was web browsing, then email, an eventually I bought a Python programming environment and the Pages word processor so I could continue developing software and producing material for my students.<br />
<br />
There are lots of articles out there that will tell you how wonderful the iPad is for teaching, but these are often little more than superficial lists of frivolous apps for presentations, flashcards and the like.<br />
<br />
As a language teacher, there are more fundamental features of the iPad that are instantly a problem: audio and video and file access.<br />
<h4>
Sound files? No thanks!</h4>
<div>
I wanted to do an exam simulation using one of the practice papers at <a href="http://www.cityandguildsenglish.com/">www.cityandguildsenglish.com</a>, so I downloaded the paper, the answer scheme and the listening transcript onto my iPad. But all this became a bit futile when I rediscovered that e iPad will not let you download MP3s from websites, preferring to force you to use either the iTunes store or the iTunes app. With the files not being available on iTunes andmy PC out of action, I have no way to get any audio or video files I need onto my iPad. Now for listening exercises I am forced to fall back on a rather old Android phone, as it allows me to download anything I want.</div>
<h4>
Why would you want to access your own files?</h4>
<div>
Apple have gone out of their way to prevent the iPad being a computer. In one aspect, it was a clever design decision, as now rather than having the abstract concept of "a file", most file types exist as documents within their respective applications. There's less confusion for the user and less danger of malicious or faulty software interfering with the files from other applications,</div>
<div>
<br /></div>
<div>
However, in my current job, I don't do my own printing and photocopying, so I'm always sending multiple worksheets to the course secretary. Without file browser access, I'm currently restricted to going into individual applications, and using the "share" function on individual files to send them as emails. Where once I had one email with 8 attachments, now I have 8 emails with 1 attachment each. This makes life hard for both me and the secretary, as there is a very good chance that one of us will forget something.</div>
<h4>
Overall</h4>
<div>
Feel free to tell me about the latest app that has made your life so much easier, but I will never be able to advise other teachers to use a device that complicates the very basics of digital technology for teachers. Most of those apps, or close equivalents, will be available for Android anyway, and Android gives you the power to do what you like with your own data.</div>
<div>
<br /></div>
<div>
Not only that, but the iPad is actually massively overpowered for the basic functions we teachers need (have you seen the complexity of some of the games?) so you're paying more than you need.</div>
<div>
<br /></div>
<div>
Buy a cheap Android tablet instead - it'll save you money and time.</div>
Titchhttp://www.blogger.com/profile/03003350618976942468noreply@blogger.com0