Showing posts with label prescriptivism. Show all posts
Showing posts with label prescriptivism. Show all posts

31 October 2012

Québécois: a question of prestige

Recently, several of the other blogs I've been reading have mentioned a new book, Le Québécois en  10 Leçons, self-published on Lulu.com by Alexandre Coutu, known on how-to-learn-any-language as Arrekusu.  Now I'm pretty sure I'll be buying this book myself very soon, but right now I'm trying to stop myself as I've got enough on my plate what with a busying teaching schedule and trying to pick up Corsican.

Anyway, so I decided to start looking around to see what other people's opinions were, and I came across one mother of a thread on HTLAL itself...

The thread became kind of derailed when another user, s_allard, objected to the title, suggesting that it should be "le québécois populaire" or "le québécois de rue" or somesuch.

s_allard's position can be summarised thus:
By using the term "québécois" to describe low-status speech, Coutu makes the term "québécois" into a low-status term by association.  Instead, s_allard wants to use the term "québécois" inclusively to include the "standard" language used in higher register situations.  Besides, many of the features Coutu highlights are going out of fashion.

It's a valid viewpoint, but it's one that I don't personally hold.  Rather, I would say that s_allard's position, while not in itself malicious, maintains an unfair distinction where one's intelligence is determined by the language one speaks.  It has been pointed out that middle-class children generally do best in school the world over, but that this is simply a language bias -- school-teachers are middle-class, therefore the school lect is the middle-class sociolect.

We have a choice of action, therefore: change the lect of the children to match that of the school, or change the lect of the school to match that of the children.  Academically, it's mostly agreed that the latter is the option that has proven most effective time and again, but s_allard's standpoint better matches popular opinion, whether you're talking about India, where many children are taught through the foreign medium of English; Scotland, where many say Gaelic shouldn't be taught "because many children can't even speak English properly" (for some undefined notion of "properly") or s_allard's own Canada.

Personally, it's an attitude I'd want to challenge, not least because many of the notions of "proper" English and "proper" French have been well and truly disproven by corpus studies of the language.  (EG. The statistical norm in French is to drop the "ne" particle in the negative, and the statistical norm in English is to say "can", not "may", when asking permission.)

Moreover, there's the issue of words and phrases going out of fashion.  I personally believe the examples s_allard gave are pretty misleading.  He picked English slang words with no real history, that were invented by one generation and dropped by the next.  But the "québécois" that he objects to is a historically attested form, and one that is being lost not simply due to natural language change, but by the constant imposition of the Standard French norm.

One of the books that I'm using to help me learn Corsican, le corse en 23 lettres, puts it very clearly.  The author, Ghjaseppiu Gaggioli, is a descriptivist grammarian, and in the introduction states that he doesn't want anyone to interpret his work as authoritative.  Instead, he wants to inform the reader so that the reader can make an educated choice.  Because, he says, all languages change, but for a language to stay healthy, that change needs to come from within the boundaries of the language itself.  Much of the change in Corsican today is the borrowing of feature after feature from French into the language.  Similarly, much of the change in Scottish Gaelic is the borrowing of feature after feature from English.

And of course, much of the change in Quebec French is the borrowing of feature after feature from School French*.  s_allard's approach leads to us defining "québécois" as something that every day becomes more like School French.  He wants to differentiate the language name, while differentiating the language less and less.

Coutu's approach is more like that Gaggioli.  He wants to bring people's attention to the features that exist in their local tongues, features that they are not themselves aware of.  He mentions in the HTLAL thread that he gets people telling him "I don't talk like that," only to use the exact same word or phrase a sentence or two later.

This is completely and utterly normal, and anyone who has studied linguistics in a modern setting will have experienced a lesson where the teacher will tell them that "everybody really says X" and the student doesn't believe it.  Over the next couple of days/weeks, the student simply can't stop hearing the phrase.

A couple of years ago, I was telling my manager about how us Scottish people hardly ever say "please" -- we go into a shop and say "I'll have a , thanks."  That's "thanks", not "please".  He was having none of it.  He always said please...  Well, that same day he came in from lunch looking shocked.  "You're right," he said, "I just asked for a sandwich, and I didn't say please."

No-one can protect their own language until they recognise it for what it is....


* I'm giving up on the term "Standard X" unless it's a statistically-defined norm-reference.  A standard isn't a standard just because a minority of people say it should be.  "School X" is a far more accurate term.

02 March 2012

The inconsistency of prescriptivism


I was recently involved in a little "edit war" on Wikipedia (now there's something of a hyperbole -- comparing a few mouse clicks to bullets and napalm) and it provided a great example of the arbitrariness of prescriptivist grammar.

"Prescriptivism" is the view that there is such a thing as objectively "correct" language, and the imposition of rules on everyone's language.

It is the prescriptivists that teach us that you use "may I...?" to ask for permission to do something when in natural speech we practically always use "can I...?".  The dangerous thing about prescriptivism is that we believe the myth and spread it, even though we don't actually do it ourselves.  So a native speaker will often "correct" a learner who uses perfectly normal, natural English.  This is a Very Bad Thing, as it makes it much, much harder for the learner.

Anyway, the prescriptivist will tell us that double-negatives are very, very wrong.  Well, there was a time when they weren't, and English operated much like the rest of Western Europe, not doing nothing to nobody.  The prescriptivists got their way, though, and eventually they managed to drive the English double-negative to extinction.

Well, almost.

For some reason, many of the same prescriptivists who insist that "a double negative is a positive" (and there ain't never been no truth to none of that) will also insist that not ... nor is "more correct" than not ... or, and likewise neither ... nor over neither ... or.

Why have people in general stopped using nor?  There's a very simple explanation: we don't think in double-negatives any more -- the prescriptivists taught us not to.  Nor only ever existed as the second of a double negative.

We can split most negative pairs and have two valid modern English sentences:
I haven't never been there -> I haven't ever been there / I have never been there
I haven't done nothing -> I haven't done anything / I have done nothing

But look what happens when we try that with nor.
I had neither one nor the other -> I had neither one or the other / *I had either one nor the other

Keeping only one negative in the sentence produces two options: keeping the first negative (not or neither) gives a clear and comprehensible sentence; keeping the second negative results in a sentence which is unclear and difficult to comprehend.  But we had to lose one, because our model of English rejects the double negative.  Hence we lost nor.

Arguing against "I ain't done nothing" while simultaneously arguing for "neither... nor" is likely telling everyone to take up a vegetarian diet, but saying that bacon isn't meat....

23 July 2011

In language, there's no such thing as a common error

This is a statement mired in controversy.  It wasn't me that first said it, but I agree with it... with one caveat: we're talking about native language.

For many, many years, grammarians and school teachers would hound us for saying things wrong.  As a child, I was constantly "corrected" by my mother for asking for permission with Can I...? instead of Please may I...? or for saying if I was you... in place of if I were you....

So when I studied English at university it was very heartening to find that modern linguistics considers everything that is said by a sizable chunk of the population as acceptable language.  And of course this includes both Can I...? and if I was you....

What triggered this post was seeing an article on the Register about a grammatical error in a BBC headline: Phone-hacking: the other news you might of missed.

This is one of those "errors" that's now common enough and consistent enough that we may have to stop calling it an error.

When I suggest this, people often recoil in horror.  "But it's the perfect tense," they cry, "logically it must be have."  (And yes - I know that perfect is an aspect, not a tense, but pointing that out at this point would seem like cheap point-scoring so I generally let it lie.)  But since when was language logical?  You must and You have to are logically equivalent in some usage, but when you negate them you get two very different things: you mustn't and you don't have to.

The thing is, logic aside, we have empirical evidence that shows people's brains don't see it as have -- the errors themselves stand as proof of an emerging norm.  Rather than fretting about the logic of have=perfect, we should be paying close attention to the thought processes behind this change and trying to make the way we right English match the way we speak it.

This does not mean that we have to accept might of, could of etc.  No, because there is an existing mechanism that rids us of this problem: contractions.

Contractions are mostly hated by our schoolroom English teachers, but they are gaining growing acceptance.  We're allowed can't now, where my primary school teacher insisted on cannot, and even I'm where my teacher insisted on I am.  Yet we're still told off by teachers and editors if we try to use could've or coulda, should've or shoulda, might've or mighta.  But these are what we say.  Our habits of speaking have gradually reduced the auxiliary have to something more of a fusional element, a suffix, than a word.  It is only when a writer is expected to write "in full words" that might've becomes might of, so why not simply accept might've?  It would eliminate both the error and the controversy, and would say several pedants a few more grey hairs....

15 July 2011

False Etymologies and Prescriptivism

Q: When is an error not an error?

A: When it's a fixed phrase.

My mother has a bit of a tendency towards linguistic prescriptivism: in her mind, some things are wrong and some things are right.  Like most of us, she can find sufficient justication.

One of her pet hates is the phrase "moment in time".  To her, this is very wrong, because it's tautologous.  What other type of moment can there be, after all?

Well, I just happened to read Prisoner of Zenda this year, as I thought it was on the reading list for the Cambridge exam First Certificate of English which several of my current students were intending to take.  (Special thanks to About.com for having an out-of-date and undated list of books....)

The first chapter ends as follows:
"Colour in a man," said I, "is a matter of no more moment than that!"—and I gave her something of no value.
"God send the kitchen door be shut!" said she.
"Amen!" said I, and left her.
In fact, however, as I now know, colour is sometimes of considerable moment to a man.
Clearly, "moment" here is nothing to do with time.  I figure that this sense of "moment" must be the root of the word "momentous", meaning very important.  Even though this sense of the word is now dead in common speech, "moment in time" has survived as a fixed phrase, so it is difficult to justify it as "wrong".  (See also "moment of inertia" in physics.)

Another thing my mother objects to quite strongly is the word "bloody", and the history of this one is quite fascinating.  Somebody somewhere along the line basically decided that the word was offensive (well, it has to be, doesn't it?  Common people use it!) and then looked for why it was offensive.  From there came the bizarre myth that it was swearing in the name of Mary, Jesus's mother in the High Christian traditions, and anyone who subscribes to a high church religion would consider that a very bad thing indeed, because Mary typifies virtue and purity.  The trouble is, there is no attested process by which "by Our Lady" would mutuate into "bloody".  And even more damning -- I'm told that other Germanic languages use (or used to, at the very least) cognates exactly like we use "bloody": both as a descriptive adjective (that shirt is very bloody) and as an intensifier (that bloody shirt is bloody awful).

Quite often, these days you'll hear UK English speakers decrying "Americanisms" creeping into the language on this side of the Atlantic, but very often when you look at the historical records and listen to old recordings you'll discover that these so-called Americanisms have been alive and well in the UK for centuries.  Many of them are actually Scotticisms, borrowed into English in the US by Scots-speaking immigrant communities.  Many others are simply dialectal variation within England.  And a surprising number of them are in fact the most common form in use in the UK.

Denouncing another native speaker's language as "wrong" is very dangerous, because if you're the one who is wrong, you leave yourself looking like a prat.  And no-one wants to look like a prat.