C is for Construction

5 02 2012

Here’s a little test. Read this (authentic) text, and identify the grammar. You have one minute, starting now:

A girl was taking her little brother for a walk in the park. ‘Can I go and run along the top of that wall?’ he asked her.

‘No,’ said the sister.

‘Go on,’ insisted the little boy.

‘Well, OK,’ she said, ‘but if you fall off and break both your legs, don’t come running to me.’[1]

Ask most EFL teachers what the grammar is in that text and they will probably home in on the past continuous (was taking), modal auxiliary verbs in an inverted question-form (Can I…?), the past simple (asked, insisted, said) and some kind of conditional construction: ‘if …’.  They might also pick up on the phrasal verbs (go on, fall off), although they might not be sure as to whether these are grammar or vocabulary, strictly speaking.

These are all items that are prominent in any coursebook grammar syllabus.

But if grammar is defined as something like ‘generative multi-morpheme patterns’, and if we understand ‘pattern’ to mean any sequence that recurs with more than chance frequency, a quick Google search, or, more scientifically, a nearly-as-quick corpus search, will throw up many more patterns in this text than your standard grammar syllabus accounts for.

For example:

  • take a/the [noun] for a/the [noun] – there are over 100 instances in the British National Corpus (BNC), according to StringNet, of which round 20 are some form of take the dog for a walk
  • a walk in the [noun] – 44 occurrences in the BNC
  • a [noun] in the [noun] – 10,000 occurrences
  • [verb] and [verb], as in go and run – 82,000 occurrences, of which over 5000 start with some form of go
  • [preposition] the top of [noun phrase]  as in along the top of that wall
    • [prep] the top of the [singular N] = 1665 instances in the BNC
    • [prep] the [sing N] of the [sing N] = 60,000 occurrences
  •  [personal pronoun] + [verb] +  [personal pronoun], as in he asked her –  over 220,000 occurrences, of which 3169 involve the verb ask
  • [verb] + [subject], as in said the sister, insisted the little boy – too difficult to count, but very common, especially in fiction
  • both +  [possessive pronoun] + [plural noun] (as in both your legs): 423 examples
  • come/came etc running – 174 examples
  • don’t come running to me (a Google search returned a figure of approximately 579,000 results for this complete utterance)

This doesn’t exhaust the frequently occurring patterns by any means, but it’s enough to give you an idea of how intensely and intricately patterned that text is. Moreover, many of the patterns in my list are just as frequent – if not more so – as the relatively narrow range of patterns that form traditional coursebook grammar. There are as many instances of the pattern [preposition] the [noun] of the [noun] (as in along the top of the wall) per million words of running text as there are examples of the past continuous, for example.

The range and heterogeneity of these patterns also challenges the traditional division between grammar and vocabulary, such that some grammarians have opted for the vaguer, but perhaps more accurate, term constructions. As Nick Ellis (2011, p. 656) puts it:

Adult language knowledge consists of a continuum of linguistic constructions of different levels of complexity and abstraction.  Constructions can comprise concrete and particular items (as in words and idioms), more abstract classes of items (as in word classes and abstract constructions), or complex combinations of concrete and abstract pieces of language (as mixed constructions).  Consequently, no rigid separation is postulated to exist between lexis and grammar.

Note that, according to this view, the pattern go and [verb] is a construction, and so is the idiom don’t come running to me, since both have a semantic and syntactic integrity that has become routinised in the speech community and entrenched in the minds of that community’s speakers. Given the first couple of words of each construction we can make a good guess as to how it will continue.

In this sense, predictive writing tools, like Google Scribe, that draw on a vast data-base to predict the next most likely word in a string, are replicating what speakers do when they speak, and what listeners do when they listen. Rather than mapping individual words on to a pre-specified grammatical ‘architecture’ (as in a Chomskyan, generative grammar view), speakers construct utterances out of these routinised sequences – the operative word being construct. As one linguist put it, “when it comes to sentences, there are no architects, there are only carpenters” (O’Grady, 2005, p. 2).

And it is out of these constructions that a speakers ‘grammar’ is gradually assembled. Nick Ellis again: “The acquisition of grammar is the piecemeal learning of many thousands of constructions and the frequency-biased abstraction of regularities within them”  (2003, p. 67).

If this is true, what are the implications for the teaching of a second language, I wonder? Where do learners encounter these ‘many thousands of constructions’?  How do they ‘abstract regularities’ out of them?

References:

Ellis, N. 2003. Constructions, Chunking, and Connectionism.  In Doughty, C J, & Long, M H (eds) The Handbook of Second Language Acquisition Oxford: Blackwell.

Ellis, N. 2011. The emergence of language as a complex adaptive system. In Simpson, J. (ed.) The Routledge Handbook of Applied Linguistics. London: Routledge.

O’Grady, W. 2005. Syntactic Carpentry: An Emergentist Approach to Syntax. Mahwah, NJ: Lawrence Erlbaum.

Illustrations from Goldschmidt, T. 1923. English by Intuition and Pictures. Leipzig: Hirt & Sohn.


[1] Girling, B. 1990. The Great Puffin Joke Directory. London: Puffin Books.





A is for Approach

22 01 2012

A copious amount of blog ink (blink?) has been expended in the last week or so, arguing the toss as to whether – among other things – Dogme is an approach. In Neil McMahon’s blog, for example, he asks the question:

What is Dogme?  No one, even among the Dogme-gicians, seem to be able to agree on whether it’s an approach, a method, a technique, a tool, an attitude, a lesson type or an irrelevance.  And does it matter?  I think it matters if people are passing it off as something it’s not (e.g. an approach), at least to me.

At the risk of inducing another bout of blogorrhea, I thought I might try and rise to Neil’s challenge, and to do this by appealing to the literature on methods and approaches. I.e.

Approach refers to theories about the nature of language and language learning that serve as the source of practices and principles in language teaching.

(Richards and Rodgers 2001, p. 20).

In this sense, then, it seems to me that Dogme does qualify as a coherent approach, in that it is grounded in theories both of language and of learning – theories, what’s more, that have been widely broadcast and endlessly discussed.

In terms of its theory of language,  it takes the view, very simply, that language is functional, situated, and realised primarily as text, “hence, the capacity to understand and produce isolated sentences is of limited applicability to real-life language use” (Meddings & Thornbury, 2009, p. 9).

Its theory of learning is an experiential and holistic one, viewing language learning as an emergent, jointly-constructed and socially-constituted process, motivated both by communal and communicative imperatives (op.cit p. 18). Or, as Lantolf and Thorne (2006, p. 17) put it:

… learning an additional language is about enhancing one’s repertoire of fragments and patterns that enables participation in a wider array of communicative activities. It is not about building up a complete and perfect grammar in order to produce well-formed sentences.

Of course, these theories of language and of learning are not original: they are shared by other approaches, notably task-based and whole language learning. So Dogme’s claim to be an approach in its own right is justified only if there are in fact distinguishable (and even distinctive) practices that are derived from these theories (check the Richards & Rodgers definition again). Anyone, after all, can dream up a couple of theories, but if no one actually puts them to work, they are dead in the water.

Putting the theories to work means that (Richards & Rodgers again) “it is necessary to develop a design for an instructional system” (p. 24).

It was the lack of a ‘design’ as such, and even of ‘an instructional system’, that prompted me, a few years ago, to suggest that another self-styled approach, the Lexical Approach, was an approach in name only. In this sense, Neil McMahon’s critique of Dogme (and its ‘evangelists’) echoes my own critique of Lewis (and his acolytes). You can read it here.

My argument went like this: while it is clear that Lewis does have a well elaborated theory about the nature of language (“Language consists of grammaticalised lexis, not lexicalised grammar” [Lewis, 1993, p.vi]) it is less clear that he has a coherent theory of how languages are learned. Nor is it clear how the learning process, in a Lexical Approach, would be actualised, e.g. in terms of a syllabus and materials.

So, while Lewis insists that he is offering “a principled approach, much more than a random collection of ideas that work” (Lewis 1997, p. 205), it’s never been very clear to me how this would work in practice, or how it would not look like any other approach that just happens to have a few collocation activities grafted on.

Is Dogme any less squishy? Is there a Dogme praxis? I don’t know, but I do know that – in the last year or so – there has been a veritable eruption of blogs (too many to list here), workshops, YouTube videos, conference presentations – and even a dedicated conference – that claim allegiance to the founding Dogme principles. There are descriptions of single lessons, sequences of lessons, one-to-one lessons, computer-mediated lessons, and even whole courses. What’s more, these descriptions of Dogme practice emanate from a wide range of geographical contexts – Italy, Germany, France, Russia, Argentina, Costa Rica, Korea, Turkey, the US and the UK, to name but a few.

Of course, if you were to subject these descriptions to close scrutiny, you may find that there are as many differences between them as there are similarities. But that shouldn’t surprise you: the way that any approach is implemented –  whether task-based learning or CLIL or whole language learning   —  is likely to exhibit a similar diversity across different contexts. On the other hand, if there were no common core of praxis, then Dogme’s claim for ‘approach’ status would, I think, be seriously jeopardised.

I believe that there is a common core of Dogme practices, but I also suspect that it is still somewhat in flux. This fuzziness (that many deplore) is both a strength and a weakness. A strength because it invites continuous experimentation; a weakness because it discourages widespread adoption.  But the more that Dogme praxis is described, debated, and even debunked, the more likely it is that its soft centre will coalesce, amalgamate, stablise and – however diverse its outward appearance  – solidify into an approach.

References:

Lantolf, J., & Thorne, S. (eds.) (2006). Sociocultural Theory and the Genesis of Second Language Development. Oxford: Oxford University Press.

Lewis, M. (1993). The Lexical Approach. Hove: Language Teaching Publications.

Lewis, M. (1997). Implementing the Lexical Approach. Hove: Language Teaching Publications.

Meddings, L., & Thornbury, S. (2009) Teaching Unplugged: Dogme in English Language Teaching. Peaslake: Delta Publishing.

Richards, J., &  Rodgers, T. (2001). Approaches and Methods in Language Teaching (2nd edition). Cambridge: Cambridge University Press.





S is for Small Words

2 01 2011

In an extract from his recently published (and long overdue!) autobiography, Mark Twain recalls how, as a child, he was once reprimanded by his mother: “It was a simple speech, and made up of small words, but it went home.” And he adds, “She never used large words, but she had a natural gift for making small words do effective work…” (‘The Farm’, in Granta, 111, 2010, p.237).

‘Making small words do effective work’ might in fact be a definition of English grammar. Not being a highly inflected language, English makes use almost entirely of function words (or functors), such as auxiliary verbs, determiners, and prepositions,  in order to convey all manner of grammatical relations, including definiteness, quantity, possession, duration, completion, volition, voice, futurity, habit, frequency and so on.  Small words also serve to make connections across stretches of text (e.g. and, so, but), to connect utterances to their context (here, now, this), and to manage speaker turns (well, oh, yes).

Not surprisingly, therefore, small words are everywhere: the twenty most frequent words in English are all functors, and together comprise a third of all text, while on average around half the words in any single text are likely to be function words. (Thus far, of the 200 odd words in this text, over 80 are functors).

What’s more, it’s the small words that have the highest degree of connectivity with other words: Nick Ellis (2008) cites research that shows that “the 10 most connected words of English are and, the, of, in, a, to ‘s, with, by, and is” (p. 235). The most frequent patterns that are formed by these connections are what we know as the grammar of the language. As Michael Hoey puts it:

Small words on the march: from Palmer's New Method Grammar (1938)

Grammar is … the sum of the collocations, colligations and semantic associations of words like is, was, the, a and of, syllables like ing, er and ly, and sounds like [t] (at the end of syllables) and [s] and [z] (likewise at the end of syllables)
(2004, p. 159).
It follows (arguably) that learning about the behaviour of these small words, including their constructional properties, is the key to learning the structure of English.  This is an insight that predates even corpus linguistics. In 1864 a certain Thomas Prendergast wrote:
“When a child can employ two hundred words of a foreign language he possesses a practical knowledge of all the syntactical constructions and of all the foreign sounds.”

Not just a child, but any language learner, I’d suggest. In fact, if you take just the top 200 words in English, and for each of these words you display the constructions most frequently associated with it, you cover all the main grammar structures in the language.   Just think of how many structures incorporate the verbs have, be, and do, for example. Or the adverbs ever, more and still. Or the conjunctions if, while and since.

Not only that, if you memorised just one or two common idiomatic expressions whose nucleus was one of these high frequency words, you’d be internalising the typical grammar patterns in which these words are commonly embedded. For learners who are not well disposed to generating sentences from rules, these memorised chunks offer another way into the grammar. What’s more, they provide the building blocks of spoken fluency. Think of the conversational mileage provided by these expressions with way (one of the commonest nouns in English): by the way, either way, to my way of thinking, the wrong way, no way, way to go! etc.

This is the thinking that underpins books like Harold Palmer’s Grammar of English Words (1944) which details the meanings, collocations and phraseology of 1000 common English words.  It is also the theory that prompted me to write Natural Grammar , published in 2004 by Oxford University Press (the working title of which, by the way, was The Secret Grammar of Words). In this book I take 100 high frequency words and explore their associated patterns. Predictably, this word-level view of grammar provides coverage of all the main ‘coursebook’ structures, plus a good many more.

One argument for organising a grammar around ‘small words’ is that their very smallness – and the fact that they are typically unstressed and often contracted –  means that they have low ‘perceptual saliency’. That is to say, learners simply don’t notice them. Making them salient, by devoting a double-page spread to each one, would seem to be a helpful thing to do, I figured.

Which leads me to wonder – if this was such a good idea, and so well-grounded in theories of language description and acquisition – why the lack of uptake? In short, why has this book been less than a runaway success? 😉

References:

Ellis, N.  2008. The dynamics of second language emergence: cycles of language use, language change, and language acquisition.  Modern Language Journal, 92, 232 — 249.
Hoey, M. 2004. Lexical Priming: A new theory of words and language. London: Routledge.

Prendergast, T. 1864.  The Mastery of Languages, or, the Art of Speaking Foreign Tongues Idiomatically.





L is for (Michael) Lewis

5 09 2010

(Continuing an occasional series of the type ‘Where are they now?’)

Michael Lewis and me: University of Saarbrücken

A reference in last week’s post (P is for Phrasal Verb) to the fuzziness of the vocabulary-grammar interface naturally led to thoughts of Michael Lewis. It was Michael Lewis who was the first to popularize the view that “language consists of grammaticalized lexis, not lexicalized grammar” (1993, p. 34). This claim is a cornerstone of what rapidly came to be known as the Lexical Approach – rapidly because Lewis himself wrote a book called The Lexical Approach (1993), but also because, at the time, corpus linguistics was fueling a major paradigm shift in applied linguistics (under the visionary custodianship of John Sinclair and his brainchild, the COBUILD project) which, for want of a better term, might best be described as ‘lexical’. Lewis was one of the first to popularize this ‘lexical turn’ in applied linguistics, and he did so energetically, if, at times, contentiously.

So, what happened to the Lexical Approach – and to Lewis, its primum mobile?

Well, for a start (as I argued in an article in 1998), the Lexical Approach never was an approach: it offered little guidance as to how to specify syllabus objectives, and even its methodology was not much more than an eclectic mix of procedures aimed mainly at raising learners’ awareness about the ubiquity of ‘chunks’. Moreover, Lewis seemed to be dismissive – or perhaps unaware – of the argument that premature lexicalization might cause fossilization. To him, perhaps, this was a small price to pay for the fluency and idiomaticity that accrue from having an extensive lexicon. But wasn’t there a risk (I argued) that such an approach to language learning might result in a condition of “all chunks, no pineapple” i.e. lots of retrievable lexis but no generative grammar?

In the end, as Richards and Rodgers (2001) note, the Lexical Approach “is still an idea in search of an approach and a methodology” (p. 138). Nevertheless, as I said in 1998, “by challenging the hegemony of the traditional grammar syllabus, Lewis… deserves our gratitude.”

Michael responded graciously to these criticisms, acknowledging them – although not really addressing them – in a subsequent book, Teaching Collocation (2000). There the matter rested. Until 2004, when I published a ‘lexical grammar’ – that is, a grammar based entirely on the most frequent words in English – and, in the introduction, paid tribute to my ‘lexical’ precursors, specifically Michael Lewis, and Jane and Dave Willis.

Michael was not pleased. When I next ran into him, at an IATEFL Conference a year or two later, he was still fuming. Apparently, by suggesting that his version of the Lexical Approach had anything in common with the Willis’s, or that my book in any way reflected it, was a gross misrepresentation. The sticking point was what Michael calls ‘the frequency fallacy’, that is, the mistaken belief that word frequency equates with utility. Much more useful than a handful of high-frequency words, he argued, was a rich diet of collocations and other species of formulaic language. I, by contrast, shared with the Willis’s the view that (as Sinclair so succinctly expressed it) ‘learners would do well to learn the common words of the language very thoroughly, because they carry the main patterns of the language’ (1991, p. 72). To Michael, ‘patterns of the language’ sounded too much like conventional grammar.

When we met again, a year or two later, at a conference at the University of Saarbrücken, we found that we had more in common than at first seemed. For a start, we sort of agreed that the chunks associated with high frequency words were themselves likely to be high frequency, and therefore good candidates for pedagogical treatment. And Michael was working on the idea that there was a highly productive seam of collocationally powerful ‘mid-frequency’ lexis that was ripe for investigation.

A few months later, at a conference in Barcelona, we had even started talking about some kind of collaborative project. I was keen to interest Michael in developments in usage-based theories of acquisition, premised on the view that massive exposure to formulaic language (his ‘chunks’) nourishes processes of grammar emergence – a view that, I felt, vindicated a re-appraisal of the Lexical Approach.

But Michael is enjoying a well-earned retirement, and I suspect that he’s satisfied in the knowledge that the Lexical Approach, his Lexical Approach, whatever exactly it is, is well-established in the EFL canon, and that his name is stamped all over it.

So, then, what’s the Lexical Approach to you?

References:

Lewis, M. 1993. The Lexical Approach. Hove: LTP.
Lewis, M. 2000. Teaching Collocation. Hove: LTP.
Richards, J., and Rodgers, T. 2001. Approaches and Methods in Language Teaching (2nd edition). Cambridge University Press.
Sinclair, J. 1991. Corpus, Concordance, Collocation. Oxford University Press.