P is for Poverty of the stimulus

7 06 2015

plato_bustThe case for humans being innately and uniquely endowed with a ‘language instinct’ rests largely on the ‘poverty of the stimulus’ argument, or what is sometimes called ‘Plato’s problem’: How do we know so much when the evidence available to us is so meagre?

As Harris (1993: 57-8) elaborates:

‘One of the most remarkable facts about human languages – which are highly abstract, very complex, infinite phenomena – is that children acquire them in an astonishingly short period of time, despite haphazard and degenerate data (the “stimulus”). Children hear relatively few examples of most sentence types, they get little or no correction beyond pronunciation (not even that), and they are exposed to a bewildering array of false starts, unlabelled mistakes, half sentences and the like.’

Is this really true? Is the stimulus really so impoverished?

The quantity of the stimulus – i.e. the input available to a child –  is certainly not impoverished: it has been estimated (Cameron-Faulkner et al. 2003) that children hear around 7,000 utterances a day, of which 2,000 are questions (cited in Scheffler 2015). This suggests that in their first five years children are exposed to 12.5m meaningful utterances. At an average of, say, ten words an utterance this is larger than the entire British National Corpus (100m words), from which several hefty grammars and dictionaries have been derived.

What about the quality? While it’s true that the speech between adults often includes ‘disfluencies’ of the type mentioned by Harris above, studies suggest that ‘motherese’ (i.e. the variety that caregivers typically use when interacting with their children) ‘is unswervingly well formed’ (Newport et al. 1977, cited in Sampson 2005). In one study ‘only one utterance out of 1500 spoken to the children was a disfluency’ (ibid.).

Chomsky and his followers would argue that, even if this were true, the child will have little or no exposure to certain rare structures that, in a short time, she will nevertheless know are grammatical. Ergo, this knowledge must derive from the deep structures of universal grammar.

One much-cited example is the question-form of the sentence with two auxiliaries, e.g. The boy who was crying is sleeping now. How does the child know that the question form requires fronting of the second of the two auxiliaries (Is the boy who was crying sleeping now?), and not the first: *Was the boy who crying is sleeping now?, especially if, as Chomsky insists, the number of naturally-occurring examples is ‘vanishingly small’: ‘A person might go through much or all of his life without ever having been exposed to relevant evidence’ (Chomsky 1980: 40). The explanation must be that the child is drawing on their inborn knowledge that grammatical transformations are structure-dependent.

The_mother_of_JohnA quick scroll through a corpus, however, reveals that the stimulus is not as impoverished as Chomsky claims. Pullum & Scholz (2002, cited in Sampson op. cit), using a corpus of newspaper texts, found that 12% of the yes/no questions in the corpus were of the type that would refute the ‘invert the first auxiliary’ hypothesis. (It is significant that Chomsky impatiently dismisses the need to consult corpus data, on the grounds that, as a native speaker, he intuitively knows what is grammatical and what is not. Unsurprisingly, therefore, generative linguists are constantly, even obsessively, fiddling around with implausible but supposedly grammatically well-formed sentences such as John is too stubborn to expect anyone to talk to and What did you wonder how to do? [cited in Macaulay 2011]).

But even if it were the case that the (spoken) input might be deficient in certain complex syntactic structures, you do not need to hypothesize ‘deep structure’ to account for the fact that a question of the type *Was the boy who crying is sleeping now? is simply not an option.

Why not? Because language is not, as Chomsky views it, a formal system of abstract symbols whose units (such as its words) are subject to mathematical operations, a perspective that ‘assumes that syntax can be separated from meaning’ (Evans 2014: 172).  Rather, language is acquired, stored and used as meaningful constructions (or ‘syntax-semantics mappings’).  Children do not process sentences from left to right looking for an available auxiliary to move. (They don’t even think of sentences as having a left and a right). They process utterances in terms of the meanings they encode. And meaning ‘isn’t just abstract mental symbols; it’s a creative process, in which people construct virtual experiences – embodied simulations – in their mind’s eye’ (Bergen 2012: 16).

Thus, the child who is exposed to noun phrase constructions of the type the little boy who lives down the lane or the house that Jack built understands (from the way they are used in context) that these are coherent, semantic units that can’t be spliced and re-joined at will.  Is the little boy sleeping? and Is the little boy who lives down the lane sleeping? are composed of analogous chunks and hence obey the same kind of syntactic constraints.

What’s more, experiments on adults using invented syntactic constructions suggest that patterns can be learned on the basis of relatively little input. Boyd et al. (2009: 84) report that ‘even small amounts of exposure were enough (a) to build representations that persisted significantly beyond the exposure event, and (b) to support production.’  A little stimulus goes a long way.

daniel-everett-dont-sleep-there-are-snakes-life-and-langauge-in-the-amazonian-jungleIn the end, we may never know if the poverty of the stimulus argument is right or not – not, at least, until computer models of neural networks are demonstrably able to learn a language without being syntactically preprogrammed to do so. As Daniel Everett (2012: 101) writes, ‘No one has proven that the poverty of the stimulus argument, or Plato’s Problem, is wrong. But nor has anyone shown that it is correct either. The task is daunting if anyone ever takes it up. One would have to show that language cannot be learned from available data. No one has done this. But until someone does, talk of a universal grammar or language instinct is no more than speculation.’

References

Bergen, B.K.(2012) Louder than words: The new science of how the mind makes meaning. New York: Basic Books.

Boyd, J.K., Gottschalk, E.A., & Goldberg, A.E. (2009) ‘Linking rule acquisition in novel phrasal constructions.’ In Ellis, N.C. & Larsen-Freeman, D. (eds) Language as a complex adaptive system. Chichester: John Wiley & Sons.

Cameron-Faulkner, T., Lieven, E. & Tomasello, M. (2003) ‘A construction based analysis of child directed speech.’ Cognitive Science 27/6.

Chomsky, N. (1980) various contributions to the Royaumont Symposium, Piatelli-Palmarini (ed.) Language and Learning: The debate between Jean Piajet and Noam Chomsky. London: Routledge & Kegan Paul.

Evans, V. (2014) The Language Myth: Why language is not an instinct. Cambridge: Cambridge University Press.

Everett, D. (2012) Language: The cultural tool. London: Profile Books.

Harris, R.A. (1993) The Linguistics Wars. New York: Oxford University Press.

Macaulay, K.S. (2011) Seven Ways of Looking at Language. Houndmills: Palgrave Macmillan.

Pullum, G.K. & Scholz, B.C. (2002) ‘Empirical assessment of stimulus poverty arguments.’ Linguistic Review, 19.

Sampson, G. (2005) The Language Instinct Debate (Revised edition). London: Continuum.

Scheffler, P. (2015) ‘Lexical priming and explicit grammar in foreign language instruction.’ ELT Journal, 69/1.

 PS: There will be no more new posts until the end of summer and things calm down again.


Actions

Information

61 responses

7 06 2015
eflnotes

hi Scott

just wanted to pick up on the statement “It is significant that Chomsky impatiently dismisses the need to consult corpus data,…” which I think may not be as an adamant or straightforward a statement as it may imply.

one history of linguistics scholar argues that the case of Chomsky being “anti-corpus” is not so well supported – http://htl.linguist.univ-paris-diderot.fr/leon/leon_hs.pdf

ta
mura

7 06 2015
Scott Thornbury

Thanks, Mura, for that clarification. The fact remains, though, that as a means of understanding grammaticality, Chomsky was very dismissive of corpora. He is quoted as saying ‘It is absurd to construct a grammar that describes observed linguistic behavior directly’ (Chomsky 1961: 130, quoted in Sampson 2001, Empirical Linguistics) and here he goes so far as to say that ‘corpus linguistics doesn’t mean anything’!

7 06 2015
eflnotes

yes i think as far as grammar/syntax goes he does not put much reliance on corpora

thanks for that interview pdf

reading it in full it seems he is saying that by +itself+ it does not mean anything, of course i think the use of that phrase is provocative in its usual sense 🙂

i think it is necessary to highlight this strand of arguments against Chomsky’s position since it goes something like (very simplified i admit but seen a lot) “look Chomsky dismisses actual language use as shown by corpora, what a nutter!”

whereas in contrast his position is much more well-thought out as shown in that interview pdf you linked to

ta
mura

22 06 2015
Ahmed Mohammed

I’m little bit confused about that Mr. Scott because most of Iraqi colleges in MA and Ph.D depends on Chomsky’s resources. So, do you mean according to that we wasted our time to study his resources?

24 06 2015
Scott Thornbury

Hi Ahmed … far be it from me to say if you wasted your time! Studying anything to do with linguistics in any detail can only be a good thing, but we should also learn to be critical, and not accept any theory on its face value. I don’t think I’m alone in suggesting that the uncritical acceptance of Chomskyan linguistic theory somewhat distorts and obscures the choice of options available. And, whether or not it is a sound theory, its application to language teaching is – in my opinion (and, to his credit, in Chomsky’s) – tenuous, to say the least.

7 06 2015
Geoff

Hi Scott,

Certain properties of language which are not explicit in the input are learnt by children who end up with a complex grammar that goes far beyond the input, resulting in knowledge of grammaticality, ungrammaticality, ambiguity, paraphrase relations, and other subtle and complex phenomena. One explanation for this knowledge is the theory that universal principles must mediate acquisition and shape knowledge of language.

While you comment on the quantity and quality of input children get, you wisely stop short of claiming (as Samson does) that there’s no poverty of the stimulus problem in the first place. Nothing much so far, then.

You go on to suggest that an alternative explanation is that “language is not, as Chomsky views it, a formal system of abstract symbols whose units (such as its words) are subject to mathematical operations, a perspective that ‘assumes that syntax can be separated from meaning’ (Evans 2014: 172). Rather, language is acquired, stored and used as meaningful constructions (or ‘syntax-semantics mappings’)”. First, keeping syntax and semantics separate helps study syntax and is no necessary criticism of a theory which does so, but more importantly, the rather bald assertion that language consists of “meaningful constructions” does not provide an alternative explanation because, while it might explain some 1LA, it fails to explain all the knowledge children demonstrably possess of language.

You end by saying we may never know if the poverty of the stimulus argument is right or not. This limp conclusion is supported by a daft quote from Everett who’s obviously been immersed up to his neck in snake-infested rivers for rather too long; giving him an excuse, but not you :-). That no one has so far proved the poverty of the stimulus argument to be false is an enormously strong argument in its favour, while the fact that nobody has shown it to be correct has no force whatsoever, because it’s logically impossible to prove that a theory is true. Thus, the asinine statement that nobody has proved that UG theory is “correct” doesn’t warrant the conclusion that it’s “no more than speculation”, as anybody who knew the first thing about scientific method and theory construction would know.

7 06 2015
Scott Thornbury

Thanks Geoff – I had a feeling this post would rouse you from your post-prandial slumber!

You say: “Certain properties of language which are not explicit in the input are learnt by children who end up with a complex grammar that goes far beyond the input, resulting in knowledge of grammaticality, ungrammaticality, ambiguity, paraphrase relations, and other subtle and complex phenomena”.

Yet the complex grammar you describe is late acquired – in fact it is probably nor so much acquired as learned. Pre-school children are exposed to and acquire the syntactic features of spoken grammar – qualitatively less complex than the written grammar that forms the basis of Chomskyan generative linguistics. Research suggests that the sentences beloved by Chomskyan linguists are neither understood nor produced by most children under the age of 9. Claims that children can make grammaticality judgements about sentences like Jimbo seemed to us to like himself or John and Peter like each other’s pictures, are false (‘The set of children under 5 who will pay attention to examples like these, far less produce them, is null’, Miller and Weinert (1998: 383)). To use such claims to argue the case for the child having an innate structure-dependent grammar is dodgy, to say the least.

As Miller and Weinert (1998) go on to assert, ‘Children acquire spoken language but learn written language’. How? Through the development of literacy skills, where the stimulus is far from impoverished.

Miller, J. & Weinert, R. (1998) Spontaneous Spoken Language. Syntax and discourse. Oxford University Press.

7 06 2015
geoffjordan

Hi Scott,

You’re right of course to point out that I credited children with learning things about language which in fact are only learned much later. But even the more sophisticated knowledge learned later is part of the poverty of the stimulus argument.

Have a nice lunch.

7 06 2015
Luan

“The quantity of the stimulus – i.e. the input available to a child – is certainly not impoverished: it has been estimated (Cameron-Faulkner et al. 2003) that children hear around 7,000 utterances a day, of which 2,000 are questions (cited in Scheffler 2015). This suggests that in their first five years children are exposed to 12.5m meaningful utterances.”

In many cultures infants get no motherese at all, except for the odd telling off. Once the child is able to speak, then the parents converse with them.

7 06 2015
Grant Hartley

Hi Scott

I really enjoyed reading this post. The suggestion that “in their first five years children are exposed to 12.5m meaningful utterances” got me thinking about my own observations of my children’s language development. I have always been surprised at how quickly they develop quite complex meaning/ form units. Items that my adult students don’t seem to ‘get’ through direct instruction in class my five year old son seems to have kind of picked up automatically. I’m guessing the main difference lies with the concept of ‘meaningful’. His exposure to conditional structures, as an example, must be so much more meaningful than those of my students, no matter how much I try to make them, or manufacture contexts where they could be more meaningful in class. Your reference to complexity theory makes sense here, it’s not the amount so much as the ‘moment’, which triggers the adjustment.

Given that ‘class moments’ aren’t usually nearly so laden with meaning, I wonder if phrases (be they what we consider lexical or more ‘grammatical’ as in noun phrases) aren’t a good place to start in terms of ‘granularizing’ language. I say this because ‘phrases’ can be categorized according to Halliday’s three aspects of register (field, tenor & mode) and therefore the most important meanings within any given register can be highlighted, before forms within the phrases are focused on. An added bonus to this is that it can avoid the narrow focus on form which course books usually adopt.

‘Granularizing’ with a broader, but perhaps less intensive focus on form. In other words, more as awareness raising as opposed to teaching. This seems to me to fit more with what complexity theory has to offer language teaching, i.e. non-linearity etc.

Grant

8 06 2015
Scott Thornbury

Hi Grant,

Thanks for referencing Halliday, which gives me the excuse to include this quote that I had left out of the original post on grounds of length:

As Halliday argues, learning a language is learning to mean, and ‘the essential condition of learning is the systematic link between semantic categories and the semiotic properties of the situation’ (1975: 140).

It’s all about the link! ‘Our central task for any theory of grammar is to solve the so-called “linking problem”: the problem of discovering regularities of how participants of an event are expressed in surface grammatical forms’ (Baker 1996: 1, cited in Goldberg 2006). Thus, most languages map the agents of actions as the grammatical subject, and the recipients as grammatical objects – these ‘argument structures’ are a reflection of the way we perceive and categorize events in the world. Goldberg argues ‘that the input children receive provides more than adequate means by which learners can induce the association of meaning with certain argument structure patterns’ (2006: 72) using general learning strategies, and without having to hypothesize an innate grammar.

Moreover, the way that these argument structure patterns are represented in the syntax suggests that you cannot separate syntax from semantics, since the constructurons themselves have meaning. ‘Thus, if I say to you “The dax got mibbed by the gazzer,” you know – without knowing the meaning a signle content word – that the gazzer did something (called mibbing) to the dax (and we have entered that event for the perspective of the dax, as patient).’ (Tomasello 2008: 297).

Coming back to your point about item-learning (e.g. in the form of phrasal chunks), Boyd, et al (2009) – the study I cited in my post – conclude that these constructions are probably learned initially as unanalysed chunks, on an item-by-item basis ‘and only achieve abstract status over time’ (p. 86).

This is certainly the conclusion that Wong-Fillmore (1976) came to in her classic study of L2 acquisition in playgrounds, where observed strategies included:

Get some expressions you understand and start talking.
Look for recurring parts in the formulas you know.
Work on the big things first; save the details for later.

(cited in R. Ellis 2008)

Other refs:

Goldberg, A. (2006) Constructions at Work: The nature of generalization in language. Oxford University Press.

Halliday, M.A.K. (1975) Learning to mean: Explorations in the development of language. Edward Arnold.

Tomasello, M. (2008) Origins of Human Communication. MIT Press.

8 06 2015
Grant Hartley

Hi Scott

Thanks for sharing the quote. While I would agree with the Goldberg quote regarding the input most children receive in their first language, I’m not so sure about the quantity or quality of my students second language input outside of class. Anyway, to me the aim of the classroom, from purely a learning perspective, is to try create shortcuts in which ever way students appear to best respond to. Given that noticing appears to be an initial step, it seems a good place to start. Appropriacy is obviously an important consideration. I may be wrong but I think it was Widdowson who suggested that appropriate language was that which was ‘appropriable’. The phrases I point students towards would be recognizable/ understandable to them, but perhaps haven’t yet become part of their productive repertoire. My expectations are not that students memorize and learn the phrases, but rather notice the wordings in the hope that it has some sort of effect on their interlanguage.

I think your sentence ““The dax got mibbed by the gazzer,” demonstrates that once an awareness of the patterns is in place, students have more to work with when they start guessing/ inferring the content words. For example, if a student is aware of the construction, “She said that it was here.”, then I suspect they would be more likely to infer the meaning of less frequent verbs representing verbal processes, like; ‘mentioned’, when they eventually encounter them.

12 06 2015
Anthony Ash

Hi Grant,

I think you’ve hit the nail on the head with regards to separation of meaning and form. I think that is why in language teaching it’s important to set up the meaning as clearly as possible, which could be lexical meaning or, alluding to Scott’s example of the “The dax got mibbed by the dazzer”, situational meaning, before then going on to give the learners what I like to call “the noise” i.e. the form.

7 06 2015
Cindy Hauert

Chomsky bashing seems to be popular these days. While I’m not a trained linguist, I’m still forever grateful to him for delivering us from the Skinnerian hell that preceded his work. It may turn out that he was wrong, but his analysis still paved the way for a new look at language acquisition, particularly in children.

I’m also reminded of an interesting conversation I had some years ago with a Zoologist, who had done extensive reasearch in how finches learn to sing. According to him, baby finches who are deprived of singing lessons from their elders CAN sing–but their songs are never as elaborate or “communicative” as those who do get exposure to adult finch songs. Ditto their nest building performance. The adult input deprived finches build half-assed sloppy nests that fail to do the job they’re meant to.

Made me think…

8 06 2015
Scott Thornbury

Hi Cindy, yes even some of Chomsky’s fiercest critics credit him with having shaken up linguistics when it badly needed it.

As for the bird song example, I recall Gillian Brown (University of Cambridge) saying that there are two distinct types of birds: those whose song seems to be innately programmed, and those that need to learn their song from their caregivers. Perhaps your finches are a bit of both!

9 06 2015
Scott Thornbury

A footnote – just came across this:

Birdsong, Speech and Language: Exploring the evolution of mind and brain, Bolhuis et al, 2013 MIT Press. It has an introduction by Chomsky so I suspect it will argue that birds, too, suffer from poverty of the stimulus!

9 06 2015
Cindy Hauert

Thanks Scott. The book is a bit pricey but would certainly be interesting!

9 06 2015
Cindy Hauert

What the Songbird Said, a BBC podcast about birdsong and language which is relevant to the discussion.

https://itunes.apple.com/ch/podcast/discovery/id284012446?l=en&mt=2&i=343094722

10 06 2015
Russ

I’m curious what the Skinnerian hell was?

10 06 2015
Cindy Hauert

Oh, I just meant that pure behaviourism has been pretty much discounted as a complete explanation of language acquisition.

8 06 2015
J.J. Almagro

The poverty of stimulus and children’s power to figure out the big communication picture make me think of CSI agents coming up with a full-fledged 3D version of somebody’s face with just a couple of bone splinters.

8 06 2015
johnpfannkuchen

I love that you referenced Bergen! I actually taught my Composition 111 course this semester.

9 06 2015
Svetlana

Hello Scott,
Thank you for a very intriguing post, the poverty of stimulus. I totally agree with you when you say that this argument is not convincing enough to advocate the existence of some Language Acquisition Device , just because in fact the stimulus is not poor at all. But what could be more persuasive is the following evidence – the stimulus DOESN’T HAVE TO BE STRONG because a child’s mind is hypersensitive to the incoming data through visual, motor and auditory channels of perception. Just look at the following figure: A baby forms 700 new neural connections per second in the first years of life (J. Guerra). “This process of building the architecture of the brain is dramatically influenced by life experiences. It is not genetically hardwired.” (J. Shonkoff) Those neural connections are possible because of rapid growth of dendritic spines that appear as a result of learning and novel sensory experience. (G. Yang, F. Pan, W. Gan. Stably maintained dendritic spines are associated with lifelong memories)
There’s some kind of Universal Grammar, Noam Chomsky says. Who would deny that? Of course it is universal, just look around – wherever you are, what you will see will not differ from the perception of the world of that of a baby – objects around you and the actions you can do to them or they to you or each other to show how they move through space and time, the core noun and verb phrase. It is a universal.
And I wonder what percentage of the world population could get Band 9 on a test like IELTS, for example, in their native language. My guess is this number will strongly correlate with the years of schooling those people have done so far.
Have a great summer, Scott! Looking forward to new posts in September.

9 06 2015
Scott Thornbury

Thanks, Svetlana – and, as usual, you are able to bring new evidence to bear to confirm my initial hunch.

It occurs to me that the poverty of the stimulus argument seems to have been enlisted to refute both the notion that language, like other animal faculties, might simply have evolved as humans adapted to their environment in increasingly complex ways, as well as the fact that children are a lot more sentient than we take them for. In that sense, the PoS argument is is a bit like creationism – a colossal failure of the imagination.

12 06 2015
Anthony Ash

Hi Scott,

I saw from the picture above you’ve read Dan Everett’s first book about the Amazonian people he lived with. He has a second book, “Language: the cultural tool” – I know it appears in your reference list at the end of the post, but if you haven’t read it all, I’d recommend it: he talks exactly about what you’ve just mentioned, that our linguistic abilities are a product of our environment. Cracking stuff 🙂

9 06 2015
geoffjordan

What evidence does Svetlana provide to confirm your initial hunch? She says “the stimulus DOESN’T HAVE TO BE STRONG because a child’s mind is hypersensitive to the incoming data through visual, motor and auditory channels of perception”. So what? Without being hard-wired the baby detects or invents exemplars of language that our empirical observations of the input miss? The stimulus, strong or weak, has to consist of input that is sufficient to explain what a child knows about language – and it doesn’t.

That a baby forms 700 new neural connections per second in the first years of life does not, Shonkoff notwithstanding, in any way confirm your hunch that we are not hardwired in the way Chomsky suggests: as stated, it’s pure assertion. Your original post, which fails to appreciate the asymmetry between truth and falsehood, and now your glowing reaction to this obscurantist, pseudo-scientific twaddle, this list of unsupported assertions and non-sequiturs from Svetlana, both fail to respect the most basic rules of logical argument and rational thinking.

Just to be clear, I think that there are serious flaws in Chomsky’s theory of UG, and I think he has little to contribute to theories of SLA. But ill-informed and badly-argued nonsense like this does nothing to promote an understanding of the issues. You say “the PoS argument is a bit like creationism – a colossal failure of the imagination”. In my opinion, your argument is a bit like astrology – a colossal failure of rational argument.

9 06 2015
Scott Thornbury

Thanks Geoff. I was worried that you were losing your touch! 😉

“The stimulus, strong or weak, has to consist of input that is sufficient to explain what a child knows about language – and it doesn’t.”

Actually – and this was the point of my original post – it does. Corpus studies suggest that everything a child needs is in place. The fine-tuning (‘John showed Bill a picture of himself’ etc etc ad nauseam) comes much later, with the onset of basic literacy. To argue that a 5-year old child has an adult grammar already in place is to be woefully ignorant of children.

Svetlana’s point simply echoes the point that Boyd et al. made (quoted above) that the child’s brain is mightily disposed to mine the input, and that a little stimulus goes a long way, especially when the child is so feverishly in need of both communicating and becoming socialized. General learning processes do the rest. “‘Children’s creative linguistic ability – their language system – emerges from their analyses of the utterances in their usage history using general cognitive abilities and from their abstraction of regularities within them”. (The “Five Graces Group”, 2009: 9).

It’s hardly very parsimonious to hypothesise a modular language learning device – a kind of linguistic g-spot (and one that no neurobiologists have yet located) – when a perfectly plausible explanation is already available.

Ref:

The ‘Five Graces Group’ (2009) ‘Language is a complex adaptive system: position paper.’ In Ellis, N.C. & Larsen-Freeman, D. (eds) Language as a complex adaptive system. Chichester: John Wiley & Sons.

10 06 2015
Luan

“It’s hardly very parsimonious to hypothesise a modular language learning device – a kind of linguistic g-spot (and one that no neurobiologists have yet located)”

Is anyone really hypothesising that though? It’s a very dated view. Knowledge of brain plasticity suggests that the logic of language is in the wiring and connections and that a series of neural switches are activated at birth that expand through firing off further connections based on these as a model. As well as that, much of this initial circuitry is subsequently disposed with at puberty hence the critical period.

This doesn’t negate the idea of language acquisition as an instinct, as these initial clumps of primed neurones and synapses — the acquisition infrastructure — are innate.

Cf. Pinker’s ‘The Language Instinct’

9 06 2015
geoffjordan

Actually, it doesn’t. I promise this is my last contribution to the “Yes-it-does-no-it-doesn’t” song; after all, it’s your blog.

Once again, you’re not reasoning logically. That corpus studies suggest that everything a child needs “is in place” doesn’t justify the inference that children’s knowledge of the language can be accounted for by the input. There is a great deal of evidence that children have knowledge of language that did not come from input. Whether or not the LAD is the correct explanation for this is another matter.

Chomsky doesn’t argue that a 5-year old child “has an adult grammar already in place”; he argues that “fairly early in life” a child’s linguistic competence reaches a “steady state”.

That people learn more about language after that is obviously the case, and it has absolutely no weight as evidence against the PoS argument.

“The child’s brain is mightily disposed to mine the input”; “A little stimulus goes a long way”; and “The child is feverishly in need of both communicating and becoming socialized” are empirically-empty, value-laden assertions.

The claim that “general learning processes” provide a “perfectly plausible explanation” is another empty assertion which gets no support from being preceded by quotes from Boyd, et.al. and the Five Graces Group, since all the quotes do is make the same unsupported assertion in slightly different ways.

Chomsky’s argument is quite simple:

1. A native speaker of a particular language knows a particular aspect of syntax.

2. This aspect of syntax could not have been acquired from language input. This involves considering all possible sources of evidence in the language the child hears and in the processes of interaction with parents.

3. This aspect of syntax is not learnt from outside. If all the types of evidence considered in Step B can be eliminated, the logical inference is that the source of this knowledge is not outside the child’s mind.

4. This aspect of syntax is built-in to the mind (Cook, 1991).

In place of this, you offer nothing more than general platitudes like “The child’s brain is mightily disposed to mine the input” and unsupported assertions that general learning processes can explain how children know things about language that go far beyond anything they’ve been exposed to. I hope readers can see for themselves that this is a very poor attempt indeed to refute the PoS argument.

Cook, V. J. (1991) “The poverty-of-the-stimulus argument and multi-competence”. Second Language Research, 7,2, 103-117.

9 06 2015
Scott Thornbury

“This aspect of syntax could not have been acquired from language input.”

This is the bone of contention, surely. Prove that it couldn’t. Or else isn’t this another ’empirically-empty’ assertion?

I agree that we cannot rule out the PoS argument until ‘all possible sources of evidence in the language the child hears and in the processes of interaction with parents’ have been considered (as you put it). But this has not been done. Least of all by Chomsky, who is utterly scornful of data.

The only researcher I know of who has attempted to do this is Deb Roy at MIT, who wired up his home and recorded every waking moment of his child’s initial speech exposure and production. He found a strong correlation between frequency in the input and vocabulary development but (in 2009 at least) hadn’t yet sifted through the data to plot the development of grammar – although he seemed confident that it would emerge in the same way. (More recently he’s been tracking the emergence of future forms in another child) .

Nevertheless, if you generalize the findings beyond the single word level to constructions, there are grounds to believe that, again, frequency of occurrence in the input predicts emergence in the output. The next step is to generalize from constructions to grammar – a step that Chomsky wouldn’t wish to take – but which nowadays anyone versed in corpus linguistics, phraseology, construction grammar etc, wouldn’t blink at – then, hey presto, the grammar emerges on the back of the frequent constructions.

See http://web.media.mit.edu/~dkroy/papers/pdf/Roy_interspeech_keynote.pdf

10 06 2015
geoffjordan

Hey Presto indeed! This is what your alternative explanation amounts to – it’s magic! Apt, because since your original post on Sunday, you’ve pulled various rabbits out of the hat to dodge criticism.

Step 1: You comment on the quantity and quality of input children get, but you don’t go so far as to say that the input children get is enough to explain all they know about language. As an alternative to Chomsky’s view of syntax, you suggest that “language is acquired, stored and used as meaningful constructions (or ‘syntax-semantics mappings’)” without explaining how this deals with the PoS argument. You end with Everett’s claim that since no one has proved that the poverty of the stimulus argument is correct, “talk of a universal grammar or language instinct is no more than speculation”. I pointed out that this claim is nonsense since it’s logically impossible to prove that a theory is true.

Step 2: Svetlana brings “new evidence to bear” to confirm your “initial hunch”. This new evidence consists of facts about babies forming lots of neural connections per second, about the rapid development of “dendritic spines” containing “lifelong memories”, plus a few unsupported wooly assertions thrown in for good measure.

Step 3: In reply to my complaint that you’re easily hoodwinked by bullshit, you then say that, “actually”, the input children get IS sufficient to explain what they knows about language. You say that this (now much stronger) claim is supported by corpus studies which “suggest that everything a child needs is in place”. But how do these corpus studies explain what children know about language? Well, because “the child’s brain is mightily disposed to mine the input” found in the corpora; because “a little stimulus goes a long way, especially when the child is so feverishly in need of both communicating and becoming socialized”; and because “general learning processes do the rest”.

Step 4: In reply to my further complaint that platitudes and unsupported assertions have now completely replaced any attempt at reasoned argument, you reply that anyone who claims that children’s knowledge about an aspect of syntax could not have been acquired from language input has to prove that it couldn’t, or else it’s another ’empirically-empty’ assertion. You make the same mistake you made in Step 1 again. For purely formal reasons, you can’t prove such a thing, and to demand such “proof” demonstrates an ignorance of logic and of how rational argument, science, and theory construction work. Obviously (although it’s clearly not so obvious to you), failing to meet the impossible demand of proof doesn’t make the PoS argument an empirically-empty assertion.

To the extent that a theory survives serious, empirically-based attempts to falsify it, we have reasonable grounds to tentatively accept it. In the last 50 years thousands of studies have been done in hundreds of universities around the world involving tens of thousands of participants to test the hypothesis that children have knowledge of the putative universal principles of language (not, incidentally of all aspects of grammar) which, to the best of our knowledge, (i.e. having considered all possible sources of evidence in the language the child hears and in the processes of interaction with parents) they didn’t get from input. This is open to empirical tests, so that it can be proved false, although (at the risk of repeating myself too often) it can’t be proved. The PoS argument itself is not a theory, it’s a logical inference from the evidence that children have innate knowledge of what are claimed by UG theory to be universal aspects of natural languages.

To say that Chomsky is “utterly scornful of data” is to further misrepresent his argument. Chomsky is scornful of the uses to which Skinner and others put performance data, which is a very different matter.

Step 5: Here’s the magic bit where we finally fall down the hole and end up in Wonderland. If we “generalize the findings beyond the single word level to constructions” and then “generalize from constructions to grammar”, then “hey presto, the grammar emerges on the back of the frequent constructions”. What “findings beyond the single word level”? How do you generalise these “findings” to “constructions”? How do you generalise from constructions to grammar? What grammar?

Some good work has been does on connectionist models, but none of it by Larson-Freeman whose muddled pronouncements on emergentism have already led you to publish a particularly daft article on the subject (see reference below). While fools rush in, the angels working in this area tread warily, recognise the complexities of the issues, and salute the enormous contribution Chomsky has made to the study of linguistics.

Maybe you should stick to the MacNuggets.

Thornbury, S. (2009) “Slow Release Grammar.” English Teaching Professional, 61.

10 06 2015
Scott Thornbury

Let’s get back to the basic argument:

1. Chomsky argues that the stimulus is impoverished, hence the stimulus (alone) can not explain grammaticality.
2. Ergo there must be some other (magical?) explanation. E.g. UG.
3. His critics argue that the stimulus is not impoverished, and demonstrate, using real data, why not. (And they add the bonus observation that general cognitive capacities can perfectly well account for language acquisition).
4. Hence, no alternative explanation (magical or not) is needed.

Surely the onus of proof is on the nativists (who are the ones hypothesizing a magical solution) to show that the stimulus is impoverished?

10 06 2015
Russ

you may find this quite interesting reading, if you haven’t seen them already. The splendid Dorothy Bishop.
http://deevybee.blogspot.co.uk/2012/09/what-chomsky-didnt-get-about-child.html

10 06 2015
geoffjordan

The solution Chomsky offers isn’t magical: it’s a scientific theory (UG) which is open to empirical testing, and which, just BTW, is judged, not just by his peers but by academics in all fields the world over, to be one of the finest intellectual achievements of the 20th century.

He challenges those who say that there’s no PoS to offer an alternative explanation of how children know things about language which go beyond the input. Just baldly stating that “the the stimulus is not impoverished” and that “general learning theories do the rest, doesn’t cut it. You haven’t given any account whatsoever of a reasonable alternative to Chomsky’s nativist view; all you’ve done is make assertions, churn out platitudes, and cherry pick the points you reply to.

Bates and MacWhinney are having a good shot at an explanation which really does provide an alternative to Chomsky’s, and their efforts highlight the poverty of this attempt.

11 06 2015
Russ

whether or not academics all over the world in many fields think it’s a wonderful achievement is irrelevant. isn’t this an argument from authority. Also, why would I or anyone really care about what academics in other fields think? I’m only interested in what linguistics have to say about it, and a lot of them disagree with it.

11 06 2015
Russ

“He challenges those who say that there’s no PoS to offer an alternative explanation of how children know things about language which go beyond the input. Just baldly stating that “the the stimulus is not impoverished” and that “general learning theories do the rest, doesn’t cut it. You haven’t given any account whatsoever of a reasonable alternative to Chomsky’s nativist view; all you’ve done is make assertions, churn out platitudes, and cherry pick the points you reply to.”

no one is ‘baldy stating ‘that the stimulus is not impoverished. What they’re doing (and what Chomsky didn’t do” is researching child language and finding that the ‘stimulus’ directly affects their language development. They’re also finding the examples of things “children could have never heard” are actually pretty common.

12 06 2015
Anthony Ash

You do know this is a blog post, Geoff, not an academic paper being presented to a symposium: Scott could literally quote nobody and still post this blog, as it’s just s blog post. It’s really up to the reader to decide what they think of it all.

12 06 2015
geoffjordan

Shame on you, Anthony! Blog posts start out with a short statement and provoke discussion which can be as academic as those involved choose to make it. It’s not “really up to the reader to decide what they think of it all”, it’s up to the reader to respond if they want to and to say whatever they want.

10 06 2015
patrickamon

Perhaps an example may help of an aspect of syntax that “could not have been acquired from language input.” Speakers of any variety of English with which I am familiar will, I think, find some of the following sentences to be well-formed, others as not.

1. Who’s that?
2. Who’s that at the door?
3. Who’s that for?
4. I wonder who that’s.
5. I wonder who that’s at the door.
6. I wonder who that’s for.

Doubtless, experience suffices to explain how it is that we find 1, 2, 4 and 6 to be well-formed. The thing that requires explanation is how it comes about that we find 4 and 5 to be ill-formed. That we’ve never heard sentences that exhibit the aspect of syntax found in these sentences that is to be prohibited does not suffice to explain the phenomenon, since our ability to acquire language at all requires that we be receptive to at least some aspects of syntax that are not already familiar to us. The thing to be explained is how it is that we find some unfamiliar aspects of syntax to be (perhaps) admissible and others as not. Experience alone cannot distinguish between these two types of unfamiliar aspects of syntax, precisely because aspects of syntax of both types are unfamiliar (i.e, they both equally lie outside the bounds of our experience.) Adult language learners require negative feedback to indicate that certain constructions are ill-formed in the target language. The thing to be explained is that infant learners of their first language become able to judge certain constructions as ill-formed apparently without such negative feedback (certainly, I have no recollection of having ever produced sentences such as 4 or 5 and then being corrected, or frowned at, or slapped, or whatever.) Chomsky’s suggestion, if I’ve understood it, is that we are possessed of innate cognitive structure which constrains the forms that any language that may be acquired in the way that a language is acquired by a growing human infant may take, such that it is psycholinguistically impossible for a language learnable in the way that languages are learned by human infants both to have certain of the characteristics that English does have and at the same time for it to permit sentences such as 4 and 5.

10 06 2015
Scott Thornbury

Thanks, Patrick.

As I suggested before, these are the sorts of conundrums that engage (some) linguists but not children, simply because the former assume language is an abstract system obeying laws of abstruse algebraic complexity, while the latter are making meanings using a kind of bricolage approach based on what they have already heard, whereby certain meanings come packaged in certain constructions.

The results of decades of study of (first) language development seem to show, fairly conclusively, that, as Tomasello (2003: 192) summarizes it, ‘(1) initially children’s constructions are based totally on particular words and phrases (not abstract categories) tied fairly closely to the language they hear; (2) linguistic abstractions (categories and constructions) develop continuously and relatively slowly.’ Tomasello points out that ‘this view, like the generative view, predicts that children will not make so many errors in early language, when they are mostly learning to produce concrete linguistic expressions that they have heard adults use, but that as development proceeds, children find patterns that are not conventional in the language they are learning and so make some errors’.

But, as Evans (2014 104) points out, the errors that they make are errors, typically, of overgeneralization (mommy goed…) and not the kind of ‘logical errors’ that generative linguists delight in ( I wonder who that’s), which ‘concern grammatical patterns alien to the language being learned’.

It is worth quoting Evans in full:

‘Why assume that children would ever produce, at least in principle, sentence patterns that they’ve never heard? After all, it could be that the absence of particular types of grammatical constructions in what a child hears [such as utterance-final contractions: I wonder who that’s] provides a line of evidence that guides the child to avoid precisely those sorts of grammatical patterns that no English-speaker uses.

In fact, the philosopher of cognitive science Jesse Prinz makes exactly this point. He suggests that children “make predictions about what sentences or words they will here while listening to adults. A failed prediction could be used as evidence that the rule underlying the prediction was wrong.” The consequence is that infants could actually have a rich source of negative evidence — learning what not to say from what they don’t hear — without ever being corrected by their parents and carers. Just because children don’t receive negative correction from carers does not mean they lack evidence as to what the grammar of their native language looks like. And so, it is plausible that children don’t produce grammatical oddities that fly in the face of the native tongue they are learning, precisely because they have never heard such constructions.

While Chomsky assumes that absence of evidence provides no evidence, it may in fact be the case that children take absence of evidence for evidence of absence: children learn what not to say (and consequently what they should say), from what they don’t hear.’

Evans, V. (2014) The language myth: Why language is not an instinct. Cambridge.

Tomasello, M. (2003) Constructing a language: A usgae-based theory of language acquisition. Continuum.

10 06 2015
Russ

What I always found staggering (or brilliant?) about Chomsky was how he not only developed this new theory but developed a new set of rules for linguistic inquiry which insulated his theories from criticism. I’m sure Geoff will point out if/where I’m wrong, but Chomsky suggested that the stimulus is impoverished but based this on nothing but logic. A reasonable academic might ask “have you tested this? Where’s your evidence that it is impoverished?’ which may have undone C but he simultaneously introduced the concept that linguistics didn’t need, and in fact should spurn empirical research (I think he called it linguistic stamp collecting). He claimed that being a NS he could just sit in his office and come up with endless examples, -and that would suffice for evidence. How nice!

When Pullum and SCholtz looked at his common examples they found that all of them actually DO occur in commonly! (http://www.lel.ed.ac.uk/~gpullum/bcscholz/Assessment.pdf) THey conclude “n the absence of any answers, or any research that is even directed at trying to develop answers, we conclude that the APS instance that Chomsky has used to support the APS for over twenty-five years has no empirical support and offers no support for the existence of innately primed learning”(45)

All summed up nicely by Sampson:
“hang on a minute,’ I hear the reader say. ‘You seem to be telling us that this man [Chomsky] who is by common consent the world’s leading living intellectual, according to Cambridge University a second Plato, is basing his radical reassessment of human nature largely on the claim that a certain thing never happens; he tells us that it strains his credulity to think that this might happen, but he has never looked, and people who have looked find that it happens a lot.”Yes, that’s about the size of it. Funny old world, isn’t it!” (2005:47)

10 06 2015
geoffjordan

Russ,

Your claim to represent the case for evidence-based rational thinking is severely damaged every time you venture an opinion on Chomsky’s work. You never went back to sort out the mess you made on your own blog talking about Chomsky, and here you are making more silly pronouncements.

To say that Chomsky spurned empirical research is give compelling evidence that you don’t know what you’re talking about. Chomsky’s theory of UG has a long and through history of empirical research as anyone with the slightest understanding of his work knows. As for Sampson’s opinion, how can you possibly judge it until you get some elementary grasp of the issues he’s talking about?

10 06 2015
Russ

Hi Geoff, you write “You never went back to sort out the mess you made on your own blog talking about Chomsky, and here you are making more silly pronouncements” -a comment which is both unnecessarily unpleasant and factually incorrect. I messaged you in April on twitter saying “Hi Geoff – just a note to say I won’t be able to get back to you re Chomsky till the second half of this year…August probably” -so I’m not entirely sure what you’re talking about here, -care to explain?

Secondly I notice you didn’t go back to continue discussing the matter with Mike Samsa, who replied to you on April the 5th.

I would also add that worrying about damage to MY reputation shows very little self-awareness on your part. If you want people to engage with you seriously Geoff I suggest you accord them the proper levels of respect and decency academic discussions require, -or even normal human conversations for that matter. Perhaps you are so taken with Chomsky you think his unpleasant style of discussion/argumentation is something to be emulated? It is not.

10 06 2015
Glenys Hanson

Russ, are you Russ Mayne or another Russ?

10 06 2015
geoffjordan

I’m not worried about anybody’s reputation Russ. Just as soon as you show signs of having a grasp of the matters you so glibly opine on, I’ll show you some respect.

11 06 2015
Russ

when you provide some semblance of an argument (with evidence) rather than continually stating “Chomsky was right” and patronizing anyone who dares disagree with you I might start taking you seriously…

10 06 2015
geoffjordan

Sorry, I meant : “To say that Chomsky spurned empirical research is to give compelling evidence that you don’t know what you’re talking about. Chomsky’s theory of UG has a long and thorough history of empirical research”.

10 06 2015
Scott Thornbury

“Chomsky’s theory of UG has a long and thorough history of empirical research”. What!!? Where? When? Who?

Wouldn’t this ‘long and thorough history’ need to track both what children can do with language (not what armchair linguists think they might know about language) and compare this to what they have been exposed to (not what armchair linguists think they have NOT been exposed to). Citations? References? Academic rigour? Logical argumentation? 😉

10 06 2015
geoffjordan

Dear oh dear, Scott. This is quite a hole you’re digging for yourself. To paraphrase: “Empirical Research? Chomsky? Are you kidding?” Well no, “actually”.

As I said above “In the last 50 years thousands of studies have been done in hundreds of universities around the world involving tens of thousands of participants to test the hypothesis that children have knowledge of the putative universal principles of language.” These studies involve (among other methods of collecting empirical data) grammaticality judgment (GJ) tasks, which are one of the most widespread data-collection methods used in the world of linguistics to test theoretical claims empirically. Participants in studies comprising GJ tasks are presented with a set of linguistic stimuli to which they must react. The responses are usually in the form of assessments, where speakers determine whether and/or the extent to which a particular stimulus is “correct” in a given language.

I share the concerns of many about the validity of some of these GJ tests, but these concerns don’t change the fact that a huge amount of empirical research has been carried out on UG. Still, maybe criticisms of GJ tests is another rabbit you’ll now put out of the hat for our further amusement

Crain,S. and Thornton, R. (1998) Investigations in UG. A Guide to Experiments on the Acquisition of Syntax and Semantics. MIT press.

10 06 2015
Scott Thornbury

As I said before, Geoff, grammaticality judgement tests (and I do know what they are!) are typically administered to adults or teens, but not to five year olds, so it’s not valid to say that the results represent pre-literate innate knowledge. Moreover, they tend to focus on a very limited set of grammatical features, e.g. subject NPs with relative clauses, or the ways various types of pronouns connect to their antecedents (John showed Kim a picture of himself etc etc) – items that have been chosen on the (dodgy) assumption that they don’t appear in the ‘stimulus’. And even if these tests were administered to infants, the results are not interpretable until an accurate analysis of the stimulus has been conducted – i.e. a corpus-based study. For the nth time, you cannot argue a case for the poverty of the stimulus if you haven’t ever described and analysed the stimulus!

10 06 2015
geoffjordan

First, I was replying to this: “Chomsky’s theory of UG has a long and thorough history of empirical research”. What!!? Where? When? Who?
So now you do the well-known side-step ” “Oh! THAT research!” Oh, Pllease!

1. Grammaticality judgement (GJ) tests have been done with tens of thousands of children.
2. GJ tests on adults test innate knowledge of UG. I’m starting to think you don’t even know what “valid” means. Just because GJ tests are done on people over the age of 5 doesn’t make them invalid as tests of UG.
3. You shouldn’t be too surprised to hear that GJ tests focus on the principles and parameters of UG and thus do not cover all parts of grammar, or even most of it.
4. For the n+1 time, the argument is that UG theory explains the knowledge children display of certain underlying principles of language. The explanation of how they acquired this is that since it goes so far beyond the input that children have been observed to be exposed to, the knowledge must be innate. It isn’t necessary to “describe and analyse the stimulus” whatever that means, in order to propose the theory of UG; it’s enough to say that those doing the studies have considered all possible sources of evidence in the language the child hears and in the processes of interaction with parents. In order to challenge the theory you need to explain how children know things about language that they’ve never been exposed to. You say the explanation is that they HAVE been exposed to it, but this is simply false, as any reading of the literature will show – start with Crain & Thornton (1998).

Data from corpora and appeals to general learning theories might be a starting point for an explanation that doesn’t rely on innate knowledge, but just endlessly repeating that they are enough to explain L1A isn’t enough. See the hard graft MacWhinney and his colleagues are putting in, and consider how they, not to mention SLA nativists such as Lydia White, Kevin Gregg, Susan Gass, Vivian Cook, and Nick Ellis, to mention just a few, might react to the arguments you present here.

10 06 2015
Scott Thornbury

White, Gregg, Gass, Cook OK. But not Nick Ellis!

In Ellis 2011 (‘The emergence of language as a complex adaptive system’ in Simpson, J. (ed.) The Routledge Handbook of Applied Linguistics), for instance, he totally distances himself from the Chomskian tradition in which ‘grammar became top-down and rule-governed, rather than bottom-up and emergent. It was modularized, encapsulated, and divorced from performance, lexis, social usage, and the rest of cognition. … Language and cognition, however, are mutually inextricable; they determine each other… Language learning involves determining linguistic structures from usage and this, like learning about all other aspects of the world, involves the full scope of cognition … Cognition, consciousness, experience, embodiment, brain, self, and human interaction, society, culture, and history are all inextricably intertwined in rich, complex, and dynamic ways’ (p. 655).

No mention of a language-specific module, or UG, notice.

Speaking of Universal Grammar, hasn’t Chomsky himself moved on? As Tomasello (2008: 312) describes it, ‘Originally the hypothesis was fairly straightforward, as universal grammar contained such purely linguistic things as nouns, verbs and basic rules of European grammar. But as it became clear that these things did not fit many non-European languages, the hypothesis changed to include very abstract linguistic things, supposedly representing the universal computational structure of language – such things as the subjacency constraint, the empty category principle, the theta-criterion, the projection principle, and so on. But as it has become clear that these things are totally theory-dependent and the theory has been abandoned, the proposal is now that there is simply one specifically linguistic computational principle, and that is recursion – and that may not even be specifically linguistic (Hauser, Chomsky and Fitch 20012) [As noted above, birds do it!]. The Chomskian hypothesis of an innate universal grammar thus currently has no coherent formulation.’

So – no PoS, and no UG either?

Well, that’s certainly the view taken by Kees de Bot in the latest issue of Applied Linguistics (2015: 36/2). ‘After 40 years of research, it is still unclear what UG consists of, and the research has been limited to a few syntactic features, such as pro-drop phenomena. Some of its core assumptions, like the so-called Logical Problem of Language Acquisition, the Poverty of Stimulus argument, and the existence of the Language Acquisition Device have been undermined by various types of research, based on large language corpora and cognitive modelling. Its premises have lost their weight for the SLA community…’ (p. 262).

Indeed, he quotes William Grabe (Vice President for Research and Regents’ Professor at Northern Arizona University) to the effect that ‘fundamentally, Chomsky is wrong and we wasted a lot of time’, while Andrea Tyler (Professor of Linguistics, Georgetown University) is quoted as saying, ‘The current gap between linguistics generally and language teaching was partly caused by the fact that UG could not explain many aspects of SLA and many people gave up linguistics due to the UG failure’.

Crikey!

10 06 2015
Paul Gallantry

Thanks for the post and the fascinating arguments in the comments!
Just to add my two pennorth:
Surely all language acquisition is inherently experiential and emergent? That is, through relatively minimal input and repetition, the acquirer gradually builds up syntactical and grammatical complexity, akin to walking (or driving, if you’re lazy) the same route many times, and gradually extending the journey, as it were.
I’ve always suspected that language knowledge is in many ways similar to handedness, i.e. I can do certain things very well with my dominant hand without thinking, but it requires more effort when I use the other hand. For example, I can write the same message with either hand, but the fluency with which I do it may significantly differ.

10 06 2015
geoffjordan

Wow! Peace at last! I read all of that and I can only find one thing to disagree with.

* You’re right about Nick Ellis. You should pay more attention to him and less to D. Larsen-Freeman 🙂
* You’re right about Chomsky having moved on to a very weird place. Kees de Bot quite right, I think. Good article.
* Don’t agree with Grabe – tho I can easily believe that he wasted a lot of time.
* Agree strongly with Tyler.

So much for Chomsky. Now, what were you saying about a new approach to classroom materials?

10 06 2015
Scott Thornbury

Peace is good! But the ferocity that this argument seems to generate – and has generated – suggests it touches on something more profound than whether there’s a pro-drop parameter or not – something fundamental about the human condition – the nurture vs nature debate being just a part of it.

11 06 2015
geoffjordan

I agree, Scott.

May I just use this space, since there’s no “reply” space under it, to reply to Russ’ peevish little contribution, posted today. It’s not an argument from authority because I was careful to say “and just by the way”. As for the question “Why would I or anyone really care about what academics in other fields think?” I suggest you take your foot out of your mouth and ask your counsellor.

11 06 2015
Scott Thornbury

Not sure if this is relevant, but I came across this intriguing throwaway comment in the introduction of a book that arrived today (Fabbro, 1999.The Neurolinguistics of Bilingualism, Psychology Press):

‘Noam Chomsky is among those linguists who have tried to describe syntax as a set of logical rules … Although Chomsky made valuable contributions to linguistics in general – and nowadays many linguistics departments in North America are run by Chomsky’s pupils – his studies and theories have found a very limited use in the study of individuals with language disorders. Indeed, there is no data suggesting that the human brain organizes syntax according to rules […]. Therefore, the idea that syntax is governed by a set of rules should be considered as a temporary approach to this unsolved problem’ (p. 8).

12 06 2015
patrickamon

The final sentence of this quote appears to grant that the Chomskyan approach is the best we have presently available, observing only that it may be superseded by a better one in the future. I think I’m right in thinking that both Geoff and Noam will readily allow this. Indeed, isn’t that much about the best that we can say about any theory of anything?

12 06 2015
Svetlana

Dear Scott,
In my opinion looking for a language genome with LAD programmed in it is a kind of dated idea. It is not likely to be found, ever. Nowadays a much more popular word is connectome — the mapping of all neural connections within an organism’s nervous system (Wikipedia, sorry). You are your connectome, neuroscientists say. With all that newest equipment available to lucky neurophychologists it is possible to trace the language connectome and this is a real thing. Though one may assume that there are certain areas in every human brain which are responsible for language acquisition, this connectome is unique to every person as it is based on this person’s experience of constant, non-stop interaction with the environment. And this is my reading to do in the summer.
Would you agree that this is a more viable idea than those ‘colorless green ones that sleep furiously’? The latter smell deadly …:-O

21 07 2015
Scott Thornbury

Footnote: this paper by Eva Dabrowska is one of the best refutations of the PoS argument – and its associated innatist paraphernalia – that I’ve yet to read (or understand!): http://journal.frontiersin.org/article/10.3389/fpsyg.2015.00852/full
Thanks to Michal Paradowski for bringing it to my attention.

26 04 2017
Patrick

It’s common sense that children have an innate ability to learn their (first) language – any mum or dad or older brother or sister knows this.

But an ability to learn is not the same as an innate grammar.

The PoS has never been demonstrated adequately,

If you abstract language from meaning (as Ch did) you end up with meaninglessness, which is what he achieved.

TG/UG has now been around for half a century, but it has little or nothing to show (in linguistics terms), and absolutely zero in SLA terms, for the vast amounts of money spent in the USA on its research.

It damaged the progress of linguistics in the US massively, not just in theoretical terms, but in career terms.

Leave a comment