P is for Predictions (part 1)

3 09 2017

I’m regularly asked to make predictions about the future of ELT: it’s one of the liabilities of being on a conference panel, for starters. I seldom feel comfortable in the role. Perhaps the safest prediction is of the order:  Any prediction that I make now about ELT in X years’ time will be laughably wrong when the time comes.

It came up again when I was guesting on a DELTA course recently: the teachers were understandably  anxious about the future of their chosen career. However, I decided to re-frame the question – not in terms of predictions (‘What will ELT be like in 10 years?), but in terms of identifying some of the on-going tensions in ELT, the outcomes of which (following a dialectic mode of thinking) will surely determine the shape of ELT in 5 or 10 or however many years’ time.

delta course

The IH Barcelona DELTA class: Teachers with a  future! (courtesy @sanchiadanielle)

These are some of the tensions I discussed:

  1. The tension between the local and the global

Incontestably, English is a global language (although its reputation may have taken some knocks recently – see E is for English). It therefore lends itself to the kind of commodification and marketization that we associate with other items of mass consumption – such as fast foods, trainers, and cell-phones. These processes of ‘McDonaldization’ (Ritzer 1998) are evidenced in the way textbooks are produced, marketed and distributed globally, and in the ‘branding’ of high-stakes exams – such that an English class in Thailand is likely to be using the same materials to prepare for the same exam as is a class in Chile.  Or Armenia. Or anywhere.

Countering this ethos of ‘one-size-fits-all’ is what Kumaravadivelu (2001, p. 538) calls ‘a pedagogy of particularity’, i.e. one that is ‘sensitive to a particular group of teachers teaching a particular group of learners pursuing a particular set of goals within a particular institutional context embedded in a particular sociocultural milieu.’ Because, as Pennycook insistently reminds us, English is a global language used locally: ‘Everything happens locally. However global a practice may be, it still always happens locally’ (2010, p.128).

Attempts to ‘particularize’ English teaching include the production of locally-authored, culturally-specific curricula and materials, implemented by means of an ‘appropriate methodology’ (Holliday 1994). Arguably, digital technologies have made the production of such home-grown materials a lot easier, while resistance to the uncritical importation of ‘Western’ methodologies is voiced frequently (e.g. Burnaby & Sun 1989; Li 1998). But it is not clear how the global vs local tension will play out.

  1. The tension between teaching and testing

Driven by the aforementioned globalizing forces, but also fuelled by the neo-liberal obsession with accountability and standardization, and lubricated by ever more sophisticated data-gathering mechanisms, high-stakes testing now dominates many educational contexts, English language teaching not the least. The numbers speak for themselves: the IELTS test, for example, was taken by nearly 3 million candidates in 2016 – at around $200 a go, this is big business. It is also a nice little earner for language schools, with the result that many teachers feel that they are now simply in the business of test preparation.

‘Teaching-for-the-test’ has also seen the rise of standards-based, or competency-based teaching (also known as mastery learning), where the syllabus consists of an inventory of bite-sized ‘competencies’, each one taught and tested in isolation, on the assumption that all these bits will magically coalesce into a whole. (These bite-sized learning chunks also lend themselves to [teacher-less] online delivery).  This has led to a culture of testing that is the despair of many educationalists, Diane Ravitch (2010, p. 16) being one of the more vocal: ‘How did testing and accountability become the main levers of school reform?  …  What was once an effort to improve the quality of education turned into an accounting strategy’. And she adds, ‘Tests should follow the curriculum. They should not replace it or precede it.’

Will the tide turn? Will teaching and learning reassert their rightful place in the curriculum? Don’t hold your breath.

  1. The tension between classroom instruction and self-study

Learners have always learned languages through self-study packages, whether by means of books (of the Teach yourself Swahili type); long-playing records, cassettes or CDs; video, in combination with any of the above, and, of course, more recently, online and via apps such as Duolingo. In July this year, for example, Duolingo was boasting an estimated 200 million ‘active’ users[i], although what constitutes ‘active’ is a moot point. Nevertheless, the availability, ease and low cost of many self-study tools (Duolingo is of course free) means that they constitute a real threat to traditional classroom teaching.

Their single biggest drawback is, of course, the lack of any real face-to-face interaction, including personalized instruction and feedback (where ‘personalized’ means ‘mutually intersubjective,’ not ‘individually customized by an algorithm’). That is what classrooms offer – or should. But, of course, such ‘human’ interactions come at a cost. Will language learning in the future be primarily app-mediated, with classroom teaching relegated to a high-end, niche status? And/or will classroom teachers be compelled to migrate en masse to call-centre-type facilities in order to provide the much-needed human interaction that these apps will offer as a premium add-on? Watch this space.

There are at least three other tensions, the outcome of which I suspect will shape our collective futures. But I’ll deal with those next week.

References

Burnaby, B. and Sun, Y. (1989) ‘Chinese teachers’ views of western language teaching: context informs paradigms’. TESOL Quarterly, 23/2.

Holliday, A. (1994) Appropriate Methodology and Social Context. Cambridge: Cambridge University Press.

Li, D. (1998) ‘”It’s always more difficult than you plan or imagine”: Teachers’ perceived difficulties in introducing the communicative approach in South Korea’. TESOL Quarterly, 32/4.

Kumaravadivelu, B. (2001) ‘Towards a post-method pedagogy.’ TESOL Quarterly, 35/4.

Pennycook, A. (2010) Language as a local practice. London: Routledge.

Ravitch, D. (2010) The death and life of the great American school system: how testing and choice are undermining education. New York: Basic Books.

Ritzer, G. (1998) The McDonaldization Thesis: Explorations and extensions. London: Sage.

[I] http://expandedramblings.com/index.php/business-directory/24390/duolingo-stats-facts/





M is for Machine translation

2 07 2017

(Or: How soon will translation apps make us all redundant?)

Arrival Movie

An applied linguist collecting data

In a book published on the first day of the new millennium, futurologist Ray Kurzweil (2000) predicted that spoken language translation would be common by the year 2019 and that computers would reach human levels of translation by 2029. It would seem that we are well on track. Maybe even a few years ahead.

Google Translate, for example, was launched in 2006, and now supports over 100 languages, although, since it draws on an enormous corpus of already translated texts, it is more reliable with ‘big’ languages, such as English, Spanish, and French.

A fair amount of scorn has been heaped on Google Translate but, in the languages I mostly deal with, I have always found it fairly accurate. Here for example is the first paragraph of this blog translated into Spanish and then back again:

En un libro publicado el primer día del nuevo milenio, el futurólogo Ray Kurzweil (2000) predijo que la traducción hablada sería común para el año 2019 y que las computadoras llegarían a niveles humanos de traducción para 2029. Parecería que estamos bien en el camino. Tal vez incluso unos años por delante.

In a book published on the first day of the new millennium, futurist Ray Kurzweil (2000) predicted that the spoken translation would be common for 2019 and that computers would reach human translation levels by 2029. It would seem we are well on the road. Maybe even a few years ahead.

Initially text-to-text based, Google Translate has more recently been experimenting with a ‘conversation mode’, i.e. speech-to-speech translation, the ultimate goal of machine translation – and memorably foreshadowed by the ‘Babel fish’ of Douglas Adams (1995): ‘If you stick a Babel fish in your ear you can instantly understand anything said to you in any form of language.’

The boffins at Microsoft and Skype have been beavering away towards the same goal: to produce a reliable speech-to-speech translator in a wide range of languages. For a road test of Skype’s English-Mandarin product, see here: https://qz.com/526019/how-good-is-skypes-instant-translation-we-put-it-to-the-chinese-stress-test/

The verdict (two years ago) was less than impressive, but the reviewers concede that Skype Translator will ‘only get better’ – a view echoed by The Economist last month:

Translation software will go on getting better. Not only will engineers keep tweaking their statistical models and neural networks, but users themselves will make improvements to their own systems.

Mention of statistical models and neural networks reminds us that machine translation has evolved through at least three key stages since its inception in the 1960s. First was the ‘slot-filling stage’, whereby individual words were translated and plugged into syntactic structures selected from a built-in grammar.  This less-than-successful model was eventually supplanted by statistical models, dependent on enormous data-bases of already translated text, which were rapidly scanned using algorithms that sought out the best possible phrase-length fit for a given word. Statistical Machine Translation (SMT) was the model on which Google Translate was initially based. It has been successful up to a point, but – since it handles only short sequences of words at a time – it tends to be less reliable dealing with longer stretches of text.

star trek translator.png

An early translation app

More recently still, so-called neural machine translation (NMT), modelled on neural networks, attempts to replicate mental processes of text interpretation and production. As Microsoft describes it, NMT works in two stages:

  • A first stage models the word that needs to be translated based on the context of this word (and its possible translations) within the full sentence, whether the sentence is 5 words or 20 words long.
  • A second stage then translates this word model (not the word itself but the model the neural network has built of it), within the context of the sentence, into the other language.

Because NMT systems learn on the job and have a predictive capability, they are able to make good guesses as to when to start translating and when to wait for more input, and thereby reduce the lag between input and output.  Combined with developments in voice recognition software, NMT provides the nearest thing so far to simultaneous speech-to-speech translation, and has generated a flurry of new apps. See for example:

https://www.youtube.com/watch?v=ZAHfevDUMK4

https://www.cnet.com/news/this-in-ear-translator-will-hit-eardrums-next-month-lingmo/

One caveat to balance against the often rapturous claims made by their promoters is that many of these apps are trialled using fairly routine exchanges of the type Do you know a good sushi restaurant near here?  They need to be able to prove their worth in a much wider variety of registers, both formal and informal. Nevertheless, Kurzweil’s prediction that speech-to-speech translation will be commonplace in two years’ time looks closer to being realized. What, I wonder, will it do to the language teaching industry?Universal-Translator-FI.png

As a footnote, is it not significant that developments in machine translation seem to have mirrored developments in language acquisition theory in general, and specifically the shift from a  focus primarily on syntactic processing to one that favours exemplar-based learning? Viewed from this perspective, acquisition – and translation – is less the activation of a pre-specified grammar, and more the cumulative effect of exposure to masses of data and the probabilistic abstraction of the regularities therein. Perhaps the reason that a child – or a good translator – never produces sentences of the order of Is the man who tall is in the room? or John seems to the men to like each other (Chomsky 2007) is not because these sentences violate structure-dependent rules, but because the child/translator has never encountered instances of anything like them.

References

Adams, D. (1995) The hitchhiker’s guide to the galaxy. London: Heinemann.
Chomsky, N. (2007) On Language. New York: The New Press.
Kurzweil, R. (2000) The Age of Spiritual Machines: When Computers Exceed Human Intelligence.  Penguin.

 





I is for Innovation

9 05 2015

This is a dress rehearsal of my opening ‘mini-plenary’ for the hugely successful ELT Innovate conference, held this weekend in Barcelona  – on the subject, unsurprisingly – of innovation.

These are the books and articles I refer to:

Postman, N., & Weingartner, C. (1969) Teaching as a subversive activity. Penguin Education.

Selwyn, N. (2011) Education and Technology: Key Issues and Debates, London: Continuum.

Selwyn, N. (2013) Distrusting Educational Technology: Critical Questions for Changing Times. London: Routledge.

Selwyn, N. (2015) ‘Minding our language: Why education and technology is full of bullshit … and what might be done about it’, paper given to the ‘Digital Innovation, creativity and knowledge in education’ conference, Qatar, January 2015.





S is for SLA

1 03 2015

The criteria for evaluating the worth of any aid to language learning (whether print or digital, and, in the case of the latter, whether app, program, game, or the software that supports these) must include some assessment of its fitness for purpose. That is to say, does it facilitate learning?

But how do you measure this? Short of testing the item on a representative cross-section of learners, we need a rubric according to which its learning potential might be predicted. And this rubric should, ideally, be informed by our current understandings of how second languages are best learned, understandings which are in turn derived — in part at least — from the findings of researchers of second language acquisition (SLA).

This is easier said than done, of course, as there is (still) little real consensus on how the burgeoning research into SLA should be interpreted. This is partly because of the invisibility of most cognitive processes, but also because of the huge range of variables that SLA embraces: different languages, different aspects of language, different learners, different learning contexts, different learning needs, different learning outcomes, different instructional materials, and so on. Generalizing from research context A to learning context B is fraught with risks. It is for this reason that, in a recent article, Nina Spada (2015) urges caution in extrapolating classroom applications from the findings of SLA researchers.

Cautiously, then, and following VanPatten and Williams’ (2007) example, I’ve compiled a list of ‘observations’ about SLA that have been culled from the literature (albeit inflected by my own particular preoccupations). On the basis of these, and inspired by Long (2011), I will then attempt to frame some questions that can be asked of any teaching aid (tool, device, program, or whatever) in order to calculate its potential for facilitating learning.

Exposure to input is necessary

Here, then, are 12 observations:

  1. The acquisition of an L2 grammar follows a ‘natural order’ that is roughly the same for all learners, independent of age, L1, instructional approach, etc., although there is considerable variability in terms of the rate of acquisition and of ultimate achievement (Ellis 2008), and, moreover, ‘a good deal of SLA happens incidentally’ (VanPatten and Williams 2007).
  2. ‘The learner’s task is enormous because language is enormously complex’ (Lightbown 2000).
  3. ‘Exposure to input is necessary’ (VanPatten and Williams 2007).
  4. ‘Language learners can benefit from noticing salient features of the input’ (Tomlinson 2011).
  5. Learners benefit when their linguistic resources are stretched to meet their communicative needs (Swain 1995).
  6. Learning is a mediated, jointly-constructed process, enhanced when interventions are sensitive to, and aligned with, the learner’s current stage of development (Lantolf and Thorne 2006).
  7. ‘There is clear evidence that corrective feedback contributes to learning’ (Ellis 2008).
  8. Learners can learn from each other during communicative interaction (Swain et al. 2003).
  9. Automaticity in language processing is a function of ‘massive repetition experiences and consistent practice’ in ‘real operating conditions’ (Segalowitz 2003; Johnson 1996).
  10. A precondition of fluency is having rapid access to a large store of memorized sequences or chunks (Nattinger & DeCarrico 1992; Segalowitz 2010)
  11. Learning, particularly of words, is aided when the learner makes strong associations with the new material (Sökmen 1997).
  12. The more time (and the more intensive the time) spent on learning tasks, the better (Muñoz 2012). Moreover, ‘learners will invest effort in any task if they perceive benefit from it’ (Breen 1987); and task motivation is optimal when challenge and skill are harmonized (Csikszentmihalyi 1990).

On the basis of these observations, and confronted by a novel language learning tool (app, game, device, blah blah), the following questions might be asked:

  1. ADAPTIVITY: Does the tool accommodate the non-linear, often recursive, stochastic, incidental, and idiosyncratic nature of learning, e.g. by allowing the users to negotiate their own learning paths and goals?
  2. COMPLEXITY: Does the tool address the complexity of language, including its multiple interrelated sub-systems (e.g. grammar, lexis, phonology, discourse, pragmatics)?
  3. INPUT: Does it provide access to rich, comprehensible, and engaging reading and/or listening input? Are there means by which the input can be made more comprehensible? And is there a lot of input (so as to optimize the chances of repeated encounters with language items, and of incidental learning)?
  4. NOTICING: Are there mechanisms whereby the user’s attention is directed to features of the input and/or mechanisms that the user can enlist to make features of the input salient?
  5. OUTPUT: Are there opportunities for language production? Are there means whereby the user is pushed to produce language at or even beyond his/her current level of competence?
  6. SCAFFOLDING: Are learning tasks modelled and mediated? Are interventions timely and supportive, and calibrated to take account of the learner’s emerging capacities?
  7. FEEDBACK: Do users get focused and informative feedback on their comprehension and production, including feedback on error?
  8. INTERACTION: Is there provision for the user to collaborate and interact with other users (whether other learners or proficient speakers) in the target language?
  9. AUTOMATICITY: Does the tool provide opportunities for massed practice, and in conditions that replicate conditions of use? Are practice opportunities optimally spaced?
  10. CHUNKS: Does the tool encourage/facilitate the acquisition and use of formulaic language?
  11. PERSONALIZATION: Does the tool encourage the user to form strong personal associations with the material?
  12. FLOW: Is the tool sufficiently engaging and challenging to increase the likelihood of sustained and repeated use? Are its benefits obvious to the user?

Is it better than a teacher?

This list is very provisional: consider it work in progress. But it does replicate a number of the criteria that have been used to evaluate educational materials generally (e.g. Tomlinson 2011) and educational technologies specifically (e.g. Kervin and Derewianka 2011). At the same time, the questions might also provide a framework for comparing and contrasting the learning power of self-access technology with that of more traditional, teacher-mediated classroom instruction. Of course, the bottom line is: does the tool (app, program, learning platform etc) do the job any better than a trained teacher on their own might do?

Any suggestions for amendments and improvements would be very welcome!

References:

Breen, M. P. 1987. ‘Learner contributions to task design’, republished in van den Branden, K., Bygate, M. & Norris, J. (eds) 2009. Task-based Language Teaching: A reader. Amsterdam: John Benjamins.

Csikszentmihalyi, M. 1990. Flow: The psychology of optimal experience. New York: Harper & Low.

Ellis, R. 2008. The Study of Second Language Acquisition (2nd edn). Oxford: Oxford University Press.

Kervin, L. & Derewianka, B. (2011) ‘New technologies to support language learning’, in Tomlinson, B. (ed.) Materials Development in Language Teaching (2nd edn). Cambridge: Cambridge University Press.

Lightbown, P.M. (2000) ‘Classroom SLA research and second language teaching’. Applied Linguistics, 21/4, 431-462.

Long, M.H. (2011) ‘Methodological principles for language teaching’. In Long, M.H. & Doughty, C. (eds) The Handbook of Language Teaching, Oxford: Blackwell.

Muñoz, C. (ed.) (2012). Intensive Exposure Experiences in Second Language Learning. Bristol: Multilingual Matters.

Nattinger, J.R. & DeCarrico, J.S. (1992). Lexical Phrases and Language Teaching. Oxford: Oxford University Press.

Segalowitz, N. (2003) ‘Automaticity and second languages.’ In Doughty, C.J. & Long, M.H, (eds) The Handbook of Second Language Acquisition. Oxford: Blackwell.

Segalowitz, N. (2010) Cognitive Bases of Second Language Fluency. London: Routledge.

Sökmen, A.J. (1997) ‘Current trends in teaching second language vocabulary,’ in Schmitt, N. and McCarthy, M. (Eds.) Vocabulary: Description, Acquisition and Pedagogy. Cambridge: Cambridge University Press.

Spada, N. (2015) ‘SLA research and L2 pedagogy: misapplications and questions of relevance.’ Language Teaching, 48/1.

Swain, M. (1995) ‘Three functions of output in second language learning’, in Cook, G., & Seidlhofer, B. (eds) Principle and Practice in Applied Linguistics: Studies in Honour of H.G.W. Widdowson. Oxford: Oxford University Press.

Swain, M., Brooks, L. & Tocalli-Beller, A. (2003) ‘Peer-peer dialogue as a means of second language learning’. Annual Review of Applied Linguistics, 23: 171-185.

Tomlinson, B. (2011) ‘Introduction: principles and procedures of materials development,’ in Tomlinson, B. (ed.) Materials Development in Language Teaching (2nd edn). Cambridge: Cambridge University Press.

VanPatten, B. & Williams, J. (eds) 2007. Theories in Second Language Acquisition: An Introduction. Mahwah, NJ: Lawrence Erlbaum.

This is a revised version of a post that first appeared on the eltjam site:  http://eltjam.com/how-could-sla-research-inform-edtech/

 





The End

9 06 2013

So this is it, folks: I’m closing down the blog for the summer… and for good. After 3 years, 150 posts, nearly 7000 comments, and innumerable hits, visits, views, however you want to describe and count them, plus one e-book spin-off (but no sign of a second edition of An A-Z!), I think it’s time to call it a day.

But that’s not the end of blogging.  In the autumn (or in the spring, if that’s your orientation) I’ll be resuming with an altogether different theme and format, provisionally titled The (De-)Fossilization Diaries.  Watch this space!

At some point between now and then I’ll lock the comments on this blog, but it will hang around a little longer. If you think you might miss it if it suddenly disappeared, you could always buy the book! 😉

Meanwhile, thanks for following, commenting, subscribing, tweeting… I have so enjoyed hosting this blog, not least because of the active and widely-distributed online community that has grown up around it. Blogging is my favourite medium by far, and, despite claims to the contrary by some curmudgeons, it seems to be very much alive and well.

bunyolsNow, to give you something to chew on over breakfast, I’ve done a quick cut and paste of some of the one- (or two-) liners that capture many of the core themes of this blog. (You can hunt them down in context by using the Index link above).

1. If there are no languages, only language, what is it that we teach? … The short answer, perhaps, is that we would facilitate a kind of creative DIY approach – semiotic bricolage, perhaps – by means of which learners would become resourceful language users, cutting and pasting from the heteroglossic landscape to meet both their short-term and their long-term goals. (L is for Language)

2. The tension – and challenge – of successful communication is in negotiating the given and the new, of exploiting the predictable while coping with unpredictability. To this end, a phrasebook, a grammar or a dictionary can be of only limited use. They are a bit like the stopped clock, which is correct only two times a day. (M is for Mobility)

3. Creating the sense of ‘feeling at home’, i.e. creating a dynamic whereby students feel unthreatened and at ease with one another and with you, is one of the most important things that a teacher can do. (T is for Teacher Development)

4. A reliance on the coursebook IN the classroom does not really equip learners for self-directed learning OUTSIDE the classroom, since nothing in the outside world really reflects the way that language is packaged, rationed and sanitised in the coursebook.(T is for Teacher Development)

5. The language that teachers need in order to provide and scaffold learning opportunities is possibly of more importance than their overall language proficiency (T is for Teacher Knowledge)

6. A critical mass of connected chunks might be the definition of fluency. (Plus of course, the desire or need to BE fluent). (T is for Turning Point)

7. Education systems are predicated on the belief that learning is both linear and incremental. Syllabuses, coursebooks and tests conspire to perpetuate this view. To suggest otherwise is to undermine the foundations of civilization as we know it. (T is for Turning Point)

8. If I were learning a second language with a teacher, I would tell the teacher what I want to say, not wait to be told what someone who is not there thinks I might want to say. (W is for Wondering)

9. Irrespective of the degree to which we might teach grammar explicitly, or even base our curriculums on it, as teachers I think we need to know something about it ourselves. It’s part of our expertise, surely. Besides which, it’s endlessly fascinating (in a geeky kind of way). (P is for Pedagogic grammar)

10. Every language divides up the world slightly differently, and learning a second language is – to a large extent – learning these new divisions.(P is for Pedagogic grammar)

11. The meaning of the term student-centred has become too diffuse – that is to say, it means whatever you want it to mean, and – whatever it does mean – the concept needs to be problematized because it’s in danger of creating a false dichotomy. (S is for Student-centred)

12. There is a responsibility on the part of teachers to provide feedback on progress, but maybe the problem is in defining progress in terms of pre-selected outcomes, rather than negotiating the outcomes during the progress. (O is for Outcomes)

13. Language learning, whether classroom-based or naturalistic, whether in an EFL or an ESL context, is capricious, opportunistic, idiosyncratic and seldom amenable to external manipulation. (P is for Postmodern method)

14. I have no problem with the idea of classes – in fact for many learners and teachers these can be less threatening than one-to-one situations – but I do have a problem with the way that the group learning context is moulded to fit the somewhat artificial constraints of the absentee coursebook writer. (P is for Postmodern method)poached eggs nov 2012

15. The idea that there is a syllabus of items to be ‘covered’ sits uncomfortably with the view that language learning is an emergent process – a process of ‘UNcovering’, in fact. (P is for Postmodern method)

16. This, by the way, is one of [Dogme’s] characteristics that most irritates its detractors – that it seems to be a moving target, constantly slipping and sliding like some kind of methodological ectoplasm. (P is for Postmodern method)

17. The ‘mind is a computer’ metaphor has percolated down (or up?) and underpins many of our methodological practices and materials, including the idea that language learning is systematic, linear, incremental, enclosed, uniform, dependent on input and practice, independent of its social context, de-humanized, disembodied, … and so on. (M is for Mind)

18. Is there no getting away from the fact that classrooms are just not good places to learn languages in? And that, instead of flogging the present perfect continuous to death, it might not be better simply ‘to take a walk around the block’? (A is for Affordance)

19. If automaticity is simply the ability to retrieve memorised chunks, this may result in a repertoire that is fast and accurate, but functional only in situations of the utmost predictability. Fine, if you’re a tourist – just memorise a phrase-book. But for a more sophisticated command of language – one that is adaptable to a whole range of situations – you need to be able to customise your chunks. In short, you need to be creative. Hence, creative automaticity. (A is for Automaticity)

20. Technosceptics, like me, happily embrace technology in our daily lives, but are nevertheless a little suspicious of the claims made, by some enthusiasts, for its educational applications – claims that frequently border on the coercive. (T is for Technology)

21. As edtech proponents tirelessly point out, technology is only a tool. What they fail to acknowledge is that there are good tools and bad tools. (T is for Technology)

22. Another bonus, for me, of the struggle to dominate a second (and third, fourth etc) language has been an almost obsessive interest in SLA theory and research – as if, somewhere, amongst all this burgeoning literature, there lies the answer to the puzzle. (B is for Bad language learner)

23. ‘Fluency is in the ear of the beholder’ – which means that perhaps we need to teach our students tricks whereby they ‘fool’ their interlocutors into thinking they’re fluent. Having a few well rehearsed conversational openers might be a start…. (B is for Bad language learner)

24. I’ve always been a bit chary of the argument that we should use movement in class in order to satisfy the needs of so-called kinaesthetic learners. All learning surely has kinaesthetic elements, especially if we accept the notion of ‘embodied cognition’, and you don’t need a theory of multiple intelligences to argue the case for whole-person engagement in learning. (B is for Body)

25. I agree that learners’ perceptions of the goals of second language learning are often at odds with our own or with the researchers’. However, if we can show [the learners] that the communicative uptake on acquiring a ‘generative phraseology’ is worth the initial investment in memorisation, and, even, in old-fashioned pattern practice, we may be able to win them over. (C is for Construction)

26. How do we align the inherent variability of the learner’s emergent system with the inherent variability of the way that the language is being used by its speakers? (V is for Variability)

27. The problem is that, if there is a norm, it is constantly on the move, like a flock of starlings: a dense dark centre, a less dense margin, and a few lone outliers. (V is for Variability)

28. Think of the blackbird. Every iteration of its song embeds the echo, or trace, of the previous iteration, and of the one before that, and the one before that, and so on. And each iteration changes in subtle, sometimes barely perceptible, ways. But the net effect of these changes may be profound. (R is for Repetition [again])

29. Diversity is only a problem if you are trying to frog-march everyone towards a very narrowly-defined objective, such as “mastering the present perfect continuous.” If your goals are defined in terms of a collaborative task outcome … then everyone brings to the task their particular skills, and it is in the interests of those with many skills to induct those with fewer. (E is for Ecology)

30. Teaching […] is less about navigating the container-ship of the class through the narrow canal of the coursebook/syllabus than about shepherding a motley flotilla of little boats, in all weathers, across the open sea, in whatever direction and at whatever speed they have elected to go. (P is for Postmodern method)

the-end-03





T is for Teacher development

27 05 2012

This is a summary of the keynote talk I gave yesterday at the IATEFL Learning Technologies and Teacher Development Joint SIG Conference, titled With or Without Technology, held at Yeditepe University, Istanbul this weekend.

Why Dogme is good for you.

Because the conference theme focuses on teacher development (TD), in both its ‘plugged’ and ‘unplugged’ manifestations, it’s perhaps timely to review the case for ‘teaching unplugged’, otherwise known as Dogme ELT (hereafter just Dogme), and try to situate it in relation to teacher development generally.

In its relatively long life (12 years and still counting) Dogme has generated a fair amount of heat – more, indeed, than its co-founders bargained for, and indicative, perhaps, of how surprisingly subversive it is. Formerly, this heat was confined mainly to the Dogme discussion list itself, but it has now migrated into the blogosphere at large, where, far from having been diffused, it seems to be burning more fiercely than ever. (I’m not the first to point out that you can increase the traffic to your blog exponentially by cocking a snook at Dogme!)

Among the criticisms that have been levelled at it these are some of the most frequent:

  • it doesn’t work for beginners
  • it doesn’t work with large groups
  • it doesn’t work with young learners
  • it doesn’t work with non-native speaker teachers
  • it’s not new
  • it doesn’t work because there’s no input
  • it doesn’t work because there’s no syllabus
  • it doesn’t work because there’s no attention to form
  • it doesn’t work in [insert name of the country where you work]
  • it doesn’t work with [insert any nationality] learners
  • it just doesn’t work, period.

Yeditepe University

Far from attempting to refute any of these claims, I would argue that they are in fact irrefutable. Method comparison, as a science, is dead in the water. There’s no controlling for all the variables, and sample sizes are usually too small to generalise from. And so on. So, for argument’s sake, I will simply accept that for some teachers these claims are plausible (just as for others the claims made for Dogme are equally plausible), and I will move on. (At the same time, whether or not the above claims are true, I don’t think Dogme has done anyone any harm. It’s not like HIV-denial or the anti-vaccine lobby. I don’t know of many students who have died because their teachers didn’t use coursebooks. But I may be wrong).

There is, however, one thing to be said about Dogme which is incontrovertibly true. And that is that – for a great number of teachers – Dogme has provided a framework for highly productive self-directed teacher development, involving cycles of experimentation and reflection, essential components for any developmental program. It has done this principally because it invites teachers to question some of the received wisdoms about language teaching, such as

  • that language learning is an incremental and linear process
  • that language learning is a purely cognitive process
  • that a grammar syllabus represents the best ‘route’ for language learning
  • that imported materials are better than learner-generated ones
  • that lessons have to be meticulously planned
  • that accuracy is a pre-condition for fluency
  • that teaching is better with technology

Dogme is by no means the first platform from which these claims have been challenged, but for reasons I still don’t entirely fathom, it seems to have been very successful at articulating its critique and broadcasting it to practising teachers. (The concurrent boom in online communication may have had something to do with it – an irony not lost on Dogme’s critics).

A glance through the quantity of postings on the list demonstrates the fact that many teachers have used one or more of the tenets of Dogme, either to initiate change in their own teaching, or to explain changes that they had already initiated – and often with spectacularly positive results, as this early post suggests:

…I’m buzzing at the moment ‘cos I’ve been lucky enough to hit on a couple of new groups who seem to have invented dogme themselves, and the things we’re coming up with together are stunning me into a state of ‘I’ve never loved teaching so much before – but is this really teaching?!’.

Well, it certainly seems to be learning – enthusiastically and really joyfully – for all of us.

And thanks to everyone in the group for helping me better appreciate what’s happening!

Some of the dogme blogs

Like the Dogme critics, the Dogme enthusiasts have also turned to blogging to get their teacher development message across. One notable instance of grassroots, collaborative Dogme-inspired teacher development was the ‘teach off’ that Chia Suan Chong initiated last month. Whatever doubts you might have about its scientific rigour, the buzz that it generated was truly remarkable.

Finally, and in advance of the conference, I did a little exercise in crowdsourcing, by tweeting the following question: ‘How has Dogme helped you develop as a teacher?’ Here is a small selection of the many replies I got:

@michaelegriffin: #Dogme helped me c that I wasn’t crazy to think that books weren’t a curriculum and that the people in the room are the key

@AnthonyGaughan: it encourages confidence in exploring my teaching self #DogmeTD

@dalecoulter: playing with variables in the lesson and reflecting on the results #DogmeTD

@kevchanwow: watching lively exchange within Dogme community makes me more comfortable trying new approaches in my own way & own classes

@kenwilsonlondon: #DogmeELT I couldn’t understand why my best lessons were when the class more/less forced me to abandon the plan. Now I know!

@esolamin; Haven’t followed Dogme as such, but ‘unplugged’/improvised activities produced more ss participation & interest, I found.

@englishraven It marked my progression into actually being a teacher- the whole deal, real thing. Not an instructional attendant #DogmeELT

@sx200i how has Dogme helped me. Pure enjoyment in my lessons. Confidence. Never bored! #DogmeTD





E is for eCoursebook

29 01 2012

Technology then

Apple’s plan, announced last week, to launch electronic publishing of school textbooks set social networks a-twitter, triggering flurries of excitement and apprehension in equal measure.  To expedite this initiative, Apple have launched an app, called iBooks Author, which allows wannabe textbook authors to create interactive ebooks and self-publish them (of course, only on an iPad, and with Apple taking a nice little chunk of the profits).

The enthusiasts have been talking up the way this technology will open up textbook writing to anyone with an iPad, while allowing material to be customized for very specific markets. Moreover, by shortcutting the laborious production processes of print publishing, plus the huge costs incurred, e-textbooks will be cheaper, as well as more eco-friendly, and less a burden on kids’ tender spines.

Detractors point to the ‘walled garden’ mentality of Apple, arguing that this is a cynical attempt to monopolise a ginormous market, further entrenching Apple products into schools, while raising the spectre of Apple as the world’s number one provider – and gatekeeper – of educational content.

Why does all this chattering leave me – if not cold – at least bemused?

Because, dear reader, you don’t actually need textbooks – of any description. Not for language learning, at least. Maths, history, economics – maybe. But ESOL? No way.

What do you need?

You need data, and you need incentives and tools to mine the data in order to make form-meaning connections, and to extract generative patterns and exemplars. You need scaffolded opportunities to put these ‘mappings’, patterns and exemplars to repeated communicative and creative use, and you need feedback on the results. Above all you need a social context (either real or envisioned), and the desire to belong to it, in order to activate and energise the whole process.

You don’t need textbooks to provide any of this, really. In fact, textbooks can’t provide most of it.  So, whether McNuggets Publishing produces textbooks or whether Apple does, it won’t actually impact on the way languages are learned. Not least because, thanks to the internet, all the means and tools are already in place to do the job a lot more effectively – and more cheaply.

Here’s a possible scenario, based on existing technology, or on technology that must surely be just round the corner, and assuming a ‘smart classroom’, i.e. an internet connection and a data projector:

  1. A topic arises naturally out of the initial classroom chat. The teacher searches for a YouTube video on that topic and screens it. The usual checks of understanding ensue, along with further discussion.
  2. A transcript of the video, or part of it, is generated using some kind of voice recognition software; alternatively, the learners work on a transcription together, and this is projected on to the interactive whiteboard, which is simply a whiteboard powered by an eBeam.
  3. A cloze test is automatically generated, which students complete.
  4. A word-list (and possibly a list of frequently occurring clusters) is generated from the text, using text processing tools such as those available at The Compleat Lexical Tutor. A keyword list is generated from the word list. Learners use the keywords to reconstruct the text – using pen and paper, or tablet computers.
  5.  On the basis of the preceding task, problematic patterns or phrases are identified and further examples are retrieved using a phrase search tool.
  6.  The target phrases are individually ‘personalised’ by the learners and then shared, by being projected on to the board and anonymised, the task being to guess who said what, leading to further discussion. Alternatively, the phrases are turned into questions to form the basis of a class survey, conducted as a milling activity, then collated and summarised, again on to the board.
  7. In small groups students blog a summary of the lesson.
  8. At the same time, the teacher uses online software to generate a quiz of some of the vocabulary that came up in the lesson, to finish off with.

Remember vinyl? (Click to enlarge)

Similar processes, whereby language study and practice opportunities are generated from self-selected online texts, are within reach of individual learners, working on their own, too. There are now search engines that will select texts on the basis of their ease or difficulty of readability. Hopefully someone is already working on an algorithm that will find a text in seconds according to your choice of level, topic, length, genre, and recency. And there are tools to create a hypertext link from every word in the text to an online dictionary. Programs exist that allow review and recycling of vocabulary items in a randomised order.

Predictive collocation tools allow students to create their own texts, selecting from high-frequency lexical and grammatical choices. Grammar and spell checks are increasingly more sophisticated. Online dictionaries and thesauri offer ready-made semantic networks. Free online video and audio tools mean that learners can record themselves doing a task and send it to other students or an instructor by email. Skype allows free video and/or audio interaction with other speakers, while the conversations thus generated can be audio-recorded for later transcription.

In short, anything (e)textbooks can do, the internet can do better. (This does not mean, of course, that I am advocating the exclusive use of online tools, or that the internet is the only alternative to coursebooks. But it is a viable one).





S is for Situation

15 05 2011

Don't even know what they're called in English!

In the wood department of a large hardware store in Barcelona this week, I needed to negotiate the purchase of five lengths of moulding whose function is to prevent rainwater from entering under the doors that open on to a terrace.

It was a situation which was partly familiar (routine service encounter script) but also partly unfamiliar, not least because of the vocabulary I needed, as well as certain unforeseen departures from the script, such as the fact that the wood comes in standard lengths so you have to buy more wood than you actually need. And, for anything in excess of five ‘saw cuts’, there is an extra charge. I got the wood, but not without some linguistic awkwardness. Could I have been better prepared for this situation?

Situation room? Dogme Symposium

What got me thinking about this is something Howard Vickers said at the Dogme Symposium at the IATEFL Conference in Brighton last month. Howard suggested that a syllabus of ‘situations’ might make a better fit with learners’ needs than a syllabus of grammar McNuggets. On his website, Howard shows how he applied this approach with respect to a specific student who “wanted to have a clearer sense of what he would be learning when”. Howard’s solution?  “I have developed a kind of syllabus that gives greater structure to the classes and yet is naturally student focused.  This syllabus is based around situations that the student may well find himself in and themes that he is interested in.”

The notion of a situational syllabus is, of course, not new. As far back as 1966, Pit Corder wrote that “one can perfectly well envisage theoretically a course which had as its starting point an inventory of situations in which the learner would have to learn to behave verbally.  These situations would be analysed into categories, some of which would be behavioural, and then, and only then, would the actual linguistic items be specified to make the situations meaningful” (p. 96).

Corder was coming from a well-established tradition in British linguistics that drew on the work of J. R. Firth, central to which was the proposition that the meaning of an utterance is dependent on its “context of situation”.  Obvious as it may seem to us now, it was Firth who was the first to claim that learning to use a language is a process of “learning to say what the other fellow expects us to say under the given circumstances” (1935/1957, p.28).  It was left to others, such as Michael Halliday, to attempt to answer the question: “How do we get from the situation to the text?” (1978, p. 142). That is to say, what is it in the situation that determines the way the text is?  Accordingly, Halliday’s project was to identify “the ecological properties of language, the features which relate it to its environment in the social system” (p. 141). The outcome of this quest is enshrined in his Introduction to Functional Grammar (1985).

While linguists were wrestling with these questions, teachers were already implementing what came to be known as Situational Language Teaching. Lionel Billows’ wonderful Techniques of Language Teaching (1961) outlines the principles that underpin this approach. (Incidentally, the much-loved English in Situations by Robert O’Neil [1970] is not part of the Situational Language Teaching tradition, since the situations are not the starting point of course design, but are devised solely to contextualise pre-specified grammar items).

In order to ‘situate’ language learning, Billows proposes a system of concentric circles, radiating out from the learner’s immediate context (e.g. the classroom) to the world as directly experienced, the world as imagined, and the world as indirectly experienced through texts. Billows argues that we should always seek to engage the outer circles by way of the inner ones.

Nowadays, of course, it is much easier, using existing technologies, to bring the outer world ‘into’ the classroom. Moreover, we are much better equipped to gather ‘thick descriptions’ of the kinds of situations our learners will need to negotiate. And, of course, the students themselves can be recruited to the task, becoming ‘ethnographers’ of their own language use. As Howard Vickers suggests, “students can prepare for a phone call or a shopping trip using a Personal Phrasebook to prepare and look up useful phrases before (or even during) the situation.  Students can then record the experience (using an MP3 player or other mobile device) and bring the recording to a subsequent class”.

Which makes me think, what else could I have done – using the available technology – to prepare myself for – and learn from – my wood-buying experience?

And, I guess the other question is: in a general English class of learners with disparate (or undefined) needs, how could a situational focus be successfully implemented?

Mission accomplished!

References:

Billows, L.F.  1961. The Techniques of Language Teaching. London: Longmans.

Corder, S.P.1966. The Visual Element in Language Teaching.  London: Longman.

Firth, J.R. 1957. Papers in Linguistics 1934-1951. London: Oxford University Press.

Halliday, M.A.K. 1978. Language as social semiotic: The social interpretation of language and meaning. London: Edward Arnold.

Halliday, M.A.K. 1985. Introduction to Functional Grammar. London: Edward Arnold.

O’Neil, R. 1970. English in Situations. Oxford: Oxford University Press.





T is for Technology

1 05 2011

The ELT Journal Debate, IATEFL 2011

There was a good deal of whooping and hollering after the ELTJ debate at the IATEFL Conference in Brighton a couple of weeks ago. And, in the face of Alan Waters’ well-argued, but somewhat lacklustre critique, Nicky Hockly deservedly won a healthy round of applause for her feisty defence of educational technologies.  But many of the comments from the floor seemed to reflect a wilful misunderstanding of the nature of the debate (admittedly, the motion – Twitter is for the birds… – was not helpful). Instead of arguing about the merits of integrating technology into (language) education, it became a free-for-all about technology in general (“I wouldn’t have been here if it hadn’t been for Twitter”, “If you are unable to follow a Twitter-stream you are soft in the head…” etc). Comments like these seemed to be largely irrelevant to the matter in hand, i.e. the uses (or abuses) of technology in language education.

There are good reasons for integrating technology into language education, and there are bad reasons. But the debate never seriously addressed them. Instead, the general view seemed to be that, if technology is good for laundering clothes or photographing Mars, it must, ipso facto, be good for education. QED.

Nicky Hockly and me: poles apart?

Moreover, by framing  the issue as an either/or one (inevitable, unfortunately, for a debate), the event served only to perpetuate the division between so-called technophiles and so-called technophobes, obscuring  the wide range of possible stances in between. One of these stances is that of the technosceptic.  Technosceptics, like me, happily embrace technology in our daily lives, but are nevertheless a little suspicious of the claims made, by some enthusiasts, for its educational applications – claims that frequently border on the coercive: If you don’t use technology in your classes you are unprofessional/ irresponsible/ old-fashioned/ in denial, or even (as one blogger put it) “a tad rude”.  And, as Hal Crowther (2010) wrote recently: “Coercion is not just interpersonal but societal, and pervasive. The word ‘Luddite’ which we used to wear with defiant pride, has become an epithet like ‘Communist’ or ‘reactionary’” (p.109).

Uncritical acceptance of any innovation, whether it be interactive whiteboards or multiple intelligence theory, needs to be subjected to a dose of level-headed scrutiny. And, as far as I am concerned, until the following four problems have been satisfactorily addressed, an ounce or two of scepticism regarding ‘ed tech’ seems well advised.

The delivery model problem: Despite the enormous potential technology has both to facilitate communication and to foster creativity, a lot of educational software still seems to be predicated on a delivery model of education. I.e. the more information learners have –  and the quicker –  the better.  As a consequence, many publishers seem to be responding to the demand for language learning apps by simply re-issuing existing reference works in mobile-friendly formats, a well-known grammar self-study book being a case in point. But, to paraphrase (the sainted) Neil Postman, if learners are having problems learning to speak English, it is not through lack of information!

"Let's check out this Murphy app"

The theory vacuum problem: In a review of the film ‘The Social Network’, Zadie Smith (2010) commented to the effect that, “in France philosophy seems to come before technology; here in the Anglo-American world we race ahead with technology and hope the ideas will look after themselves”. As evidence, not a day goes by without someone tweeting to announce a blog or website that offers ’20 things to do with Wordle’, or ‘100 ways of using Twitter in the classroom’ and so on. Rarely if ever do you see ‘7 tools to help students with listening skills’ or ‘100 apps that facilitate vocabulary acquisition’. That is to say, rather than the learning purpose determining the technology, it’s the technological tail that seems to wag the pedagogical dog. What theories of learning underpin the claims being made for educational technology? We deserve to know!

The attention deficit problem: A good while back, Aldous Huxley warned against the dangers of ‘non-stop distraction’. More recently, commentators have noted that a state of ‘continuous partial attention’ characterises the kind of engagement that digitial technologies induce. As Nicholas Carr writes (2010), “When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning.  It is possible to think deeply while surfing the Net, just as it’s possible to think shallowly while reading a book, but that’s not the type of thinking the technology encourages and rewards” (pp. 115-16).

If you accept that a degree of higher order thinking and sustained concentration is a prerequisite for learning, then you have to be worried about these effects.  (Do those who deny that multi-tasking is  a problem also condone the use of cell phones while driving?)

The added value problem: At a recent presentation on the educational use of mobile technology, the presenters quoted a survey of teachers in which the majority said that they didn’t anticipate using mobile technology in their classrooms. The presenters glossed this as meaning “…because they don’t know how”. Was I the only member of the audience who was thinking that the more likely reason was “….because they don’t see the need”?

As long ago as 1966, Pit Corder warned that “the use of mechanical aids in the classroom is justified only if they can do something which the teacher unaided cannot do, or can do less effectively” (1966, p. 69). This would still seem to be a useful test of the value that technology adds to education, not least when one factors in the costs – not just in terms of the initial outlay, but in terms of training, maintenance, upgrades and eventual disposal. (Crowther, op. cit, notes that “Americans alone discard 100 million computers, cell phones and related devices every year, at a rate of 136,000 per day” and adds that “it takes roughly 1.8 tons of raw material… to manufacture one PC and its monitor” [p. 113]). Confronted by any new tool or application, the discerning teacher should be asking: Is it really worth it?

Coincidentally, while preparing this blog, I discovered that at least two other bloggers were addressing the same theme. Here’s how Luan Hanratty , for example, responds to the added-value problem, a good deal more eloquently than I can:

My own philosophy of teaching barely includes technology because if teachers understand the proper principles of language learning, informed by psychology and other fields, then technology is mostly superfluous. It’s not that I don’t like it, it’s just that I don’t really need it. There is more immediate stuff out there in the collective consciousness and more beneficial techniques to employ than the more-is-more approach of jumping on the latest bandwagon.

Of course, I ought to say what I think technology is good for, but this post has already exceeded the word count, so I’ll reserve that discussion for the comments.

References:

Carr, N. 2010. The Shallows: how the Internet is changing the way we think, read and remember.  London: Atlantic Books.

Crowther, H. 2010. One hundred fears of solitude: The greatest generation gap. In Granta, 111.

Pit Corder, S. 1966. The Visual Element in Language Teaching. London: Longman.

Smith, Z. 2010. Generation Why? Review of The Social Network. New York Review of Books Nov 25 2010-12-09.





P is for Profession

30 01 2011

Hard times?

In a leading Spanish daily a couple of weeks ago, there was a feature on an up-and-coming actress, in which she recounted her years of ‘penury’ before achieving stardom. This is how it was reported (loosely translated): “Her career has suffered fits and starts. [She recalls,] ‘I worked as an ice-cream seller, a mime at Ikea, a teacher of English, and a teacher of drawing…'” The newspaper comments: “These are the privations that many of her actor friends have had to put up with, grabbing whatever they can …”

And in the 1995 edition of The Cambridge International Dictionary of English the following citation appeared under the entry for end up: “After working her way around the world, she ended up teaching English as a foreign language”.

This perception of English language teaching as being a slightly disreputable last resort, or, at best, a gap-year option, is one that is endlessly perpetuated, and is a source of both embarrassment and indignation on the part of many dedicated English teachers.

One way of redressing this negative stereotype has been to claim professional status, arguing that language teaching, being highly skilled, requires (or should require) extensive training and rigorous gate-keeping. In this spirit, organisations such as IATEFL and TESOL make it their mission “to develop and maintain professional expertise in English language teaching” (as the TESOL website puts it).

But is TEFL really a profession? Is teaching even a profession? In his seminal book, School teacher: A sociological study, Lortie (1975) suggested that — compared to the prototypical professions like law, medicine or engineering — maybe it is not. Why? Because, unlike doctors, lawyers, architects, etc:

  • teachers continue to be employed subordinates who are employed in organizations where those that govern do not belong to the occupation;
  • there is no consensual base of professional knowledge;
  • membership is not carefully screened by the occupational group itself;
  • entry to teaching is eased by society, as compared to other professions: entry requirements are relatively lacking in rigour and length and the decision to enter can be made at almost any age.

Whether or not this is true for mainstream teaching, it certainly does seem to reflect the reality on the ground for much of TEFL, and accounts for the relatively low levels of professional self-esteem, often exacerbated by poor pay and long hours.

"The technology model"

What is to be done? As I wrote a few years back (Thornbury 2001), “those working in EFL who are concerned by this implied lack of status have responded by attempting to construe EFL in terms of one of two distinct models” (p. 392). These I labelled the academic model (aimed at establishing ‘a consensual base of professional knowledge’, through, for example, research and publication), and the therapeutic model, where, by enlisting certain new-age discursive practices, the somewhat mundane activity of teaching is re-invented  as a form of healing. (I am less convinced, now, that the therapeutic model has as extensive a following as it did in the 1980s and 1990s. If anything it has been eclipsed by the technology model, whereby respect is conferred by donning a lab-coat and swearing allegiance to the doctrine of Vorsprung durch Technik. Meanwhile, the academic model is stronger than ever, judging by the number of MA TESOL programs on offer – on one of which – declaring an interest – I teach).

As an alternative (to the academic and therapeutic models), I argued that teachers might achieve a measure, not just of self-respect, but of personal and professional excitement, by acknowledging the fact “that they occupy a privileged space on the frontier between languages and hence on the frontier between cultures, and that they are uniquely situated to mediate contact through dialogue” (p. 394).

A dialogic model of pedagogy, grounded firmly in an educational tradition, as opposed to an academic or a therapeutic or a technocratic one, still seems to me to offer the best way forward. As Claire Kramsch puts it: “A dialogic pedagogy is unlike traditional pedagogy… it sets new goals for teachers – poetic, psychological, political goals that … do not constitute any easy-to-follow method. .. Such a  pedagogy should better be described, not as a blueprint for how to teach foreign languages, but as another way of being a language teacher” (Kramsch 1993, p. 31).

I concluded my article by suggesting that:

as a profession we should worry less about what other people think of us and concern ourselves more with what we are good at: being out there, at the front, in the firing line, on the edge. Few jobs can offer as much. The lightness of EFL is dizzying. But we need to guard against respectability. As Auden wrote: “The sense of danger must not disappear” (p.396).

Ten years on: is the craving for respectability still as strong as ever?

References:

Kramsch, C. 1993. Context and culture in language teaching. Oxford: Oxford University Press.

Lortie, D. 1975. School teacher: A sociological study. Chicago: University of Chcago Press.

Thornbury, S. 2001.The unbearable lightness of EFL. English Language Teaching Journal, 55/4, 391-6.