Category Archives: Uncategorized

The UK is finally facing its demise. It’s disgraceful past is increasingly coming back to haunt it and in the next 20 years it will be stripped of the last vestiges of its former power. Good riddance to all of it.

One of the slender but still powerful pillars propping up UK power is the ELT industry. Worth $2,000 billion in 2017, the industry is dominated by UK players – Pearson, CUP, OUP, the various tentacles of Cambridge Assessment English, the British Council, and the rest of them. They have a dominating and decisive influence on the ELT global industry.

And what’s the result? The result is that the ELT global industry is smothered by a particularly British culture that defers to authority, that reeks of anti-intellectualism and conservatism, that abhors criticism, and that fights reform.

There’s nothing that comes close to the ELT culture in the rest of education. No other subject in the educational curriculum is taught with such disregard for the evidence of relevant research. No other subject has such bad results. No other subject has such a profusion of charlatans informing teachers about how to do their jobs.

And the ELT community just doesn’t want to listen to criticism. The deference to authority, the absurd aura of misconceived politeness that hovers like a suffocating old blanket over discussion, the truly disgraceful intellectual sloth, the absence of critical acumen, the reliance on the home-spun wisdom of experience, the stubborn refusal to call out those who talk baloney, all this infuses ELT culture.

It makes me sick and I’ve had enough. I’m taking a break. I urge those who want change to continue fighting.

Share this:

Like this:

1. Why do you largely ignore the vital question of how people learn languages?

We learn our first language/s implicitly, unconsciously. You know that. Explicit learning about the language plays a minor role. If we get help with learning a foreign or second language, information about the language helps a lot, but implicit learning through talking in the language still plays the major role. If the goal is communicative competence, it’s much more effective to organise courses around “making meaning” – engaging learners in meaningful communication about relevant topics in the target language – than telling learners about the language. I’m sure you know that too. And you know that we know thanks to the enormous amount of research into language learning that’s been done in the past 60 years and more.

We know that language learners follow an internal trajectory unaffected by teaching; we know that language learning is not a matter of assimilating knowledge bit by bit; we know that the characteristics of working memory constrain rote learning; we know that by varying different factors in tasks we can significantly affect the outcomes. And there’s a great deal more we know about language learning which has important implications for everything in ELT from syllabus design to the use of the whiteboard; from methodological principles to the use of IT, from materials design to assessment.

We know that in the not so distant past, generations of school children learnt foreign languages for 7 or 8 years, and the vast majority of them left school without the ability to maintain an elementary conversational exchange in the L2. Only to the extent that today’s teachers respect what we know about language learning and get their students to engage in communicative activities have things improved; and to the extent that teachers continue to spend most of the time talking to their students about the language, those improvements are minimal.

So how come so many of you ignore all this? However much you might say that you’re in favour of CLT, how many of the training courses you do include a sizeable component where the subject of language learning is properly discussed so as to reveal the methodological principles that inform CLT? How many of these courses include a proper examination of the special characteristics of language learning which any methodology must respect? How many courses explain the complex nature of language learning, which leads one to seriously question the efficacy of basing teaching on the presentation and practice of a succession of bits of language? Surely if more attention were paid to the view of language learning underpinning CLT, more teachers would be persuaded of its value, and do more than just pay lip service to it.

In fact, looking at the literature and activities of Teacher Development SIGS, what one sees is that teacher trainers continue to give almost no attention to how people learn languages, preferring to stay stuck in a methodology that was discredited and out of date forty years ago. The “How to teach” books continue to say almost nothing about language learning, and very few presentations at conferences mention it. Apart from lots of attention to keeping fresh, avoiding burn-out, staying on your toes, being involved in on-going, continuous, development, and all that, when you talk about teaching, you concentrate on better ways of doing things that are simply assumed to efficiently facilitate language language learning. New ways to present grammar structures, better ways to check comprehension of what’s just been presented, more imaginative ways to use the whiteboard to summarise it all. After all, what harm’s done by never stopping to question the common sense assumptions on which all these procedures rest, particularly the basic assumption that students will learn what they’re taught? Well, quite a lot of harm, actually. Thanks to the special nature of language learning, that basic assumption is quite simply false.

2. Why don’t you push harder for changes to the CELTA course?

The CELTA course gives almost no attention to the question of how people learn languages. It aims to enable candidates to “acquire essential subject knowledge and familiarity with the principles of effective teaching” and “a range of practical skills for teaching English to adult learners”. Most of you CELTA trainers are proficient teachers of English grammar, pronunciation, lexis, functions, and spoken discourse, and no doubt would like more time to deal more thoroughly with all of it, plus give more attention to formulaic language and lexical chunks. But when it comes to the actual teaching, most of you seem happy to go along with using a coursebook to show the trainees how to present and practice bits of grammar and lexis, and how to practice the 4 skills. The brevity of the course, its lack of attention to language learning and its failure to engage the trainees in any critical evaluation of coursebook-driven ELT, combine to make this an inadequate and backward-looking course, urgently in need of radical reform.

3. Why don’t you object to the stars of the ELT world flying half way round the globe to do courses that local teacher trainers could do just as well?

Aren’t you sick of seeing the usual suspects turning up at a venue near you to give advice to the locals on how to teach? What justifies the expense – the taxis and flights and hotels and restaurants on top of the personal fee? What makes a talk by one of these stars about aiming high, or interleaving fully contextualised lexical stems with olive-sized grammar chunks, or being the best teacher you can possibly be, so much better than a workshop on a topic chosen by the teachers themselves and led by a local teacher trainer? Why don’t you complain to those responsible for this absurd practice?

Like this:

In cognitive psychology, studies in various fields have shown that when studying factual information or practicing a particular skill like a golf swing or reversing a car, spaced practice gets better results than massed practice. It’s a big step from these results to the claim that spaced practice can have a transformative effect on learning a second or foreign language. Here are a few quotes from journals dealing with SLA and ELT.

The results showed that distributed practice led to superior test scores on the long-term tests, indicating that the learning of second language syntax can benefit from distributed practice …..

These improvements admittedly only pertain to subsets of the knowledge and cognitive skills required for tests measuring overall proficiency (e.g., overall communicative ability; Lapkin et al., 1998). As suggested by results of some language-learning studies, there remains the possibility that at the global proficiency level, language learning involves a degree of complexity that eliminates the benefit of distributed learning .

One related limitation of the present study that should be pointed out is that the tests measured the ability to detect and correct verb morphology in a context that enforced conscious attention to the verb forms. Full language proficiency, of course, depends on the ability to produce grammatically well-formed sentences at a discourse level spontaneously and correctly during real communication with little or no conscious attention to verb morphology. The present results do not warrant a conclusion that the grammatical distinctions have been “learned” in this more global sense of language learning.

Most research that has investigated the issue of time distribution in this manner has failed to find an advantage for spaced distribution learning. (see Serrano & Munoz, 2007; for an overview of these studies)……….

…… Donovan and Radosevich’s (1999) meta-analysis of 63 studies on the spacing effect, found that the effect sizes of studies were much smaller when the learning tasks involved a high degree of task complexity. Thus, while spaced distribution practice may have a strong effect on the learning of a relatively simple task such as memorizing a list of words, the effect is less dramatic on more cognitively demanding tasks such as puzzle solving or learning a skill involving the synthesis of a number of behaviors and choices. ……….

….. no conclusions can be drawn about massed vs. spaced distribution methodology employing other forms of grammatical practice such as communicative approaches to grammar instruction (e.g., focus on form) or implicit learning. Finally, while the results of the study suggest that the learning of specific grammatical structures may be benefited by spaced distribution practice, they do not give any reliable data on how spaced distribution may affect general skill development. It is possible that grammar learned through spaced distribution would fail to transfer to general speaking and writing skills.

…. none of the differences regarding the retention rates comparing both groups directly were statistically significant.

The results give support to Schmitt’s (2010) observation that second language learners report having difficulties learning function words. For teaching and learning, it might be helpful to not just follow suggestions made in textbooks or language programs that group words by theme or grammatical category, but to pay attention to the underlying factors of processing a word, such as the division between content and function words.

Serrano, R.; Muñoz, C. (2007). Same hours, different time distribution: Any difference in EFL?System 35, 305-321.

Although our findings are still preliminary, they seem to suggest that concentrating second language instruction has a positive impact on the students’ acquisition of certain aspects of a particular language, as other studies have previously shown (Collins et al., 1999; Lightbown and Spada, 1994; Peters, 2000; Spada and Lightbown, 1989; White and Turner, 2005). The claim that intensive programs are more effective than extensive programs may indicate that, contrarily to what some cognitive psychology research studies have shown in the laboratory (Dempster, 1987; Glanzer and Duarte, 1971; Hintzman et al., 1973; Melton, 1970), massed practice can be more effective than distributed practice in classroom learning (Carroll, 1994; Rettig and Canady, 2001; Seamon, 2004), especially in the case of second/foreign languages.

Share this:

Like this:

Baloney is foolish or deceptive talk; nonsense. Another word for it is bullshit. Regarding deceptive talk, a particular quality of baloney is this: false claims to knowledge of what’s being talked about. In the world of ELT, those who give teachers advice about the best ways to do their jobs are more or less guilty of using this particular form of baloney.

John Fanselow never stoops to baloney. He’s a scholar; he pays attention to SLA research, he knows about it, he respects it, but it’s not something he talks about much. To John, teaching English as an L2 is a matter of helping learners get progressively closer to using English fluently and correctly by giving them lots of input and then, crucially, paying close attention to their output. He’s “old school” in many respects, but he knows that learning an L2 is predominantly a matter of consolidating implicit knowledge, and he recognises that teaching should mostly concern itself with talking in the language, not about it. Fanselow sees teaching as a craft and he invites teachers to examine their practice carefully and non-judgementally in order to make small changes which lead to a transformation in the way they do their jobs. He talks now and then about this research finding or that, he doesn’t downplay research, and he never, ever bullshits. Why would he!

In terms of the baloney I’m talking about, Fanselow is the anti-baloney gold standard: he’s as honest as the day is long. He’s my hero; he’s Connie’s hero; he’s Rose Bard’s hero; he’s an inspiration to everybody who ever did a course with him.

Close on Fanselow’s immaculate heels come Jim Scrivener and Adrian Underhill. I’ve criticised their “Demand High” initiative for its conservative approach, but there’s no sign of baloney in what they say. Underhill is a consumate teacher of pronunciation, widely respected by academics such as Rogerson-Ravell, and he wears his knowledge lightly. I’ve never read or heard anything by Underhill that snacks of baloney. Scrivener, apart from writing like a well-heeled angel (is that an oxymoron?) puts his rivals to shame, not just in the clarity of what he says, but in never ever resorting to baloney. I recommend Scrivener’s work to my MA students because of its clarity and coherence and because there’s not a sniff of bullshit anywhere.

Then there’s Scott Thornbury. In my opinion there’s a sniff of baloney in his work. When he talks about SLA, his grasp of Chomsky is, let’s say, fragile, and his enthusiastic endorsement of some of the more ridiculous statements about emergentism sound to me like he hasn’t read as much as he makes out. Still, I respect Thornbury’s academic honesty. I don’t agree with his view of SLA but that matters not one jot. I sometimes doubt his reliability as a mediator, but never mind. In my opinion, Scott is a much better mediator between scholars and teachers than the rest of the awful people in ELT who pretend to do the job.

Then there’s the rest of them. Ah, them. There’s this awful list of people who get paid to talk bullshit to teachers. I won’t name names but the IATEFL 2018 conference had its full quota of those who pretended to know more than they really knew, those who spewed bullshit and got away with it because, as usual, nobody called them out. The ELT community is not critical enough: it’s gullable, badly informed, too easily conned.

Read John Fanselow’s latest book, (you can get a copy for 15 quid), and then look at what the big stars of IATEFL had to say about ELT.

Like this:

In this post I examine Dellar’s 2018 IATEFL presentation and suggest that it:

mounts a straw man argument against grammar-based coursebooks using a series of false assertions

misrepresents research findings and draws unwarranted conclusions from them

fails to appreciate the implications of SLA research findings for ELT practice.

Dellar accuses the ELT profession of being in thrall to mass practice, which results in “hampering learners”, “limiting potential”, “slowing learning”, and “robbing students of their time.” To remedy this situation Dellar insists that teachers stop using coursebooks which impose the “invalid construct” of massed practice on them, and adopt instead the proven pedagogical procedure of spaced practice. A model of a viable alternative – a coursebook series which exemplifies not just spaced practice, but also interleaving and the complete grammaticalisation of lexis – is the Outcomes series, written by H. Dellar and A. Walkley, which is advertised on a large billboard on the left of the stage.

First I’ll go through the talk commenting on specific points. Then I’ll look at the general argument.

Massed Practice

Focused repetitive practice or massed practice is “practicing something over and over again until we’ve perfected it or nailed it.” Dellar asserts that

massed practice is so deeply rooted in our mental construct of how competence develops that we rarely stop to consider how effective it is, or if there might be more effective ways of developing competence.

Comments:

By definition of the terms, nobody has a mental construct of how competence develops.

Massed practice is not a construct.

As we’ll see, Dellar offers no empirical evidence to support his assertion that using massed practice as a teaching procedure in ELT is deeply rooted in the way teachers approach their work. If teachers rarely stop to consider more effective ways of developing competence than massed practice, it’s probably because they’ve never heard of it, and because it plays a minor, if any, role in their teaching.

Spaced practice

Next, Dellar says that in baseball mass practice of a free throw is less effective than spaced practice, and that when learning how to park a car, massed practice of the parking manoeuvre is less effective than spaced practice. He then says:

As it is with basketball and as it with parking, so it also is for teaching and learning foreign languages.

Dellar says that even though the spacing effect has been verified in vocabulary learning, “there’s not been (sic) as many studies into how we learn grammar and whether this also applies to grammar”. Nevertheless, Dellar goes on to say:

time and time again what’s been shown is that spaced distribution instruction on the development of particular grammatical structures versus massed practice or massed exposure is very conclusive. The immediate post test show very very little difference in terms of the way that learners perform but the delayed post tests show time and time again that spaced distribution and spaced exposure to the new items always helps you outperform people who study under mass practice conditions.

Comments:

The spacing effect has not been verified in vocabulary learning. Pedagogical procedures don’t get verfied; hypotheses and theories about certain procedures are supported or challenged by empirical evidence from well-conducted studies.

Dellar cites no studies to support his assertions about the comparative merits of massed practice versus distributed practice, and his assertions are, in fact, false: no study has shown the results that Dellar claims. As I said in my previous post, when Dellar was asked on Twitter to provide the sources for his assertions, he came up with an article from a web site called “Ask a Cognitive Scientist”. The article is “Allocating Student Study Time: “Massed” versus “Distributed” Practice”. It was written in 2002 and is not specifically about SLA. A more reliable source is John Rogers’s bibliography of articles on massed versus distributed practice published in applied linguistics academic journals. I’ve read eight of the articles on the list, enough to confirm that, in fact, very few empirical studies have been done. One of the most recent articles is by Rogers himself (Rogers, 2015) which begins by saying that very little data have so far been collected. It gives a brief report of a not very well-designed study, and while Rogers concludes that the results are promising, he recognises the need for far more evidence before any firm conclusions can be drawn. There is evidence that spaced learning gets better results in some specific types of studying, but first, the evidence is not conclusive, and second, the studies of the comparative effects of 2 different ways of studying factual information can’t be used to make general statements about developing communicative competence in an L2. Dellar treats the tentative conclusions reached by fledgling studies as convincing proof that massed practice is having a crippling effect on learners and that spaced practice is the best way to teach both vocabulary and grammar. Such claims are unwarranted.

Coursebooks in thrall to mass practice

Having demonstrated to his own satisfaction that spaced practice leads to “better uptake in both vocabulary and grammar”, Dellar asks why these convincing findings are not guiding practice. His answer is that ELT is “in thrall to mass practice”. Here’s what Dellar says happens when teachers use coursebooks.

In “coursebook after coursebook after coursebook” you get “the presentation of a discrete piece of grammar usually scripted and crafted in such a way so as to not include any other kind of particular grammar structure in the initial presentation because of the fear that it might somehow obscure the view of the structure you’re currently looking at.”

Then you get laughable dialogues designed to maximise exposure to every possible variation on that particular grammar structure.

Then you get exercises which massify the practice of every possible variant: negatives, questions positives, “irrespective of whether they come to be useful in the kind of communicative tasks that you come to ask your students to perform.”

Then you get freer practice which resembles not a real conversation, but rather “a cunning trap designed by coursebook writers to elicit common errors of the new structure all of which can then be corrected en masse.”

That structure then immediately vanishes from sight for the rest of the book and thus from the rest of the course, unless the teacher realises that despite the massed practice students aren’t able to utilise the structure, in which case they try more massed practice , maybe written practice, maybe a communication game that forces students into massed practice.

Comments

To say that coursebook writers present discrete pieces of grammar “in such a way so as to not include” (sic) “any other kind of particular grammar structure” (sic), “for fear that it might somehow obscure the view of the structure you’re currently looking at” is pure assertion and almost certainly false.

The presentation of a grammar point is not always, or even usually, followed by a dialogue and Dellar’s example of a dialogue using “should” is not “perhaps a slight exaggeration”, it is ridiculous.

Whatever oral or written text follows the presentation of a grammar item, the assertion that the text is designed to maximise exposure to every possible variation on that particular grammar structure is evidently false: no coursebook attempts to present and practice any of the dozens of complicated grammatical structures of English by dealing with every possible variation of them in one unit.

It’s likewise false to say that the exercises that follow massify the practice of every possible variant.

To characterise the freer practice offered in coursebooks as “a cunning trap designed by coursebook writers to elicit common errors of the new structure all of which can then be corrected en masse” is an interpretation that has nothing to do with the facts and has very little to recommend it.

The structural items that are presented in coursebooks don’t immediately vanish from sight for the rest of the book. How could they? How could the present or the past tense, or the simple, progressive and perfect aspects of verbs, or mass and count nouns, or the articles, or pronouns, or modals, for example, be excluded from the coursebook once they’ve been presented and practiced?

The suggestion that when teachers see that their students “aren’t able to utilise the structure” they “try more massed practice, maybe a communication game that forces them into massed practice” is false, and also rather insulting.

In general, coursebooks don’t present grammar the way Dellar describes. Below are the grammar points dealt with in some of the units of Headway Intermediate. Note that more than one discrete grammar point is dealt with in the same unit and that modal verbs are dealt with in Unit 4 and then again in Unit 11. A 30 minute examination of this coursebook shows that it bears little resemblance to Dellar’s description of coursebooks in general.

Before we leave coursebooks, I should mention a tweet from Dellar. He says:

The grand irony of course is that Geoff has spent his time railing about coursebooks but now somehow feels compelled to insist they’re alright at heart to avoid the cognitive dissonance of agreeing with me. Ha ha.

This is an obvious non sequitur. Pointing out the mistakes in Dellar’s description of coursebooks doesn’t compel me to insist that coursebooks are “alright”.

So, Dellar asks, What is to be done?

First, stop using coursebooks which implement a massed practice approach. As an alternative, use spaced practice and interleaving. And here’s some more advice:

Comments

Dellar loses track of the argument here. Rather than show precisely how spaced practice of a new grammar structure should be implemented, he gives a series of general pieces of advice that mix the uncontroversial (ensure vocabulary is recycled ; ensure that students have plenty of opportunities to talk; don’t always expect students to use language correctly; etc.) with the highly questionable (present new structures alongside old; ensure that vocabulary is fully grammaticalised) and turns the presentation into a demonstration of the virtues of using Outcomes.

To support this argument Dellar mounts a straw man argument against grammar-based coursebooks and makes sweeping, unwarranted assertions about the damage done by massed practice. He provides no evidence from research findings in SLA to support his exaggerated claims for the effects of spaced practice, and in fact, while there are certainly signs that spaced practice can help learners with specific types of studying, there is no evidence to support the claim that spaced learning provides the best pedagogical procedure for helping learners to achieve communicative competence in an L2.

At one point in his talk, Dellar says

If we’re really going to act like professionals and we’re going to acknowledge the impact of research on our practice, sooner or later we have to acknowledge that coursebooks that are based on massed practice are not theoretically valid.

The first part of the sentence chides teachers who do not take account of SLA research and suggests that Dellar himself is familiar with it. But the second half of the sentence suggests otherwise, since no SLA research has ever stated or implied that coursebooks are based on mass practice, or that mass practice has no theoretical validity. Indeed, Dellar’s use of the terms ‘construct’ and ‘theoretical validity’ suggest that he has yet to fully grasp what they mean.

In more general terms, to suggest that substituting spaced practice for massed practice will lead to a transformation in ELT is, by Dellar’s light, to act unprofessionally and to refuse to apply research findings to our teaching practice. Dellar wants to substitute one type of coursebook for another, thus flying in the face of robust SLA findings which show that any coursebook which implements a synthetic syllabus (and that includes Outcomes), imposes an impossible task on learners, whose own interlanguage development makes it impossible for them to synthesise the information they’re presented with in the way they’re expected to.

Furthermore, it should be noted that Dellar’s preoccupation with explicit instruction, his narrow focus on the presentation and practice of bits of language, be they large or small, runs counter to the view held by most SLA researchers. As I said in a reply to Andrew Walkley some tme ago, probably the most important issue in the various accounts of language learning concerns the roles of implicit and explicit learning. With regard to this fundamental question, there’s widespread agreement among SLA scholars that, as Long (2017) puts it “the relevant goal for instruction is implicit learning, resulting in implicit L2 knowledge”. Implicit learning is regarded by SLA scholars as more basic, more important than explicit learning, and superior. Access to implicit knowledge is automatic and fast, and is what underlies listening comprehension, spontaneous speech, and fluency. It is the result of deeper processing and is thus more durable, and it obviates the need for explicit knowledge, freeing up attentional resources for a speaker to focus on message content.

All the research indicates that learning an L2 is not best facilitated by presenting and practicing bits of grammar and “fully grammaticalised lexis”, no matter how big or small the bits are. Rather, ELT should aim to develop learners’ ability to make meaning in the L2, through engagement in relevant tasks involving exposure to comprehensible input, participation in discourse, and implicit or explicit feedback.

Share this:

Like this:

I was naturally very pleased to see that the first plenary of the 2018 conference was given by Lourdes Ortega, who addressed the question What is SLA research good for, anyway? Lourdes had technical problems with the animation of her slides, which, I think, badly affected her ability to really connect with the audience, but anyway, she gave everybody a good idea of how a serious academic who sees her work as scientific research looks at things. So we learned that SLA research reflects the uneven progress made in any science; that some findings are more robust and relevant than others, that teachers need to exercise their own judgement when assessing research findings. But we didn’t, in my opinion, get a good overview of how SLA research findings can or should affect ELT, or an answer to the question: “What is SLA research good for?“

The answer is, of course, that it’s good for an understanding of how people learn an L2; at least the best of it is. There are parts of SLA research, specially those, such as theories of interlanguage development, that now have 40 years of continuous work behind them, which provide crucial explanations of the L2 learning process. These explanations are deliberately ignored by far too many people in the ELT industry, many of them standing behind stalls in the exhibition stall or giving the most well-attended talks. At last year’s conference, Scott Thornbury reported on a poll of influential teacher trainers and authors of “How to teach” books which showed that none of them paid much attention to SLA research findings. At this year’s conference, the talks and workshops about the best and newest ways to teach, and particularly those to do with teacher training, continue to largely ignore SLA research findings, preferring to trust hunches, feelings, and most of all, drum roll – “Experience!” – that precious inner wisdom which comes from slogging through the 14 units of a coursebook time and time again.

Sorry. Experience is enormously valuable, indeed, it’s much more valuable than reading lots of books on SLA, especially if you read authors like Larsen-Freeman, Rod Ellis, or Saville Troike, but it’s no substitute for critical thinking, which in turn depends on a good base of reliable evidence . The evidence is there, and while, yes, it takes a certain effort to get hold of it and to critically evaluate it, it’s increasingly available in increasingly digestible form. These days, there are more and more people acting as mediators, people who can help busy teachers to stay up to date, people who can compensate for the hopeless job being done by the so-called mediators who are presently strutting around the IATEFL conference as if they owned it. Pace the reassuring words of Ur, Thornbury and the rest, if you teach English as an L2 without having a good grasp of how people learn second or foreign languages, you can’t really complain if you’re not generally regarded as a professional.

Back to the conference plenary. Lourdes touched on 4 areas of SLA research: motivation, error correction, age and bilingualism. The work on motivation which she mentioned does little more than produce “motherhood statements” – platitudes that don’t get us very far at all. As for error correction, while she’s quite right to say that there’s no clear, easy answer to the question of what error correction “works”, let alone what sort of error correction works best, I think that the debate is an important one, and both sides of the argument need to be carefully discussed by teachers. Arguments about error correction reflect the more general on-going debate about the proportion of classroom time that should be spent on explicit and implicit instruction, and as such, they go right to the heart of ELT practice.

With regard to age differences, the message “Younger is NOT better” was, I think, too stridently delivered. The studies Lourdes referred to were on primary school education, and they found that there is no big advantage to starting L2 instruction in schools at age 7 or 8 over starting at age 13 or 14. This is, of course, an important finding, with clear implications for those responsible for policy decisions, but Lourdes’ claim can easily be misconstrued. Nothing in the research she cited challenges the uncontroversial contrast between, as Long (2007, p. 43) puts it,

“children’s near uniform success with both first and second language acquisition and adult’s near uniform partial failure with either.”

Nor does the cited research challenge the findings which show that children attain higher levels of ultimate attainment than adults, or that there are multiple sensitive periods for SLA. In general, there is strong evidence to suggest that there are maturational constraints on language acquisition; the first sensitive period is from birth to 9 months for categorical perception; the second is from birth to between 4 and 6 or 7 for supra-segmental and segmental phonology, and some lexical/collocation abilities; and the third is from birth to the mid teens for morphology and syntax (Long, 2007, p.59). These maturational restraints have implications not just for parents thinking about their children’s education, but also for teachers working with both young learners and adults.

Spacing out with the Champ

A couple of hours later, Hugh Dellar made SLA research findings the focus of his talk, which I found a bit surprising. This is, after all, the man who attributes structuralism to Chomsky, who relies on Hoey’s explanations of SLA, and who fails to see any contradiction between Krashen’s monitor theory and Schmidt’s Noticing hypothesis.

In the conference programme, Dellar’s talk, Spacing out! In praise of distributed grammar practice, referred to “a growing body of evidence” which supports the assertion that

grammar is more solidly acquired if structures are encountered regularly over a longer period of time – rather than in one massed meeting.

How one measures the solidity of grammar acquisition it doesn’t say, nor does it explain how you encounter a structure “in one massed meeting”, but, anyway, the blurb promises that Dellar will explain all, and will discuss “the implications for everyday classroom practice and for materials design.”

Here are three screen shots from Dellar’s talk.

These slides suggest to me that

Dellar might be mounting a straw man argument against ELT in general and grammar-based coursebooks in particular.

His use of research findings might not be well-considered.

His argument might consist of unwarranted, sweeping assertions.

His argument might be contradicted by his own previous presentations and his own materials.

If Dellar’s argument is to have any credibility, he needs to explain

the “construct” of mass practice which he claims coursebooks are imposing on us

precisely how the coursebooks impose the “construct”

why the “construct” is invalid

spaced learning and how it works in everyday classroom practice

the research evidence that he’s referring to.

The last bit will involve giving details of “the growing body of evidence” which supports his claims that “mass practice is invalid”, and that spaced learning has better results than mass practice on second language acquisition. There are, in fact, very few empirical studies of SLA that support the claim. When asked to explain massed practice versus spaced learning on Twitter, Dellar said “It’s widely written about both within ELT research and in many other fields as well”, and, when pushed for references, provided this one:

John Rogers has a good bibliography of articles in this area, and if you look through it you’ll see just how few studies have been done. One of the most recent articles is by Rogers himself (Rogers, 2015) and is a brief report on a study that has its critics, and that recognises the need for far more evidence before any firm conclusions can be drawn.

Even supposing that there were a reliable body of research evidence to support the claim that distributed learning has better results than mass practice on second language acquisition, (which there isn’t) what implications could be drawn from such a claim? Both mass practice and spaced learning refer to ways of studying factual information; they are at the extreme end of explicit instruction, just the sort of thing that Dellar has always been so keen on, but not self-evidently the best way to design and implement a course of English as an L2.

Anyway, we’ll have to wait till Dellar provides the slides and a reliable summary of his talk before making any definitive evaluation of it.

Like this:

“Come in”, she says. “Don’t be perturbed by my turban and my long white beard; they’re no more than cheap theatrical props aimed at enhancing my credibility. Take a seat. I’ll just close the curtains, dim the lights, light the joss sticks, get rid of any vestiges of rational thought, and then we can start.”

She’s on the stage of a huge auditorium, rigged to look like a Victorian sitting room. She takes her seat – Midnight – at a round table marked out as a 24 hour clock. In the middle of the table is a dull glass ball awaiting interrogation. A chosen group of 23 ELT professionals file in and take their seats, while thousands of paying “participants” lean forward in awed expectation.

“So”, she says, “Does anyone have a question?”

“What is the future of ELT?” asks the British Council rep., sitting in the 1am seat. He’s half way through washing down a duck liver pate canape with a chilled glass of a young English rosé, so he has to repeat the question.

The glass ball sparkles into life.

“The future is bright”, says the president, gazing intently into the glass ball. “Our business advisors confidently predict that the ELT industry will surpass the $300 billion mark in the next 5 years.”.

Spontaneous applause breaks out from the “participants”.

“How will that figure be achieved?” asks the Pearson rep., sitting at 2am. She’s got a curled wire going to a thing in her ear, connecting her to a team of marketing consultants in London.

“Well you should know”, says the president, without so much as a glance towards the crystal ball. “Tell us, why don’t you.”

“There are 4 planks in the Pearson global strategy to provide the ELT industry with what it needs.

The Global Scale of English (GSE), consisting of a granular, precise scale of proficiency aligned to the Common European Framework of Reference;

Course Materials (both digital and printed materials, aligned to the selection of learning objectives relevant for a course/level);

The Pearson range of assessment tools (placement, formative/ summative assessments and high stakes tests aligned to the GSE).

We see these as the necessary and sufficient conditions for a unified ELT industry which can be rolled out across the globe, bringing an end to unhelpful competition and heralding a new era of well-regulated language learning. A 20% increase in profits is a conservative estimate.”

Muted mumblings from the “participants”. The ball glows red. The president tugs on her beard and attends to the ball. Then:

“The crystal ball says that most of us will be out of a job in 10 years time and that’s why Pearson, faced with falling profits, has diverted its attention to so-called personalised learning where teachers are dispensed with. So much for global predictions. What, then, about the here and now?”

“Here and now we’re making just the greatest progress ever with eradicating injustice!” chirps Marek Kiczkowiak, sitting at 3am. He snaps the elastic band on his left wrist a few times, rocks back on his fashionable negative-heel trainers, adjusts his special light blue tinted conference spectacles, and explains:

“There’s no difference between native speakers and non-native speakers, no difference between men and women, no difference in the end between chickens and eggs, and there’s a last, very special “conference special” chance to sign up for my “Say it like it sounds to you” pronunciation course. Do it now and get 10% off.”

Before the president can complain, here comes a rare shout from the audience.

“Way to go, Marek! Get your EVE badge on the way out!”

This from Fiona Mauchline, leader of EVE, which has to be the most over-egged, content-free, fuss for fuss’ sake group ever. EVE is a good example of the awful tilt these days away from the issues themselves and towards stance, posturing, grand standing and publicity seeking. If you’re a student of critical discourse analysis and you’re looking for examples of the way that liberal causes get turned back on themselves, morphed into slick bits of self-serving, rarified, trumpeting, well then the EVE web site is worthy of a close look. Look at it. Eve indeed. Which Eve? The old testament biblical one? What kind of model was she? Who did the design? It screams smug elitism. It sets itself up as an arbiter for a cause that nobody disputes. It wants to be the watch dog of a good cause that’s already a bit past it’s launch by date, so it has to shout and strut and crave protagonism. Look at it: it’s all front. It’s glossy, it’s full of itself, confidently sure of an obviously righteous cause. But it’s empty, vacuous, it’s got nothing to say. I’m not surprised that Marek Kiczkowiak is so involved and I bet that his introductory blog post will say absolutely nothing more than the blindingly obvious. In stark contrast to this crap, there’s Nicola Prentis’ Facebook page for women who want to shoot the breeze among themselves. Nicola does her own thing with admirable judgement; what I don’t understand is why she put her name on the list of “Friends” of the dire EVE web site. The list includes Harmer, Dellar, and Tyson, hardly the sharpest knives in the box, and others who, had they looked more closely, might not have signed.

Where were we?

The president adjusts her turban, calls for order and asks for more questions about the current state of ELT.

“Will coursebooks continue to dominate ELT?” asks the OUP rep., sitting in the 4am seat, his copy of Headway Intermediate safely stowed in his long term memory.

The crystal ball remains inert. The president leans towards it.

To be continued

Share this:

Like this:

I went to the IATEFL site yesterday, to see how preparations for the 2018 conference were going, and through a series of links found myself at the KOTESOL website, which both TESOL & IATEFL seem to have a hand in. The theme of this year’s KOTESOL national conference is “Crossing Borders: Korean ELT in the Modern World”.

In our field of ELT education, everyone has knowledge, skills, and insights that can help others to face their challenges, to cross into new professional territory, to make new discoveries, and to grow. All that is needed is the chance to share what we know. That is the purpose of this conference – to allow professionals, novice and veteran, and from any and all contexts, to share and to learn, for the benefit of everyone.

The article had a link to an article that Mike Griffin had written for KOTESOL in 2015 about professional development. I couldn’t resist! It started like this:

What do Thomas Farrell, Barbara Sakamoto, Claire Kramsch, Chuck Sandy, Willy Renandya, and Jeremy Harmer have in common? Aside from being huge names in our field they are all people who were scheduled to give talks in South Korea in calendar year 2015. South Korea (hereafter Korea) is home to some big ELT conferences. I believe the KOTESOL International Conference is the biggest and best-attended of these. From my view, big conferences are just one of the reasons Korea can be a great place for professional development for English teachers.

Mike had provided links to the blogs of Thomas Farrell et. al., and browsing through the posts I saw nothing there that raised any serious doubts about the way ELT was being carried on in S. Korea. Back with Mike’s article, I read:

Korea’s “English Fever”might not always be seen as a good thing but one benefit from my perspective is how the sheer number of people involved in English education in this country guarantees there is (sic) always a wide range of teachers with various experiences and perspectives.

I clicked on the link “English Fever” and found an abstract of Jin-Kyu Park’s (2009) article: ‘English fever’ in South Korea: its history and symptoms.

‘Education fever’ drives the demand for English in South Korea today. One professor of politics has recently deplored the current pursuit of ‘English education’ (yeongeokyoyuk) in South Korea as a ‘collective neurosis of English fever’ (Y-M. Kim, 2002). What has brought this current English boom to South Korea? It can be traced back to the traditional ‘education fever’ (kyoyukyeol) or ‘preoccupation with education’ (Seth, 2002). The English boom resulting from the Korean education fever has led to a strong antipathy toward Koreans – even in English-speaking countries.

I went to my uni library’s website and downloaded the article. A worrying picture of Korean education emerged, and the picture of the teaching of English was particularly disturbing. My interest aroused, I downloaded some more articles about ELT in S. Korea (see References at the end of this post) and then I watched a documentary about the South Korean university entrance exams. The documentary follows three students as they prepare for the famous SKY exam, and goes on to tell the story of what happens to them after the exam. I urge you to watch the documentary; it’s harrowing, sad, upsetting. To me, it’s the painful portrait of a collectivist culture caught up in the competitive clutches of a neoliberal ideology, a bleak picture of a nation suffering collective alienation on a frightening scale.

In a publicity handout, the directors of the movie quote the United Nations On The Rights Of The Child Committee’s 2003 report on S. Korea:

…. the Committee reiterates its concern that the highly competitive nature of the education system risks hampering the development of the child to his or her fullest potential.

and they go on to say this:

Statistically it is clear that the pressure Korean students have to deal with is more than problematic. South-Korea has the highest suicide rate among the OECD countries. Between 2007 and 2009 suicide was the leading cause of death among students aged 15 to 24. Every month 2 students take their own life. 75% of students committing suicide are in high school.

The Ministry of Education is aware of the problems this overemphasis on credentials has created and has undertaken many attempts to reform its system but to little avail. The underlying cause of this is not to be found within a failing educational system, but in society as a whole. As long as a degree from a prestigious university is considered a status symbol by parents and a decisive requirement for employment little will change. A fundamental shift in mentality is needed, but it’s quite clear this will not happen overnight.

‘Reach for the SKY’ is a documentary about a society where education has become a multi-billion industry because of its obsessions with achievement and status; about a culture where education has become as important as the type of car you drive or the size of your apartment; where mothers have become the educational agents of their children, micro-managing every hour that could be spent on studying.

An important part of the SKY exam is the English test. A report 12 years ago by the Samsung Economic Research Institute stated that Koreans spent about $16 billion per year on learning English (Jeon Hyo-chan & Choi Ho-sang, 2006), so we may realistically assume that today the figure is over $20 billion. There are 17,000 English cram schools (known as hagwons) scattered across the nation and an army of 30,000 native English teachers, along with thousands more who teach English illegally (The Diplomat, 2014). Despite the government’s stated policy that “the main goal of English education in Korea is simply to advance the ability to communicate in English” (Ministry of Education, 2010), and despite the sporadic efforts made by a minority of teachers, most of this money is spent on exam preparation.

In 2002, Seth argued that due to the importance of education in Korean culture, Korea had become “the most exam-obsessed culture in the world” (2002: 5). The university entrance exams, Seth said then, represent more than just education:

…the examination system illustrates the importance of education as a determiner of social status, the Korean concern with rank and status, and the universal desire for and belief in the possibility of upward mobility.

Test scores in these exams decide who goes to the best universities, and those who go the the best universities go on to get the best jobs. The exam system is thus a crucial factor in determining the future success and status of young Koreans. As a result, it seems that what is actually being implemented in schools in both the public and private sectors is a traditional “talking about the language” approach where teachers pay no more than lip-service to CLT and where the washback effect of the university entrance exams is overwhelming (Korean Ministry of Education, Science and Technology, 2010; Shin, 2007; Jeon, 2009; Park, 2009).

Although the government introduced a listening part to the English exam, it’s still widely believed that the CLT method is inappropriate, since there is still no oral component to the exam (Littlewood, 2007). Teachers find themselves (willingly or not) giving in to pressure from parents and students to teach for the exam, and thus to put little emphasis on oral communication.

How can we evaluate the effects that the huge ELT industry in S. Korea is having? Is it contributing to the culture of overachievement in education which takes such a heavy toll on students in terms of their health and happiness? Is it contributing to the commodification of education where high stakes exams determine classroom practice, and where the focus on credentials, tests and entrance exams deny students not only a humanistic education but also the skills (e.g. creativity, problem-solving, teamwork) to succeed in higher education or in an increasingly difficult local job market? How typical, I wonder, is this account, from the New York Times in 2014, of what’s going on in hogwons?

Cram schools like the one I taught in — known as hagwons in Korean — are a mainstay of the South Korean education system and a symbol of parental yearning to see their children succeed at all costs. Hagwons are soulless facilities, with room after room divided by thin walls, lit by long fluorescent bulbs, and stuffed with students memorizing English vocabulary, Korean grammar rules and math formulas. Students typically stay after regular school hours until 10 p.m. or later.

Herded to various educational outlets and programs by parents, the average South Korean student works up to 13 hours a day, while the average high school student sleeps only 5.5 hours a night to ensure there is sufficient time for studying. Hagwons consume more than half of spending on private education.

This “investment” in education is what has been used to explain South Koreans’ spectacular scores on the Program for International Student Assessment, increasingly the standard by which students from all over the world are compared to one another. But a system driven by overzealous parents and a leviathan private industry is unsustainable over the long run, especially given the physical and psychological costs that students are forced to bear.

I presume that those who work for KOTESOL, and those who blog about ELT in S. Korea are aware of what’s going on, and I can only suppose that the picture painted by the documentary Reach for the SKY and by the other sources cited here is somehow distorted, an unfair reflection of the real ELT world as depicted by KOTESOL and Mike Griffin, where teachers work together on their professional development, in order to face their challenges, to cross into new professional territory, to make new discoveries, and to grow.

Share this:

Like this:

This is a summary of Copley’s (2018) article, just out. Much of it is verbatim, all the ideas are his, although I ‘ve taken terrible liberties with the original text, for which I apologise to him. Copley’s argument is simple: modern ELT coursebooks are based on neoliberal ideology.

Neoliberal idealogy internalises entrepreneurial behavior: as Holborow (2007, p. 51) puts it, “the ideology of the global market insinuates itself everywhere.” The primacy of the market pervades all areas of our lives, in such a way that we act as atomised individual agents, part of a global society where competitiveness is the overriding goal of our activity. Of course, neoliberal ideology is not unopposed: collectivist values still exert themselves. While social attitude surveys provide evidence to support the view that neoliberalism has damaged class-based identities, there remains deep and continued resistance to the marketisation of all public spheres. This resistance, however, has been comprehensively sidelined in mainstream public discourse, where class-based identities have been replaced with “a collection of individuals…competing with each other for their own interests” (Jones, 2011, p. 48). The acceptance of market relations as the “fulcrum of the organisation of human needs and capacities” (Cox & Nilsen, 2014, p. 137), has meant replacing notions of social connectedness with a subjectivity in which people are judged “by their capacity for consumption” (Bourdieu, 1984, p. 310).

Copley contends that these key characteristic features of neoliberal ideology

we are defined by consumption;

the market is the best template for all social relations;

social solidarity is replaced by individuals pursuing their own self-interest

are systematically disseminated within the most well-known and widely used modern ELT coursebooks. ELT coursebooks “render socially constructed relations as natural,” and in doing so, “confer legitimacy on the dominant status of particular social groups” (Sleeter & Grant, 2011, p. 185). Questions of who is included in such materials and whose experiences are valid (Auerbach, 1995) are key ways in which representations of social life, class, and conflict, or the notable absence of it, reinforce neoliberal ideology.

Copley recognises that examining coursebooks poses important questions about the ELT industry, its strategic role in the political economy of neoliberal globalization, and the practices that it promotes. He cites Cox and Nilsen’s (2014) nicely observed remark that the legitimacy of any dominant ideology is largely dependent on it being seen ahistorically as “just the way things are”, a position that conveniently ignores the time when things were different, or the possibility that they will ever radically change.

While statements about the working class need approaching with great caution, we might agree that fifty years ago the working class lived far more common, collective experiences with regard to their working lives. In the neoliberal outlook found in ELT coursebooks today, there is a shift from the portrayal of social life as it was seen then from a working class perspective – often burdensome, unsatisfying, and structurally rooted in antagonistic relations of exploitation – to one today, where individual agency and personal satisfaction are pursued, cut loose from any associations with class solidarity, structural inequality, or class conflict. On the occasions when working class figures do appear in current coursebooks, they’ve been “neoliberalised”; in other words, they’re no longer connected to one another through any class location, but rather, they’re depicted as individual actors within an impersonal free market.

Copley selects these coursebooks from 1975–1982

Industrial English (Jupp & Hodlin, 1975)

Strategies (Abbs et al., 1975)

Work and Play (Centre for British Teachers, 1977)

Challenges (Abbs & Sexton, 1978)

Streamline English (Hartley & Viney, 1978)

Opening Strategies (Abbs et al., 1982)

And these from 1998–2014

Cutting Edge Intermediate (Cunningham & Moor, 1998)

New Cutting Edge Intermediate (Cunningham & Moor, 2005, 2013)

New Headway Pre-Intermediate (Soars & Soars, 2000)

New Headway Intermediate (3rd ed.; Soars & Soars, 2003)

New Headway Intermediate (4th ed.; Soars & Soars, 2013)

Open Mind Intermediate (Taylore-Knowles & Taylore-Knowles, 2014)

Copley’s discussion of the earlier coursebooks is too interesting for me to attempt any quick summary, so suffice it to say that he sees them as genuine attempts to provide the expanding ELT market with books that “fully acknowledged the collective and resilient nature of working-class experience”. Copley contrasts these early books with the examples from the modern era, which have moved away from depictions of collective experience to a world where explicit class identity and any acknowledgment that conflict arises out of class location has vanished, replaced by a world where “the individual acts alone in a massified world that has no social group interests and therefore does not prohibit the imposition of the individual will” (Dendrinos, 1992, p. 156).

The modern coursebook focuses on individuals who are essentially unconstrained by material considerations. Cutting Edge Intermediate (Cunningham & Moor, 1998) presents a classic example of this in a description of a group of friends who have recently graduated from university in a unit entitled “Making Plans.” They are portrayed as attractive, carefree individuals, assured of future success, whose only real concern seems to be whether to go traveling before embarking on a high-flying career:

Dan’s parents, who are both lawyers, really want him to become a lawyer too, but he isn’t so sure. He’s about to go on holiday to think things over…This is Eliza. She’s hoping to work in fashion, ideally she’d like to be a fashion editor for a glossy magazine…Ahmand’s just finished a Business Studies course and intends to work in Personnel Management eventually, but first she’s decided to go travelling for a while…This is me, Richard. I have no real plans at the moment. I’m thinking of going abroad for a while. But basically I just seem to enjoy being with all my friends! (Cunningham & Moor, 1998)

No one in the group seems in the least concerned with issues such as student debt, finding work, or affordable housing. No mention is made of the high percentages of graduates who can only find low-skilled employment and can’t repay their student loans. One nod towards reality comes when Heather, who did Drama Studies, mentions that “she’s working at the moment as a waitress,” but we’re quickly reassured that “she’s also doing lots of auditions, and she’s determined to be a star one day.” Copley notes that this particular edition of Cutting Edge was published around the same time as widespread student protests about poverty in the United Kingdom, with the National Union of Students reporting that around ninety per cent of their members were employed at some stage during their courses, the majority working for less than the legal minimum wage.

In the neoliberal coursebook there is virtually no regard paid to even the possibility that working class occupations might involve economic hardship, physical or emotional stress, unfair treatment, or even mild dissatisfaction. In New Headway Intermediate (Soars & Soars, 2012), students are asked to interview each other about their own jobs and then report back to the class, with reporting prompts that include such sentence stems as “He likes his job because…” There are no prompts to scaffold discussion about why someone doesn’t like their job. In the one text in this edition of New Headway that does touch upon economic and social hardship we are introduced to the Kamau family from Kenya. Boniface, his wife Pauline, and their two young daughters live in a two- bedroom apartment. Boniface works as a taxi driver in his battered Toyota, while his wife is a dressmaker and currently unemployed. We are told

his salary doesn’t go far. Rent is 30 pounds a month and he gives the same amount to his parents, who don’t work (Soars & Soars, 2012).

Students are not, however, invited to consider why such a situation might exist. Instead, we are predictably reassured with the prospect of a happy ending borne of individual entrepreneurship:

Next year, Sharon (the youngest daughter) is going to prep-school, so Pauline will have more time to start her own business. By then, the family might have a new home…Boniface plans to build a three-bedroom house in the suburbs of Nairobi. (Soars & Soars, 2012)

In New Headway Pre-Intermediate (Soars & Soars, 2000), a unit titled “Living in the USA” focuses on the stories of a number of recently arrived immigrants, including Roberto from Mexico:

At first he missed everything – the sunshine, the food, his girlfriend. But now he has a successful business with his three brothers and his sister…Roberto’s girl- friend is now his wife, and they have two children who go to American schools. (Soars & Soars, 2000)

The possibility that immigrants might experience problems more serious than homesickness, for example discrimination, state harassment, access to decent housing, low-paid work and so on, is simply off the radar. In this neoliberal fantasyland, the very real problems faced by immigrants to a country like the United States, which could well give scope to interesting discussions and access to useful language acquisition, are simply airbrushed out of the frame. And rather than collective struggle for better conditions, if there are problems then the answer, according to New Headway, is consumerism. In the same unit we are also introduced to Endre, from Hungary, who felt similarly homesick at first, but “started to feel happy when I bought a car”; and a young woman from Hong Kong, who works in Madison Avenue as a publisher and loves the department stores and cosmopolitan restaurants. The texts are followed by anodyne discussion prompts in which none of the issues raised are problematized or investigated in any serious fashion.

This emphasis on individualized agency is also the key to understanding the real limitations of the supposed feminization of coursebooks. Although strong, independent women are represented in these books, they tend to be middle class, with relatively high levels of personal control over their lives. In New Headway Intermediate (Soars & Soars, 2003) we read about Judy, who works for a computer company and spends her time jetting around in first class from one executive meeting to the next, before arriving back home in time to “put the baby to bed” (Soars & Soars, 2003). There is also a reading text about Karen Saunders, who has her own travel company in upmarket Mayfair in London that “sends people all over the world on their dream holidays” (Soars & Soars, 2003). We learn that she will soon be traveling to Canada to stay in an Ice hotel, then to Dubai to stay at “the spectacular Burj al-Arab, and then she’s off to Tanzania for a seven-day safari” (Soars & Soars, 2003).

One author of a number of successful globally marketed coursebooks (Bednáriková, 2014) explained the rationale for a series entitled Open Mind in the following way:

As we put the course together, we realised that there was a very important area that we felt we had to include: life skills. Our young adult students often lack the key skills they need to use their English effectively in their professional lives, in their social lives and in their academic lives.

The project is thus to

pedagogically refashion ourselves…If we could only make ourselves better, faster, stronger, smarter, etc., in short, get our training and education right, our bright futures would once again be assured (Blacker, 2013, p. 3).

This message is continuously reinforced in the neoliberal coursebook, as in the 2013 edition of Cutting Edge Intermediate (Cunningham & Moor, 2013), which contains a unit on the topic of getting a job, titled “Go For It!” In the introductory text, “business guru” Heinz Landau suggests that job seekers should spend time on “personal improvement”, because

if you work hard on your job, you can make a living. But if you work hard on yourself, you can make a fortune.

Today’s coursebooks thus frame communication skills as formalised, measurable assets to offer employers. Human communication is now seen as essential to the organization of work which is driven by intensified competition, and has spread from jobs where communicative inter-action is central, such as service sector jobs, to encompass virtually all occupations. In the intermediate-level Open Mind coursebook, Don Dawson works for an advertising company and loses an account with an airline called Jet Stream, who face a very poor safety record. Don says:

In response to this, my team and I decided that Jet Stream needed to build an image of safety. (Taylore-Knowles & Taylore-Knowles, 2014).

The question is not whether a corporation should stop putting profit before safety, but how best, through “effective communication” to sell a more positive image of the brand. The ethical implications are left unexplored, the text focusing entirely on Don’s faulty “communication skills” which failed to sell the idea to Jet Stream. Students are then “given the edge” by exploring what Don should have said, with the illusion that they’re already progressing in the atomized, cut throat world of the 21st century job market place.

Conclusion

To reiterate the main theme of this study, to fully understand the development of ELT coursebooks one must link it to wider social forces, the development of ELT as an industry, and the nature of commodities. The economic importance of a global ELT industry was acknowledged as far back as 1956, when a UK Ministry of Education report described English, perhaps for the first time, as a “commodity” and a “valuable and coveted export” (as cited in Pennycock, 1994, p. 155). By the 1980s, ELT had indeed become a global commercial concern and has seen continued growth since. Positioned as a “major international service industry” (Chun, 2010, p. 12), it has produced enormous profits for an interlocking teaching, testing and publishing hydra, largely reliant upon the worldwide marketization of education. Coursebooks promote and reinforce the perceived link between English and the notions of individual success and consumerism that underpin neoliberal ideology. They not merely reflect a neoliberal zeitgeist, in many respects they are strategically positioned within it.

At the end of the 1960s there was a growing recognition among ELT practitioners that rehearsing formulaic exchanges could not meet learners’ needs, and that language should be seen not as a simple set of structure-habits, but rather as “a vehicle for the comprehension and expression of meanings” (Howatt, 1984, p. 280). This new approach was also, in its best instances, grounded in a principled and humanistic rejection of behaviorist pedagogy and informed by a wider democratic vision of what education was for, often emerging from the experience of community education and, in some cases at least, political and social activism (Rixon & Smith, 2012).

Today, in contrast, despite the self-legitimizing discourse of inclusivity found in coursebooks, the commercial interests in charge of ELT are not particularly concerned with the majority of the world’s population, those who find themselves at the bottom of the economic pyramid. Their target is the new urban middle class, with the disposal income to buy into “brand English.” As Bauman (1990) noted concerning the nature of all commodities,

they have a price-tag attached to them. These tags select the pool of potential customers … Behind the ostensible equality of chances the market promotes and advertises hides the practical inequality of consumers (p. 211).

The extent to which students, the consumers of the product, are critical of the content of ELT coursebooks is a sadly underresearched area, and would certainly be a productive area of research for future study. Teachers, as the mediators in this process, are often uncomfortable with the cultural and political messages embedded in the materials they are obliged to use, and do their best to facilitate spaces for more critical interpretation and adaptation of content. This is, of course, to be welcomed and encouraged. Ultimately, however, there needs to develop a more overtly politicized awareness of the questionable role of such materials, to more effectively challenge both the current hegemony of the neoliberal coursebook as well as many of the wider practices of the ELT industry and the structures of inequality and power that sustain and reinforce them.

References

Auerbach, E. R. (1995). The politics of the ESL classroom: Issues of power in pedagogical choices. In J. W. Tollefson (Ed.), Power and inequality in language education (pp. 9–33). Cambridge, UK: Cambridge University Press.

Share this:

Like this:

In a special issue of the L2 Journal (2015), various scholars offer “Critical Perspectives on Neoliberalism in Second/Foreign Language Education”. While the quality of writing in these articles is not always the very highest, I think the arguments they put forward are irrefutable. Below is a brief summary of the introductory paper, consisting of quotes, paraphrasing and my own additions. Six features of today’s second/foreign language education industry are identified, as follows:

1. Language as a technicized skill

Language is seen as a commodified, technicized skill (Duchêne & Heller, 2012; Heller, 2010) and individuals are seen as human capital, developed through the acquisition of skills. Language skills lead to social mobility and economic development, and language becomes essential in order to compete in the global economy. Decisions about which languages to teach and to learn; when, where, and how to teach them depend on the market.

2. Culture as a commodity

As language becomes a job skill, akin to knowledge of spreadsheets or word processing, culture is increasingly mythologized (Barthes, 1972) as a product used to market nation-states and to encourage learners to cultivate desires to consume. For example, the Eiffel Tower becomes the symbol of Paris that denotes the romantic atmosphere of the city. Food such as pasta, tacos, sushi, and kimchi are introduced as the representation of authentic, traditional culture. Natural environments including mountains and beaches are not simply to be appreciated but to be viewed as commodities to be developed, advertised, and sold. This conceptualization of culture implements “a tourist gaze” (Kramsch & Vinall, 2015) which is carefully modelled in the layout, graphics and texts used in coursebooks.

3. Language teachers as expendable and replaceable knowledge workers

Teachers are no longer salaried professionals whose job is to help learners psychologically, socially and intellectually to become more mature individuals. Rather, teachers are increasingly zero hour contract workers paid a minimum hourly rate, with no job security, sickness or pension rights, by those who control the language skills industry. They have been converted into expendable and replaceable knowledge workers, as demonstrated by the increasing reliance on this type of staff in language schools and in higher education in general.

4. Language learners as entrepreneurs and consumers

Learners are pushed to choose languages that will make them more competitive: what language you speak and what culture you embody demonstrate your market worth. Thus, learning a language becomes an act of investment. Within the classroom students also practice participation in the market. Coursebooks emphasize routinized, truncated dimensions of language used in particular settings (e.g., socialising, shopping, travelling, business interaction) and stereotypified culture. Learners are encouraged to see social phenomena as transactions, to maximize their self-interests, and to contribute to the global economy with their language skills.

5. The creation of a global language teaching industry

While language teachers are treated as expendable and replaceable knowledge workers, paradoxically, language teaching has become highly profitable and increasingly privatized. According to a report by the British Council (2015), the global market for English language learning alone is worth around US $200 billion. The global language teaching industry presents language in prepackaged, standardized forms in response to the needs of the free market. Rosetta Stone, for instance, advertises that they teach more than 30 languages around the world online (or through a CD) and that one can be fluent in a language in three months. In addition to these corporations, nation-states, including the UK (through the British Council), Mainland China (through the Confucius Institute), Germany (through the Goethe Institut), France (through the Alliance Française), and the United States continue to invest large amounts of resources to promote their languages and cultures globally.

Teacher training is part of the huge money-spinning industry. In the USA, a bachelor of arts or science degree is usually a prerequisite for doing a specialised course, such as a Masters in TESOL or in applied linguistics, or a TEFL certficate. In Europe, a university degree is not a pre-requisite. The University of Cambridge Local Examinations Syndicate (UCLES) offers the most popular course: CELTA, while Trinity College, London, offers the rival Cert TESOL. Other options are Masters courses and DELTA. As the British Council report puts it (2015, p.9),

although there are some 12 million English teachers active in the world today, this masks a huge global shortage.

The shortage has generated what John Knagg of the British Council referred to as

an almost insatiable demand for qualified English language instructors accross the globe.

All these teachers – hundreds of thousands of them – are trained to teach English by using coursebooks. While many of those who design, write, and supervise the training become rich, only a minority of the teachers will find well-paid, secure, satisfying jobs.

Global English

To those currently running the ELT industry, the fact that English is the global lingua franca is a “good thing”. Just about everybody giving presentations during the 2018 Conference Season sees the spread of English as a liberating, empowering, democratizing force in the world; a way of evening the playing field by providing greater access to knowledge and opportunities to all those it reaches. Those in charge of the ELT industry (the British Council, the publishing companies, the training bodies, the examination bodies, the big private school chains), and their paid spokespeople (the managers, writers, trainers, conference stars) see little wrong with the current state of the industry. For them, the commodification of ELT, with its coursebooks and high stakes exams and CELTA training courses; its huge profits for the few and precarious conditions and pay for so many, might not be perfect, but it’s still, as Penny Ur might say, the most sensible, most practical way to organise things. The conferences will be generally up beat; the usual suspects will talk about newer, improved ways of doing more or less the same thing, and everybody will somehow convince themselves that, contrary to the evidence, little by little, things are getting better and better.

Meanwhile, those contributing to the special edition of L2 Journal take a more critical perspective on the global spread of English. They see it as a reflection of complex processes of globalization which lead to the privileging of elites, a widening gap between rich and poor, and linguistic as well as cultural homogenization. This, in turn, results in cultural loss and threatens the vitality and survival of local languages. With May (2011, p. 213), they call attention to “the relationship between English and wider inequitable distributions and flows of wealth, resources, culture and knowledge— especially, in an increasingly globalized world”. As evidence, they point to the experience of all those people who learn English in hopes of moving to an English-speaking country, but are then denied access, and of all those people who have been let down by those who sold them English.

The global spread of English is more hegemonic than democratic, it oppresses more than it liberates, it threatens more than it empowers, it serves the interests of a small minority of the world’s population and betrays the interests of the rest. Again and again throughout the special issue of the L2 Journal, studies report on participants who were let down by the false promise of learning English (Kubota, 2011). Most of the subjects of these studies set out to learn English so as to improve their job opportunities, or to gain respect in the workplace, or admission to top universities, or participation in the global marketplace, and most found that these rewards never materialized.

Directly tied to this false promise of English is the notion that gains made in one context are not recognized in another, as particular varieties or repertoires of English are valued differently in different markets or fields. In one study, Gao and Park point out, for example, that

the English learned by young South Koreans living in Singapore is not valued in South Korea, a great disappointment to their mothers who sought standard English for their children only to find that they had come home speaking “Singlish” (Berstein, et al, 2015, p. 12).

Similarly, in another study, Jang points out how the communicative skills sought out overseas by South Korean students do not, in the end, trump the TOEIC exam results.

South Korean students who return from studying in Canada have difficulty documenting their new skills in ways that are meaningful on the job market, although it was the demands of the job market—for workers who show flexibility, collaboration, and global sensitivity—that sent them to Canada in the first place (Berstein, et al, 2015, p. 12).

In a third study, Hsu notes that

although many Filipinos are native speakers of English and are marketed as such by the Philippine government in its bids to attract corporate call centers to their country, when Filipinos arrive in the United States, they are seen as foreigners and English learners, with incomprehensible accents (Berstein, et al, 2015, p. 12).

The issue concludes with two articles dedicated entirely to modeling approaches to resistance. Davis and Phyak’s paper illustrates how researchers in various contexts can work with local populations to make changes in hegemonic language policies and practices. Ramírez and Hyslop-Margison’s manuscript provides specific tools for deconstructing texts that draw their authority from hegemonic discourses—in their case, those of crisis and neoliberal austerity.

Together, the papers in this special issue move beyond “critique;” they take us toward action, toward alternative discourses, and toward other possibilities for imagining language in education (Berstein, et al, 2015, p. 13).

Share this:

Like this:

In the last thirty years, Penny Ur has published more than 30 books: coursebooks, work books; grammar practice books, skills practice books and “How to Teach” books. In all that time she has never wavered in her support of the same approach to ELT that she was taught when she did a PGCE at Cambridge University all those years ago.

What’s remarkable is that today, Ur continues to recommend, as keenly as she ever did, the same carefully controlled, anodyne routines that the PGCE course recommended way back then. According to this view, teachers, wherever they happen to be in the world, should use a coursebook produced in London to deliver a synthetic, grammar-based syllabus by working their way steadily through a succession of Units where “language items” are “presented, practiced and tested”, until they come to the bit of the book at the back with no writing on it, when they should stop.

Given that coursebooks have been adopted around the world as the preferred way of implementing ELT since the early 1990s, we may say that Ur’s faith in the coursebook-driven approach has been vindicated. Certainly, her tireless, consistent promotion of the same cause has won her a fair amount of success, fame and recognition, which includes being awarded an OBE (Officer of the Order of the British Empire) for services to English Language Teaching in 2013.

I happen to think that the approach championed by Ur is wrong, partly because it’s based on false assumptions about how people learn a foreign language (see this post for more about these false assumptions), and partly because it represents a stifling orthodoxy which has gone hand in hand with the commodification of ELT in particular and education in general. It fits perfectly with the general drive towards the implementation of ‘adaptive learning’ programmes which reduce education to the learning of discrete units of testable ‘knowledge’, delivered with minimum mediation by teachers. The result is the de-skilling of teachers, the reconfiguring of learners as consumers, and, as Scott Thornbury so memorably put it, to Comfort. Complacency. Conformity. Professional atrophy. Institutional malaise. Student boredom. Slow death by mcnuggets.

Whichever side of the argument you’re on, you’ll surely agree that the powerfully entrenched, coursebook-driven model of ELT should at least be open to criticism, and that it’s a “good thing” for there to be open discussion about how best to help people learn English as an L2. Even if everybody were happy to implement the kind of ELT recommended by Penny Ur, in the name of professionalism teachers should at least know something about on-going research into the English language, how people learn English as an L2, and how various teaching programmes have been evaluated.

And here’s where we hit a problem, because Penny Ur, apart from staunchly defending coursebook-driven ELT, also promotes herself as a mediator between the academic world of applied linguistics and the classroom teacher; able, she claims, to reliably inform teachers about what’s going on in academia, despite the fact that she has no credentials for such a job. Ur has never published an article in an academic journal, she shows few signs of knowledge of the SLA literature, and she consistently dismisses significant research findings when they challenge her own approach to teaching. Ur tells readers of the UK Guardian newspaper, and of the ELT Gazette, and all those who attend her teacher training courses and conference presentations about what’s going on in applied linguistics research, while at the same time admitting that she misses a lot of what’s published, and breezily dismissing the inconvenient mountain of data which point to the fact that students don’t learn what they’re taught if they’re subjected to a synthetic, grammar-based syllabus. When asked, for example, “Why don’t you mention the research findings on interlanguage development?”, Ur replies “We have no conclusive proof” (Ur, 2017a), as if all the evidence that we do have counts for nothing.

Ur’s claim to be able to mediate between the world of academic research on the one hand, and the world of the classroom teacher on the other, is not just unwarranted, it’s also misleading and unfair, especially to novice teachers who assume that Ur knows what she’s talking about when she tells them, for example, that there is no evidence that TBLT works, or that Pienemann’s teachability hypothesis has only very doubtful implications for teaching. In her article in the Guardian (2012) Ur makes her disdain for most of what passes for academic work perfectly clear. First, Ur says, academics concentrate “almost exclusively on language acquisition”; second, the studies reported on “are selected for reasons that have nothing to do with their usefulness to the practitioner”; third, “topics that are difficult to research, though possibly more valuable for the teacher, tend to be neglected”; and finally:

researchers are not practitioners. Many have very limited or nonexistent teaching experience so their ideas on the pedagogical implications of their results may not be very practical and need to be treated with caution.

Notice that while Ur has no doubts about her own ability to speak on academic matters, she cautions against giving any credence to academics’ ideas on teaching.

In her books on how to teach English as a foreign language, Ur spends very little time discussing the question of how people learn an L2, or encouraging teachers to take part in a critical evaluation of theoretical assumptions underpinning her practical teaching tips. The updated edition of her widely recommended A Course in Language Teaching includes a new sub-section where precisely half a page is devoted to describing theories of SLA. For the rest of the 300 pages, Ur expects readers to take her word for it when she says, as if she knew, that the findings of applied linguistics research have very limited relevance to teachers’ jobs. Nowhere in any of her books or articles or presentations does Ur attempt to seriously describe and evaluate arguments and evidence from academics whose work challenges her approach, and nowhere does she encourage teachers to do so.

Ur’s work is evidence of the distinction Richards (2008) makes between two broad streams in teacher education: the first at the certificate level, where trainees receive instruction in classroom skills, and the other, ‘teacher development’, where teachers learn more about second language acquisition. How can we expect teachers to be well-informed, critically acute professionals in the world of education if their training is restricted to instruction in classroom skills, and their on-going professional development gives them no opportunities to consider theories of language, theories of language learning, and theories of teaching and education? Can we really afford to agree with Ur’s view that there’s nothing broken in teacher training in ELT?

Here are a few excerpts from Ur’s books and articles. Note that the first 4 quotes are from the 1991 edition, updated in 2009, of A Course in Language Teaching.

1. Ur, P. (1991, p. 10)

In principle, the teaching processes of presenting, practising and testing correspond to strategies used by many good learners trying to acquire a foreign language on their own. …………….

In the classroom it is the teacher’s job to promote these three learning practices by the use of appropriate teaching acts.

Comment: Notice the careful hedging of the first claim (“In principle”, “strategies used by many good learners” ) and the sweeping non-sequitur that follows. This is a good example of Ur’s argumentation.

2. Ur, P. (1991, p. 12)

The learners need to take the material into short-term memory; to remember it, that is, until later in the lesson when you and they have an opportunity to do further work to consolidate learning.

Comment: The duration of short-term memory is between 15 and 30 seconds.

3. Ur, P. (1991, p. 14)

Note than some learners remember better if it is seen, others if it is heard, yet others if it is associated with physical movement (visual, audio and kinaesthetic input)…..

Comment: There is, of course, no evidence to support the theory of NLP or the notion of learner styles; it’s all been thoroughly debunked.

4. Ur, P. (1991, p. 26) RE a Spelling Activity.

The students remarked afterwards that the activity had helped to fix the spellings in their minds and the teacher noticed that this was borne out by their subsequent performance in free writing.

Comment: Any doubts about the weight of this “evidence” will no doubt come from academics whose opinion can be safely ignored, since they know nothing about real classroom practice.

5. Ur, P. (2012)

Teaching grammar proactively through traditional focus on formS is effective.

Comment: No ifs, no buts, it’s effective. So there.

6. Ur, P. (2017b)

There is no evidence that TBLT works.

Comment: There have been over 60 studies of TBLT published in academic journals in the last 15 years. The vast majority of them report an overall positive and strong effect for TBLT implementation on a variety of learning outcomes. Furthermore, both the quantitative and qualitative data show positive stakeholder perceptions towards TBLT programmes.

7. Ur. P. (quoted by Thornbury, 2017)

It’s certainly possible to write helpful and valid professional guidance for teachers with no research references whatsoever.

Share this:

Like this:

How do we help people learn an L2? A major finding of SLA research is that learners of an L2 cannot be taught what they’re not ready to learn, because they’re all at some particular point in the development of interlanguages which are impervious to instruction. This suggested to many that we should help learners along their trajectory by finding out what their needs in the L2 are and then engaging them in relevant communicative tasks. Some brief, carefully-measured attention, now and then, to relevant aspects of the grammar is seen as an important way to speed up the development.

Then, along comes Schmidt and suggests that consciously ‘noticing’ formal features of L2 input is a necessary condition for learning, and this is taken by proponents of synthetic syllabuses which deliver bits of grammar or an endless succession of lexical chunks to mean that lots of explicit grammar and/or vocabulary teaching will help learners to ‘notice’ and to ‘notice the gap’.

Recall that in his original 1990 paper, Schmidt claimed that “intake” was the sub-set of input which is noticed, and that the parts of input that aren’t noticed are lost. Thus, Schmidt’s Noticing Hypothesis, in its 1990 version, claims that noticing is the necessary condition for learning an L2.

‘Noticing’ is said to be the first stage of the process of converting input into implicit knowledge. It takes place in short-term memory (where, according to the original claim, the noticed ‘feature’ is compared to features produced as output) and it is triggered by these factors: instruction, perceptual salience, frequency, skill level, task demands, and comparing.

Criticisms of Schmidt’s hypothesis:

1. It fails to distinguish carefully enough between attention and awareness

In reply to Schmidt’s argument that attention research supports the claim that consciousness is necessary for learning, Truscott (1998) points out that such claims are “difficult to evaluate and interpret”. He cites a number of scholars and studies to support the view that the notion of attention is “very confused”, and that it’s “very difficult to say exactly what attention is and to determine when it is or is not allocated to a given task. Its relation to the notoriously confused notion of consciousness is no less problematic”. He concludes (1998, p. 107) “The essential point is that current research and theory on attention, awareness and learning are not clear enough to support any strong claims about relations among the three.”

2. Empirical support for the Noticing Hypothesis is weak

Truscott (1998) points out that the reviews by Brewer (1974) and Dawson and Schell (1987), cited by Schmidt, 1990), dealt with simple conditioning experiments and that, therefore, inferences regarding learning an L2 were not legitimate. Brewer specifically notes that his conclusions do not apply to the acquisition of syntax, which probably occurs ‘in a relatively unconscious , automatic fashion’ (p . 29).

Truscott further points out that while most current research on unconscious learning is plagued by continuing controversy, “one can safely conclude that the evidence does not show that awareness of the information to be acquired is necessary for learning” (p. 108).

Altman (1990) gathered data in a similar way to Schmidt (1986) in studying her learning of Hebrew over a five-year period. Altman found that while half her verbalisation of Hebrew verbs could be traced to diary entries of noticing, it was not possible to identify the source of the other half and they may have become intake subconsciously.

Alanen’s (1992) study of Finnish L2 learning found no significant statistical difference between an enhanced input condition group and the control group.

The studies are not comparable due to variations in focus and in the conditions operationalized.

The level of noticing in the studies may have been affected by variables which casts doubt on the reliability of the findings.

Cross (20o2) notes that “only Schmidt and Frota’s (1986) and Altman’s (1990) research considers how noticing target structures positively relates to their production as verbal output (in a communicative sense), which seems to be the true test of whether noticing has an effect on second language acquisition. A dilemma associated with this is that, as Fotos (1993) states, there is a gap of indeterminate length between what is noticed and when it appears as output, which makes data collection, analysis and correlation problematic.”

Ahn (2014) points to a number of problems that have been identified in eye-tracking studies, especially those using heat map analyses. (See Ahn (2014) for the references that follow.)Heat maps are only “exploratory” (p. 239), and they cannot provide temporal information on eye movement, such as regression duration, “the duration of the fixations when the reader returns to the lookzone” (Simard & Foucambert, 2013, p. 213), which might tempt researchers to rush into a conclusion that favors their own predictions. Second, as Godfroid et al. (2013) accurately noted, the heat map analyses in Smith (2012) could not control the confounding effects of “word length, word frequency, and predictability, among other factors” (p. 490). This might have yielded considerable confounding effects as well. As we can infer from the analyses shown in Smith (2012), currently the utmost need in the field is for our own specific guidelines for using eye-tracking methodology to conduct research focusing on L2 phenomena (Spinner, Gass, & Behney, 2013). Because little guidance is available, the use of eye tracking is often at risk of misleading researchers into making unreliable interpretations of their results.

Schmidt re-formulated his Noticing Hypothesis in 2001. He begins by saying that to minimise confusion, he will use ‘noticing’ as a technical term equivalent to what Gass (1988) calls “apperception”, what Tomlin and Villa (1994) call “detection within selective attention,” and what Robinson’s (1995) calls “detection plus rehearsal in short term memory.” What is noticed are now “elements of the surface structure of utterances in the input, instances of language” and not “rules or principles of which such instances may be exemplars”. Noticing does not refer to comparisons across instances or to reflecting on what has been noticed.

In the section “Can there be learning without attention?”, Schmidt admits there can, with the L1 as a source that helps learners of an L2 being an obvious example. Schmidt says that it’s “clear that successful second language learning goes beyond what is present in input”. Schmidt presents evidence which, he admits, “appears to falsify the claim that attention is necessary for any learning whatsoever”, and this prompts him to propose the weaker version of the Noticing Hypothesis, namely “the more noticing, the more learning”.

Apperception

As was mentioned, Schmidt (2001) says that he is using ‘noticing’ as a technical term equivalent to Gass’ apperception. True to dictionary definitions of apperception, Gass defines apperception as “the process of understanding by which newly observed qualities of an object are initially related to past experiences”. The light goes on, the learner realises that something new needs to be learned. It’s “an internal cognitive act in which a linguistic form is related to some bit of existing knowledge (or gap in knowledge)”. It shines a spotlight on the identified form and prepares it for further analysis. To me, this clashes with Schmidt’s insistence that noticing does not refer to comparisons across instances or to reflecting on what has been noticed, and in any case, it is not at all clear to me how the subsequent stages of Gass’ model convert apperceptions into implicit knowledge of the L2 grammar.

Schmidt says that ‘noticing’ is also equivalent to what Tomlin and Villa (1994) call “detection within selective attention.” But it seems to me that ‘noticing’ isn’t at all equivalent to what Tomlin and Villa really wanted to talk about – detection that does not require awareness. According to Tomlin and Villa, the three components of attention are alertness, orientation, and detection, but only detection is essential for further processing and awareness plays no important role in L2 learning.

In the 2010 paper, Schmidt confirms the concessions which amount to saying that ‘noticing’ is not needed for all L2 learning, but that the more you notice the more you learn. He also confirms that noticing does not refer to reflecting on what is noticed.

You can’t notice grammar

Finally, we get a glimpse of an answer to Gregg’s crucial question about how we get from ‘noticing’ to the acquisition of linguistic competence in Schmidt’s 2010 paper, where he deals with Suzanne Carroll’s objection to his hypothesis. Schmidt succinctly summarises Carroll’s view that attention to input plays little role in L2 learning because most of what constitutes linguistic knowledge is not in the input to begin with. She argues that Krashen, Schmidt and Gass all see “input” as observable sensory stimuli in the environment from which forms can be noticed,

whereas in reality the stuff of acquisition (phonemes, syllables, morphemes, nouns, verbs, cases, etc.) consists of mental constructs that exist in the mind and not in the environment at all. If not present in the external environment, there is no possibility of noticing them.

Schmidt’s answer is:

In general, ideas about attention, noticing, and understanding are more compatible with instance-based, construction-based and usage-based theories (Bley-Vroman, 2009; Bybee & Eddington, 2006; Goldberg, 1995) than with generative theories.

Which is not much better than no answer at all. Carroll effectively answers Gregg’s question by saying that all those who start with input, following Krashen, get things backwards. I offered this quote from Carroll (2001, p. 11) at the end of Part 2:

The view that input is comprehended speech is mistaken and has arisen from an uncritical examination of the implications of Krashen’s (1985) claims to this effect. …… Comprehending speech is something which happens as a consequence of a successful parse of the speech signal. Before one can successfully parse the L2, one must learn it’s grammatical properties. Krashen got it backwards!”

Learners do not attend to things in the input as such, they respond to speech-signals by attempting to parse the signals, and failures to do so trigger attention to parts of the signal. Carroll’s Autonomous Induction Theory is too complicated for me to offer a brief summary of, but in my opinion, Carroll’s assertions that it is possible to have speech-signal processing without attention-as-noticing or attention-as-awareness are persuasive. She argues that learners may unconsciously and without awareness detect, encode and respond to linguistic sounds; that learners don’t always notice their own processing of segments and the internal organization of their own conceptual representations; that the processing of forms and meanings are often not noticed; and that attention is the result of processing not a prerequisite for processing.

In brief:

The Noticing Hypothesis even in its amended version does not clearly describe the construct of ‘noticing’.

The empirical support claimed for the Noticing Hypothesis is not as strong as Schmidt (2010) claims.

A theory of SLA based on noticing a succession of forms faces the impassable obstacle that, as Schmidt seemed to finally admit, you can’t notice rules or principles of grammar.

“Noticing the gap” is not sanctioned by Schmidt’s ammended Noticing Hypothesis.

The way that so many writers and ELT trainers use “noticing” to justify all kinds of explicit grammar and vocabulary teaching demonstrates that Scmidt’s Noticing Hypothesis is widely misunderstood and misused.

There we are then. My attempt to understand Schmidt’s propositions remind me of Wittgenstein’s famous conclusion to his Tractatus

My Propositions serve as elucidations in the following way: anyone who understands me eventually recognizes them as nonsensical, when he has used them — as steps — to climb beyond them. (He must, so to speak, throw away the ladder after he has climbed up it.)

Schmidt, R. and Frota, S.N. (1986) Developing basic conversational ability in a second language: a case study of an adult learner of Portuguese . In Day , R.R., editor, Talking to learn: conversation in second language acquisition. Rowley, MA: Newbury.

These conferences ignore the increasing commodification of the education industry.

Teachers are seen as the deliverers of courses which reduce education to the learning of a pre-determined list of discrete units of testable ‘knowledge’.

This “knowledge” is delivered by coursebooks with minimal mediation from de-skilled teachers, and it turns learners into consumers.

The ‘knowledge’ is tested by a raft of high-stake exams which reify that ‘knowledge’.

Most learners of English as an L2 fail the exams they take.

Most teachers of English as an L2 have no job security, no proper contracts, no professional development provisions, no pension plans, bad pay, and bad working conditions.

With this backdrop, once again we’re about to witness the sorry spectacle of the usual suspects strutting their stuff, spinning their tired recipes, while ignoring the failure of ELT as an industry to address the problems alluded to above.

The conferences will be sponsored by those who make huge profits from the on-going provision of shoddy goods. These providers – the big publishing companies, the British Council, the Cambridge Clan who control proficiency exams and teaching certification – crucially influence the conference agenda, they make sure that the plenary speakers talk safely within carefully defined limits, and they sponsor speakers to promote their products. It’s very unlikely that anything challenging will be said, and if it is, it will be said in a back room. Just about the only chance of any controversial stuff being widely aired is that one of these back room talks will be picked up by social media, without any support from the organisers.

Judging from data from questionnaires, teachers feel that the best part of their conference attendance is the chance to share time with their colleagues. Get together, swap ideas, tell stories, support each other, wine, dine, dance and feel good together. But why depend on conferences like IATEFL and TESOL to do these things? Why should we continue to go to these events where we’re lectured by highly paid clowns pushing dodgy goods, where no attempt to seriously address the issues that need talking about is made? I’m labouring the point now, but why should thousands of teachers pay to listen to Jeremy Harmer promoting the Pearson Academic English exam by describing how he learned to play the tuba, rather than pay a tenth of the price to share in a conversation with someone local who can talk knowledgeably about language testing?

Why support these commercial events with their shows and shoddy spectacles that reflect the commodification of education, instead of organising our own events?

Hundreds of anecdotes about “What I got from a conference” should not deflect attention from this simple argument: the aim of the big ELT conferences is to promote coursebook-driven ELT and to encourage the mistaken view that there’s nothing fundamentally wrong with the way things are going. Locally organised events are a credible, better alternative.

Share this:

Like this:

If I’d actually drunk a bottle of tequila while trying to understand Schmidt’s Noticing Hypothesis last Tuesday, I would have woken up with a hangover, and these days the hangovers are so bad that I just can’t face them. So when I woke up the morning after, all was well; my surroundings were familiar, my wife was with me, there was nothing to make amends for. Reassuring, of course, but I confess to feeling nostalgia for my younger days. There’s nothing quite like the fun you have drinking; the Devil has all the best songs, they say, and I bet Hades had all the best cocktails. Easy to imagine getting the ferry across the Acheron, sitting around the lounge bar waiting to see where you were going (probably not to the Elysian fields!), banging back dry martinis with funny people like W.C. Fields (“I cook with wine. Sometimes I even add it to the food”, and Tommy Cooper (“I’m on a whisky diet. I’ve lost three days already), grateful that you’d never been a mere sober mortal.

Downstairs, I made a nice big mug of tea and took it to the study. There on the desk and on the monitor was all this stuff about the Noticing hypothesis. Not just Schmidt versus Truscott, and Gregg versus Krashen, and all the other SLA feuds, but also the famous Locke versus Leibniz debate and the equally famous Aristotle versus Plato debate about more or less the same thing. Aristotle wasn’t quite an empiricist, but certainly got the better of Plato on epistemology, while Leibniz is generally regarded as coming out on top against Locke. Specially the Leibniz-Locke debate still seems relevant today in the light of the latest challenge to nativist views on language learning, and I think Leibniz might have had some harsh words to say about the blurred lines between awareness, atttention and consciousness in Schmidt’s attempts to develop the Noticing Hypothesis.

Just to reassure those who might be unduly swayed by the likes of Penny Ur (and Scott Thornbury on a bad day) into thinking that they shouldn’t worry their heads with all all this theoretical stuff (just trust your instincts and polish your presentation skills), my motivation for sniffing around this particular theoretical stuff is to check on the foundations of our teaching. It’s a terrible job, the pay’s lousy, but somebody’s got to do it, right? Somebody’s got to check, that is, to see whether ‘noticing’ justifies all the explicit teaching done in its name. I suspect that the influential teacher trainers who rely on ‘noticing’ to justify their encouragement of everything from teaching a grammar-based syllabus to teaching as many lexical chunks as you can cram into a 90 minute class are talking baloney, and it should be made clear that their advice gets no support from any good research. On the face of it ‘noticing’ encourages bad teaching practice, and so needs to be carefully examined.

So here we go with Part 2. I left Part 1 face down on the carpet, exhausted by unsuccessful efforts to understand the Noticing Hypothesis. In the comments that followed, one particular problem was highlighted by Kevin Gregg, who said:

You can’t notice what is not in the input; and rules, for instance, or functions, are not in the input.

This prompted Thom to ask:

In what other way can anybody learn grammar if it is not by way of input?

Kevin’s on-going tussle with time (trains to catch, letters to write, shopping to do) prevented him from replying, so I’ll try.

Well it depends where you’re coming from, as they say. Empiricists, or rather, “‘empiricist’ emergentists” as Gregg calls them would say that input is the sufficient condition for learning an L2, and they’d probably caution against listening to any talk of mental grammars. Empiricists like Nick Ellis see all knowledge as coming from the information we get through our senses during our interaction with the environment, and with reference to language learning, the emergentists argue that we aren’t born with linguistic knowledge of any sort because we don’t need it. General learning devices (capable of making generalisations based on exemplars found in the input, for example) are all we need. In Nick Ellis’ words:

massively parallel systems of artificial neurons use simple learning processes to statistically abstract information from masses of input data. What evidence is there in the input stream from which simple learning mechanisms might abstract generalizations? The Saussurean linguistic sign as a set of mappings between phonological forms and conceptual meanings or communicative intentions gives a starting point. Learning to understand a language involves parsing the speech stream into chunks which reliably mark meaning.

… in the first instance, important aspects of language learning must concern the learning of phonological forms and the analysis of phonological sequences: the categorical units of speech perception, their particular sequences in particular words and their general sequential probabilities in the language….

In this view, phonology, lexis and syntax develop hierarchically by repeated cycles of differentiation and integration of chunks of sequences.

On the other hand, nativists like Kevin Gregg, specially those who accept Chomsky’s principles and parameters UG theory, point to the knowledge young children have of language to argue that SLA is the result of an innate representational system specific to the language faculty acting on input in such a way that an L2 grammar is created. We are born with knowledge of various linguistic rules, constraints and principles. In interaction with the environment, which exposes us to ‘primary linguistic data’, we acquire a new, expanded body of linguistic knowledge, namely, knowledge of a specific language like English. This final state of the language faculty constitutes our ‘linguistic competence’, essential, but not sufficient for our ability to speak and understand a language. Additional knowledge about actual language use is acquired through other general learning mechanisms.

Whatever view we take of the SLA process, the question of how it starts (input) is obviously critical, but re-visiting Schmidt’s Noticing Hypothesis has led me to appreciate that the question of how it ends up is equally important. What finally gets acquired? To answer this question we need what Gregg calls a “property” theory of SLA – a theory of language, or, more precisely, of linguistic knowledge of the L2. What is the knowledge that is acquired when someone learns a second language? O’Grady (2005) notes that while the UG camp talk about problems sorting out categories and structures, the emergentists talk about sorting out words and their meanings, and this leads him to suggest that the disagreement about how we learn an L2 stems from a deeper disagreement about “the nature of language itself”. O’Grady (2005, p. 164) explains:

On the one hand, there are linguists who see language as a highly complex formal system that is best described by abstract rules that have no counterparts in other areas of cognition. …. Not surprisingly, there is a strong tendency for these researchers to favor the view that the acquisition device is designed specifically for language. On the other hand, there are many linguists who think that language has to be understood in terms of its communicative function. According to these researchers, strategies that facilitate communication – not abstract formal rules – determine how language works. Because communication involves many different types of considerations … this perspective tends to be associated with a bias toward a multipurpose acquisition device.

This excellent comment is echoed by Susanne Carroll (2001, p. 47), who distinguishes between

Classical structural theories of information processing which claim that mental processes are sensitive to structural distinctions encoded in mental representations. Input is a mental representation which has structure.

Classical connectionist approaches to linguistic cognition which deny the relevance of structural representations to linguistic cognition. For them, linguistic knowledge is encoded as activated neural nets and is only linked to acoustic events by association.

Carroll comments:

Anyone who is convinced that the last 100 years of linguistic research demonstrate that linguistic cognition is structure dependent — and not merely patterned— cannot adopt a classical connectionist approach to SLA.

O’Grady’s and Carroll’s remark remind me that the majority of scholars who are currently looking closely at how input ends up as knowledge don’t articulate a coherent answer to the crucial question: “What is the linguistic knowledge that is acquired?”. Many years ago, I myself made some effort to kick this question into the long grass. Gregg’s repeated insistence on the need for a property theory of SLA which describes what is acquired, prompted me to say in a book and in an article for Applied Linguistics that researchers could perfectly well get on with developing a theory of SLA without worrying about the damn property theory. In a short reply (I think he had a bus to catch that time), Gregg effortlessly dealt with my bleatings (the bus and, I like to think, our friendship saved me from the full Gregg treatment) and I’m now fully persuaded that he’s right to demand a property theory.

I think it’s the absence of a well-articulated property theory that makes it so difficult for Schmidt and others to explain how information from the environment ends up as linguistic knowledge of the L2. They accept that the knowledge acquired includes linguistic knowledge of, for example, the structure of an English verb phrase, and they insist that learning this knowledge depends on ‘noticing’ things in the input” But how, we must ask again, does ‘noticing’ audio stimuli from the environment lead to the acquisition of the linguistic knowledge demonstrated by proficient L2 users? Let’s take a quick look at the history of SLA research.

The shift from a behaviouristic to a mentalist view of language learning (sparked by Chomsky’s rebuttal of Skinner in 1957) prompted scholars in the field of psycholinguistics to see language learning as a process which goes on inside the brain and involves the workings of some kind of acquisition device. The, as yet unobservable, “black box” that we can refer to as an acquisition device is almost certainly not located in one particular part of the brain, might or might not be dedicated exclusively to language learning, might or might not make use of innate linguistic knowledge, but certainly does (somehow) enable us to receive, organise, store and retrieve, and manipulate ‘input’ so as to facilitate learning the L2.

And there it is: ‘input’. The Merriam-Webster dictionary says that the term was first used in 1953, in the context of computer design, to refer to data sent to a computer for processing. In the study of SLA, Corder (1967) was the first to suggest that we acquire the rules of language in a predictable way, and that the order is independent of the order in which rules are taught in language classes. This led Corder to suggest that there was a difference between input and intake.

The simple fact of presenting a certain linguistic form to a learner in the classroom does not necessarily qualify it for the status of input, for the reason that input is ‘what goes in’ not what is available for going in, and we may reasonably suppose that it is the learner who controls this input, or more properly his intake. This may well be determined by the characteristics of his language acquisition mechanism. (p. 165).

Here, input is what’s available, and intake is what the learner decides to take in. It’s not clear to me what either ‘input’ or ‘intake’ refer to, and anyway, as Schmidt (1990) points out, Corder contradicts himself by saying in the first sentence that the learner controls intake, and by then saying in the second sentence that his language acquisition mechanism does. More importantly for our hunt, Schmidt goes on to say that it’s not clear whether intake is the subset of input that makes it into short term memory, or whether it’s that part of input that has been sufficiently processed to now form part of the learner’s interlanguage system. The way Schmidt expresses this second point is instructive. Schmidt says that Corder’s treatment of intake does not make any clear distinction between that part of input used to comprehend messages and that part used “for the learning of form” (Schmidt, 1990, p. 139). Schmidt also endorses Slobin’s (1985) distinction between processes involved in converting input into stored data for the construction of language, and processes used to organise stored data into linguistic systems. Schmidt is obviously aware (sorry) of the problem of clearly identifying not just the level of conscious attention /awareness involved in noticing, but also the problems of clearly defining what is noticed and what (if any) processing goes on when learners notice whatever it is they notice.

Moving on to Krashen, his input hypothesis draws on the “natural order” of L2 acquisition that Corder drew attention to, and supposes that learners progress along a pre-determined learning trajectory which is impervious to instruction and controlled by a language acquisition device. Acquisition, Krashen says, is triggered by receiving L2 input that is one step beyond their current stage of linguistic competence. If a learner is at a stage ‘i‘, then acquisition takes place when he/she is exposed to ‘Comprehensible Input’ which belongs to level ‘i + 1‘. In Krashen’s model, learners only need comprehensible input and a low affective filter to acquire the L2, because once the i+1 input is received, Chomsky’s LAD does the rest. Almost needless to say, the trouble with Krashen’s input hypothesis is that he nowhere explains what comprehensible input consists of, or tells us how to recognise it.

Unsurprisingly, Schmidt’s not very impressed with Krashen’s badly-defined hypothesis, but it’s not just the lack of definition that Schmidt objects to; crucially, Schmidt insists that SLA is triggered by conscious attention. Krashen’s comprehensible input is, says Schmidt, much better seen as intake, itself defined as that part of the input which is ‘noticed’. Because what learners actually do is consciously attend to, notice, certain parts of the input, and the noticed parts becomes intake. Furthermore, since the parts of the input which aren’t ‘noticed’ are lost, it follows that noticing is the necessary condition for learning an L2. In his 1990 paper, at least, the claim is not, as so many now want to interpret the Noticing Hypothesis, “More noticing leads to more learning”, but rather, the much stronger claim “Learning can’t take place without noticing”.

In the next post, I intend to look at processing models and try to pin down Schmidt’s “technical” definition of ‘noticing’, which he says is “equivalent” to Gass’ ‘apperception’. Hmmm. I’ll also look at Suzanne Carroll’s very different view of input. She says:

The view that input is comprehended speech is mistaken and has arisen from an uncritical examination of the implications of Krashen’s (1985) claims to this effect. …… Comprehending speech is something which happens as a consequence of a successful parse of the speech signal. Before one can successfully parse the L2, one must learn it’s grammatical properties. Krashen got it backwards!”

Share this:

Like this:

Things started pretty normally yesterday. I looked at Twitter, and, as usual, there were numerous tweets by Dr. Conti urging readers to revisit his web site. Unable to resist, I thought I’d have a quick look at the post on parallel texts, which, I soon learned, are good because they encourage ‘noticing’. Dr. Conti explains:

According to Schmidt’s (1990) ‘Noticing hypothesis’ the learning of a foreign language grammar structure cannot occur unless the learner ‘notices’ the gap between the way that structure is used in the target language and his/her own L1. In my classroom experience I have witnessed many a time that Eureka moment when a student said, almost thinking aloud, “Oh, I get it! ‘I went’ in French is actually ‘I am gone’. That would be an occurrence of ‘noticing’

“Well, but would it?”, I wondered. Surely Schmidt’s noticing hypothesis makes no such claim; surely noticing the gap is more a trigger for noticing than noticing itself, and anyway, surely it’s not a question of noticing the gap between the L1 and the L2 but between features in input and output? Isn’t it?

As I sipped my third green tea of the morning, I hunted for Schmidt’s 1990 paper, but came across Truscott’s critique first. On the first page of his paper, Truscott (1998) says that he’s going to ignore the “noticing the gap” claim altogether:

Proponents of noticing also give much attention to noticing the gap – learners’ awareness of a mismatch between the input and their current interlanguage (see especially Schmidt and Frota, 1986). It is important to avoid confusion between this idea, which necessarily involves awareness, and the more general notion of a comparison between input and interlanguage. Theories of unconscious acquisition naturally hypothesize an unconscious comparison process. Thus, arguments that learners must compare input to their interlanguage grammar (e.g., Ellis, 1994b) are not arguments for noticing.

This is where I think things started to get a bit odd. Was the green tea starting to kick in? Surely Schmidt says that the conscious comparison of input and interlanguage triggers noticing, so surely Truscott is wrong to just kick Schmidt’s rejection of Krashen’s “more general notion” under the carpet? Why was ‘noticing the gap’ nothing to do with ‘noticing’? On the other hand, wasn’t Truscott right to challenge the claim that the only way L2 learners make progress in interlanguage development is through consciously attending to new features of the L2 that are present in the input? I got a panicky feeling that I needed to remind myself of what I thought about all this before some student asked me. More tea.

Right, then. What had I said publicly? I found a post on the blog where I lamented the way Schmidt’s Noticing Hypothesis was being used to support all manner of explicit teaching practices. Whether it’s presenting the present perfect on Tuesday at 8pm, making the explicit teaching of lexical chunks the number one priority in a course, or using a red pen to indicate all the missing third person ‘s’s in a composition, it’s all OK because the Noticing Hypothesis says that bringing things to learners’ attention is a good thing. “Schmidt’s construct has been watered down so much that it now means no more that noticing in the everyday meaning of the word”, I’d said. Hmmm. Things were now getting Alice-in-Wonderland weird: suddenly I found myself unable to say what the special, unwatered down meaning of ‘noticing’ was, or to remember why it was so highly regarded. I searched for Schmidt, 1990, again, found it and found this:

OK, clear enough, but why would anyone believe such a thing? If input can’t get processed without being noticed, then ALL second language learning is conscious. Surely this is either trivially true by adopting some very weak definition of ‘conscious’ or ‘learning’, or obviously, monstrously false?

Dear oh dear, I was falling down the hole fast. And I’d been so absorbed reading Schmidt that I’d made a triple expresso with freshly ground Robusta coffee beans and drunk it without, OMG, noticing!

Trying to get a grip, I carried on reading. Schmidt says that the term ‘unconscious’ is used in three distinct senses:

to describe learning without ‘intention’,

to describe learning without metalinguistic ‘understanding’,

to describe learning without attention and ‘awareness’.

He goes on to assert that although L2 learning without intention or metalinguistic understanding is clearly possible, there can be no learning without attention, accompanied by the subjective experience of being aware of – that is of ‘noticing’ – aspects of the surface structure of the input. Intake is

that part of the input which the learner notices … whether the learner notices a form in linguistic input because he or she was deliberately attending to form, or purely inadvertently. If noticed, it becomes intake (Schmidt, 1990: 139).

“What?”, I thought, “You can notice things purely inadvertently? Without paying attention? But with focal awareness??” I tried some deep breathing but it didn’t help. Anyway I had to focus. Apart from confusing me about what ‘noticing’ meant, where was Schmidt’s argument? How could he just DEFINE intake as noticed input in the way he did? What had I missed? I went back and tried to read it again, but all I could think was that I’d read it a dozen times already. I drew a little diagram:

There it was: Consciousness as awareness, level 2, noticing. But what did it mean? And what was all the rest about? Schmidt claimed that it was all supposed to sort out the “confusion” he saw in the use of the terms conscious and unconscious, but in fact, all I could see was a terrible muddle.

we notice

we pay attention

we are aware

we are focally aware

we deliberately attend to form

we notice purely inadvertently

our focus of attention is on surface structures in the input

we perceive competing stimuli and may attention to them (notice them) if we choose

storage without conscious awareness is impossible

the primary evidence for the claim that noticing is a necessary condition for storage comes from studies in which the focus of attention is experimentally controlled

the basic finding, that memory requires attention and awareness, was established at the very beginning of research within the information processing model.

I needed a drink. Time, I thought, for a good shot of tequilla.

I decided to concentrate on the question of what exactly ‘noticing’ refers to, and how we can be sure when it is, and is not being used by L2 learners. I had 3 sources: Schmidt and Frota (1986), Schmidt, 1990, and Schmidt 2001. Schmidt claims that ‘noticing’ can be operationally defined as “the availability for verbal report”, “subject to various conditions”. He adds that these conditions are discussed at length in the verbal report literature, and cites Ericsson and Simon (1980, 1984), and Faerch and Kasper (1987), but he does not discuss the issue of operationalisation further until 2001, and even there he fails to provide any reliable way of knowing if and when ‘noticing’ is being used.

But in the 2001 article Schmidt says that ‘noticing’ is related to attention and argues that attention as a psychological construct refers to a variety of mechanisms or subsystems (including alertness, orientation, detection within selective attention, facilitation, and inhibition) which control information processing and behaviour when existing skills and routines are inadequate. Hence, learning is “largely, perhaps exclusively a side effect of attended processing”. (Schmidt, 2001: 25). Oh no! We’re back! What’s “attended processing”? Is it ‘noticing’? Is attention the same as awareness? Another shot needed.

I was so dizzy by now that Truscott’s words came floating towards me like a lifeline:

current research and theory on attention, awareness and learning are not clear enough to support any strong claims about relations among the three. … they do not offer any basis for strong claims of the sort embodied in the Noticing Hypothesis (Truscott, (1998, p. 106).

Well at least that sounded right. Schmidt was tossing all these theoretical terms of attention, awareness and learning around like Humpty Dumpty, wasn’t he? Tuscott was surely right to question the assertion that attention can be equated with awareness, and obviously right to say that there is no evidence to support the sweeping claim that “learners must consciously notice the particular details to be learnt”. But why does Truscott say “consciously notice”? ‘Noticing’ can’t be unconscious, can it? Agghhh!

Start again. ‘Noticing’ is part of the first stage of the process of converting input into implicit knowledge. Learners notice language features in the input, absorb them into their short-term memories, and compare them to features produced as output. The claim is that noticing takes place inside short-term memory, and Schmidt explains that it is triggered by different influences, namely instruction, perceptual salience, frequency, skill level, task demands, and comparing.

I decided to take ‘noticing’ to mean‘noticing’, defined by the OCD as “becoming aware of something”. It seemed to me preposterous to suggest that second language acquisition could be explained as a process that starts with input going through a necessary stage in short-term memory where “language features” had to be noticed in order to get any further along the way towards knowledge of, or competence in, the target language. What, ALL language features? Seriously? All language features in the L2 shuffle through short-term memory and if unnoticed have to re-present themselves? Was that a serious suggestion for the acquisition of grammatical competence, for example? I recalled what Gregg had said to me:

Noticing is a perceptual act; you can’t perceive what is not in the senses, so far as I know. Connections, relations, categories, meanings, essences, rules, principles, laws, etc. are not in the senses.

So how on earth can one build up grammatical competence by simply noticing things in the input?

And how had the Noticing Hypothesis come to be accepted as an explanation of how input becomes intake, prior to processing and availability for integration into a learner’s developing interlanguage system? I found R. Ellis’ diagram, which is reproduced all over the place:

It appears to suggest that the 3 constructs of ‘noticing’, comparing and integrating are what turn input into output and explain IL development. Can it really be making such a claim? Where’s the noticing supposed to take place according to the figure? And what is short/medium-term memory? Anyway, as Cross (2002) points out, Ellis (1994, 1997), Lewis (1993), Skehan (1998), Gass (1988), Batstone (1994), Lynch (2001), Sharwood-Smith (1981), Rutherford (1987) and McLaughlin (1987) all agree that noticing a feature in the input is an essential first step in language processing. How depressing.

By now I was most of my way through the bottle of tequilla, and I became reckless: I decided to confront the main man, M.H.Long. I opened the obra maestra, Long (2015) SLA and TBLT, to page 55:

With Nick Ellis and others, what I claim is that explicit learning (not necessarily as a result of explicit instruction) involves a new form or form–meaning connection being held in short-term memory long enough for it to be processed, rehearsed, and an initial representation stored in long-term memory, thereafter altering the operation of the way additional exemplars of the item in the input are handled by the default implicit learning process. It is analogous to setting a radio dial to a new frequency. The listener has to pay close attention to the initial crackling reception. Once the radio is tuned to the new frequency, he or she can sit back, relax, and listen to the broadcast with minimal effort. Ellis identifies what he calls the general principle of explicit learning in SLA: “Changing the cues that learners focus on in their language processing changes what their implicit learning processes tune” (Ellis 2005, p. 327). The prognosis improves for both simple and complex grammatical features, including fragile features, and for acquisition in general, if adult learners’ attention is drawn to problems, so that they are noticed (Schmidt 1990 and elsewhere). This is the first of four or five main stages in the acquisition process (Chaudron 1985; Gass 1997), in which what is noticed is held and processed in short-term, or working, memory long enough for it to be compared with what is in storage in long-term memory, and, as a result, a sub-set of input becomes intake.

A couple of pages on:

Noticing in Schmidt’s sense, where the targets are the subject of focal attention, facilitates the acquisition of new items, especially non-salient ones, and as Schmidt maintains, and as demonstrated by 20 years of studies, from Schmidt and Frota (1986) to Mackey (2006), “more noticing leads to more learning” (Schmidt 1994, p. 18).

I’d read this before, lots of times, nodding sagely at the wisdom of the maestro, but suddenly, I doubted it all. Was Long really signing up to the explanation that Schmidt offered of how input gets processed? Well it seemed that he was, and as the tequilla worked my rational doubts into sentimental despair, I flapped at the pages, turning them back and forth, trying to find the bits that I’d liked so much before. He CAN’T be saying that ALL new ‘items’ MUST be ‘noticed’, can he?

Ah! What was this?

Crucially, however, as claimed by Gass (1997), and as embodied in the tallying hypothesis (N.C. Ellis 2002a,b), once a new form or structure has been noticed and a first representation of it established in long-term memory, Gass’ lower-level automatic apperception, and Tomlin and Villa’s detection, can take over, with incidental and implicit learning as the default process.

Black clouds threatened. I’d forgotten about the lower-level automatic apperception stuff. What, for pity’s sake, was THAT all about? I found some notes I’d made years earlier. “Gass claims that apperception is “the process of understanding by which newly observed qualities of an object are related to past experiences”. It “serves as selective cueing for the very first step of converting input into intake”. It “relates to the potentiality of comprehension of input, but does not guarantee that it will result in intake”. Beats me! It “relates to the potentiality of comprehension of input” indeed! Must ask Kevin.” In fact, much later I did ask Kevin, who told me that he’d actually been there at the plenary of whatever conference it was when Susan Gass introduced the lucky listeners to apperception. Now what exactly had he said about it?

Thankfully, the tequilla rescued me from musing on apperception’s mysterious properties, and allowed me to grasp what I was sure was the main message. Hallelujah! The Empire strikes back: incidental and implicit learning as the default process return! Phew! And there was more good news:

whether detection without prior noticing is sufficient for adult learning of new L2 items is still unclear – perhaps one of the single most critically important issues, for both SLA theory and LT, awaiting resolution in the field.

Yes! Now I remembered! I’d read the bit about noticing maybe not being necessary at all somewhere else. I found this in a document I’d done myself, but most of it was directly from Long, 2015:

In this view, once a new form or structure has been noticed and a first representation of it established in long-term memory, lower-level “apperception” (Gass) or Tomlin and Villa’s “detection” take over, with incidental and implicit learning as the default process. So the first representation in long-term memory primes the learner to unconsciously perceive subsequent instances in the input. The big question is of course whether noticing is necessary for any representation to be established in long-term memory: is consciously attending to and detecting a form or form-meaning connection in the input the necessary first stage in the process of acquiring some features and form–meaning connections? Long calls this “perhaps one of the single most critically important issues, for both SLA theory, and language teaching, awaiting resolution in the field”.

In other words, Rather than see ‘noticing’ as the necessary and sufficient condition of SLA, I could now interpret Long as saying that incidental and implicit learning are still the main ways adults learn an L2. Furthermore, while noticing might facilitate the acquisition of “new items”, it’s still an open question as to whether it’s a necessary condition for acquisition.

“This is surely a gap worth noticing” was the last thing I remember saying to myself.

Share this:

Like this:

This is my final blast in 2017 against those who use their power to defend the lamentable state of English Language Teaching against those who want change.

Who are they?

Those in charge of The British Council, TESOL, IATEFL, Cambridge Assessment English, IELTS, Pearsons, Kaplan, International House, OUP, CUP, McGraw Hill, National Geographic, New Oriental, and many others. Between them, they control the coursebook-dominated ELT industry which has a turnover of $200,000,000,000 ($200 billion).

Their visible face comprises the stars of the ELT world. Many of them are multi-millionaries (Nunan, Richards, Rogers, Murphy, Mr. and Mrs. Soars, Mr. and Mrs Haycraft, for example) and all of them earn more than $200,000 a year.

What do they do?

They promote the use of coursebooks as the driver of ELT in classrooms, in in-company courses, and private classes all over the world. They also control assessment of English profiency and of teacher training.

They produce the coursebooks and the tests that determine a person’s level of proficiency in English, and they supervise the admin and marking of the tests. They also produce the teacher training courses and the tests that determine a teacher’s abilities, and they supervise the admin and marking of the tests.

Their “visible face” is seen at the ELT conferences and in the social media. They are the ELT stars, the shockingly uninformed “experts” who write coursebooks and “How to teach” manuals, who give plenaries and workshops around the world, promoting their own products, peddling the same orthodox line.

In brief, they decide what is taught and by whom.

What’s wrong with what they do?

The fusion of their roles as designers, producers and assessors points to something rotten in the field of education.

They commercialise ELT. They turn education into a product. And as in all commercial endeavors, they get rich, the teachers stay poor, and those who “receive” the product don’t learn as well as they could. By controlling the assessment of English proficiency, of ELT materials and of teacher training, they ensure that ELT is a profit-driven industry where good practise and results are measured by profit margins. Most people who do English as an L2 courses don’t learn as much as they want, or as much as they could.

What should we do?

First, draw on the findings of instructed SLA research and on the insights of people like John Fanselow and other gifted crafts people. Somehow, we have to bring together our understanding of the learner’s interlanguage development and our collective wisdom of ELT as a craft. However this pans out, it will be a local solution. The biggest fight we have is to make our teaching relevant to our particular context and to slay the control of the global coursebook.

Second, break the hold of those who currently control ELT. Speak out. Criticise the Britsh Council, IH, the Cambridge Examiners, IATEFL, etc. Organise locally by forming cooperatives, in-house workshops, small scale conferences, etc.

Happy 2018 to all.

Share this:

Like this:

In 2017 the British Council’s charity status and its branding as a UK government agency helped to maintain tax-free income at around the £1 billion mark, which is what they made in 2016 according to their own report . Particularly lucrative among its commercial operations were those involving the International English Language Testing System (IELTS), its English teaching centres, and its educational marketing and education related contracts. These activities have led to accusations that its one-third share in the IELTS biases its testing and certification policies; that it competes with an unfair advantage to train teachers for overseas governments; and that its not-for-profit status means that the income it gets from English teaching is exempt from corporation tax in many countries, unlike its competitors. Concerns about the way that the British Council conducts itself as a charity have been compounded by the activities of the British Council Education UK website, which offers advice to international students looking to study in the UK. While British Council services are paraded all over the web site, other schools and colleges offering similar services have to pay to be on the lists provided, and the ones with the biggest marketing budget get the best positioning. There’s also the on-going struggle of ordinary teachers working for the British Council to be included in the generous pension plan which is presently only provided to the upper echelons of British Council staff.

The British Council is one of the most important pillars in the entire global ELT establishment, and, naturally, it gives active support and encouragement to all those who work to keep things running nice and smoothly just as they are, while frowning on any attempt to rock the boat. A visit to the huge British Council Teaching English web site is a truly depressing experience, like going for tea at some minor country pile in Dorset where the half-crazed, inbred family members, slumped in their lumpy threadbare sofas, talk about how hard it is to get a decent pot of tea these days. There’s no spark, no wit, no edge, no real sign of intelligent or joyful life anywhere to be found. The BC web site is clogged with unchallenging, boring, middle-of-the-road pap about the importance of motivation, ordered use of the whiteboard, conflict avoidance, dynamic classroom seating arrangements and positive feedback. There is quite simply no trace of serious critical evaluation of any of the issues facing us today. You don’t believe me? Take a visit there right now. This is what will greet you:

The BC proudly trumpets that

Armenian English language teachers are given the opportunity to attend a series of talks delivered by a distinguished ELT expert and develop professionally through exposure to the latest thinking and understanding in the field.

What the unlucky Armenian teachers actually got was five hours (FIVE HOURS!) exposure to Harmer, whose performance suggests that he can’t articulate his thoughts about anything for more than twenty seconds. I find it hard to believe that whoever is responsible for this web site actually watched the videos of Harmer stumbling around the room like someone who’s eaten the wrong mushrooms for lunch, before making them available to the public, and I find it even harder to believe that when Harmer’s talks mercifully finished, anyone in the audience felt that they were now up to date with “the latest thinking and understanding in the field”.

Still, that’s just the front page. Surely if you delve into the rich store of teacher materials you’ll find some stimulating, challenging, innovative stuff to make you think? Well no you won’t. You won’t find any reports and critical evaluations of research findings of the sort offered in the ELT Research Bytes blog, or anything that isn’t written by tried and trusted, controversy-averse, well-heeled teachers and teacher trainers dedicated to defending the status quo, keeping the good ship ELT on a steady course, and not rocking the boat.

To the extent that 2017 was a good year for the British Council, it was a bad year for innovative, democratic, progressive ELT.

Share this:

Like this:

In Part 1 I suggested that those who write books and give teacher training courses in ELT have a duty to act as mediators between researchers and teachers, and that most of them make a mess of the job. This opinion was supported by the mini study Thornbury carried out and then reported on at the 2017 IATEFL conference. The study looked at four top-selling “How to Teach English” books which are recommended reading for hundreds of thousands of people studying to get a qualification in ELT, and it found that all four books are based more on the authors’ biases, intuitions, feelings, and what somebody else told them, than on any serious attempt to critically assess what research findings tell us about how people learn languages. In a post on these mediators I suggested that Thornbury took a disappointingly uncritical look at the data that his study had produced.

Staying on the Fence

Unlike the four writers he reviewed, Thornbury himself has discussed research findings that challenge ELT orthodoxy more than once, so if he thinks it’s important for him to keep in touch with research and to use research findings to inform his views on methodology, why doesn’t he expect the same of others? And since he’s been so outspoken in his criticism of coursebooks, why didn’t he mention this when discussing his findings? The answer seems to be that Thornbury has developed the unique knack of not just sitting on the fence, but actually living perfectly perched on it. He’s become so adroit at deftly ducking controversy, so practiced at never getting drawn on the political issues raised by the matters he discusses, that he makes the UK liberal democrats look radical. He knows perfectly well that the bosses of the British Council, the publishing houses, the exam bodies, the training outfits and so on will simply not allow any serious attacks on current ELT practice to be made – witness his own publishers’ making it clear to him that they’re “not interested” in his McNuggets views or in what he really thinks of the CELTA course. He knows that the ELT educational system is set up in such a way that teachers are unlikely to hear about “inconvenient” research findings which challenge coursebook-driven ELT, or which show that the Pearson Test of English is built on sand, or which describe the Common European Framework of Reference as “a prime example of in the way political and social agendas can impact on language testing, and how language testing can be made to serve those agendas” (Fulcher, 2005). I suppose Thornbury thinks, like many reformers, that he can be more of a force for change by staying inside the tent than pissing on it from outside. I think that this argument is demonstrably wrong, but never mind; even if that is Thornbury’s view, it doesn’t explain why he doesn’t adopt a more critical stance. In the end, maybe it’s just that he’s a really nice guy and he doesn’t like upsetting people. Well, I can certainly relate to that. 🙂

Misrepresenting Chomsky

Still, there’s another bone I have to pick with the loveable Thornbury, and that is his continued misrepresentation of Chomsky’s work. If you look at the “daft things the experts said” at the end of my last post, it was Thornbury who said “The NS-NNS distinction is absolutely central to the Chomskyan project”. It isn’t, of course, and, pace Thornbury, the onus isn’t on Chomsky to perform the logically impossible task of proving that some aspects of the knowledge of language that children demonstrate couldn’t have been acquired from input, and it isn’t the case that there’s no empirical evidence to support Chomsky’s theory of UG. In my post Treatise on Thornbury’s view of SLA I pointed to some mistakes in Thornbury’s account of what Chomsky says about language and language learning, and also the faults in his arguments about UG in general and the poverty of the stimulus argument in particular. It’s important to stress that none of the emergentists who Thornbury now seems to think offer the best explanation of SLA, least of all Larsen-Freeman, has offered an explanation for what young children know about language. As Eubank and Gregg (2002) argue, to suggest that language learning is explained by a general theory of associative learning is to leave unexplained

the fact that children know which form-function pairings are possible in human language grammars and which are not, regardless of exposure.

The countless cases of instantaneous learning .

The knowledge children have in the absence of exposure (i.e., a frequency of zero) including knowledge of what is not possible.

Furthermore, to quote Eubank and Gregg (2002, p. 237)

Ellis aptly points to infants’ ability to do statistical analyses of syllable frequency (Saffran et al., 1996); but of course those infants haven’t learned that ability. What needs to be shown is how infants uniformly manage this task: why they focus on syllable frequency (instead of some other information available in exposure), and how they know what a syllable is in the first place, given crosslinguistic variation. Much the same is true for other areas of linguistic import, e.g. the demonstration by Marcus et al. (1999) that infants can infer rules. And of course work by Crain, Gordon, and others (Crain, 1991; Gordon, 1985) shows early grammatical knowledge, in cases where input frequency could not possibly be appealed to. Landau & Gleitman (1985) even document lexical acquisition in spite of frequent input, where a blind child acquired (her own interpretation of) verbs like “look” despite frequent training under a different interpretation.

In a comment on the post about Thornbury’s view of SLA, Gregg wrote this:

Hi Geoff,

I think I’d revise one bit of your discussion: Where you say

Thornbury’s unqualified assertion that language learning can be explained as the detection and memorisation of frequently occurring sequences in the sensory data we are exposed to is probably wrong and certainly not the whole story.

I’d change ‘probably’ to ‘definitely’. It’s striking, and depressing, to see how purveyors of ’emergentism’ continue to ignore the mountain of research showing the complexity of language, and the other mountain of research showing the kinds of linguistic (and other) knowledge young children show, knowledge that no one has been able to account for on an empiricist learning theory, and how they continue to blithely assert that it’s all done by generalization across input samples, without showing how. I’m again reminded of the story … of how Rockefeller became rich. One day as a young lad he found himself with a penny in his pocket. He walked down to the farmer’s market and bought an apple, walked to Wall Street and sold it for 2 cents. Then back to the market to buy 2 apples, back to Wall Street, … At the end of a week he’d bought an old wheelbarrow, and after a month he’d earned enough to put down the first month’s rent on a small fruit shop. But then his uncle died and he inherited everything.

Sprawling in the Primeval Slime

While Thornbury’s remarks about emergentism are slightly less preposterous this year than they were in 2016 (he’s moved on from the Larsen-Freeman and Cameron’s (2008) nonsense about complex systems to slightly better-argued stuff by the likes of Nick Ellis and Tomasello), he continues to incite a younger generation, who after a quick perusal of Samson, Everett, Wolfe and other reliable sources, share their ignorance with others in the comments sections of the A to Z of ELT blog. Thornbury being Thornbury, he doesn’t tell the young uns that they’re talking baloney, he actually encourages them. In one of his posts this November, Thornbury cheerfully quips that, given the choice between Chomsky’s self-proclaimed triumph of “human reason” on the one hand, and “beastly grovelling in the primeval slime” on the other, he’ll choose the slime every time. The trouble is, he invites the younger generation to join him in the beastly bog; he encourages them to think that their ignorance of Chomsky’s work should be worn like a badge of cool, and he confidently assures them that SLA is best explained as the complex result of a simple process of “reinforcing contingencies set up by the verbal community”. You couldn’t make it up, so it must be true. Well, for the time being we’ll have to leave them to it, happily frolicking in the slime, unconsciously strengthening the associations between who knows what cues, and trust that before they get too much older, the brighter ones will get tired of it, climb out, and leave their genial hero alone with his dirty bucket and spade, there to finally appreciate the power and utility of non-communicative uses of language, or ‘thinking’ as Chomsky refers to them.

Share this:

Like this:

Looking back on the posts during 2017, I notice that I started the year (ELT: Art and Rationality) by happily conceding that ELT is

“a creative, imaginative endeavour where a teacher’s ability to bring language to life; to contextualise it; to create situations where students engage with it; to get students to learn some key parts of it by rote or at least through frequent re-cycling; to create group dynamics and nurture group cohesion; to empathise with the doubts and fears of students, to manage conflicting needs, and also to design, organise and carry out a coherent plan of learning, are all far more important than a critical appreciation of theories of SLA and the research they’re based on”.

Chomsky (1995, cited in Gregg, 2006, p. 403) made a similar point when he remarked that we might well learn more about how people think and feel and act by studying history or reading novels than from empirical research, which, “outside of narrow domains has proven shallow or hopeless”.

Teaching English is still, despite all attempts to commodify education, an “arts and craft” activity, a job where experience counts a great deal, and where teachers who combine all manner of skills and knowledge and character traits, and who find themselves in the right place at the right time, can work wonders, making the difference between FonF and FonFs pale into insignificance. And yet, as I said in that post, things have changed from the time when Earl Stevick, John Fanselow Alan Maley and other master craftsmen (I’m afraid they were mostly men) shared their insights with teachers seeking awareness and inspiration. Since the widespread adoption of coursebooks, our freedom as teachers to express our individuality, inventiveness and creativity has shrunk alarmingly, while at the same time, research into the English language and into how people learn languages has greatly expanded. Despite these two decisive changes, we perversely persist in using syllabuses, methodological principles and pedagogic procedures that rob us of the freedom to pursue our craft, and that, at the same time, fly in the face of robust research findings.

My main argument throughout the year has concerned that enormous elephant in the room: ELT coursebooks. Pace the arguments of those who try to defend their use, coursebooks are not just “a symptom”; it’s not just a question of the way you use them, or that they put too much emphasis on grammar teaching, or that they’re tools of imperialism; or even that they’re stultifyingly boring. No, it’s that they have a huge, generally detrimental effect on the practice of ELT, including syllabus design, methodology, and testing. All the discussion of doing things better, of the role of extensive reading, of what work to do in and outside classrooms, of how to use this or that bit of kit, of whether to teach vocabulary this way or that, of the best way to recycle work, of the efficacy of pronunciation teaching, of when to use the L1, of how to respond to written and spoken errors, and on and on, all take place against the backdrop of using a coursebook which imposes a restrictive and deforming framework on everything we do. We know that synthetic syllabuses, a PPP methodology and an incremental step by step view of progress are based on false assumptions about how people learn an L2, and yet, using the excuse of convenience and bowing to commercial pressure, we plod on regardless. To make matters worse, like politicians refusing to take climate change seriously, our stubborn refusal to face facts blights the future. The coursebook imposes its mistaken methodological principles and pedagogic procedures on teacher training, particularly the CELTA and Trinity College training courses, where learning to be a teacher of English to speakers of other languages is intricately bound up with learning how to use a coursebook.

In a number of posts this year, I’ve replied to those who have defended coursebook-driven ELT (see the Coursebook section of the menu on the right) and, in my opinion, no serious answers to the case against coursebooks have been offered. Penny Ur’s airy dismissal of any criticism of them; her recent review of SLA research affecting teaching practice (see the Gagged post) where she made no mention of interlanguage research and ignored questions about the implications of interlanguage research for coursebook-driven ELT; and her continued reliance on the argument that the convenience of coursebooks outweighs all other considerations srikes me as typical of too many of today’s so-called ELT experts. Ur’s replies to Thornbury’s questions about the importance of research (“it’s certainly possible to write helpful and valid professional guidance for teachers with no research references whatsoever”), her misrepresentation of the research on TBLT (“there’s no evidence that it works”); and her extensive use of the well-known fallacy that “inconclusive” evidence in support of a hypothesis is reason to believe it’s false, are hallmarks of the unreliable expert.

I suggest that we have a right to expect that those whose job it is to oversee the training and on-going professional development of teachers should take robust research findings about how people learn an L2, particularly those regarding interlanguage development, more seriously and make discussion of them part of their books and training courses. Why does Ur’s book A Course in Language Teaching so confidently promote the coursebook and so completely ignore 40 years of interlanguage research? Why does Harmer’s magnum opus The Practice of ELT (see here for a review ) devote more pages to a discussion of classroom seating arrangements than to a discussion of SLA research? Why does nobody in the ELT establishment (except Scott Thornbury) speak out against all the harm being done by the domination of coursebooks today?

The most obvious answer is “Because ELT is a business” and coursebooks are the perfect way to package what could otherwise be a rather messy “product”. But I can’t help feeling that a certain insidious complacency is also to blame, especially when I see Ur, Harmer, Dellar and the rest of them jetting around the planet giving teachers everywhere expert advice on how to teach, without ever initiating a serious discussion of the mounting evidence from SLA research which indicates that current ELT methodology is fundamentally mistaken. Dellar’s*** tweet in September from some exotic corner of the globe illustrates the ease with which doubts about current practice can be shrugged off by those who feel themselves to be really in the know: You quickly realise how little the heated debates of the euro-centric #EFL blogosphere have to do with most contexts …. he wrote. Ohers were quick to “Like”.

*** My sincere apologies to Jim Scrivener, to whom I wrongly attributed the tweet when I published this post.

An examination of conference talks given by the leading lights in ELT in 2017 reveals a general lack of awareness and critical acumen that many of us find shocking; and almost as shocking is that these conference talks go almost entirely unchallenged. The bombast and chutzpah of so many in the ELT establishment gels with the gullibility and docility of their audiences to produce a complacent culture lacking any healthy critical edge. This year, every time the twenty or thirty plenary speakers who presently dominate the global ELT conference circuit finished their presentations, they were met with polite applause. Until this is replaced with a cacophony of affronted catcalls, change won’t come; or at least it won’t come from rank and file action, though it might well come soon enough from technological change which makes both coursebooks and most teachers redundant.

In Part 2 I’ll look at some of the daft things our experts said in 2017, including these:

If you encounter the pattern They man-doubled across the place, you know that man-doubled is some kind of way of moving.

In academia the established use of ‘native speaker’ as a sociolinguistic category comes from particular paradigmatic discourses of science and is not fixed beyond critical scrutiny.

English sometimes seems as if it is everywhere, but in reality, of course, it is not.

The NS-NNS distinction is absolutely central to the Chomskyan project.

English migrated to other countries … such as the USA, Canada, New Zealand, … and many other corners of the globe. And it didn’t stop there. It has morphed and spread to other countries too.

The way I see it Scott is that ‘interlanguage’ is one of the uglier of many unnecessary neologisms invented by academics, presumably to give them a sense that they are forging a profession: there are plenty of plain English alternatives.

Have you read Evans’ The language myth. Why language is not an instinct ? Very good book. Quite an eye-opener.

After the rain came falling,

And the truth was washed away,

I called my brother on the telephone,

Just to see what he would say.

The last one is the first stanza of a song, a lament one could say, in response to Brexit. The song inspired the best comment of 2017 from John Clave:

“I experienced such vergüenza ajena I curled up in a ball and rolled under my bed”.

Reference

Gregg, K.R. (2006) Taking a social turn for the worse: The language socialization paradigm for second language acquisition. Second Language Research, 22: 413.

Share this:

Like this:

A few weeks ago, someone in the ELT world tweeted that Salma Patel’s blog, which deals with management of the UK National Health Service, had a post that gave a good, brief summary of research paradigms. I went to the blog and found the post:

Published in 2015, it’s had 168,622 views so far, and there are dozens of comments at the end thanking Patel for his “clear”, “brilliant”, “superb”, “excellent”, “amazing”, “extremely useful” explanations.

The explanation starts with a summary of the main components of a research paradigm and there is then a video which explains the text. Patel begins by saying that there are two main approaches to research:

Filling knowledge gap: positivist

Problem-solving: interpretive.

He explains:

In the first you read a lot of books …..and you find a gap in the research. ……It is objective. What is the meaning of objective? Reality is external to us – I don’t know the reality. So, I propose a hypothesis. What is the meaning of a hypothesis? There is a relationship between X and Y, or not. That’s it.

In the second, you identify a problem, you ask “Why?”. There is no single reality so we have to look at reality from different perspectives, understand different characters, different people, .. So there’s no reality here. That’s why we have to go ourselves into the organisation and talk to people.

So there you have it: scientific, quantitative research is most suitable for research projects which seek to fill a knowledge gap, while qualitative research (which assumes that there’s no such thing as objective reality) is the best way to go about problem solving.

Scientific research is, of course, nothing like Patel’s description of it. Nor is positivism what Patel says it is, and nor does his chart present a reliable or useful guide to research projects.

The aim of scientific research is, precisely, to solve problems, or, to put it another way, to explain phenomena. The collection of empirical data, the organisation of taxonomies, etc. are carried out not for their own sakes but in the service of an explanatory theory. Hypotheses are the beginning of attempts to solve problems and should lead to theories that explain a certain group of phenomena. The aim is to unify descriptions and low-level theories into a general causal theory.

SLA research carried out under the umbrella of cognitive science adopts these aims and methods, and although far from achieving any general theory, it still has some claim to be part of what Kuhn calls a mature science tradition. In contrast, the sort of work Patel encourages falls, at best, into Kuhn’s “immature science” bag, in the ‘pre-paradigm’ period. It’s clear from the literature that some sociologists and sociolinguists want no part of the scientific enterprise, but Patel’s biased and distorted description of different approaches to research fails to properly explain either the realist or the relativist case. In order to provide newcomers with a clear, balanced, well informed introduction to research methodology, I think Patel needs a better grasp than he shows of the philosophy of science, the history of western philosophy, and how evidence-based research is conceived and conducted.

In response to information given to me by Steve Brown, Carol Goodey and others earlier this year, I wrote a post on Research Paradigms where I commented on the way that various influential sociology departments have developed their own particular post-Khunian narrative concerning how research is carried out. I said at the time that I was really surprised to learn how widely these daft notions of ‘positivism’ and ‘research paradigms’ had spread, but I find the fact that Patel’s post has reached over 160,000 grateful post graduate students quite shocking. Did nobody catch so much as a whiff of baloney? Nobody took the trouble to, ahem, deconstruct the text?

A more respectable version of Patel’s presentation can be found in Scotland (2012), which is cited, but it’s hardly any better. In the end, we can trace most of this “revised”, post-Kuhnian treatment of paradigms back to Lincoln and Guba (1985) who proposed a “Constructivist paradigm” as a replacement for “the conventional, scientific, or positivist paradigm of enquiry”. This view is idealist (“what is real is a construction in the minds of individuals”), pluralist and relativist:

There are multiple, often conflicting, constructions and all (at least potentially) are meaningful. The question of which or whether constructions are true is sociohistorically relative. (Lincoln and Guba, 1985: 85).

Lincoln and Guba assume that the observer can’t and shouldn’t be neatly disentangled from the observed in the activity of inquiring into constructions. Constructions in turn are resident in the minds of individuals:

They do not exist outside of the persons who created and hold them; they are not part of some “objective” world that exists apart from their constructors (Lincoln and Guba, 1985: 143).

Thus constructivism is based on the principle of interaction.

The results of an enquiry are always shaped by the interaction of inquirer and inquired into which renders the distinction between ontology and epistemology obsolete: what can be known and the individual who comes to know it are fused into a coherent whole (Guba: 1990: 19).

Note that Patel has either overlooked or ignored the fact that, according to the leading lights in his “constructivist paradigm”, the distinction between ontology and epistemology is obsolete. In any case, if you want to find the roots of the full-blown idealist, relativist, pluralist, your-experience-of-me-experiencing-you-experiencing-the-teapot, topsy-turvy, now-you-see-it-now-you-don’t world of post-modern sociology, you need look no further than Lincoln and Guba, 1985. And if you want a demonstation of why it’s so much baloney, see Gross & Levitt, 1994; and Sokal & Bricmont, 1998.

Not far behind in terms of culpability for all this mess comes Crotty (1998), whose “seminal work” on research in the social sciences is required reading in thousands of undergraduate and post graduate courses all over the world. Crotty’s work quite wrongly states that positivism started with the work of Francis Bacon, completely misrepresents the work of the positivists themselves, and misrepresents the work of Popper, Kuhn and Feyerabend too. At one point, Crotty says that the real target of Feyerabend’s criticism were “the positivists”, despite the fact that before Feyerabend’s Against Method was published, positivists – scientists and philosophers alike – had thankfully disappeared. I challenge Crotty to find a scientific department in any university anywhere on the planet run by self-proclaimed positivists.

C.P. Snow, in his 1959 lecture, first described the ‘two cultures’ of science and the humanities (see Snow, 1993), since when the gap has widened considerably. Eleven years ago, Gregg (2006) noted that in the field of SLA, a look at the ‘applied linguistics’ literature

turns up doubts about the value of controlling for variables (Block, 1996), reduction of empirical claims to metaphors (Schumann, 1983; Lantolf, 1996), mockery of empirical claims in SLA as ‘physics envy’ and denials of the possibility of achieving objective knowledge (Lantolf, 1996), even wholesale rejection of the values and methods of empirical research (Johnson, 2004). Although the standpoints are various, one common thread unites these critiques: a fundamental misunderstanding of what science, and in particular cognitive science, is about (see, e.g. Gregg et al., 1997; Gregg, 2000; 2002).

Today, blogs and twitter exchanges abound with references to white coats, laboratory conditions and the other trappings of so-called positivists (including Chomsky of course) who, it’s claimed, fail to make any connections with the real world, even though, ironically enough, they’re the only ones who believe in such a thing. In my own case, in exchanges with Marek Kiczkowiak of TEFL Advocates about the existence (or not) of native speakers, I refer to the “sociolinguistic twaddle that obfuscates a simple psychological reality”, while he refers to “the fantastic beast the NS has become in theoretical linguistics and SLA labs”. I’d say that in this case it’s Kiczkowiak who shows a typically depreciating and ignorant attitude towards SLA cognitive research, while I limit myself to the claim that regardless of how difficult it might be for sociolinguists to decide who belongs to what social group, there are such things as native speakers, and it is the case (a case worth researching) that most people who learn a L2 fall short of native competence. But then, I would say that, wouldn’t I.

Patel’s post is more evidence of the need to remain critical in our reading and thinking about our profession. There are so many examples of low standards of scholarship, rational criticism and intellectual honesty in the work of those who do research and teacher training that we need to be constantly on our guard. Down with baloney!

References

Crotty, M. (1998) The Foundations of Social Research: Meaning and Perspective in the Research Process. London, Routledge.

Gregg, K. R. (2006) Taking a social turn for the worse: the language socialization paradigm for second language acquisition. Second Language Research 22, 4; pp. 413–442.

Gross, P.R. and Levitt, N. (1994) Higher superstition: the academic left and its quarrels with science. Johns Hopkins University Press.

Scotland, J. (2012) Exploring the philosophical underpinnings of research: Relating ontology and epistemology to the methodology and methods of the scientific, interpretive, and critical research paradigms. English Language Teaching, 5(9), pp.9–16.

Snow, C.P. (1993) The two cultures. Syndicate of the University of Cambridge.