theme-sticky-logo-alt

It’s complex, not complicated.

What is complexity theory and how does it accommodate up to date beliefs about how languages are acquired, and new approaches to teaching like task based learning and dogme approaches?
Until recently, theories about language acquisition have been dominated by the cognitivists, such as Krashen, Long and Chomsky. The basic premise was that input would be processed, and hypotheses made, which would then result in output where the hypotheses could be tested. This process was said to be innate, and relatively fixed. Coupled with this was the idea that there existed a natural order of acquisition. Whereas ‘ed’ endings would be learnt relatively early, the third person ‘s’ would always be acquired later on down the line.  The implications for teaching involved providing students with lots of ‘comprehensible’ input (based on the natural order of acquisition), helping learners process patterns in the language through ‘noticing’ and ‘consciousness raising’ and providing them with plenty of opportunities for practice so that they could receive feedback on their hypotheses. Presentation-practice models of teaching seemed to be able to accommodate this theory of language acquisition, and publishers could produce grammar driven coursebooks based on the supposed natural order of acquisition. It also suited teachers, of course. They were provided with a relatively safe set of approaches and techniques where the language remained firmly under their control and where unpredictability could be kept to a minimum.
However, although there is much to be said for cognitivism, there was always a feeling that the process of learning a language was a much more complex affair. For it is obvious that learning a language is not so linear. It is not the case that if you provide sufficient input, this will always result in desired output. Neither is learning a language as incremental as it has been made out to be. It is not a matter of putting the building blocks one on top of the other as if you were constructing a wall.
As Scott Thornbury says:
(…) the learner’s grammar restructures itself as it responds to incoming data. There seems to be periods of little change alternating with periods of a great deal of flux and variability, and even some backsliding. In this way, process grammars are not unlike other complex systems which fluctuate between chaotic states and states of relative stability ​(Thornbury, p.49)
As such, there has recently been a paradgm shift in our attempt to explain language acquisition. This is not to say that we have supplanted the old theory for a brand new one, but we are beginning to view and apply ‘old’ concepts and constructs diiferently. The new paradigm is a ‘complex emergent system‘ and it is based on complexity theory.
Complexity theory is an attempt to explain complex phenomemon, something which more traditional theories were not, or only partly, able to do. It recognizes that complex behaviour emerges from a few simple rules, and that all complex systems are networks of many interdependent parts which interact according to those rules.
And what could be more complex than learning a language? For the language system, like all complex systems, is much more than the sum total of it’s parts. The lower sub-systems, such as morphology, dynamically interact in parallel with other lower sub-systems, such as phonemes, which in turn interact with higher level systems like syntax and prosody. The system is in constant flux and any change in any one of these systems can trigger alterations in another. It is through these interactions that a language system emerges, not in an ad hoc or chaotic manner, but in a relatively self-organising way. And what emerges is not linear, nor incremental. It is often unpredictable and very surprising.
The implications for teaching are slightly scary, but also liberating. It means that we should relinquish our control of the language. Yes, we should still provide learners with appropriate input, but we cannot dictate how each individual learner will approach that input, nor the type of output which will result from it. Gone will be the days when we can treat the whole class as a homogeneous group, each learning in lockstep, with all the learners producing the same language at the same time. It means that teachers will have to make the transition from planning ‘safe’ presentation-practice lessons towards ‘riskier’ task and dogme-like lessons. This will mean that teachers and students alike will have to step outside their comfort zones and be prepared to deal with unplanned events such as unpredictable language arising and fuzzy logic. It will mean that teachers will have to think much more on their feet and be much more flexible in terms of how they use materials, how they deal with individual leaners’ needs and on how they manage time. It will also have an affect on how we assess students. Rather than testing students on what we have decided to teach them during the semester (the present perfect, quantifiers, travel lexis, etc) through achievement tests,, maybe we will have to start evaluating the learners’ on a more global level through proficiency tests.

Whatever the merits of complex emerging systems, we need to to constantly appraise the beliefs about language learning which underpin our teaching. Just as cognitivism is being reappraised, maybe it is now time that we begin to reappraise our own teaching.

(Thornbury, p.49, “Uncovering Grammar,” Macmillan Heinemann, London, 2011)

 

Previous Post
Adapting lessons to learners’ profile
Next Post
Why teachers need a pronunciation dictionary
Dominic Walters

I am CELTA and DELTA qualified and have an MA in Educational Psychology. I have been teaching English since 1991, working in Brazil, Republic of Ireland, Spain, Portugual, Egypt and the UK. I am a DELTA, ICELT, CELTA, FTBE assessor and tutor as well as a CELTA online course tutor. I am also an examiner for the Cambridge, IELTS, Trinity exams.

15 49.0138 8.38624 1 0 4000 1 https://www.richmondshare.com.br 300 0