There is a lot of talk about learner needs, needs analysis and learner centred lesson planning and course planning.  But do learners really know what they need?  Or do they just tell us what they want?

The difference between “wants” and “needs” is neatly illustrated by the image on the right – a want is something that is desirable but unnecessary.  A need is something you have to have no matter what!  And do we always know the difference?  I know that I often say I need to go and buy something, when the truth is, I can probably do without it!

In education the reliance on needs analysis worries me, as I fear it might be misplaced.  After all, we go to our doctors and describe our symptoms but we don’t tell the doctor what to do next – why should we as language teachers rely on the input our learners give us?  Surely as a professional I am capable of spotting the problems a learner is having, communicating those problems to the learner and working out a set of solutions.

But we don’t always notice and the doctor analogy is perhaps right – the learners come to us and say “I’m having problems” before we then think about what the causes and solutions might be.  So learner input is valid – but can it be trusted?

To find out – I conducted a small scale study with a group of FCE learners.  They all have their exam due later this year and I thought the needs analysis exercise would be a useful one in helping me prioritise their needs over the remainder of the course.  Without going into the entirety of the questionnaire, one of the things I asked was how learners would like to prioritise their time – I asked them to complete a pie chart reflecting the proportion of time they wished to spend on different components of the course.  You can see their collated feedback in the diagram below:

The results show three tiers of prioritisation:

  • Tier 1:  Vocabulary, Use of English, Exam Practice
  • Tier 2:  Speaking, Writing, Grammar
  • Tier 3:  Reading, Listening.

The first tier may reflect learners’ beliefs about language learning and exam preparation more than it does their needs.  The learners identify “Exam Practice” as their priority, though the effectiveness of exam practice is debateable.  Some studies, such as Hayes 2003, Green 2007 and Perrone 2010, which examine the impact of test preparation on test performance, show negligible positive effect.  This would suggest that while learners may want “exam practice” it is not, in fact, what they “need”.  It’s also not clear from the feedback what the learners are requesting when they specify “Use of English” as this is the section of the FCE relating to language ability, so the request could either be interpreted as a desire to improve grammatical and vocabulary knowledge or for additional exam practice focusing specifically on this section.  Note to self – improve questionnaire design for next time.

So does this match the data?  How accurate is the learners’ self assessment?  To find out, the learners were asked to conduct a practice exam based on a past paper.  In full disclosure – I marked all the papers and conducted the speaking exam myself and I’m not a trained examiner, so the results are perhaps conservative and may not fully reflect actual exam performance, though I’ve been doing this a while and I’m usually fairly close.  Plus the mark schemes and level descriptors etc are quite specific for FCE, so I suspect these are not too far off.  The results are rendered below as scores out of 20, with the total from the five papers adding up to a percentage score.  The pass mark is 60%.

What’s interesting here is that (a) it’s a good group of learners who probably won’t have much trouble with the real thing (b) while there are individual variations, by and large the learners as a group were more successful on some papers and less successful on others.

If we average out the scores (as in the graph above) we see that the data does actually match the learners stated needs.  That is if we assume that learners want to achieve higher scores and therefore need to prioritise development in areas where they scored lower.  Use of English, as the lowest score, matches their top time prioritisation.  As the second and third lowest scores, writing and speaking match the Tier 2 prioritisation.  Finally, learners scored highest in those areas they felt they needed to spend least time on, reading and listening.  The scores from the speaking section are perhaps slightly off.  Not only am I not a trained examiner, but it was the first time the learners had sat through an entire practice speaking exam and not all of them took it seriously.

So what does this tell us?  I don’t think it contradicts any of the studies that cast doubt on the efficacy of exam practice as it didn’t really ask that question.  For the record, the learners had previous exposure to the exam task types through the coursebook and had carried out one prior assessment in most, but not all, of the task types before doing the entire practice paper.  I make no conclusions on this though!

I think it also tells us that generally, learners have a fairly good idea of what they’re good at and what they’re not – it would be particularly interesting to see what correlations were thrown up between individual needs analysis responses and test data.  But it also tells us we need to make sure that both we and the learners understand the questions that we’re asking.

Ultimately, I think it tells me what I’ll be focusing on in class over the next few months.  Heads down, eyes front, pencils at the ready…. well….  for at least some of the time!

References:

Green, A., (2007).  Washback to learning outcomes: a comparative study of IELTS preparation and university pre-sessional language courses.  Assessment in Education: Principles, Policy & Practice, 14(1), 75-97.

Hayes, B.  (2003).  IELTS preparation in New Zealand: an investigation into the nature of the courses and evidence of washback.  Unpublished doctoral thesis, Victoria University of Wellington,Wellington.

Perrone, M.J.  (2010).  The impact of the First Certificate in English (FCE) examination on the EFL classroom:  A washback study.  Unpublished doctoral thesis. Columbia University,New York.

Advertisement