theme-sticky-logo-alt

“Can you help me with this quick questionnaire?” — ELT Survey Mistakes

Have you noticed how many ELT surveys there have been out there as of late? Teachers seem to be firing off questionnaires to other teachers all the time now, about the most varied topics, from tasks used in class to teacher profile, from student motivation to whatnot. That is probably a consequence of the advent of apps such as Google Forms and Survey Monkey, which ease the burden of creating and handing out questionnaires and will even generate graphs automatically for you.Online_Survey_Icon_or_logo.svg

And that’s great. I make a point of publicizing questionnaire links as much as I can, for I strongly believe that helping research in our field means helping all of us. Also, in a field that has been criticized for being so research-averse, the turn towards data-driven claims should be celebrated.

Sadly, though, the research baby doesn’t seem to have come with its bath water. As any master’s or PhD candidate will tell you, research design isn’t a ride in the park. Questionnaires are only one of the possible tools of research tools, and there have been books and more books written on them. So I feel something might be missing in those questionnaires that have been popping up everywhere.  I also know I’m not the best person to put the finger on this particular problem, not being an academic myself. Here, though, from teacher to teacher, from respondent to potential questionnaire writers, I’d like to point out a few issues that have been bothering me with those surveys.

1. No initial explanation as to the research objectives and the uses of the results

The aim, purpose and importance of your research is obvious to you, of course, but as a respondent, I feel I need to know what I’m spending time on. What topic are you going to investigate? That helps me to understand the questions better, not to stray off topic and consequently saves you time. (If by any chance you think you cannot clarify what you’re angling for, for fear of confounding the results, that’s fine, but do check how to do that ethically.)

Above all, however, what are you doing this for? Is it market research; is it for a presentation of yours (if so, when are you going to present it and where); is it for an undergrad or postgrad program (in which case, where are you doing it)? There is nothing wrong with any of those uses, provided that we respondents know.

Tip: State the objectives and uses of the research on the first page of the questionnaire, even if you explain it on the email or post in which you publicize the survey link. After all, the link might get separated from your text, and then new respondents will not have access to it.

2. Not specifying your target audience

Once targetI got a survey link and had to read through the questions before I realized it aimed at Young Learner teachers, something I haven’t been for 7 years. If you have a specific respondent profile in mind (and indeed you should), please say so in the introduction of the questionnaire. Is it teachers or students who should respond? From Brazil or anywhere in the world? Teaching in private or state schools, or maybe only language institutes or one-to-one? Novice, seasoned teachers or either? Again, you do not want people to waste their time. Plus, your results might end up skewed at best, if you want to say “YL teachers think this and that” but actually had teachers of all age groups responding.

3.  Leading or biased questions

You know that kind of reporter or interviewer that asks, “Don’t you think that bla bla bla bla bla?”

I half expect the interviewee to go, “Dear, do you want to know my opinion or state yours?”

Before you send off the questionnaire, try and ask a colleague to read through your items. If your views are seeping through, rewrite the items. You know how students can guess what the teacher wants to hear and say exactly that even if they don’t believe in it? Well, that’s a risk with fellow teachers as well… We human beings like to please.

Also, even those disagreeable human beings who won’t want to please you (me!) might lose faith that the results will be analyzed and interpreted properly, instead of just being used to confirm what the researcher already thought. If you’re doing research, you have to be open to both confirming and refuting your hypothesis.

4. Ambiguous or badly written questions or optionsrotten

I know we all do our best proofreading after we press “send”, but you can always ask a few close people to attempt to do your questionnaire and give you feedback. Are there any typos? How do they understand the questions? Do they want to add any options (esp. if you don’t have an “Other” option)? Do the options go with the question or does each option seem to be about a different topic? Also, if it’s multiple choice, do respondents get to choose only one option or more? If it’s only one, are the options exclusive, or it possible people will feel the need to tick more than one? Are the options too similar?

5. Assuming people understand the terms you are using

E.g.: “What do you think is more important for language learning: intrinsic or extrinsic motivation?”

Erm… who says everybody knows those terms or understands them in the exact same way? Give me concrete examples for me to judge (e.g.: student likes the language; student will get a raise if he passes the test, etc.) and then, when you analyze, you categorize those as intrinsic or extrinsic motivation, according to whatever theorist you’re using.

6. Questions/Research methods that do not correspond to the research objectives

Still with the motivation example from #5, what are you getting at by asking teachers those questions? If you inquire about teachers’ opinions on types of motivation and their impact on learning, you’ll get — surprise, surprise — what teachers think is more important in terms of motivation. Sounds tautological, but it’s more common a mistake than you might guess. You have to be clear (to yourself) about what you’re researching. With questions like that, you can’t go around saying this type of motivation is more important than the other. You haven’t researched the impact of different kinds of motivation on learning; you’ve researched teacher beliefs or attitudes regarding student motivation. And that’s interesting in its own right, but a different kettle of fish altogether.

7. Asking the respondents to identify themselves when the questions are sensitive in nature.

Here I was happily filling in a survey form with my name, age, e-mail, context of work, blablabla, and then… *drum roll*… SALARY.

Thanks, but not thanks.

In that case, because I really cared about the institution responsible for the questionnaire, I contacted the questionnaire designers and conveyed my reservations. Mind, I fully understood why they were asking for the salary, and I trusted them not to tie that information down to names and e-mail addresses, but I was a single person who happened to know who they were and what they were going for. I’d be willing to bet not many people responded. Well, don’t get your hopes up, we rarely get many respondents anyway. However, if you ask for sensitive information and ask the participant to identify him or herself in the same breath, you’re just setting yourself up for failure.

8. Allowing people to respond many times.

Blue_tang_(Paracanthurus_hepatus)_02Hi, I’m Dory. And Dory here is over 30. If you’re publicizing your link again, and then a few weeks later, and then again (which is good practice, by the way, as you shouldn’t be sending off the questionnaires the night before your presentation), I will probably forget whether I have responded. That can lead me 1) to think I have done when I haven’t, and you miss out on one participant (times as many ELT Dories there are out there); 2) to respond more than once, which means my answers will be duplicated, and I don’t need to say that detracts from the validity of your research.

(I do realize sometimes it’s impossible to prevent that from happening, esp. if you are asking for sensitive information, but if you can, by all means, help us Dories in the ELT world.)

9. Looooooong. 

Rich coming from me, when this post is so long already… However, if your questionnaire is extensive or if it has too many open-ended/essay questions, few people will have the time for it. Also, check with a few friends how long they take to respond to the questionnaire and add that information to the introduction to the questionnaire, so the participant can set aside enough time to do it all in one sitting.

10. Last but not least, remember “There are three kinds of lies: lies, damned lies, and statistics.”

This is, of course, a matter of analysis and how you report your results later on. Nevertheless, make sure you think it through when you’re designing your questionnaire. How will you analyze the results? What stats do you intend to run? How many respondents do you need for those stats? Above all, be careful not to generalize beyond what your questionnaire allows you to. After all, we’re in the business of educating, not (unintentionally) deceiving.

***********

Pic sources: Wikimedia Commons & Rotten eCards

Previous Post
Reading tasks: 3 pitfalls to avoid
Next Post
Final /i/ and /m/: pronunciation for teachers
Natália Guerreiro

Natália Guerreiro has been a teacher since the year 2000 and currently works in Aviation English assessment and teaching for the Brazilian Air Force. She holds a CELTA, a B.A. in English & Portuguese from UFRJ, and an M.A. in Applied Linguistics from the University of Melbourne. She's been elected BRAZ-TESOL's Second Vice President for the 2019-2020 term.

15 49.0138 8.38624 1 0 4000 1 https://www.richmondshare.com.br 300 0