theme-sticky-logo-alt

Musings on amusing test answers (for amazing item writing)

No matter how much we love our profession, there will always be that mind-numbing task we have to carry out. To me, that’s marking a seemingly endless pile of multiple-choice/short answer tests, the kind even a computer could grade. However, there is always that outside-the-box answer that will cheer me up. The web has no shortage of such examples.

Me too, Frankie, me too. -- Source: Distractify

Me too, Frankie, me too. — Source: Distractify

The answers students give may be creative, funny or just oh-so-very-wrong, but there might be more to them than meets the eye. Take for instance the two answers below.

declaration of independence

Come to think of it… Source: justsomething

Indeed

Indeed… Source: justsomething

The test-takers that came up with “at the bottom” and “1895” were probably just trying to be funny (mission accomplished, by the way, as their answers went viral), but they were also exploiting an intrinsic ambiguity in the question. Of course we have no idea about the context in which those tests were developed and delivered, and the point here is not to find fault with other people’s work. On the other hand, unexpected and droll answers might help us reflect on test design and item writing, so we can improve the tests we write for our students.

To what extent don’t we end up writing ambiguous test items and expecting students to just know what they’re supposed to do? And more importantly, does the right answer show they have achieved whatever learning aim the test item refers to, or is it evidence that they have learned how to answer questions as it pleases the teacher? In fact, what are those test items testing? Again, there is no context here for us to be sure of much, but it does seem the right answers would depend on memorization of dates and places. Is that deemed so relevant in the curriculum to warrant a test item (out of possibly only 10, 20 or 30 items)?

If we find we need to change that sort of item in our tests, then there are at least two things to consider. Ambiguity is more easily solved: it can sometimes be avoided or reduced by asking a couple of peers, maybe even teachers of another subject, to sit the test as if they were students. The other questions are more complicated, as they strike at the heart of the item or test: its construct. The construct of a test is the skills, ability or content being assessed. If we are asking for something which wasn’t a learning aim or weighing it in such a way that overestimates its importance in the curriculum, then we are misrepresenting the curriculum, what we set out to achieve with the students. The consequence is potentially very serious: a high score might not mean the student is doing well as far as the real learning aims are concerned, and a low score might not mean the student is doing poorly either. In other, less tentative words, scores are potentially meaningless.

In that case, the test item or even the whole test may need to be rewritten from scratch. And a useful first step in good testing practice is thinking about what we want from students: “What are the learning aims and their relative importance towards one another? What is sufficient evidence of that learning aim having being achieved?” Then we could put all that into writing and refer to the document while and after designing the test. That will help us not stray from the testing objectives, which should be in alignment with the learning objectives. After all, testing what we haven’t taught is no laughing matter.

********************************************************

If you want to check out the sources of these answers:

https://justsomething.co/32-hilarious-kids-test-answers/

https://www.distractify.com/test-answers-that-are-totally-wrong-but-still-genius-1197820269.html

*********************************************************

If you would like to learn more about language testing, I’d recommend McNamara’s “Language Testing”, published by Oxford University Press (Oxford Introductions to Language Study) in 2000, as a short and sweet introduction to the theory of language testing.

Previous Post
Always check instructions, always check instructions, always check instructions, always check…………
Next Post
I used to think… (Part 2)
Natália Guerreiro

Natália Guerreiro has been a teacher since the year 2000 and currently works in Aviation English assessment and teaching for the Brazilian Air Force. She holds a CELTA, a B.A. in English & Portuguese from UFRJ, and an M.A. in Applied Linguistics from the University of Melbourne. She's been elected BRAZ-TESOL's Second Vice President for the 2019-2020 term.

15 49.0138 8.38624 1 0 4000 1 https://www.richmondshare.com.br 300 0