Can multiple-choice questions be used to give useful feedback?

I was asked the answer to this question recently, and I thought it was worth a blog post. My simple answer to the question in the title, I’m afraid to say, is “no”. Perhaps that’s a bit unfair, but I think that relying on MCQs to provide meaningful feedback is somewhat short-sighted; surely we can do better.

My argument goes thus: a question author provides feedback which they believe to be meaningful on each of the distractors, but that assumes that the student was using the same logic as the question author in reaching that distractor. In reality, students may guess the answer, or work backwards from the distractors or something in between e.g. they may rule out distractors they know to be wrong and then guess from amongst the rest. Feedback is only effective when the student encounters it in a receptive frame of mind (timing and concepts such as response certitude come into play here); if the student has given a response for one reason and the feedback assumes a different logic then the feedback is, at best, of dubious value. It is also the case that there is growing evidence that when given the option to give responses without the ‘hint’ provided by MCQs, students give answers that were not amongst those provided in the distractors.

It is no secret that I am not a fan of selected response questions, though my views have mellowed slightly over the years. My biggest problem with them is the lack of authenticity. However if that is not an issue for the use being made, and the questions are well written, based on responses that students are known to give (rather than those that ‘experts’ assume students will give), then perhaps MCQs are OK. Even relatively simple multiple-choice questions can create “moments of contingency” (Black & Wiliam, 2009; Dermo & Carpenter, 2011) and Draper’s (2009) concept of catalytic assessment is based on the use of selected-response questions to trigger subsequent deep learning without direct teacher involvement. However I think the usefulness here is in making students think, not in the direct provision of feedback.

There are other things that can be done to improve the usefulness of multiple-choice questions e.g. certainty-based marking (Gardner-Medwin, 2006). However, when there are so many better question types, why not use them? For example, there are the free-text questions – with feedback – used by the free language courses at https://www.duolingo.com/. I’m not sure what technology they are using, but I think it is linked to crowd-sourcing, which I definitely see as the way ahead for developing automatic marking and feedback on short-answer constructed response questions.

Let’s make 2016 the year in which we really look at the evidence and improve the quality of what we do in the name of computer-marked assessment and computer-generated feedback. Please.

References

Black, P. & Wiliam, D. (2009). Developing the theory of formative assessment. Educational Assessment, Evaluation and Accountability, 21(1), 5-31.

Dermo, J. & Carpenter, L. (2011). e-Assessment for learning: Can online selected response questions really provide useful formative feedback? In Proceedings of the 2011 International Computer Assisted Assessment (CAA) Conference, Southampton, 5th-6th July 2011.

Draper, S. (2009a). Catalytic assessment: Understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology, 40(2), 285-293.

Gardner-Medwin, A. R. (2006). Confidence-based marking: Towards deeper learning and better exams. In C. Bryan & K. Clegg (Eds.), Innovative Assessment in Higher Education (pp. 141-149). London: Routledge.

 

This entry was posted in feedback, multiple-choice questions and tagged , , . Bookmark the permalink.

2 Responses to Can multiple-choice questions be used to give useful feedback?

  1. John Kleeman says:

    Sally

    Of course you are right that feedback from multiple choice questions won’t always be helpful, but to suggest that they are never or even usually unhelpful seems quite a strong comment.

    There is quite a lot of evidence that feedback from multiple choice questions can help. See for example Butler and Roediger 2008 in Memory and Cognition “Feedback enhances the positive effects and reduces the negative effects of multiple-choice testing”, at http://psych.wustl.edu/memory/Roddy%20article%20PDF's/Butler%20&%20Roediger%20(2008)_MemCog.pdf

    See also Will Thalheimer’s paper at http://willthalheimer.typepad.com/files/providing_learners_with_feedback_part2_may2008.pdf which provides other evidence.

    I hope this helps

    John Kleeman

  2. Sally Jordan says:

    Thanks John – you have a fair point!
    best wishes
    Sally

Leave a Reply

Your email address will not be published. Required fields are marked *