Writing good interactive computer-marked assessment questions

I run a lot of workshops trying to help colleagues to write good e-assessment questions. There are usually lots of brilliant ideas in the workshop, but somehow we end up slipping back into using lots of multiple choice questions because people think they are reliable.

I suppose it is true that the answer matching is easier to set up for multiple choice and multiple response questions, but beware – just because you (as author) can easily identify one response as ‘correct’ and others as ‘incorrect’ it doesn’t mean that the question is behaving in the way you expect. The question might be ambiguous or there might actually be more than one correct option. Or – the most common problem – whilst some options are definitely correct and others are definitely incorrect, there may also be options which could either be correct or incorrect, depending on your interpretation of them.

We’ve just had a problem with a multiple-response question on one of the modules I am responsible for. The question author didn’t specify how many many options (in this case molecules that are the products of particular chemical reactions) were required. For the reaction in question, the author was expecting just one molecule. Water is a product too, but in the course material you’re told to ignore it.  But that doesn’t meant it isn’t there – a large percentage of students selected the expected molecule AND water – and were marked as wrong!

So, be careful when writing even supposedly straightforward questions and don’t assume that students will answer in the way you expected. Most importantly, don’t penalise them just because they come up with an answer other than the one you were expecting.

Check, check and check again.

This entry was posted in e-assessment, quality and tagged , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *