Archive for the ‘question difficulty’ Category

Multiple choice vs short answer questions

Thursday, January 19th, 2012

I’m indebted to Silvester Draaijer for leading me towards an interesting article:

Funk, S.C. & Dickson, K.L (2011) Multiple-choice and short-answer exam performance in a college classroom. Teaching of Psychology, 38 (4), 273-277.

The authors used exactly the same questions in  multiple-choice and short-answer free-text response format – except (obviously) the short-answer questions did not provide answer choices. 50 students in an ‘introduction to personality’ psychology class attempted both versions of each question, with half the students completing a 10 question short-answer pretest before a 50 question multiple-choice exam and half the students completing the 10 short-answer questions as a post-test after the multiple-choice exam. The experiment was run twice (‘Exam 2′ and ‘Exam 3′, where students didn’t know what format to expect in Exam 2, but did in Exam 3). (more…)

Does a picture paint a thousand words?

Tuesday, August 2nd, 2011

One of the things that Matt Haigh looked at when considering the impact of item format (see previous post) was whether the presence of a picture made the question easier or harder. He started with a very simple multiple choice item:

Which of the following is a valid argument for using nuclear power stations?

  • for maximum efficiency, they have to be sited on the coast
  • they have high decommissioning costs
  • they use a renewable energy source
  • they do not produce gases that pollute the atmosphere

All the students received this question, but half had a version which showed a photograph of a nuclear power station. Not surprisingly, the students liked the presence of the photograph. The version with the photograph also had a slightly lower item difficulty  (when calcualted by either Classical Test Theory and Item Response Theory paradigms), but not significantly so.

When compared with aspects of my work that I’ve  already described briefly (it’s the dreaded sandstone question again – see Helpful and unhelpful feedback : a story of Sandstone) it is perhaps surprising that the presence of the photograph does not confuse people and so make the question more difficult. (more…)

The impact of item format

Tuesday, August 2nd, 2011

One of the things I’ve found time and time again in my investigations into student engagement with e-assessment is that little things can make a difference. Therefore the research done by Matt Haigh of Cambridge Assessment into the impact of question format, which I’ve heard Matt speak about a couple of times, most recently at CAA 2011, was well overdue. It’s hard to believe that so few people have done work in this area.

Matt compared the difficulty (as measured by performance on the questions) of ten pairs of question types e.g. with or without a picture, drag and drop vs tick box, drag and drop vs drop-down selection, multiple-choice with only single selection allowed vs multiple-choice with multiple selections enabled, when adminstered to 112 students at secondary schools in England. In each case the actual question asked was identical. The quantitative evaluation was followed by focus group discussions.

This work is very relevant to what we do at the OU (since, for example, we use drop-down selection as the replacement for drag and drop questions for students who need to use a screen reader to attempt the questions). Happily, Matt’s main conclusion was the variations of item format explored here had very little impact on difficulty – even when there appeared to have been some difference this was not statistically significant. The focus group discussions led to general insight into what makes a question difficult (not surprisingly ‘lack of clarity’ came top) and also to some suggestions for the observed differences and lack of differences in difficulty in the parallel forms of the questions.

I’d very much like to do some work in this area myself, looking at the impact of item format on our rather different (and vast) student population. I’d also like to observe people doing questions in parallel formats, so see what clues that might give.

What sorts of e-assessment questions do students do best at?

Tuesday, November 2nd, 2010

What do you think?

Multiple choice? Short-answer free text?

(more…)