Skip to content

Toggle service links

The Open CETL

The Open University's Centres for Excellence in Teaching and Learning

*** This site is an archive; content may be out-of-date ***

  1. The Open CETL
  2. News
  3. Short answer free-text questions : a change of technology

Short answer free-text questions : a change of technology

Body: 

30th Mar 2010

A surprising outcome of the work of COLMSCT Fellows Sally Jordan and Phil Butcher is now having direct impact on assignments used by thousands of students per year.

Following the unexpected finding that algorithmically-based answer matching, appropriately used, can be as accurate as that of a linguistically-based system when marking short-answer free-text eAssessment questions (and that both technologies are at least as accurate as human markers), questions in use on three Open University modules are now using the OU-developed ‘OpenMark’ software for their answer matching.

With COLMSCT Associate Teaching Fellow Barbara Brockbank, Sally developed a bank of eAssessment questions requiring students to type their answers as sentences of up to 20 words. Evaluation showed that the computer’s marking was as accurate as that of human markers and 26 of the questions are in regular use in formative and summative iCMAs on S104 Exploring Science, S154 Science Starts Here and SXR103 Practising Science and well as in the diagnostic quiz ‘Are you ready for S104?’

The answer matching for these questions was initially written using linguistically-based software provided by Intelligent Assessment Technologies. However a trial in summer 2008, reported in a recent paper in Computers & Education (Butcher, P.G. and Jordan, S.E. (in press, due 2010) A comparison of human and computer marking of short free-text student responses) showed that it was possible to obtain equally impressive figures for accuracy of answer-matching using OpenMark’s own algorithmically-based answer matching software, ‘PMatch’. By ‘algorithmically-based’ we mean that the answer-matching is based on a number of simple rules. However the answer matching is not simply looking for keywords – word-order often matters (‘The cat sat on the mat’ has a different meaning to ‘The mat sat on the cat’) and the presence or absence of negation (‘not’ etc.) can be important too. Most significantly, our answer matching is based on many thousands of responses from real students, who answer the questions in both expected and unexpected ways. Sometimes their answers have led us to refine the question itself in addition to improving the answer matching.

Since summer 2008, PMatch has been refined further and an additional spell-checker has been added. We have also conducted additional trials to ensure that PMatch really is providing accurate answer matching. One S104 question and one S154 question have been using PMatch for some time and from summer 2010 all the questions will be moved to the new technology. Transferring the answer-matching from one system to the other has also enabled us to add additional targeted feedback and so to maximise the learning that students derive from these questions.

For more information please see Sally and Phil's project pages:- http://www.open.ac.uk/colmsct/projects/sallyjordan, http://www.open.ac.uk/colmsct/projects/philbutcher or see the links under 'Related projects' to the right hand side of this page.

To view Sally and Phil's paper please visit:- http://dx.doi.org/10.1016/j.compedu.2010.02.012