Using pattern matching software

PMatch is a new Moodle question type (based on OpenMark’s pattern matching question type that is currently in use at the Open University for the short-answer free text questions that I have written). There is more information here.

Follow the links and you’ll see that it is pretty simple and, despite the fact that I have no computer programming experience, I found it easy to use. If you wanted to, you could use PMatch simply to look for keywords – however, for most of the questions that I have written that would not be sufficient. Word order and negation can be very important, especially when students really do give answers that are ‘opposite’ to a correct one, as in the examples that follow. So, in one question, you need to be able to mark ‘kinetic energy is converted to gravitational potential energy’ (and synonyms) as right but ‘gravitational potential energy is converted to kinetic energy’ (and synonyms) as wrong. In another question, you need to be able to mark ‘the forces are balanced’ and ‘There are no unbalanced forces’ as right (note that students very often use double negatives in their answers) whilst marking ‘The forces are not balanced’ and ‘The forces are unbalanced’ as wrong.

One of the other things people frequently ask me about is how to handle spelling mistakes. We use a dictionary on the front end of PMatch (so words that are not recognised are queried, with no loss of marks) and I also usually allow for words with up to one letter extra or missing or with two letters in the wrong order. So I accept ‘decease’ instead of ‘decrease’, because I’ve found that this is a mistake that students make, and don’t realise that they are making. However, if you wanted to (and some of my colleagues do), you could require an exact spelling – that sort of choice is rightly left to the question author.

So we write a sequence of rules which mark responses as correct, partially correct, or wrong, and which generate feedback. What I really want to talk about in this post is the way in which these rules are combined. When checking a student response, PMatch essentially checks the rules in order and usually the vast majority of the responses can be matched by a very small number of rules. But if you want the feedback to fire appropriately you need to think carefully and logically about the order in which the rules operate.

Once all other rules have fired, we use an ‘else if’ rule that marks responses that say something the system hasn’t recognised as right or (usually) wrong. Another possibility at this stage would be to pass these unmatched responses to a human marker. Given our high marking accuracy (reported in my CAA 2012 paper) this isn’t strictly necessary, but it could still be a good idea, enabling you to deal with new synonyms and to develop your answer matching, as well as increasing student confidence. We know that our computer marking is more accurate than that of human markers, but our students don’t always appreciate this. This approach could perhaps even provide a step towards the high-stakes summative use of short-answer free-text e-assessment questions.

This entry was posted in short-answer free text questions and tagged , , . Bookmark the permalink.

1 Response to Using pattern matching software

  1. Pingback: e-assessment (f)or learning » Blog Archive » Evaluation, evaluation, evaluation

Leave a Reply

Your email address will not be published. Required fields are marked *