Student engagement with assessment and feedback: some lessons from short-answer free-text e-assessment questions

Sorry for my long absence from this blog. Those of you who work in or are close to UK Higher Education will probably realise why – changes in the funding of higher education in England mean that the Open University (and no-doubt others) are having to do a lot of work to revise our curriculum to fit. I’m sure that it will be great in the end, but the process we are going through at present is not much fun. I always work quite hard, but this is a whole new level – and I’m afraid my blog and my work in e-assessment is likely to suffer for the next few months.

It’s not all doom and gloom. I’ve had a paper published in Computers & Education (reference below), pulling together findings from our observation of students attempting short answer free-text questions in a usability lab, and our detailed analysis of student responses to free-text questions – some aspects of of which I have reported here. It’s a long paper, reflecting a substantial piece of work, so I am very pleased to have it published.

The reference is:

Jordan, S. (2012) Student engagement with assessment and feedback: some lessons from short-answer free-text e-assessment questions. Computers & Education, 58(2),  818-834.

The abstract is: 

Students were observed directly, in a usability laboratory, and indirectly, by means of an extensive evaluation of responses, as they attempted interactive computer-marked assessment questions that required free-text responses of up to 20 words and as they amended their responses after receiving feedback. This provided more general insight into the way in which students actually engage with assessment and feedback, which is not necessarily the same as their self-reported behaviour. Response length and type varied with whether the question was in summative, purely formative, or diagnostic use, with the question itself, and most significantly with students’ interpretation of what the question author was looking for. Feedback was most effective when it was understood by the student, tailored to the mistakes that they had made and when it prompted students rather than giving the answer. On some occasions, students appeared to respond to the computer as if it was a human marker, supporting the ‘computers as social actors’ hypothesis, whilst on other occasions students seemed very aware that they were being marked by a machine. Do take a look if you’re interested.

This entry was posted in short-answer free text questions, student engagement and tagged , . Bookmark the permalink.

1 Response to Student engagement with assessment and feedback: some lessons from short-answer free-text e-assessment questions

  1. Pingback: e-assessment (f)or learning » Blog Archive » Use of capital letters and full stops

Leave a Reply

Your email address will not be published. Required fields are marked *