Archive for April, 2011

Top tips

Wednesday, April 27th, 2011

I’ve recently been asked for my ‘top tips’ for writing interactive computer-marking assignment (iCMA) questions. I thought I might as well nail my colours to the mast and post them here too:

•  Before you start writing iCMA questions, think about what it is appropriate to assess in this way – and what it isn’t.

•  Think about what types of iCMA question are most appropriate for what you want to assess. Don’t assume that multiple-choice and multiple-response questions are more reliable than free-text entry questions – they aren’t!

•  Write multiple variants of your questions – this enables you to use the same basic template in writing questions for multiple purposes, reduces opportunities for plagiarism and gives students extra opportunities for practice. However, in summative use, make the variants of similar difficulty.

•  For multiple response questions (where students have to select a number of correct options), tell students how many options are required (otherwise students get very frustrated).

•  Check carefully that each question is unambiguous. Does it use language that all students should understand? If you want an answer in its simplest possible form, is this clear?

•  Think carefully about what you will accept as a correct answer. Do you want to accept miss-spellings (e.g. ‘sulphur’ instead of ‘sulfur’), surplus text etc. If in doubt, have surplus text at the end of a response removed before the response is checked. Students are not happy if their response is marked wrong because, for example, they have indicated the precision of an answer by typing ‘to 3 significant figures’ at the end of a perfectly correct numerical answer.

•  Wherever possible, give feedback that is tailored to the error that a student has made. (Students get very annoyed when they are given general feedback that assumes they don’t know where to start, where to their mind they have made a ‘small’ error late in the process, e.g. given incorrect units).

•  If a response is partially correct, tell the student that this is the case (preferably telling them what is right and what is wrong).

•  Check your questions carefully at each stage, and – especially important – get someone else to check them.

•  Monitor ‘real’ use of your questions and look at student responses to them. Check that your variants are of sufficiently similar difficulty. Be prepared to make improvements at this stage.

to two significant figures

Wednesday, April 27th, 2011

I know I keep banging on about the importance of monitoring your questions when they are ‘out there’, being used by students. If what follows appears to be a bit of a trick (and in a sense it is), it’s a trick with firm foundations – the monitoring of many thousands of student responses. (more…)

iCMA statistics

Tuesday, April 19th, 2011

This work was originally reported on the website of  COLMSCT (the Centre for the Open Learning of Mathematics, Science, Computing and Mathematics) – and other work was reported on the piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) website. Unfortunately, the whole of the OpenCETL website had to be taken down. The bare bones are back (and I’m very grateful for this) but the detail isn’t, so I have decided to start re-reporting some of my previous findings here. This has the advantage of enabling me to update the reports as I go.

I’ll start by reporting a project on iCMA statistics which was carried out back in 2009, with funding from COLMSCT, by my daughter Helen Jordan (now doing a PhD in Department of Statistics at the University of Warwick; at the time she did the work she was an undergraduate student of mathematics at the University of Cambridge). Follow the link for  Helen’s project report , but I’ll try to report the headline details here – well, as much as I can understand them! (more…)

Feedback after a correct answer

Monday, April 18th, 2011

OpenMark is set up to give students increasing feedback after each incorrect attempt at a question. After they have had [usually] three attempts they are given a ‘full answer’. The system is set up so that a student who gets the question right receives the same ‘full answer’. The screenshot above shows this feedback for a simple question.

Our approach raises a number of questions: (more…)