Author Archives: Sally Jordan

to two significant figures

I know I keep banging on about the importance of monitoring your questions when they are ‘out there’, being used by students. If what follows appears to be a bit of a trick (and in a sense it is), it’s a trick with … Continue reading

Posted in Computers as Social Actors, question analysis, significant figures | Tagged , , , , | Leave a comment

iCMA statistics

This work was originally reported on the website of  COLMSCT (the Centre for the Open Learning of Mathematics, Science, Computing and Mathematics) – and other work was reported on the piCETL (the Physics Innovations Centre for Excellence in Teaching and … Continue reading

Posted in e-assessment, statistics, student engagement | Tagged , , | 1 Comment

Feedback after a correct answer

OpenMark is set up to give students increasing feedback after each incorrect attempt at a question. After they have had [usually] three attempts they are given a ‘full answer’. The system is set up so that a student who gets … Continue reading

Posted in e-assessment, excellent students, feedback, feedback after correct answer | Tagged , | 4 Comments

Feedback for excellent students

I’ve heard/read several things recently about the fact that excellent students tend to get less feedback than others. This is perhaps related to the fact that (anecdotally at least) teachers sometimes ignore excellent students – they’ll do OK whatever, so … Continue reading

Posted in adaptive questions, excellent students | Tagged , | 2 Comments

Repeated and blank responses

The figure shown on the left requires a bit of explaining. The three columns represent student responses at 1st, 2nd and 3rd attempt to a short-answer free-text question in formative use. Green represents correct responses; red/orange/yellow respresent incorrect responses. The reason I’ve used different … Continue reading

Posted in e-assessment, student engagement | Tagged , , | 1 Comment

Can formative-only assignments tell you about student misunderstandings?

When I was doing the original analysis of student responses to Maths for Science assessment questions, I concentrated on questions that had been used summatively and also on questions that required students to input an answer rather than selecting a … Continue reading

Posted in mathematical misunderstandings, question analysis, student engagement | Tagged , , , | Leave a comment

More on feedback

Picking up on Silvester’s comment on my previous post…I think it is really important that we stop and think before saying that a student answer to an e-assessment question is wrong because some detail of it is wrong. As with … Continue reading

Posted in e-assessment, feedback, mathematical misunderstandings | Tagged , , | Leave a comment

More about units

OpenMark e-assessment questions were used for the first time in a little 10-credit module called Maths for Science  that has been running since 2002. I did some analysis years ago into the mistakes that students make, but I’m about to … Continue reading

Posted in mathematical misunderstandings, question analysis, units | Tagged , , | 3 Comments

Thank you Google translate

As you’ve probably guessed, I’m English, and aside of a pitifully small amount of French, I am ashamed to admit that  I don’t speak any other languages. Therefore I think Google translate is wonderful, especially since it enables me to … Continue reading

Posted in e-assessment, links | Tagged , , | 1 Comment

Units : little things that make a difference

If we start from the premise that we want assessment to encourage and support learning, then one measure of the assessment’s effectiveness is better performance on later summative tasks. Mundeep Gill and Martin Greenhow (Gill, M. and Greenhow, M. (2008) … Continue reading

Posted in e-assessment, units | Tagged , | 2 Comments