-
Archives
- December 2019
- July 2018
- October 2017
- August 2017
- July 2017
- November 2016
- September 2016
- May 2016
- February 2016
- January 2016
- November 2015
- October 2015
- July 2015
- June 2015
- March 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- May 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010
- July 2010
-
Meta
Category Archives: e-assessment
Repeated and blank responses
The figure shown on the left requires a bit of explaining. The three columns represent student responses at 1st, 2nd and 3rd attempt to a short-answer free-text question in formative use. Green represents correct responses; red/orange/yellow respresent incorrect responses. The reason I’ve used different … Continue reading
Posted in e-assessment, student engagement
Tagged blank responses, repeating of responses, student engagement
1 Comment
More on feedback
Picking up on Silvester’s comment on my previous post…I think it is really important that we stop and think before saying that a student answer to an e-assessment question is wrong because some detail of it is wrong. As with … Continue reading
Thank you Google translate
As you’ve probably guessed, I’m English, and aside of a pitifully small amount of French, I am ashamed to admit that I don’t speak any other languages. Therefore I think Google translate is wonderful, especially since it enables me to … Continue reading
Units : little things that make a difference
If we start from the premise that we want assessment to encourage and support learning, then one measure of the assessment’s effectiveness is better performance on later summative tasks. Mundeep Gill and Martin Greenhow (Gill, M. and Greenhow, M. (2008) … Continue reading
Peerwise
Also at the ‘More effective assessment and feedback’ meeting on Wednesday, Simon Bates spoke about the use of ‘Peerwise’ at the University of Edinburgh. Peerwise (see http://peerwise.cs.auckland.ac.nz/) enables students to write their own assessment questions, and to share and discus … Continue reading
Posted in e-assessment, peer assessment, Peerwise
1 Comment
Computers as social actors
Some of the findings I’ve been blogging about recently (and some still to come) are contradictory. On the one hand students seem to be very aware that their answers have been marked by a computer not a human-marker, but in … Continue reading
Posted in Computers as Social Actors, e-assessment, feedback
Tagged Computers as Social Actors, e-assessment, feedback
1 Comment
Trying our questions
With apologies, I thought I had put links on this blog to some Open University eAssessment questions, but this was not the case. I’ll add these links to the ‘E-assessment at the Open University’ link (right-hand side of blog) in … Continue reading
Posted in e-assessment
Tagged iCMAs, Open University, OpenMark, short-answer free text
Leave a comment
How students react to feedback from a computer
Returning to Lipnevich and Smith’s interesting work (Lipnevich, A.A. & Smith, J.K. (2009) “I really need feedback to learn:” students’ perspectives of the differential feedback messages, Educational Assessment Evaluation & Accountability, 21, 347-367). And for the benefit of those who attended the session … Continue reading
Writing good interactive computer-marked assessment questions
I run a lot of workshops trying to help colleagues to write good e-assessment questions. There are usually lots of brilliant ideas in the workshop, but somehow we end up slipping back into using lots of multiple choice questions because … Continue reading
Good posters at ALT-C
I was bowled over by two posters at ALT-C: Matt Haigh’s ‘Changing the way we see test-items in a computer-based environment: screen design and question difficulty’ (session 096)and Silvester Draaijer’s ‘Design of a question design support tool’ (session 148). Both … Continue reading
Posted in e-assessment
Tagged ALT-C 2010, e-assessment, Matt Haigh, Silvester Draaijer, student engagement, writing questions
1 Comment