-
Archives
- December 2019
- July 2018
- October 2017
- August 2017
- July 2017
- November 2016
- September 2016
- May 2016
- February 2016
- January 2016
- November 2015
- October 2015
- July 2015
- June 2015
- March 2015
- January 2015
- December 2014
- November 2014
- October 2014
- September 2014
- August 2014
- May 2014
- March 2014
- February 2014
- January 2014
- December 2013
- November 2013
- October 2013
- August 2013
- July 2013
- June 2013
- May 2013
- April 2013
- March 2013
- February 2013
- January 2013
- December 2012
- November 2012
- October 2012
- September 2012
- August 2012
- July 2012
- May 2012
- April 2012
- March 2012
- February 2012
- January 2012
- December 2011
- November 2011
- October 2011
- September 2011
- August 2011
- July 2011
- June 2011
- May 2011
- April 2011
- March 2011
- February 2011
- January 2011
- December 2010
- November 2010
- October 2010
- September 2010
- August 2010
- July 2010
-
Meta
Monthly Archives: February 2011
Peerwise
Also at the ‘More effective assessment and feedback’ meeting on Wednesday, Simon Bates spoke about the use of ‘Peerwise’ at the University of Edinburgh. Peerwise (see http://peerwise.cs.auckland.ac.nz/) enables students to write their own assessment questions, and to share and discus … Continue reading
Posted in e-assessment, peer assessment, Peerwise
1 Comment
Is it worth the effort?
I’m taking a short break from reporting findings from my analysis into student engagement with short-answer free-text questions to reflect on a couple of things following the HEA UK Physical Sciences Centre workshop on ‘More effective assessment and feedback’ at … Continue reading
Helpful and unhelpful feedback : a story of sandstone
One of the general findings that is coming out of my evaluation of student responses to multi-try e-assessment questions relates to that wonderful thing that I’ll call the ‘Law of unintended consequences’. I used to think that ‘students don’t read … Continue reading
Computers as social actors
Some of the findings I’ve been blogging about recently (and some still to come) are contradictory. On the one hand students seem to be very aware that their answers have been marked by a computer not a human-marker, but in … Continue reading
Posted in Computers as Social Actors, e-assessment, feedback
Tagged Computers as Social Actors, e-assessment, feedback
1 Comment
decease or decrease?
Back in 2007, we were observing students attempting our short-answer free-text e-assessment questions in a Usability Laboratory. One student repeatedly typed ‘decease’ instead of ‘decrease’ and he didn’t realise he was doing it. At the time, the answer matching was linguistically … Continue reading
Spelling mistakes in student responses to short-answer free-text e-assessment questions
I get asked a lot about how the answer-matching copes with poorly spelt responses to our short-answer free-text responses, and this is certainly something that used to worry me. Fortunately all the evidence is that our answer matching has coped remarkably well with poor … Continue reading
Challenging my own practice
The JISC ‘From challenge to change’ workshop yesterday (see previous post) started with an invitation to record aspects of assessment and feedback provision in our own context that we felt to be strengths or remained a challenge. I was there … Continue reading
Posted in assessment design
Tagged assessment for learning, e-assessment, iCMAs, JISC, short-answer free text, TMAs
Leave a comment
Challenging received wisdom
Our case study ‘Designing interactive assessments to promote independent learning’ from the JISC guide Effective Assessment in a Digital Age featured at the JISC Birmingham Assessment Workshop ‘From challenge to change’ yesterday, so I was speaking at the workshop. These … Continue reading