Four Colly Birds

Day 4. What gift was delivered on 28th December? Thinking about possible answers to this question should help you to identify the theme’s of today’s post, so perhaps I’ll stop writing now!

Some people say that the first day of Christmas is 25th Dec; leading to a 4th Day of 28th Dec and a 12th Day of 5th January. That’s the convention I’ve followed in this blog, but others would argue that the first day of Christmas is  26th Day, leading to a 3rd Day of 28th December and a 12th Day or 6th January. I wouldn’t like to say which of these is ‘right’ – it depends on your culture. So the question as asked is ambiguous – are you being asked about the gift delivered on the 3rd or 4th Day of Christmas? Continue reading

Posted in e-assessment, question writing, Twelve Days of Better eAssessment | Tagged , | Leave a comment

Three French Hens

Day 3. eAssessment design. I’ll start thinking about question type soon enough; in the meantime, this post is about how you run the questions. In OpenMark, usual practice is to give students multiple (usually three) attempts at each question, with increasingly directive feedback, as shown in the example below. After a first unsuccessful attempt, students are usually just told that they are wrong; after a second unsuccessful attempt they are given a more detailed hint, wherever possible tailored to the mistake that has been made; after a third unsuccessful attempt they are given a worked solution. Continue reading

Posted in e-assessment, Twelve Days of Better eAssessment | Tagged , | Leave a comment

Two Turtle Doves

Day 2. Integrate your eAssessment.  The desirability of integrating your eAssessment is true at various levels. The sorts of things I’m thinking about include:

1. Don’t just think about eAssessment as an add on. We talk about alignment of teaching and assessment – surely the best way to make that happen is to plan the two at the same time.

2. Integrate your eAssessment throughout your module. Continue reading

Posted in e-assessment, Twelve Days of Better eAssessment | Tagged , | Leave a comment

A partridge in a pear tree

Day 1. Before you start….Think!  What is the purpose of this assessment?  What are you trying to achieve? Is it assessment of learning (summative) or assessment for learning (formative)? Or perhaps it is diagnostic – providing information for your students and you about how well they are doing. How important are the feedback interventions that you provide for students? What are you trying to assess? – knowledge or skills?, recall or problem solving? Continue reading

Posted in e-assessment, Twelve Days of Better eAssessment | Tagged , | Leave a comment

Think before you assess

As well as the reading that has sparked my recent posts on Learning Outcomes and Revolution or Evolution?, I’ve been reading articles about multiple-choice questions and about assessing practical work. I’m fairly sure that I’ll be saying more about both of those topics during 2013, if not sooner. But there’s a common theme. Honest, there is.

When you want to assess practical work, it can be very easy to assess the underlying subject knowledge rather than the practical skills, and there are decisions to be taken about whether you want to assess practical experimental skills or report writing skills or both. If you chose to use multiple-choice assessment questions, irrespective of whether you like MCQs or loathe them per se, again it is sensible to stop and think about what you are actually assessing. An interesting paper that I read today:

Martinez, M.E. (1999) Cognition and the question of test item format. Educational Psychologist, 34(4), 207-218.

points out that ‘data bearing on the hypothesis of equivalence between constructed response and multiple-choice questions have been equivocal.’ Some people think the two types of questions can be equivalent, others disagree. For simple items, where a constructed response question is assessing recall, a multiple-choice question is assessing recognition. So although there may be correlation between scores obtained on tests comprising multiple-choice questions and tests comprising constructed response questions, the two tests are unlikely to be actually assessing the same thing.

My common theme is that you need to think carefully about WHAT you want to assess, and check that you are actually assessing this in the tasks that you produce for your students. And I think the easiest way to do this is to think in terms of learning outcomes. What is it that you hope your students have learnt: recall or recognition?; practical skills, report-writing skills or knowledge?

Posted in learning outcomes, multiple-choice questions, practical work | Tagged , , | Leave a comment

Revolution or evolution?

I seem to have taken part in a lot of discussions recently in which I, or others, have talked about the need for a real ‘shake-up’ of what we do in assessment. It is indeed depressing that we continue to talk about the problems, but yet we don’t seem to be able to do much better. I am very definitely not an expert, but I have read masses of papers written by those I consider to be experts, so I am somewhat embarrassed to admit that I am no longer really sure what assessment is for. I’ve heard all the arguments, but the more I read the more confused I become. And I suspect that, at least at institutional level, I am not alone in my confusion. I am no longer convinced that I know what we should alter in the ‘big shake-up’ and I fear that some of those who think they know what we should alter may be driven by rhetoric rather than by evidence.

In the absence of a revolution, I think we could make significant improvements by an evolutionary approach i.e. by making a series of smaller changes to our practice. In my own context small changes might include more ‘little and often’ assessment, more use of oral feedback, and assessment that is designed in a coherent way throughout a student’s programme of study, with more opportunities for reflection and perhaps with tutors being able to see feedback provided by tutors on previous modules. Some of these little changes are quite big! Your ‘little changes’ would be different.

Evolution has to do with the survival of the fittest, and in educational terms this reminds me of the importance of evaluating each of the ‘little changes’ (and, in an ideal world, not making too many changes at once – hmmm) and only persevering with the change if it is proven to be effective. Then, step by step, we can work towards better assessment practice.

Posted in evaluation, quality | Tagged , | Leave a comment

Learning outcomes – love them or hate them?

I went to an excellent meeting yesterday, the next step in bringing ‘joined-up thinking’ to assessment in modules in our Physics and Astronomy and Planetary Sciences pathways. There are issues, not least that some of the modules are also used by other qualifications/pathways – and we don’t own all of them. But, as at the Faculty Assessment Day in October, it was lovely to be able to spend several hours discussing teaching with a room-full of colleagues, and indeed the debate continued onto the bus to Milton Keynes Station at the end of the day!

The debate hinged around the use of learning outcomes. Continue reading

Posted in learning outcomes, programme-focused assessment, Uncategorized | Tagged , | 1 Comment

The Learning Thermometer

Last week I took part in another webinar in the Transforming Assessment series. This one was about ‘The Learning Thermometer’ and was given by Helen Stallman of the University of Queensland. There is more information at http://www.learningthermometer.com.au/.

Helen has an interest in students’ mental health and she has done research into the prevalence of ‘psychological distress’ in students, finding, not surprisingly, that students are more likely than the general population to be distressed. It is also not surprising that students’ studies are compromised by their distress – this leads to impaired attention, concentration and memory.

The Learning Thermometer is a web-based tool that students use to

  • Reflect upon their learning;
  • Get tailored feedback about strategies, resources, and support that might be useful to them doing well in their subject;
  • Develop individual learning plans to optimise their success in the course.
  • Continue reading

    Posted in mental health, webinars | Tagged , , , | Leave a comment

    An experiment in the essay-type paper

    The title of this post is the title of a paper I have just read. It was written in – wait for it – 1938. It’s a delightful little paper, but its findings are shocking. I came across it whilst valiently trying to find good reasons for using multiple-choice questions (which, you will remember, are not my favourite type of question). However, it turns out that multiple-choice (‘objective’) questions were first used because of the lack of objectivity of human-marked essay-type questions. Continue reading

    Posted in human marking | Tagged | Leave a comment

    Deadlines

    Whether you love or hate deadlines probably depends on whether you are currently struggling to meet one. Continue reading

    Posted in deadlines | Tagged | 2 Comments