Archive for January, 2013

Feedforward and dialogue

Saturday, January 26th, 2013

Until this morning, I thought the term ‘feedforward’ was something that had been invented recently – indeed, I thought it had its origins around the time of the FAST (Formative Assessment in Science Teaching) Project and the now famous Gibbs & Simpson (2004-5)literature review which was (I think! – I am now trying to be careful about what I attribute to whom) done as part of the scene setting for the FAST Project. I was certainly wrong about ‘feedforward’ – it turns out that the term was introduced by Mats Bjorkman in 1972 in the following paper:

Bjorkman, M. (1972) Feedforward and feedback as determiners of knowledge and policy: Notes on a neglected issue. Scandinavian Journal of Pyschology, 13, 152-158.

‘Feedforward’ is a useful term in that it makes us remember that we should be looking forwards not backwards. However, as I have argued previously, ‘feedback’ in its use in science and engineering is NOT backward looking. I have shown a picture of our current cold and snowy weather because it is prettier than a picture of my central heating thermostat (please bear with my rather peculiar logic which has led me to show this photo simply because central heating keeps us warm in cold weather). The thermostat uses information about the current temperature, to control the temperature in the future. This is feedback (actually ‘negative feedback’, but let’s not go there!) – a process that uses information.

So I don’t really like the term ‘feedforward’. I don’t think we need it. Let’s just remember that assessment feedback needs to be a process, most significantly a process that involves the student as well as the teacher, not just a flow of information from the teacher to the student. Then it will be forward looking. This is entirely in line with a mass of recent literature that argues in favour of assessment as a dialogic process.

Diagnostic quizzes

Friday, January 25th, 2013

Following on from the discussion with Tim on my previous post, it occurs to me that our online quizzes that consistently attract the highest number of users are our diagnostic ‘Are you ready for?’ quizzes. Since it opened in April 2012, more than 6000 people have completed ‘Are you ready for S104?‘ and more than 13000 others have started but not completed this quiz. It’s open to the world, so take a look – are you ready? have you got sufficient time for study?

I wouldn’t claim that everyone who clicks on the link is seriously contemplating Open University study, but a lot are, and we have a huge amount of evidence that many students have found previous versions of this quiz to be useful, either informing them that they need to do more preparation before beginning their studies, or – just as important – reassuring them that they are ready.

But yet, I don’t think we make as much use of diagnostic quizzes as we should. It is all too easy to sign up for a qualification without any real understanding of the time required or the level of study. I think that we should be making engagement with a quiz of this type mandatory, as part of the registration process.

Maths for Science assessment – the first 10 years

Sunday, January 20th, 2013

Ten years ago, at the end of January 2003, 186 Open University students submitted the second online interactive end-of-module assignment (EMA) for my little 10-credit baby, S151 Maths for Science (57 brave souls had submitted the first EMA, back in October 2002). Since then, something between 14, 000 and 15,000 students have taken Maths for Science. The new edition of the book has now been copublished with Oxford University Press, and it is currently being used by 315 students on the new version of S151 and also by 396 students as part of the new 30-credit module S141 Investigative and mathematical skills in science, compulsory in the latest version of our BSc Hons Natural Sciences. The new EMA for S151 opened for the first time last Monday and the iCMA for S141 (one component of the  assessment that assesses the content of Maths for Science) closed last Wednesday. It was Maths for Science that got me into eAssessment in the first place, so this seems like a pretty good moment to stop and reflect on the past 10 years.

In common I suspect with many other bloggers I’m a reflective practitioner. Indeed, I am so reflective that I sometimes find myself paralysed by indecision about the right way forward. I am aware that some see online interactive assessment as a fundamentally bad thing, reducing learning to low-level learning outcomes and giving extrinsic rather than intrinsic feedback. That upsets me, but I can see their point, in particular when coupled by the generally-held view that online assessment = multiple-choice quiz. I have argued strongly in this blog that there is a need (and, increasingly, an ability) to do eAssessment better and I don’t see eAssessment as a panacea – I prefer S141′s assessment strategy to S151′s, because the former combines online assessment with other forms of assessment, with each used when it is most appropriate.

And, when all is said and done, 10 years and more than 14,000 students on, S151 and its online assessment strategy has worked. The module has been highly successful by just about every performance measure and students like the instantaneous targeted feedback with a chance to have another go. The system has also provided me with extremely useful information about what students get wrong, with an indication of why. Our large student numbers give us the edge here - and when the assignment is summative so students are trying, and equivalent errors appear in each variant of a question, you know you’re onto something. I’ve been able to improve our teaching as a result.

So, in the broadest sense of the word, Maths for Science has been a formative experience. I’d like to thank those without whom the book, the module and the online assessment would never have happened. Ten years ago we were only just able to assume that sufficient of our students would have access to the internet (and some still don’t, in particular our offender learners). We were breaking new ground, all in order to give our students instantaneous and meaningful feedback. Since then we’ve evaluated, improved, disseminated – and learnt from others. Thank you.

Twelve Pipers Piping

Sunday, January 6th, 2013

Day 12. Monitor and improve. One of the advantages of eAssessment (whatever you mean by that term)  is the ability to monitor actual student behaviour. This is not the same as what students say they do, neither is it the same as student opinion. Student opinion is important of course, but there are real inconsistencies between what students say they do and what they say they want, and what they actually do. The best documented of the inconsistencies is the fact that students ask for more ‘feedback’ whilst not collecting their marked work. I could write at length on reasons and possible reasons for this fact – but for now, let’s concentrate on actual student behaviour. (more…)

Eleven Drummers Drumming

Saturday, January 5th, 2013

Day 11. A broader definition of eAssessment. I can clearly remember the day, around 5 years ago, when I felt rather small at a conference because I was talking about my work with short-answer free text questions, and realised that everyone else was using the term ‘eAssessment’ to mean something rather different. The irony is that I’ve always had far broader interests in assessment  – I come from a tutoring background and my current ‘day job’, as an Open University staff tutor, means that I spend a lot of time getting our ‘correspondence tuition’ (marking and commenting on tutor-marked assignments) as good as possible. I have also been involved in the assessment of wikis and tutor group forum discussions. But somehow I didn’t see this as eAssessment… (more…)

Ten Lords a-Leaping

Friday, January 4th, 2013

Day 10. Check, check and check again. So you’ve chosen your question type, written your question and feedback and constructed your answer-matching. You may even have put your questions together into a quiz or interactive computer-marked assignment (iCMA) of some sort. So now it’s ready to go to students? WRONG. Checking eAssessment questions and whole quizzes is vitally important and this post considers things such as when to do the checking and who should do it. (more…)

Nine Ladies Dancing

Thursday, January 3rd, 2013

Day 9. Using STACK to write questions to assess maths.  There may or may not be nine ladies dancing, but there is certainly one female blogger who would happily do a little dance around her house this evening. I’ve managed to prise another few hours this afternoon to write another STACK question and I am oh so happy with the result. I feel a bit embarrassed to be writing about STACK and, yes, it did take me several hours to write one question. I am a real beginner, standing on the shoulders of giants (Chris, Tim, Phil – that’s you). In writing STACK questions I am having to learn STACK, learn Maxima, learn Moodle, learn LaTeX, and – as I get onto questions that assess rather more maths than basic differentiation – I will be having to dig deep into my memory (and revise from the module materials which the questions are designed to support) from the days, something approaching 40 years ago, when I might have described myself as a mathematician. But I am loving every minute of it. (more…)

Eight maids a-Milking – part 2

Wednesday, January 2nd, 2013

Day 8b. Using PMatch for short-answer free-text questions. As promised, what I’d like to do is to give you an example of the answer matching for a real question, based on real student responses. (more…)

Eight Maids a-Milking – part 1

Tuesday, January 1st, 2013

Day 8a. Free text marking of short answer questions.  I’ve already talked about the automatic marking of short answers quite a lot on this blog so today, the first day of 2013, I’d like to do two slightly different things (1) Set our work in this area in context; (2) Give an example of the actual PMatch answer matching that we use for one question. Actually, we’ve been out walking and I wanted to get my other blog written up, so I will only do (1) today, letting the ‘Twelve Days’ slip and ending on 6th January (which some people consider to be the 12th Day of Christmas in any case) rather than 5th. A clever solution and another reminder that assessment questions need to be unambiguous (not like ‘What date is the 12th Day of Christmas?’) and to mark all correct answers as correct and all incorrect answers as incorrect – that is very important for short-answer free text questions too. (more…)