- Project home
- Try our questions
- Background - why eAssessment?
- piCETL and COLMSCT related papers, presentations and workshops
- More about Sally
The aim of this project was to develop the academic and pedagogic basis of the OpenMark eAssessment system and to develop interactive computer marked assignments (iCMAs) for S104 : Exploring Science. The OpenMark system, previously used for formative and summative assessment in S151 : Maths for Science enables students to be provided with instantaneous, targeted and relatively detailed feedback on their work. iCMA questions were also developed for S154 : Science Starts Here and the diagnostic quiz ‘Are you ready for level 1 science?’.
Development of OpenMark iCMAs for S104
S104: Exploring Science is the Science Faculty’s 60 point level one science course, introducing students to Earth science, physics, chemistry and biology and developing mathematical, communication and practical skills. S104 had its first presentation in February 2008 and since then has run in two presentations per year, with 1500-2000 students per presentation. The content of S104 draws heavily on its predecessor, S103, but its tuition and assessment strategies are very different.
S104’s assessment strategy includes the following components:
- Seven tutor marked assignments (TMAs), marked against learning outcomes and with optional eTMA submission.
- Eight interactive computer marked assignments (iCMAs) (summative but low stakes), each containing 10 questions.
- Synoptic component comprising a written end of course assignment (ECA) and a longer iCMA49 (25 questions) with questions ranging across the course .
S104's iCMAs are credit-bearing because the course team wanted students to engage with them in a meaningful way. However their purpose is to provide instantaneous feedback and to help students to pace their study. The iCMA questions use the OpenMark e-assessment system, which enables us to provide students with multiple attempts at each question, with an increasing amount of instantaneous feedback after each attempt. The student can learn from the feedback and use it to correct their answer. Wherever possible the feedback is tailored to the student’s misunderstanding. S104's iCMAs make use of the full range of OpenMark question types, including free text entry of numbers, letters and single words as well as hot-spot, drag and drop, multiple-choice and multiple-response. We have also included a few questions requiring free-text answers of up to a sentence in length (these questions are described in detail at the related project page 'eAssessment questions with short free text responses: a natural language processing approach'). Further information about OpenMark is given on the OpenMark Examples website (http://www.open.ac.uk/openmarkexamples/index.shtml).
Each S104 iCMA opens to students approximately two weeks before they are due to start reading a particular book of the course. Initially the iCMAs closed 1-2 weeks after students were due to move onto the next book, which led to each iCMA cut-off date being after the cut-off date for the TMA assessing the same book. This resulted in many students not starting the iCMA until after they had completed the TMA and moved on to the next book, so it was felt that the pacing function of e-assessment was not being used to the full so iCMA cut-off dates are now just two days after the students are timetabled to move onto the next book, with the TMA due dates a further two days later.
OpenMark sits within Moodle and scores are reported to the student and their tutor via StudentHome and TutorHome respectively. S104 has its own ‘iCMA Guide’ and associate lecturers are given additional advice via the S104 Tutor Forum.
Each presentation of S104 uses 105 questions (10 each for iCMAs41-48 and 25 for iCMA49), with some re-use of questions between presentations. Wherever possible at least five different variants of each question are provided, to act as an anti-plagiarism device, and sometimes more variants are provided to enable the same basic question to be used, for example, in iCMA43 on one presentation of the course and iCMA49 of the following presentation. Each OpenMark question was written by an academic member of the Course Team, programmed by a media developer and checked by both the academic author and an experienced consultant.
A series of workshops was run to train S104 course team members in the authoring of OpenMark questions. These included advice on writing unambiguous questions linked to appropriate learning outcomes, writing appropriate feedback for students, specifying answer matching, ensuring that as many questions as possible are fully accessible (i.e. have versions that can be read by a screen-reader) and checking questions. The basic ‘Writing OpenMark questions’ workshop has since been repeated for other course teams and a guide ‘Good practice in the academic authoring of OpenMark questions’ has also been produced.
Development of OpenMark iCMAs for S154
A bank of 66 OpenMark questions was written in the spring of 2007 for use in various contexts, including the reinforcement and assessment of basic knowledge and skills, especially mathematical skills, developed in the 10 point course S154 : Science Starts Here (first presentations October 2007 and March 2008, with 500-1000 students per presentation). An additional 15 new questions were written prior to the October 2008 presentation of S154, supplemented by two free-text short-answer questions first written for S103 and one S104 question reused with the permission of the author. The questions in the bank have been written with up to 20 different variants each, to enable their use in different places (including S154’s formative and summative iCMAs, ‘Are you ready for level 1 science?’ and the iCMA that accompanies the Maths Skills ebook) and also to provide multiple variants in each iCMA in which they are used. In formative use, multiple variants provide students with more opportunities for practice.
S154 had not initially intended to make summative use of OpenMark questions. However, mounting evidence that students frequently only attempt the first question or two of formative-only iCMAs, led to a course team decision to have two short summative iCMAs, one assessing Chapters 2-4 and one assessing Chapters 6-9. However, the focus of S154 on underpinning mathematical skills means that some students will require a lot of practice so there is also a ‘Practice iCMA’. Students are encouraged to engage with the Practice iCMA regularly, with reminders given on the course website every week and at the end of each chapter in the course book.
‘Are you ready for level 1 science?’
32 of the questions from the bank of mathematical questions described in the previous section have also been used (with different variants) in the diagnostic quiz ‘Are you ready for level 1 science?’, which has been available to prospective students of level 1 Science Faculty courses (including S104, S154, SDK125 and Science Short Courses) since April 2007. These questions are offered alongside 6 questions on English and study skills and a number of advice screens.
An OpenMark-based diagnostic quiz has been used because it forces people to engage actively with the questions, rather than looking at the answers before looking at the questions and then assuming that they could easily have obtained the correct answers for themselves. There is some evidence of this sort of behaviour with the printed and .pdf versions of the other Science Faculty ‘Are you ready for?’ quizzes, and the level 2 courses S207, S205 and S204 have had interactive ‘Are you ready for?’ quizzes for some time.
The other reason why OpenMark is appropriate for the ‘Are you ready for level 1 science’ quiz is that it enables people to be guided through a complex maze of possible routes, depending on their aspirations and the amount of time they have available, in such a way that the quiz does not appear too complex or too long to prospective students. ‘Are you ready for level 1 science?’ is actually three interlinked iCMAs. After a very short ‘introductory quiz’ students are guided either to a ‘basic quiz’ (to assess their preparedness for S154, SDK125 or entry level Science Short Courses) or to a quiz that is designed specifically to assess their preparedness for S104. Within the S104 quiz some of the questions (on arithmetical rules of precedence, negative numbers and fractions, decimals, ratios and percentages) are deemed to be ‘essential’ (these topics are not re-taught in S104) whereas other questions (on topics which are re-taught, albeit sometimes rather briefly, in S104, for example the use of scientific notation) are classified as ‘desirable’ for S104. Prospective students are advised that they will be able to complete the early books of S104 more quickly if they are already familiar with some of these topics.
Much of the work of this project has been carried out in conjunction with colleagues on the S154 and S104 course teams, in particular Linda Fowler, Ruth Williams and Valda Stevens. Development of the OpenMark questions for S154 and ‘Are you ready for level 1 science’ would not have been possible with the assistance of Greg Black (Learning and Teaching Solutions). Greg has also provided invaluable guidance on possibilities offered by the rapidly developing technologies, as has Phil Butcher (COLMSCT Teaching Fellow and leader of the VLE eAssessment Project Team).
Examples of the question types we have used for S104, S154 and ‘Are you ready for S104?' can be seen by accessing ‘Are you ready for S104?’ from www.open.ac.uk/science/courses-qualifications/are-you-ready-for-science/interactive-materials/exploring-science.php
More information about the OpenMark eAssessment system and the range of question types available can be viewed at
Examples of short-free text questions developed in the related COLMSCT project 'eAssessment questions with short free-text responses : a natural language processing approach' can be seen at students.open.ac.uk/openmark/omdemo.pm2009/
Reviews of the literature (e.g. Black and Wiliam, 1998; Gibbs and Simpson, 2004) have identified conditions under which assessment appears to support and encourage learning. Several of these conditions concern feedback, but the provision of feedback does not in itself lead to learning. Sadler (1989) argues that in order for feedback to be effective, action must be taken to close the gap between the student’s current level of understanding and the level expected by the teacher. It follows that, in order for assessment to be effective, feedback must not only be provided, but also understood by the student and acted on in a timely fashion.
These points are incorporated into five of Gibbs and Simpson’s (2004) eleven conditions under which assessment supports learning:
Condition 4: Sufficient feedback is provided, both often enough and in enough detail;
Condition 6: The feedback is timely in that it is received by students while it still matters to them and in time for them to pay attention to further learning or receive further assistance;
Condition 8: Feedback is appropriate, in relation to students’ understanding of what they are supposed to be doing;
Condition 9: Feedback is received and attended to;
Condition 11: Feedback is acted upon by the student.
It can be difficult and expensive to provide students with sufficient feedback (Condition 4), especially in a distance-learning environment, where opportunities for informal discussion are limited. Feedback on tutor-marked assignments is useful but may be received too late to be useful (Condition 6) and it is then difficult for students to understand and act upon it (Conditions 8 and 10), even assuming that they do more than glance at the mark awarded (Condition 9).
One possible solution to these dilemmas is to use e-assessment. Feedback can be tailored to students’ misconceptions and delivered instantaneously and, provided the assessment system is carefully chosen and set-up, students can be given an opportunity to learn from the feedback whilst it is still fresh in their minds, by immediately attempting a similar question or the same question for a second time, thus closing the feedback loop. Distance learners are no longer disadvantaged — indeed the system can emulate a tutor at the student’s elbow (Ross et al., 2006, p.125) — and ‘little and often’ assessments can be incorporated at regular intervals throughout the course, bringing the additional benefits of assisting students to pace their study and to engage actively with the learning process, thus encouraging retention. For high-population courses, e-assessment can also deliver savings of cost and effort. Finally, e-assessment is the natural partner to the growth industry of e-learning.
However opinions of e-assessment are mixed and evidence for its effectiveness is inconclusive; indeed e-assessment is sometimes perceived as having a negative effect on learning (Gibbs, 2006). Murphy (2008) reports that high stakes multiple-choice tests of writing can lead to actual writing beginning to disappear from the curriculum; she also reports that ‘the curriculum begins to take the form of the test’. There are more widely voiced concerns that e-assessment tasks (predominantly but not exclusively multiple-choice) can encourage memorisation and factual recall and lead to surface-learning, far removed from the tasks that will be required of the learners in the real world (Nicol, 2007: Scouller and Prosser, 1994). Also, although multiple-choice questions are in some senses very reliable, doubt has been expressed that they may not always be assessing what the teacher believes that they are, partly because multiple-choice questions require ‘the recognition of the answer rather than the construction of a response’ (Nicol, 2007)
Ashton and her colleagues (2006) point out that the debate about the effectiveness of multiple-choice questions ‘diverts focus away from many of the key benefits that online assessment offers to learning’. Perhaps the question we should be asking is not ‘should we be using e-assessment?’ but rather ‘what are the features of an effective e-assessment system?’ (Mackenzie, 2003).
Ashton, H.S., Beevers, C.E., Milligan, C.D., Schofield, D.K., Thomas, R.C. and Youngson, M.A. (2006). Moving beyond objective testing in online assessment, in S.C. Howell and M. Hricko (eds) Online assessment and measurement: case studies from higher education, K-12 and corporate. Hershey, PA: Information Science Publishing: 116-127.
Black, P. and Wiliam, D. (1998) Assessment and classroom learning, Assessment in Education, 5, 1, 7-74.
Gibbs, G. (2006). Why assessment is changing. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp11-22.
Mackenzie, D. (2003). Assessment for e-learning : what are the features of an ideal e-assessment system?. 7th International CAA Conference, Loughborough, UK. At http://www.caaconference.com/pastConferences/2003/procedings/index.asp
Murphy, S. (2008) Some consequences of writing assessment, in A. Havnes and L. McDowell (eds) Balancing Dilemmas in Assessment and Learning in Contemporary Education. London: Routledge:33-49.
Nicol, D.J. (2007). E-assessment by design: using multiple choice tests to good effect. Journal of Further and Higher Education, 31, 1, 53–64.
Ross, S.M., Jordan, S.E & Butcher, P.G. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp123-131.
Sadler, D.R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119–144.
Scouller, K.M. & Prosser, M. (1994). Students’ experiences of studying for multiple choice question examinations. Studies in Higher Education, 19, 3, 267–279.
Evaluation of S104 and S154’s use of iCMA questions and of the ‘Are you ready for level 1 science?’ quiz formed part of the larger related project 'Analysis of the impact of iCMAs on student learning'.
This project considered different models of iCMA use in order to determine:
- The most effective ways in which iCMAs can be used to support students in choosing appropriate courses/programmes of study;
- The most effective ways in which iCMAs can be used to engage students and to support student learning whilst students are studying a course.
The project’s methodology included extensive analysis of the data captured when students attempt iCMAs as well as more qualitative methodologies.
Evaluation of ‘Are you ready for level 1 science?’ showed it to have been very heavily used (with more than 26, 000 people accessing the quiz between April 2007 and January 2009) and popular, though unfortunately only around 50-65% of students on the first two presentations of S104 appear to have used the diagnostic quiz before deciding to study this course. A feedback question asked prospective students whether they found ‘Are you ready for level 1 science?’ useful, which course(s) they were considering both before and after attempting the quiz, and whether they had any suggestions for improvement. Several of the students who took the ‘basic quiz’ obviously found it very easy; some found this reassuring (e.g. in answer to ‘Did you find the quiz useful?’: ‘Yes, I enjoyed it and was pleasantly surprised. it's a long time since I did maths at school!!; ‘Very useful and very reassuring’; ‘It re-ignited a little confidence’) others appeared frustrated (e.g. ‘No. It was far too simplistic’). Analysis of responses to individual questions also indicated that most people answered these very competently. Most students who took the ‘basic quiz’ were initially intending to take S154, SDK125 or a Science Short Course and very few students changed their mind as a result of the quiz.
Responses to the feedback question and analysis of responses to individual questions indicated that prospective students taking the S104 quiz found this more difficult, and in a sense it appears that this quiz was more useful, with some students deciding to start their study with S154 rather than S104 and comments in response to ‘Did you find the quiz useful?’ such as ‘Yes. It confirmed to me that I need to take the preparatory course S154 prior to S104 in 2008. The maths section is my week point, although I feel I only need to brush up on certain areas of it and 154 will do that (I hope).’. However, again, many students were simply assured of their preparedness for the course they were originally intending to study, with comments such as ‘The quiz was very useful. As a 39 yr old who has had little mathematics exposure since the mid 80s, it was refreshing to realise how much I had remembered.’ Responses to individual questions were far more variable than for the basic quiz, but the questions all appeared to have been behaving well.
Postscript - changes to 'Are you ready for level 1 science?', November 2009
Various changes have been made to 'Are you ready for level 1 science?' in response to feedback from users.
1. Four questions on chemistry were added to the 'valuable for S104' section.
2. Although 'Are you ready for level 1 science?' was heavily used and well received, evaluation showed that many potential students were 'getting lost' between the various quizzes. For this reason, the quizzes were reconfigured, with links provided to a single quiz for S104 ('Are you ready for S104?')and a single quiz for the other level 1 courses ('Are you ready for science study?')
3. Users were irritated by the 'study skills' questions so these were removed. The questions checking that students had sufficient time for study were simplified, and the guidance on the time requirements for the various courses was strengthened.
4. Users also requested more specific guidance about whether or not they were sufficiently prepared. For the S104 quiz, a 'traffic light' system is used to indicate whether students should consider doing further study before registering for S104, as illustrated below:
Jordan, Sally (2010) (ed) Compilation of final reports on Open University Physics Innovations CETL projects: e-Assessment.
I am a Staff Tutor in Science at the Open University's Regional Centre in Cambridge, with responsibility for the running of various science courses in the East of England. I am a member of the OU Department of Physics and Astronomy. I was lucky enough to hold teaching fellowships in two of the OU's Centres for Excellence in Teaching and Learning: piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) and COLMSCT (the Centre for Open Learning in Mathematics, Science, Computing and Technology).
Following the end of the OU CETLs in July 2010, I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
I was a member of the course teams producing S104 : Exploring Science (presented for the first time in Feb 2008) and S154 : Science Starts Here (presented for the first time in October 2007). I had responsibility for the development of maths skills in these courses and for the integration of interactive computer marked assignments using the Open University’s OpenMark eAssessment system. I am currently course team chair of S104 and from January 2011 I will be course team chair of S154. I have also been (off and on!) course team chair of S151 : Maths for science, which reflects my great interest in investigating ways to help students to cope with the maths they need in order to study physics and other science courses. I was also on the Course Team of SXR103 : Practising Science for many years.
My COLMSCT and piCETL projects reflect my interests in eAssessment and science students’ mathematical misconceptions. I have investigated the use of short free-text eAssessment questions (with instantaneous feedback, wherever possible tailored to student’s misconceptions) and have applied similar evaluation methodologies to other types of eAssessment tasks, to investigate ways in which iCMAs can most effectively be used to support student learning. Analysis of students’ responses to eAssessment questions provides an opportunity for teachers to learn more about student’s misunderstandings. I have used this methodology to learn more about science students’ mathematical misconceptions. I have developed a ‘Maths Skills eBook’ which is used as a resource on many Open University science courses and I have evaluated the effectiveness and student use of an Interactive Screen Experiment (ISE) developed with piCETL funding for use by S104 students who are not able to conduct one of the course’s home experiments for themselves.
In 2006 I was awarded an Open University Teaching Award for my work in developing S151: Maths for Science and its interactive online assessment system. In 2010 I was one of the OU's nominees for a National Teaching Fellowship.
I have tutored various science and maths courses and have taken a few OU courses as a student. I live in West Norfolk with my husband; my children are both post-graduate students at (conventional) universities. My hobbies include walking long-distance footpaths and writing about this (see http://sites.google.com/site/jordanwalks/) and singing in a local choir.
Sally Jordan, Open University