Skip to content

The Open CETL > Centre for Open Learning of Mathematics, Science, Computing and Technology > Activities & projects > Assessment > Development of online interactive assessment for S104: Exploring science and other level 1 courses

Development of online interactive assessment for S104: Exploring science and other level 1 courses

The aim of this project was to develop the academic and pedagogic basis of the OpenMark eAssessment system and to develop interactive computer marked assignments (iCMAs) for S104 : Exploring Science. The OpenMark system, previously used for formative and summative assessment in S151 : Maths for Science enables students to be provided with instantaneous, targeted and relatively detailed feedback on their work. iCMA questions were also developed for S154 : Science Starts Here and the diagnostic quiz ‘Are you ready for level 1 science?’.

Development of OpenMark iCMAs for S104
S104: Exploring Science is the Science Faculty’s 60 point level one science course, introducing students to Earth science, physics, chemistry and biology and developing mathematical, communication and practical skills. S104 had its first presentation in February 2008 and since then has run in two presentations per year, with 1500-2000 students per presentation. The content of S104 draws heavily on its predecessor, S103, but its tuition and assessment strategies are very different.

S104’s assessment strategy includes the following components:

  • Seven tutor marked assignments (TMAs), marked against learning outcomes and with optional eTMA submission.
  • Eight interactive computer marked assignments (iCMAs) (summative but low stakes), each containing 10 questions.
  • Synoptic component comprising a written end of course assignment (ECA) and a longer iCMA49 (25 questions) with questions ranging across the course .

S104's iCMAs are credit-bearing because the course team wanted students to engage with them in a meaningful way. However their purpose is to provide instantaneous feedback and to help students to pace their study. The iCMA questions use the OpenMark e-assessment system, which enables us to provide students with multiple attempts at each question, with an increasing amount of instantaneous feedback after each attempt. The student can learn from the feedback and use it to correct their answer. Wherever possible the feedback is tailored to the student’s misunderstanding. S104's iCMAs make use of the full range of OpenMark question types, including free text entry of numbers, letters and single words as well as hot-spot, drag and drop, multiple-choice and multiple-response. We have also included a few questions requiring free-text answers of up to a sentence in length (these questions are described in detail at the related project page 'eAssessment questions with short free text responses: a natural language processing approach'). Further information about OpenMark is given on the OpenMark Examples website (http://www.open.ac.uk/openmarkexamples/index.shtml).

Each S104 iCMA opens to students approximately two weeks before they are due to start reading a particular book of the course. Initially the iCMAs closed 1-2 weeks after students were due to move onto the next book, which led to each iCMA cut-off date being after the cut-off date for the TMA assessing the same book. This resulted in many students not starting the iCMA until after they had completed the TMA and moved on to the next book, so it was felt that the pacing function of e-assessment was not being used to the full so iCMA cut-off dates are now just two days after the students are timetabled to move onto the next book, with the TMA due dates a further two days later.

OpenMark sits within Moodle and scores are reported to the student and their tutor via StudentHome and TutorHome respectively. S104 has its own ‘iCMA Guide’ and associate lecturers are given additional advice via the S104 Tutor Forum.

Each presentation of S104 uses 105 questions (10 each for iCMAs41-48 and 25 for iCMA49), with some re-use of questions between presentations. Wherever possible at least five different variants of each question are provided, to act as an anti-plagiarism device, and sometimes more variants are provided to enable the same basic question to be used, for example, in iCMA43 on one presentation of the course and iCMA49 of the following presentation. Each OpenMark question was written by an academic member of the Course Team, programmed by a media developer and checked by both the academic author and an experienced consultant.

A series of workshops was run to train S104 course team members in the authoring of OpenMark questions. These included advice on writing unambiguous questions linked to appropriate learning outcomes, writing appropriate feedback for students, specifying answer matching, ensuring that as many questions as possible are fully accessible (i.e. have versions that can be read by a screen-reader) and checking questions. The basic ‘Writing OpenMark questions’ workshop has since been repeated for other course teams and a guide ‘Good practice in the academic authoring of OpenMark questions’ has also been produced.

Development of OpenMark iCMAs for S154
A bank of 66 OpenMark questions was written in the spring of 2007 for use in various contexts, including the reinforcement and assessment of basic knowledge and skills, especially mathematical skills, developed in the 10 point course S154 : Science Starts Here (first presentations October 2007 and March 2008, with 500-1000 students per presentation). An additional 15 new questions were written prior to the October 2008 presentation of S154, supplemented by two free-text short-answer questions first written for S103 and one S104 question reused with the permission of the author. The questions in the bank have been written with up to 20 different variants each, to enable their use in different places (including S154’s formative and summative iCMAs, ‘Are you ready for level 1 science?’ and the iCMA that accompanies the Maths Skills ebook) and also to provide multiple variants in each iCMA in which they are used. In formative use, multiple variants provide students with more opportunities for practice.

S154 had not initially intended to make summative use of OpenMark questions. However, mounting evidence that students frequently only attempt the first question or two of formative-only iCMAs, led to a course team decision to have two short summative iCMAs, one assessing Chapters 2-4 and one assessing Chapters 6-9. However, the focus of S154 on underpinning mathematical skills means that some students will require a lot of practice so there is also a ‘Practice iCMA’. Students are encouraged to engage with the Practice iCMA regularly, with reminders given on the course website every week and at the end of each chapter in the course book.

‘Are you ready for level 1 science?’
32 of the questions from the bank of mathematical questions described in the previous section have also been used (with different variants) in the diagnostic quiz ‘Are you ready for level 1 science?’, which has been available to prospective students of level 1 Science Faculty courses (including S104, S154, SDK125 and Science Short Courses) since April 2007. These questions are offered alongside 6 questions on English and study skills and a number of advice screens.

An OpenMark-based diagnostic quiz has been used because it forces people to engage actively with the questions, rather than looking at the answers before looking at the questions and then assuming that they could easily have obtained the correct answers for themselves. There is some evidence of this sort of behaviour with the printed and .pdf versions of the other Science Faculty ‘Are you ready for?’ quizzes, and the level 2 courses S207, S205 and S204 have had interactive ‘Are you ready for?’ quizzes for some time.

The other reason why OpenMark is appropriate for the ‘Are you ready for level 1 science’ quiz is that it enables people to be guided through a complex maze of possible routes, depending on their aspirations and the amount of time they have available, in such a way that the quiz does not appear too complex or too long to prospective students. ‘Are you ready for level 1 science?’ is actually three interlinked iCMAs. After a very short ‘introductory quiz’ students are guided either to a ‘basic quiz’ (to assess their preparedness for S154, SDK125 or entry level Science Short Courses) or to a quiz that is designed specifically to assess their preparedness for S104. Within the S104 quiz some of the questions (on arithmetical rules of precedence, negative numbers and fractions, decimals, ratios and percentages) are deemed to be ‘essential’ (these topics are not re-taught in S104) whereas other questions (on topics which are re-taught, albeit sometimes rather briefly, in S104, for example the use of scientific notation) are classified as ‘desirable’ for S104. Prospective students are advised that they will be able to complete the early books of S104 more quickly if they are already familiar with some of these topics.

Acknowledgements
Much of the work of this project has been carried out in conjunction with colleagues on the S154 and S104 course teams, in particular Linda Fowler, Ruth Williams and Valda Stevens. Development of the OpenMark questions for S154 and ‘Are you ready for level 1 science’ would not have been possible with the assistance of Greg Black (Learning and Teaching Solutions). Greg has also provided invaluable guidance on possibilities offered by the rapidly developing technologies, as has Phil Butcher (COLMSCT Teaching Fellow and leader of the VLE eAssessment Project Team).  

Back

Examples of the question types we have used for S104, S154 and ‘Are you ready for S104?' can be seen by accessing ‘Are you ready for S104?’ from www.open.ac.uk/science/courses-qualifications/are-you-ready-for-science/interactive-materials/exploring-science.php

More information about the OpenMark eAssessment system and the range of question types available can be viewed at
http://www.open.ac.uk/openmarkexamples/index.shtml

Examples of short-free text questions developed in the related COLMSCT project 'eAssessment questions with short free-text responses : a natural language processing approach' can be seen at students.open.ac.uk/openmark/omdemo.pm2009/ 

Back

Reviews of the literature (e.g. Black and Wiliam, 1998; Gibbs and Simpson, 2004) have identified conditions under which assessment appears to support and encourage learning. Several of these conditions concern feedback, but the provision of feedback does not in itself lead to learning. Sadler (1989) argues that in order for feedback to be effective, action must be taken to close the gap between the student’s current level of understanding and the level expected by the teacher. It follows that, in order for assessment to be effective, feedback must not only be provided, but also understood by the student and acted on in a timely fashion.

These points are incorporated into five of Gibbs and Simpson’s (2004) eleven conditions under which assessment supports learning:
Condition 4: Sufficient feedback is provided, both often enough and in enough detail;
Condition 6: The feedback is timely in that it is received by students while it still matters to them and in time for them to pay attention to further learning or receive further assistance;
Condition 8: Feedback is appropriate, in relation to students’ understanding of what they are supposed to be doing;
Condition 9: Feedback is received and attended to;
Condition 11: Feedback is acted upon by the student.

It can be difficult and expensive to provide students with sufficient feedback (Condition 4), especially in a distance-learning environment, where opportunities for informal discussion are limited. Feedback on tutor-marked assignments is useful but may be received too late to be useful (Condition 6) and it is then difficult for students to understand and act upon it (Conditions 8 and 10), even assuming that they do more than glance at the mark awarded (Condition 9).

One possible solution to these dilemmas is to use e-assessment. Feedback can be tailored to students’ misconceptions and delivered instantaneously and, provided the assessment system is carefully chosen and set-up, students can be given an opportunity to learn from the feedback whilst it is still fresh in their minds, by immediately attempting a similar question or the same question for a second time, thus closing the feedback loop. Distance learners are no longer disadvantaged — indeed the system can emulate a tutor at the student’s elbow (Ross et al., 2006, p.125) — and ‘little and often’ assessments can be incorporated at regular intervals throughout the course, bringing the additional benefits of assisting students to pace their study and to engage actively with the learning process, thus encouraging retention. For high-population courses, e-assessment can also deliver savings of cost and effort. Finally, e-assessment is the natural partner to the growth industry of e-learning.

However opinions of e-assessment are mixed and evidence for its effectiveness is inconclusive; indeed e-assessment is sometimes perceived as having a negative effect on learning (Gibbs, 2006). Murphy (2008) reports that high stakes multiple-choice tests of writing can lead to actual writing beginning to disappear from the curriculum; she also reports that ‘the curriculum begins to take the form of the test’. There are more widely voiced concerns that e-assessment tasks (predominantly but not exclusively multiple-choice) can encourage memorisation and factual recall and lead to surface-learning, far removed from the tasks that will be required of the learners in the real world (Nicol, 2007: Scouller and Prosser, 1994). Also, although multiple-choice questions are in some senses very reliable, doubt has been expressed that they may not always be assessing what the teacher believes that they are, partly because multiple-choice questions require ‘the recognition of the answer rather than the construction of a response’ (Nicol, 2007)

Ashton and her colleagues (2006) point out that the debate about the effectiveness of multiple-choice questions ‘diverts focus away from many of the key benefits that online assessment offers to learning’. Perhaps the question we should be asking is not ‘should we be using e-assessment?’ but rather ‘what are the features of an effective e-assessment system?’ (Mackenzie, 2003).

References

Ashton, H.S., Beevers, C.E., Milligan, C.D., Schofield, D.K., Thomas, R.C. and Youngson, M.A. (2006). Moving beyond objective testing in online assessment, in S.C. Howell and M. Hricko (eds) Online assessment and measurement: case studies from higher education, K-12 and corporate. Hershey, PA: Information Science Publishing: 116-127.

Black, P. and Wiliam, D. (1998) Assessment and classroom learning, Assessment in Education, 5, 1, 7-74.

Gibbs, G. (2006). Why assessment is changing. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp11-22.

Mackenzie, D. (2003). Assessment for e-learning : what are the features of an ideal e-assessment system?. 7th International CAA Conference, Loughborough, UK. At http://www.caaconference.com/pastConferences/2003/procedings/index.asp

Murphy, S. (2008) Some consequences of writing assessment, in A. Havnes and L. McDowell (eds) Balancing Dilemmas in Assessment and Learning in Contemporary Education. London: Routledge:33-49.

Nicol, D.J. (2007). E-assessment by design: using multiple choice tests to good effect. Journal of Further and Higher Education, 31, 1, 53–64.

Ross, S.M., Jordan, S.E & Butcher, P.G. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp123-131.

Sadler, D.R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119–144.

Scouller, K.M. & Prosser, M. (1994). Students’ experiences of studying for multiple choice question examinations. Studies in Higher Education, 19, 3, 267–279.

Back

Evaluation of S104 and S154’s use of iCMA questions and of the ‘Are you ready for level 1 science?’ quiz formed part of the larger related project 'Analysis of the impact of iCMAs on student learning'.

This project considered different models of iCMA use in order to determine:

  • The most effective ways in which iCMAs can be used to support students in choosing appropriate courses/programmes of study;
  • The most effective ways in which iCMAs can be used to engage students and to support student learning whilst students are studying a course.

The project’s methodology included extensive analysis of the data captured when students attempt iCMAs as well as more qualitative methodologies.

Evaluation of ‘Are you ready for level 1 science?’ showed it to have been very heavily used (with more than 26, 000 people accessing the quiz between April 2007 and January 2009) and popular, though unfortunately only around 50-65% of students on the first two presentations of S104 appear to have used the diagnostic quiz before deciding to study this course. A feedback question asked prospective students whether they found ‘Are you ready for level 1 science?’ useful, which course(s) they were considering both before and after attempting the quiz, and whether they had any suggestions for improvement. Several of the students who took the ‘basic quiz’ obviously found it very easy; some found this reassuring (e.g. in answer to ‘Did you find the quiz useful?’: ‘Yes, I enjoyed it and was pleasantly surprised. it's a long time since I did maths at school!!; ‘Very useful and very reassuring’; ‘It re-ignited a little confidence’) others appeared frustrated (e.g. ‘No. It was far too simplistic’). Analysis of responses to individual questions also indicated that most people answered these very competently. Most students who took the ‘basic quiz’ were initially intending to take S154, SDK125 or a Science Short Course and very few students changed their mind as a result of the quiz.

Responses to the feedback question and analysis of responses to individual questions indicated that prospective students taking the S104 quiz found this more difficult, and in a sense it appears that this quiz was more useful, with some students deciding to start their study with S154 rather than S104 and comments in response to ‘Did you find the quiz useful?’ such as ‘Yes. It confirmed to me that I need to take the preparatory course S154 prior to S104 in 2008. The maths section is my week point, although I feel I only need to brush up on certain areas of it and 154 will do that (I hope).’. However, again, many students were simply assured of their preparedness for the course they were originally intending to study, with comments such as ‘The quiz was very useful. As a 39 yr old who has had little mathematics exposure since the mid 80s, it was refreshing to realise how much I had remembered.’ Responses to individual questions were far more variable than for the basic quiz, but the questions all appeared to have been behaving well.

Postscript - changes to 'Are you ready for level 1 science?', November 2009

Various changes have been made to 'Are you ready for level 1 science?' in response to feedback from users.

1. Four questions on chemistry were added to the 'valuable for S104' section.

2. Although 'Are you ready for level 1 science?' was heavily used and well received, evaluation showed that many potential students were 'getting lost' between the various quizzes. For this reason, the quizzes were reconfigured, with links provided to a single quiz for S104 ('Are you ready for S104?')and a single quiz for the other level 1 courses ('Are you ready for science study?')

3. Users were irritated by the 'study skills' questions so these were removed. The questions checking that students had sufficient time for study were simplified, and the guidance on the time requirements for the various courses was strengthened.

4. Users also requested more specific guidance about whether or not they were sufficiently prepared. For the S104 quiz, a 'traffic light' system is used to indicate whether students should consider doing further study before registering for S104, as illustrated below:

Back

A selection of COLMSCT and piCETL related papers, presentations and workshops, given by Sally Jordan, 2006-2010.
 
Publications and external conference contributions
Jordan, Sally (2007) The mathematical misconceptions of adult distance-learning science students. Proceedings of the CETL-MSOR Conference 2006, edited by David Green. Maths, Stats and OR Network, pp 87-92. ISBN 978-0-9555914-0-2.
 
Butcher, Philip and Jordan, Sally (2007) Interactive assessment in science at the Open University: 1975 – 2007. Invited oral presentation at ‘Computer-based assessment in the broader physical sciences’: a joint event hosted by the OpenCETL and the Physical Sciences Subject Centre, 26th April 2007.
 
Jordan, S., Brockbank, B. and Butcher, P. (2007) Extending the pedagogic role of online interactive assessment: providing feedback on short free-text responses. REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May 2007. Available at http://ewds.strath.ac.uk/REAP07
 
Jordan, Sally (2007) Computer based assessment with short free responses and tailored feedback. Proceedings of the Science Learning and Teaching Conference 2007, edited by Peter Goodhew. Higher Education Academy, pp 158-163. ISBN 978-1-905788-2.
 
Hudson, Ken and Jordan, Sally (2007) Practitioner scholarship can lead to institutional change – implementing interactive computer based assessment. Oral presentation at ISSOTL 2007, the International Society for the Scholarship of Teaching and learning, 4th Annual Conference, Sydney, 2nd-5th July 2007.
 
Jordan, Sally (2007) Assessment for learning; learning from Assessment? Oral presentation at Physics Higher Education Conference, Dublin, 6th-7th September 2007.
 
Stevens, Valda and Jordan, Sally (2008) Interactive online assessment with teaching feedback for open learners. Oral presentation at Assessment and Student Feedback workshop, Higher Education Academy Centre for ICS, London, April 2008.
 
Jordan, Sally (2008) eAssessment for student learning: short free-text questions with tailored feedback. Workshop at the University of Chester Staff Conference, May 2008.
 
Swithenby, Stephen and Jordan, Sally (2008) Supporting open learners by computer based assessment with short free-text responses and tailored feedback. Part of an invited symposium on ‘Matching technologies and pedagogies for supported open learning’ at the 6th International Conference on Education and Information Systems, Technologies and Applications, EISTA, Orlando, 29th June – 2nd July 2008.
 
Jordan, Sally (2008) Assessment for learning: pushing the boundaries of computer based assessment. Assessment in Higher Education Conference, University of Cumbria, July 2008.
 
Jordan, Sally (2008) Supporting distance learners with interactive screen experiments. Contributed oral presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
 
Jordan, Sally (2008) Online interactive assessment: short free text questions with tailored feedback. Contributed poster presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
 
Jordan, Sally (2008) E-assessment for learning? The potential of short free-text questions with tailored feedback (2008) In invited Symposium ‘Moving forward with e-assessment’ at at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
 
Jordan, Sally, Butcher, Philip and Hunter, Arlene (2008) Online interactive assessment for open learning. Roundtable discussion at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
 
Brockbank, Barbara, Jordan, Sally and Mitchell, Tom (2008) Investigating the use of short answer free-text eAssessment questions with tailored feedback. Poster presentation at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
 
Jordan, Sally, Brockbank, Barbara, Butcher, Philip and Mitchell, Tom (2008) Online assessment with tailored feedback as an aid to effective learning at a distance: including short free-text questions. Poster presentation at 16th Improving Student Learning Symposium, University of Durham, 1st-3rd September 2008.
 
Hatherly, Paul; Macdonald, John; Cayless, Paul and Jordan, Sally (2008) ISEs: a new resource for experimental physics. Workshop at the Physics Higher Education Conference, Edinburgh, 4th-5th September 2008.
 
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. OpenCETL Bulletin, 3, p4.
 
Jordan, Sally (2008) Online interactive assessment with short free-text questions and tailored feedback. New Directions, 4, 17-20.
 
Jordan, Sally (2009) Online interactive assessment in teaching science: a view from the Open University. Education in Science, Number 231, 16-17.
 
Jordan, Sally and Mitchell, Tom (2009) E-assessment for learning? The potential of short free-text questions with tailored feedback. British Journal of Educational Technology, 40, 2, 371-385.
 
Hatherly, Paul, Jordan, Sally and Cayless, Alan (2009) Interactive screen experiments – innovative virtual laboratories for distance learners. European Journal of Physics, 30, 751-762.
 
Butcher, P.G., Swithenby, S.J. and Jordan, S.E. (2009) E-Assessment and the independent learner. 23rd ICDE World Conference on Open Learning and Distance Education, 7-10 June 2009, Maastricht, The Netherlands.
 
Jordan, Sally. (2009) Assessment for learning: pushing the boundaries of computer based assessment. Practitioner Research in Higher Education, 3(1), pp11-19.Available online at
 
Jordan, Sally. (2009) An investigation into the use of e-assessment to support student learning. Assessment in Higher Education Conference, University of Cumbria, 8th July 2009. Available online at http://www.cumbria.ac.uk/Services/CDLT/C-SHEN/Events/EventsArchive2009.aspx
 
Jordan, Sally and Brockbank, Barbara (2009) Online interactive assessment: short free text questions with tailored feedback. Oral presentation at GIREP-EPEC, August 2009.
 
Jordan, Sally and Butcher, Philip. (2009) Using e-assessment to support distance learners of science. Oral presentation at GIREP-EPEC, August 2009.
 
Hatherly, Paul, Jordan, Sally and Cayless, Alan. (2009) Interactive screen experiments – connecting distance learners to laboratory practice. Oral presentation at GIREP-EPEC, August 2009.
Butcher, P.G & Jordan, S.E. (2010) A comparison of human and computer marking of short free-text student responses. Computers & Education, 55, 489-499. DOI: 10.1016/j.compedu.2010.02.012
 
Jordan, Sally. (2010). E-assessment for learning and learning from e-assessment : short-answer free text questions with tailored feedback. Presentation and workshop to HEA Physical Sciences Centre “The future of technology enhanced assessment’, Royal Society of Chemistry, Burlington House, London, 28th April 2010.
 
Jordan, Sally (2010) Short answer free text e-assessment questions with tailored feedback. Invited seminar to Human Computer Interaction group at the University of Sussex, 21st May 2010.
 
Jordan, Sally (2010) Maths for science for those with no previous qualifications: a view from the Open University. HEA Physical Sciences Centre ‘Maths for Scientists’ meeting, 26th May 2010.
 
Jordan, Sally (2010) Student engagement with e-assessment questions. Poster at the 2010 International Computer Assisted Assessment (CAA) Conference, Southampton, July 2010.
Jordan, S. and Butcher, P. (2010) Using e-assessment to support distance learners of science. In Physics Community and Cooperation: Selected Contributions from the GIREP-EPEC and PHEC 2009 International Conference, ed. D, Raine, C. Hurkett and L. Rogers. Leicester: Lula/The Centre for Interdisciplinary Science. ISBN 978-1-4461-6219-4, pp202-216.
 
Jordan, Sally (2010) Do we know what we mean by ‘quality’ in e-assessment? Roundtable discussion at EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference, Northumbria, September 2010.
 
Jordan, Sally, Butcher, Phil, Knight, Sarah and Smith, Ros (2010) ‘Your answer was not quite correct, try again’ : Making online assessment and feedback work for learners. Workshop at ALT-C 2010 ‘Into something rich and strange – making sense of the sea change’, September 2010, Nottingham.
 
Jordan, Sally (2010) Using simple software to generate answer matching rules for short-answer e-assessment questions in physics and astronomy. Oral presentation at the Physics Higher Education Conference, University of Strathclyde, September 2010.
 
Butcher, P.G. & Jordan, S.E, (in press) Featured case study in JISC Effective Practice Guide, Summer 2010 
 
Contributions to internal (OU) conferences, meetings and workshops
Jordan, Sally (2006) An analysis of science students’ mathematical misconceptions. Poster presentation at 1st OpenCETL Conference, 8th June 2006.
 
Jordan, Sally (2006) OpenMark – what’s all the fuss about? Lunchtime seminar at Cambridge Regional Centre, 1st November 2006.
 
Jordan, Sally (2007) Using interactive online assessment to support student learning. Faculty lunchtime seminar, 30th January 2007.
 
Jordan, Sally (2007) Issues and examples in online interactive assessment. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
 
Jordan, Sally (2007) Students’ mathematical misconceptions. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
 
Jordan, Sally (2007) Assessment for learning; learning from assessment? Paper presented at the Curriculum, Teaching and Student Support Conference, 1st May 2007.
 
Jordan, Sally and Brockbank, Barbara (2007) Extending the pedagogic role of online interactive assessment: short answer free text questions. Paper presented at the Curriculum, Teaching and Student Support Conference, 2nd May 2007.
 
Jordan, Sally (2007) Investigating the use of short answer free text questions in online interactive assessment. Presentation at the Science Staff Tutor Group residential meeting, 9th May 2007.
 
Jordan, Sally (2007) OpenMark: online interactive workshop. Workshop run at AL Staff Development meeting in Canterbury, 12th May 2007.
 
Brockbank, Barbara, Jordan, Sally and Butcher, Phil (2007) Investigating the use of short answer free text questions for online interactive assessment. Poster presentation at 2nd OpenCETL Conference, 15th October 2007.
 
Jordan, Sally (2007) Students’ mathematical misconceptions: learning from online assessment. Oral presentation at 2nd OpenCETL Conference, 15th October 2007.
 
Jordan, Sally (2007) Using interactive screen experiments in our teaching: the S104 experience and The Maths Skills ebook. Demonstrations at 2nd OpenCETL Conference, 15th October 2007.
 
Jordan, Sally, Ekins, Judy and Hunter, Arlene (2007) eAssessment for learning?: the importance of feedback. Symposium at 2nd OpenCETL Conference, 16th October 2007.
 
Jordan, Sally, Brockbank, Barbara and Butcher, Phil (2007) Authoring short answer free text questions for online interactive assessment: have a go! Workshop at 2nd OpenCETL Conference, 16th October 2007.
 
Jordan, Sally (2008) Investigating the use of short answer free-text questions in online interactive assessment. Oral presentation to EATING (Education and Technology Interest Group), 17th January 2008.
 
Jordan, Sally (2008) Investigating the use of short free-text eAssessment questions.Oral presentation to ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
 
Jordan, Sally (2008) Writing short free-text eAssessment questions: have a go! Workshop at ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
 
Jordan, Sally and Brockbank, Barbara (2008) Writing free text questions for online assessment: have a go! Workshop at the Open University Conference, 29th and 30th April 2008.
 
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. Science Faculty Newsletter, May 2008.
 
Jordan, Sally and Johnson, Paul (2008) E-assessment opportunities: using free text questions and others. Science Faculty lunchtime seminar followed by workshop, 16th July 2008.
 
Jordan, Sally and Datta, Saroj (2008) Presentation on Open University use of Interactive Screen Experiments at ISE Launch event, 19th September 2008.
 
Jordan, Sally, Butler, Diane and Hatherly, Paul (2008) CETL impact: an S104 case study. Series of linked presentations to 3nd OpenCETL Conference, September 2008. [reported in OpenHouse, Feb/March 2009, ‘S104 puts projects into practice’, p4]
 
Jordan, Sally and Johnson, Paul (2008) Using free text e-assessment questions. Science Faculty lunchtime seminar followed by workshop, 26th November 2008.
 
Butcher, Phil, Jordan, Sally and Whitelock, Denise (2009) Learn About Formative e-Assessment. IET EPD Learn About Guide.
 
Butcher, Phil and Jordan, Sally (2009) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 21st January 2009.
 
Jordan, Sally (2009) E-assessment to support student learning : an investigation into different models of use. Paper presented at Making Connections Conference, 2nd- 3rd June 2009.
 
Jordan, Sally (2009) (ed) Compilation of interim and final reports on Open University Physics Innovations CETL projects: Assessment. 
 
Butcher, Phil and Jordan, Sally (2009) A comparison of human and computer marking of short-answer free-text student responses. Presentation at 4th OpenCETL Conference, December 2009.
 
Butcher, Phil and Jordan, Sally (2009) Interpreting the iCMA statistics. Presentation at 4th OpenCETL Conference, December 2009.
 
Jordan, Sally, Nix, Ingrid, Waights, Verina, Bolton, John and Butcher, Phil (2009) From CETL to course team : embedding iCMA initiatives. Workshop at 4th OpenCETL Conference, December 2009.
 
Jordan, Sally, Butler, Diane, Hatherly, Paul and Stevens, Valda (2009). From CETL to Course Team: CETL-led initiatives in S104 Exploring Science. Poster presentation at 4th OpenCETL Conference, December 2009.
 
Jordan, Sally (2009) Student engagement for e-assessment. Poster presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2010) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 10th February 2010.
 
Jordan, Sally (2010) Workshops on using OpenMark PMatch to write short-answer free-text questions, 19th January 2010 and 17th February 2010.
 
Jordan, Sally, Nix, Ingrid, Wyllie, Ali, Waights, Verina. (2010) Science Faculty lunchtime forum, March 25th 2010.
Jordan, Sally (2010) e-Assessment. In e-Learning in Action: projects and outcomes from the Physics Innovations CETL, pp16-20.

Jordan, Sally (2010) (ed) Compilation of final reports on Open University Physics Innovations CETL projects: e-Assessment.

Back

I am a Staff Tutor in Science at the Open University's Regional Centre in Cambridge, with responsibility for the running of various science courses in the East of England. I am a member of the OU Department of Physics and Astronomy. I was lucky enough to hold teaching fellowships in two of the OU's Centres for Excellence in Teaching and Learning: piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) and COLMSCT (the Centre for Open Learning in Mathematics, Science, Computing and Technology).

Following the end of the OU CETLs in July 2010, I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/

 I was a member of the course teams producing S104 : Exploring Science (presented for the first time in Feb 2008) and S154 : Science Starts Here (presented for the first time in October 2007). I had responsibility for the development of maths skills in these courses and for the integration of interactive computer marked assignments using the Open University’s OpenMark eAssessment system. I am currently course team chair of S104 and from January 2011 I will be course team chair of S154. I have also been (off and on!) course team chair of S151 : Maths for science, which reflects my great interest in investigating ways to help students to cope with the maths they need in order to study physics and other science courses. I was also on the Course Team of SXR103 : Practising Science for many years.

My COLMSCT and piCETL projects reflect my interests in eAssessment and science students’ mathematical misconceptions. I have investigated the use of short free-text eAssessment questions (with instantaneous feedback, wherever possible tailored to student’s misconceptions) and have applied similar evaluation methodologies to other types of eAssessment tasks, to investigate ways in which iCMAs can most effectively be used to support student learning. Analysis of students’ responses to eAssessment questions provides an opportunity for teachers to learn more about student’s misunderstandings. I have used this methodology to learn more about science students’ mathematical misconceptions. I have developed a ‘Maths Skills eBook’ which is used as a resource on many Open University science courses and I have evaluated the effectiveness and student use of an Interactive Screen Experiment (ISE) developed with piCETL funding for use by S104 students who are not able to conduct one of the course’s home experiments for themselves.

In 2006 I was awarded an Open University Teaching Award for my work in developing S151: Maths for Science and its interactive online assessment system. In 2010 I was one of the OU's nominees for a National Teaching Fellowship.

I have tutored various science and maths courses and have taken a few OU courses as a student. I live in West Norfolk with my husband; my children are both post-graduate students at (conventional) universities.  My hobbies include walking long-distance footpaths and writing about this (see http://sites.google.com/site/jordanwalks/) and singing in a local choir.  

Back

Contact

Sally Jordan, Open University
S.E.Jordan@open.ac.uk

Related projects

Documents