The Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) was funded between 2005 and 2010 by HEFCE as one of the funding Council’s Centres for Excellence in Teaching and Learning.
[[[image-0 small right]]]
Faced with many challenges such as how to provide the home based student with rich and motivating learning experiences; how to help students to work together in spite of geographical separation and supporting students in how to use complex tools and techniques, the Centre harnessed academic creativity and new learning technologies to generate new responses to these long-standing challenges.
The technologies and approaches advanced by COLMSCT and the other OU CETLs have contributed to an enhanced institution-wide recognition of the possibilities that technology offers for efficient personalised learning. These ideas have been taken up in a major reform of student support.
COLMSCT was part of the Open CETL, a collaboration of the four centres based at the Open University. It was not a formal organisation but a way of describing how we worked together to inspire each other, to achieve greater impact and increase efficiency.
Learning Landscapes is the COLMSCT publication which highlights the work and achievements of its Teaching Fellows. You can read how online decision-making is improving nurses’ treatment of their patients; how disabled students are being enabled to take part in geology field trips; how prisoners are being given the opportunity to turn their lives around through access to higher education; and how students of all types can simply talk to each other and their tutor and enjoy learning together. To view the publication please See: http://www8.open.ac.uk/opencetl/resources/colmsct-resources
The Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT) was one of the HEFCE funded Centres for Excellence in Teaching and Learning (CETLs).
The Centre set about helping excellent teachers to develop effective and engaging open learning in mathematics, science, computing and technology. This was achieved through an open call to OU staff for project proposals. In choosing our portfolio of fellows and projects, it was necessary to acknowledge strategy, balance diversity and exploit opportunity for synergistic collaboration. We were able to respond to predictable priorities, e.g. online teaching of subjects such as mathematics that are symbol rich, online experimentation and computer based assessment. However, we were also able to initiate major projects in unforeseen areas, e.g. offender learners, enhancement of access to field trips and learning in immersive environments. In each case, strong change champions emerged from the COLMSCT fellowship selection process. In several of the predictable arenas, COLMSCT has led major innovation, e.g. in the use of interactive computer marked assessments.
The four OU CETLs were helped extensively in the development of their evaluation strategies by external consultant input. Professor Murray Saunders at Lancaster University contributed advice and analysis throughout the CETL lifetime. He introduced our community to the use of RUFDATA and to ideas such as; spheres of effect, use of narratives, discourse trajectories etc.
[[[image-0 small]]]
Formative Self Evaluation Report, July 2007
In July 2007, the Open CETL submitted an interim report on its activity to the Higher Education Funding Council for England.
To view COLMSCT’s Interim Evaluation Report please click here - Formative Self Evaluation Report, July 2001
CETL Final Self Evaluation Report to the Higher Education Funding Council for England, March 2010
In March 2010, COLMSCT submitted its final report to HEFCE.
To view COLMSCT’s Final Evaluation Report please click here - Final Self Evaluation Report March 2010
[[[image-0 medium right]]]
For COLMSCT, the main mechanism of reward was the awarding of ‘fellowships’ that provided dedicated time (two years part-time) for teachers of proven excellence to develop further their practice through personally designed projects that involved change in the student experience. Three cohorts of fellows were recruited during the term of COLMSCT thus creating 65 fellows which were drawn from central and regional academic and Associate Lecturer staff.
The fellows were organised as a mentored and mutually supporting action research community with the Open CETL centre as their home-base. The Centre and its core staff provided a supportive environment through which the fellows could share practice, receive guidance and provide peer support and mentoring.
[[[image-1 medium left]]]
Many fellows testify to the transformation they have experienced from being intuitive to being reflective and informed teachers. The professional rewards were greatly appreciated and strongly motivating. For example, an Associate Teaching Fellow commented “The experience was enormously empowering and intellectually stimulating”.
Although the Centre was OU based, COLMSCT recognised that open learning methods were of wider relevance across the higher education sector. That’s why partner approaches were welcomed and on occasions COLMSCT worked closely with other CETLs, the Higher Education Academy Subject Centres and other universities in the organisation of seminars, workshops, colloquia and conferences.
In order to make a real difference we aligned COLMSCT activities with the teaching and learning priorities and activities of the Mathematics, Computing and Technology and Science Faculties. Our work assisted in shaping mainstream teaching and learning.
The main activity was the work of the COLMSCT Fellows; these fellows engaged in a number of projects focusing on four main thematic areas of assessment, e-learning communities and identities, maths online and online experimentation and investigation.
Each of their projects were innovative, in as much as they pushed the boundaries of the fellows' own practice, and explored issues and concerns around changing practice more generally. In addition, COLMSCT supported a number of major initiatives which had wider institutional impact such as iCMAs, Second Life and Elluminate. Other initiatives focused on building communities of learning and on exploring the issues of online teaching of mathematics, science, computing and technology.
These initiatives aligned pockets of activity and innovation with broader Open University activities such as the implementation of the Virtual Learning Environment (VLE). They provided rigorous evidence of good practice which informed policy development.
Aims
Our initial aims were to actively encouraging excellent teachers to:
COLMSCT was both rewarding excellent teaching and disseminating good practice within the two faculties, to the wider OU, and, through its regional networks to the Higher Education sector.
Outcomes for…the Centre
COLMSCT has evaluated its achievements and benefits by an analysis based on spheres of effect, i.e. student learning, individual member of staff, department/faculty, university and sector.
Outcomes for…student learning
Access was provided to a large quantity of computer based assessment with feedback that was designed to promote interactive learning and facilitate effective support conversations. Over 50,000 tests are now taken each month. The resources are popular and promote progress.
Students now receive written feedback from tutors operating within guidelines that have been developed within COLMSCT and which promote feedback for learning. They also now receive faster written feedback using the electronic (Tablet PC) methods developed and optimised by COLMSCT and its partners.
Access to tutorials has been improved by the development and implementation of online synchronous and asynchronous alternatives to traditional mechanisms, e.g. Elluminate conferences. COLMSCT and its partners have proposed and refined these new techniques which are now used in 12 modules, affecting ~10,000 students. This number will grow extremely quickly as the technique is mainstreamed. These methods facilitate access to support.
Outcomes for…the individual member of staff
For those who had the most direct engagement with COLMSCT, a transformatory experience was frequently reported. They became better informed, more reflective, and more inclined to complement teaching intuition with rigorous evidence. Many gained in confidence and stature and are now comfortable in learning and teaching leadership roles, e.g. leading development projects, giving scene-setting presentations etc. One fellow reported, “It has given me a research identity. Much of my job has been writing and presenting courses. What I was looking for was something that gave me an identity as a researcher and COLMSCT has done that.” Personal development has been most obvious within the Associate Lecturers.
Outcomes for…the Department/Faculty
Our fellows were our change champions. They were people who were respected by their colleagues for their teaching expertise. They continued to work with their departmental colleagues and, in carrying out their projects, they were innovating within mainstream practice and within mainstream course-teams. Dissemination took place through the ongoing shared teaching tasks. Just over 50% of the academic staff of the two faculties linked most directly to COLMSCT had significant involvement in COLMSCT (e.g. regular attendance at seminars, collaboration in a COLMSCT project, etc).
COLMSCT’s effectiveness at this level was enhanced by having a related CETL centred on Physics (piCETL) and by the internally funded Maths Online project which had the broad aim of promoting e-learning in maths courses. Through our combined efforts there have been major changes in teaching and learning models, e.g. in online experimentation, tutorial support mechanisms, assessment etc.
Outcomes for…the University
Direct university level effects include; adoption of interactive computer marked assessment (iCMAs) as a key component of learning and teaching, a new strategy for offender learning, escalating use of immersive technologies, and adoption of electronic methods of supporting students.
Outcomes for…the Sector
Specific effects at sector level are difficult to identify – there are too many confounding factors. However, COLMSCT has hosted two major conferences - ‘Researching Learning in Virtual Environments’, and ‘Opening Windows on Mathematics and Statistics’ and a number of other smaller but enthusiastically received conferences/workshops, e.g. on assessment and offender learning. Our work on assessment is prominent in national strategic analyses.
Seeking excellence in open learning
COLMSCT aimed to affect the student experience directly through changes in methods of teaching, assessment and support, and indirectly through improvements in the pedagogic understanding and skills of academic staff.
It achieved this; by rewarding excellent practitioners in the two Faculties, by building on and developing good practice; by carrying out rigorous pedagogic research; and by engaging with University strategic priorities.
The Centre was co-located with the other three Open University CETLs and was able to explore synergies, maximise impact and align strategic aspirations through cross CETL initiatives, themes and shared dissemination.
Projects and Fellows
COLMSCT Teaching Fellows were at the heart of the COLMSCT community, their innovative vision was key to the success of the Centre. COLMSCT also tried to address the potential isolation of the OU’s Associate Lecturers. These part-time tutors play a hugely important role in delivering courses to the tens of thousands of students who choose to embark on a distance learning course with The Open University. By creating fellowships which were open to all OU staff and by promoting a common community, COLMSCT created a community of practitioners with a common goal of developing good practice and impacting the student experience and with complementary skills and experiences. It is possible though not easily demonstrable that the inclusivity has empowered change champions in both the AL and full time staff, with major effect on broader receptivity to changes in OU pedagogies. COLMSCT projects have transcended faculty boundaries, and have eroded false reservations about the transferability of approaches between different disciplines.
The projects focused on four main themes:
o Assessment
o E-learning Communities and Identities
o Maths Online
o Online Experimentation and Investigation
Supported by the Centre Team, fellows worked predominantly on their own project, but did so in liaison with course teams and other university colleagues who were working in the same area, etc. They were encouraged to disseminate their work through workshops, conferences, journals, meetings, etc. and to rigorously evaluate their work.
See the Activities and Projects section for more details
Special Interest Groups (SIGs)
COLMSCT supported Special Interest Groups (SIGs) on each of the above themes with the purpose of providing supportive environments through which the Fellows could share practice, provide peer support and mentoring, check project progress, refine goals and encourage stringent evaluation. Each SIG had a convenor, they formed part of COLMSCT’s Core Team and in turn would report back to the management team. There were also opportunities for SIGs to coordinate staff development, to disseminate as a group, and to collaborate between CETLs and projects. Further details about the SIGs are provided below.
Assessment
Assessment is at the heart of the learning process and any effort to achieve excellence in learning and teaching must involve scrutiny and reform of assessment processes. These projects concern issues around assessment for learning. In particular, about the enhancement of the formative potential of assignments through appropriate design of both task and feedback; associated staff development; improved exploitation of CBA; and, the role of criterion based assessment and feedback.
E-learning Communities and Identities
A number of COLMSCT projects carried out investigations into the use of mobile technologies, Wikis, conferencing systems etc, to support online community development. The generated outputs of the evaluation into tutor/student communication channels are now used in briefing of tutors.
Research into learning in multi-user virtual environments saw extensive Second Life presence developed which is now in use within targeted modules. The OU hosted the major international conference Researching Learning in Virtual Environments (ReLIVE08) and jointly with the Serious Games Institute organised The Virtual World Conference 2010. Held on the 15th September this unique event explored the use of online virtual worlds for learning, collaborative work and business ventures. It was hosted over a 24-hour period entirely in Second Life. The University’s presence in virtual worlds is now managed by a COLMSCT Associate Teaching Fellow.
COLMSCT investigation of online learning in prisons led to an OU wide review led by the COLMSCT relevant Associate Teaching Fellow. The report was adopted and the work has been extended to the wider UK and international context with the same fellow taking a national representative role. COLMSCT went on to host several large external meetings on offender learning.
Maths Online
This strand of work was carried out in collaboration with an internally funded faculty project and our sister CETL, piCETL. The use of Tablet PCs for the marking of student assignments has now been adopted as standard practice for tutors in relevant disciplines whilst the use of electronic conferencing for online tutorials was extensively tested and evaluated with the outputs now used as the basis for university wide guidance.
Online Experimentation and Investigation
The Enhancing Remote Access project, which used communicative technologies to allow disadvantaged students to have field experiences, was partly funding by COLMSCT. The project went on to win the GEES Conference Award in 2007, an Open University Teaching Award in 2008 and, subsequently, £250k Research Council funding.
A new Level 1 module on collaborative online experimentation which uses COLMSCT pedagogic analysis will be first presented in 2010 with expected student population of 500 per year.
Initiatives
COLMSCT supported a number of major initiatives which aimed to align pockets of activity and innovation with broader Open University activities.
e-Assessment for Learning; The interactive computer marked assessment (iCMA) initiative
One of COLMSCT’s major successes was the iCMA initiative. Coordinated by a COLMSCT Teaching Fellow, thirteen projects spanning four Faculties in which new online computer based assessment with targeted feedback were designed, delivered and evaluated. By the end of 2010 there will have been over 500,000 iCMAs studied by students. The questions themselves are innovative in their design and have been well received by students improving student performance. Subsequently this has led to a number of publications, numerous colloquia and seminars on the subject. The approach has been adopted as OU strategy and integrated within the Open Source Moodle environment.
Analysis of written feedback extended to three faculties and had a direct effect on university guidance to tutors.
International
A spin-off of our work was the increasing engagement in issues of capacity building and quality in distance learning internationally, in cooperation with OU partners. As we gained expertise in professional development of academics, our experiences became increasingly relevant to others who wished to enhance quality at scale. We were linked to China, Nigeria, Ghana, Rwanda and Sweden in such conversations also receiving a number of visitors throughout our tenure.
COLMSCT was a group of people committed to seeking excellence in Open Learning. Loosely speaking, the Centre was a managed community of experienced practitioners, with a proven commitment to research and innovation in distance teaching.
These practitioners undertook projects that engaged with 'change through action learning' and contributed to the Centres' themes of assessment, e-learning communities and identities, maths online and online experimentation and investigation.
The Centre activities were coordinated and supported by a Centre Team i.e. a Director, Manager, Administrator and three Educational Researchers.
Advice and comment on the direction and progress of the Centre and its activities were provided by a Core Team, drawn from senior academics from the two Faculties, and by an Advisory Group including external membership.
The main activities were undertaken by the experienced practitioners, called COLMSCT Teaching Fellows.
The progress and performance of the Centre was overseen, as was the case for each of the Open University CETLs, by the CETL Board, chaired by the PVC (Learning, Teaching and Quality), Professor Denise Kirkpatrick.
The Centre Team developed policy, managed and coordinated the Centre activities, and supported the Fellows.
The Centre Director, Manager and Administrator took responsibility for running the Centre and managing its budgets.
The Centre Director was Professor Steve Swithenby. Steve was Dean of the Faculty of Science for six years until November 2004 and has chaired several OU module teams. He has been involved in TLTP and FDTL projects, and contributed to HEFCE and HEA Committees. His academic interests include:
The Centre Manager was Catherine Reuben.
The Centre Administrator was Diane Ford.
COLMSCT’s Educational Researchers have been invaluable. They were tasked with ensuring the fellows’ projects had sound research methodology and were rigorously evaluated. They provided the fellows with advice on conducting their research and guided them through various University ethical procedures for working with students. With their expertise they were able to provide insights into wider activities within the University and identified areas where they would be able to make linkages to other colleagues and groups. Our Educational Researchers were Anne Adams, Laura Hills and Alice Peasgood.
The notion of the informed and involved practitioner was central to COLMSCT. Our Fellows were experienced practitioners, interested in their professional development and prepared to engage in the reflective practice of action research. COLMSCT provided excellent teachers with the time, resources and support they needed to capture, develop and disseminate their own good practice. Fellows developed their own skills through an action learning methodology in pursuing educational projects that were of direct benefit to students. They developed new learning materials, improved student support and enhanced the use of new technology. Pedagogical developments were disseminated through professional networks and learned journals.
Further details about the Fellows and their various projects can be found in the Activities and Projects section.
Assessment Conveners – Steve Swithenby: s.j.swithenby@open.ac.uk or Phil Butcher: p.g.butcher@open.ac.uk
E-Learning Communities and Identities Convener - Anne Adams: a.adams@open.ac.uk
Maths Online Convener – Tim Lowe: t.w.lowe@open.ac.uk
Online Experimentation and Investigation Convener – David Robinson: d.j.robinson@open.ac.uk
Technical queries relating to the COLMSCT site – iet-webmaster@open.ac.uk
Administration or other enquiries relating to COLMSCT work – esteem@open.ac.uk
Assessment is at the heart of the learning process and any effort to achieve excellence in learning and teaching must involve scrutiny and reform of assessment processes. These projects concern issues around assessment for learning. In particular, about the enhancement of the formative potential of assignments through appropriate design of both task and feedback; associated staff development; improved exploitation of CBA; and, the role of criterion based assessment and feedback.
The Initiative has been set up to bring together Faculties to accelerate adoption of online computer based assessment in their courses and programmes. The fundamental premise is that computer based assessment with feedback is an underused pedagogic strategy in the OU and that we should build internal awareness and academic capability in order to improve the learning opportunities we offer to students. Fuller academic engagement will underpin our aspirations to provide a national lead in this form of elearning.
The E-Assessment for Learning Initiative aims to bridge the gap between academic aspirations and systems development. It supports thirteen sub-projects from across the University. In each, academic staff undertake innovative e-assessment implementation projects within their own course context and work under the CETL and VLE umbrella of activities in order to explore and inform future practice and in particular the VLE.
Investigation into 'Elearning communities and identities' encompasses technological, educational, psychological and sociological issues. While all of these issues frequently interlink to enable or inhibit learning it is important to focus on key elements in our research into teaching and learning. These projects seek to enable support, collaboration, dissemination and networking possibilities both within and outside the OU. The key areas of activity are:
Mathematics poses a number of challenges for learners and teachers alike in an online learning environment. This is mainly because mathematical notation is difficult to produce electronically without a significant investment in training to use specialist typesetting packages (such as LaTeX). There are, however, a number of eLearning type technologies that might be used to overcome this problem. Additionally, there are a variety of eLearning type products that could be used to enhance a student’s overall learning experience whilst studying mathematics.
These projects are exploring and piloting of teaching and learning of mathematics in an online environment; to build on internal awareness and academic capability in order to improve the learning opportunities we offer to students; facilitate fuller academic engagement with teaching online in order to underpin our aspirations to provide a national lead in this form of elearning. Both the CETL and Mathematics-related communities will greatly benefit from this collaboration
There are some basic principles of participation in experimentation and investigation that could usefully be taught in a generic way. The increasing availability of computer-based communications and mobile phones means that it is possible to consider presenting on-line the sort of experimental and investigative activities that used to be day school and tutorial based.
Providing investigations in an on-line environment is an essential component of courses in practical-based subjects that are designed for electronic delivery. The aim of these projects is to explore issues and approaches to the online curriculum and in particular to experimentation and investigation in an online environment.
Teaching Fellows and Associate Teaching Fellows are part-time roles carried out for a fixed term. A Teaching Fellow is an existing member of the OU full-time staff with part of their time set aside to work in the Centre. Associate Teaching Fellows are recuited from the Associate Lecturer staff.
Fellows are working on projects relating to one of the COLMSCT themes:
COLMSCT is keen to build international as well as national connections with experts in the COLMSCT themes of:
Assessment is at the heart of the learning process and any effort to achieve excellence in learning and teaching must involve scrutiny and reform of assessment processes. These projects concern issues around assessment for learning. In particular, about the enhancement of the formative potential of assignments through appropriate design of both task and feedback; associated staff development; improved exploitation of CBA; and, the role of criterion based assessment and feedback.
Tony Jones, COLMSCT Associate Teaching Fellow
Despite more than 35 years of collective experience of assessment in open and distance learning, the Open University’s Science Faculty has no systematic way of making that experience available to its staff. Newcomers, in particular, have few resources to help them write assignments and when staff leave or retire their know-how disappears with them. To address this problem, this project has established a wiki – a website that can be written to as well as read – to collect, organise and preserve good practice in the design of assignments.
Phil Butcher, Coordinator of the iCMA initiative and COLMSCT Teaching Fellow
The interactive computer marked assessment (iCMA) initiative was conceived in 2005 on the premise that developing models of eAssessment had the potential to be used in a much wider role than had hitherto been possible. The initiative followed closely on, and has been enabled by, the 2005 upgrade to the university’s OpenMark CAA system. This upgrade had been built on modern internet technologies and provided a platform which was able to harness the full potential of a modern multimedia computer.
Sally Jordan, COLMSCT Teaching Fellow
This project, with funding from both COLMSCT and piCETL, has investigated the way in which students engage with interactive computer-marked assignments (iCMAs). This is a huge task and one that is far from complete. However this page gives a flavour of the methodology and reports some early findings.
Verina Waights and Ali Wyllie
Nurses are required to make clinical decisions about patients' health and well-being, responding to changes in each patient's condition, which may occur within very small time-frames
Keith McGraw, COLMSCT Associate Teaching Fellow
The aim of the project was to identify and share best practice on how to help OU Engineering students achieve their Transferable Skills Intended Learning Outcomes (ILOs) on previous versions (2003-2007) of the Professional Development Planning (PDP) courses: T191 Personal and career development in engineering, T397 Key skills for professional engineers and T398 Towards chartership: professional development for engineers.
Janet Dyke, COLMSCT Associate Teaching Fellow
The aim of this project was to determine whether formative feedback explicitly directed to improvement of study skills could help tutors provide students with the means to become more effective learners. The intended impact was that students would gain from the feedback by having increased knowledge of their strengths and weaknesses. Students should then be able to improve their skills and take them forward through the current and future courses. By setting formative targets related to course learning outcomes the tutor should be able to focus the student towards more self-directed learning.
Hilary Cunningham-Atkins, COLMSCT Associate Teaching Fellow
In October 2005 T171 reached the end of the final presentation after one pilot presentation and nine full presentations. The aim of this project is to collate the vast amount of experience of online teaching and learning amassed by T171 tutors.
Arlene Hunter, COLMSCT Teaching Fellow
The primary aim of this project was to develop and then implement an online interactive formative assessment framework, designed from a constructivist and interventionist perspective that would promote student engagement and understanding of academic progression from an extrinsic as well as intrinsic perspective.
Ali Wyllie and Verina Waights, COLMSCT Teaching Fellows
This project set out to investigate the use of an online decision-making maze tool as a form of eAssessment task more motivating and relevant to practice-based students.
Michael Isherwood
A major concern of the Course Team of M150 is Block2 - JavaScript programming
Ian Cooke
To produce e-tutorial teaching and support modules for M150
Sally Jordan (COLMSCT Teaching Fellow) and Barbara Brockbank (COLMSCT Associate Teaching Fellow)
This project has investigated the use of computer-aided assessment for checking and providing instantaneous feedback on questions requiring short free text answers, typically a sentence in length. The questions were initially written using software provided by Intelligent Assessment Technologies Ltd (IAT). This software uses natural language processing (NLP) techniques of information extraction, but an authoring tool is provided to shield the question author from the complexities of NLP. The IAT software was used with the Open University’s OpenMark e-assessment system, thus enabling students to be offered three attempts at each question with increasing feedback after each attempt. Feedback on incomplete and incorrect responses was provided from within the IAT authoring tool, using a flagging system developed during the project.
Phil Butcher, COLMSCT Teaching Fellow
Elsewhere on this website Sally Jordan describes the creation and evaluation of a range of questions which were designed to require students to respond using short-answer free-text responses of up to 20 words. The resulting responses were marked by humans and a computational linguistics algorithm, Intelligent Assessment Technology’s Free Text system, and the accuracy of the outcomes compared. The comparison showed that the computer could perform at a level equivalent to human markers. This project extends this work to encompass two computer marking solutions which are more readily available.
Ken Hudson, COLMSCT Associate Teaching Fellow
The aim of this project is to explore the value of online virtual experiments in enhancing student understanding of course material, retention and employability skill. A 'virtual lab' is defined as an 'e-learning activity based on conventional laboratory procedures, but delivered on-line to distance learners, to give them a more real experience of biological material, procedures and applicability, normally absent from a paper-based course'.
Sarah Mattingly, Faculty of Maths, Computing and Technology
This project is funded by the E-Assessment for Learning Initiative. This work aimed to consolidate and improve students’ ability to work with UML diagrams, which forms a key aspect of the course M256 'Software development with Java' and of professional software development. It centred around the implementation of two short quizzes with interactive feedback, in which students would be required to interpret and amend/complete UML diagrams.
Maria Fernandez-Toro and Mike Truman
This project has been designed as a sequel to research by COLMSCT Fellow, Mirabelle Walker, in the Technology Faculty which set out to investigate, and find ways to improve, the quality of written feedback being given on students’ tutor-marked assignments (TMAs).
Mirabelle Walker, COLMSCT Teaching Fellow
This project was designed to investigate, and find ways to improve, the quality of written feedback being given on students’ tutor-marked assignments.
Frances Chetwynd
This project builds on the work of Mirabelle Walker (Fellow 2005 - 2007) and also the Formative Assessment in Science Teaching project (www.open.ac.uk/fast)
Sarah North, Faculty of Education and Language Studies
This project is being funded under the E-Assessment for learning Initiative. The aim of this project is to develop a framework for e-assessment that could be used across English language courses in general, to support learning outcomes related to the description, analysis, and interpretation of linguistic data.
Christine Leach
A significant number of students studying higher level chemistry courses have some difficulty with the mathematics. This impacts on their studies particularly in areas involving physical chemistry which are more mathematically-based.
Ingrid Nix (COLMSCT Teaching Fellow) and Ali Wyllie (COLMSCT Teaching Fellow)
This project aims to produce a design for a continuum of topic-based computer-marked questions, from easy to difficult and from formative to summative. Students can choose, depending on their self-assessment, which questions on the continuum to engage with and to log their reflections as they do so. Our research questions will focus on how students respond to this method of selecting their learning journey, whether students agree with the mapping of the continuum according to the typology of questions which we have devised, and what design improvements can be suggested.
Alistair Willis
This project is part of the COLMSCT investigations into using online questions which require students to respond with short answers in free text. In particular, this project is looking into how to set questions that require the students to make two or three separate points, and obtain credit for each subpart. This project is working closely with the e-assessment projects of Sally Jordan and Phil Butcher.
Robert Davis, COLMSCT Associate Teaching Fellow
The focus for the project is to provide an online self-assessment tool for developing an understanding of music analysis (mainly descriptive) and music notation.
Pete Thomas, COLMSCT Teaching Fellow
The primary aim of this project was to design, implement and evaluate a software tool for helping students learn and revise diagramming skills of entity-relationship diagrams (ERDs), a crucial component of any database course. The tool, if successful, would be employed on the Computing Department’s third level course database course, M359, from 2008. At the heart of the tool is an automatic marker for ERDs being developed as a separate research project within the Computing Department. The automatic marker not only awards a grade for a diagram but also provides information useful for feedback purposes.
Judy Ekins, COLMSCT Teaching Fellow
From 2007, the OU requires its students to be on-line and it is also adopting the MOODLE VLE. This increases the feasiblilty of implementing electronic assessment. In order to improve retention on Level 1 Open University mathematics, the COLMSCT project involved piloting short interactive internet quizzes. The OU package “Open Mark” was used, enabling students to receive instant feedback, where as previously they had to wait days or weeks. Students are allowed several attempts at each question, with appropriate teaching feedback after each attempt. At the end of each quiz, alongside the mark, relevant study advice is given to the student, including references to appropriate course material. A hint facility was was also introduced for students who were unable to start a question. Open Mark has a variety of question types and is being integrated into MOODLE VLE and so will be open source.
Despite more than 35 years of collective experience of assessment in open and distance learning, the Open University’s Science Faculty has no systematic way of making that experience available to its staff. Newcomers, in particular, have few resources to help them write assignments and when staff leave or retire their know-how disappears with them. To address this problem, this project has established a wiki – a website that can be written to as well as read – to collect, organise and preserve good practice in the design of assignments.
Despite more than 35 years of collective experience of assessment in open and distance learning, the Open University’s Science Faculty has no systematic way of making that experience available to its staff. Newcomers, in particular, have few resources to help them write assignments and when staff leave or retire their know-how disappears with them.
To address this problem, this project has established a wiki – a website that can be written to as well as read – to collect, organise and preserve good practice in the design of assignments. The original aim of the project was to gather existing good practice in the authoring of TMA (tutor-marked assignment) questions and marking schemes and present it in the form of a practical and accessible guide for anyone concerned with writing TMAs.
Whilst the scope of the project has been restricted to the Science Faculty, much of the material collected would be equally useful within the Faculty of Mathematics, Computing and Technology. The project has, however, been broadened to include all kinds of assessment including computer-marked assignments (CMAs), end-of-course assessments (ECAs) and examinations. Although much useful material has been gathered, the response from potential contributors has been limited.
A one-day experimental workshop attended by associate lecturers and staff tutors confirmed a need for such a facility, identified many recurring problems in the design of assignments and attempted to document solutions in the wiki. Although a wiki appears to be an ideal tool for collecting and organising information from a dispersed group of people, in practice there are obstacles to be overcome. Composing material of value can be time consuming, the freedom to continually revise not only one’s own published work but also that of others sits uncomfortably with conventions of academic writing, and the wiki interface can appear clumsy to use. Developing a useful wiki requires unusual motivation on the part of its users. Wikis may be best suited to supporting existing projects with groups of people who already share a common purpose.
The WIKI and 'Final Project Report' can be accessed from Related Resources and Documents on the right hand side of this page.
Jones, T. (2007) Use of a wiki to gather good practice in the design of assignments. Presentation at The 2nd Open CETL Conference, Milton Keynes, 15-16 October 2007
Jones, T. (2007) Improving the quality of TMAs in Science courses. Workshop for Science Associate Lecturers, May 2007
Jones, T. (2007) Pooling know-how on assignments. Science Faculty Newsletter, February 2007
Jones, T. (2006) A Practical Guide To The Design Of Assignments In Science. Science Faculty Newsletter, June 2006
Contact
Tony Jones, COLMSCT Associate Teaching Fellow
Awj7@tutor.open.ac.uk
Related resources
Jones, T. (2008) Science Assessment Wiki
Documents
The interactive computer marked assessment (iCMA) initiative was conceived in 2005 on the premise that developing models of eAssessment had the potential to be used in a much wider role than had hitherto been possible. The initiative followed closely on, and has been enabled by, the 2005 upgrade to the university’s OpenMark CAA system. This upgrade had been built on modern internet technologies and provided a platform which was able to harness the full potential of a modern multimedia computer.
The leadership of COLMSCT believed that if OU academics could be freed from the constraints of normal course production they could provide the innovation that would lead the university in helping to develop these new models of eAssessment. The projects that are reported on here have shown what can be achieved in a variety of subject areas. Four years on COLMSCT takes some pride in reporting that COLMSCT fellows have been to the fore in directing OU eAssessment developments and that it is the faculties that have had COLMSCT fellows that are in the vanguard of using eAssessment in OU courses.
COLMSCT offered fellows collaboration with specialists in pedagogy, educational research, and educational computing. COLMSCT also provided the resources to help specify and implement the assessments and to evaluate both the process of creating the assessments and the outcomes. In all thirteen projects were supported in Biology, Chemistry, Computing, Earth Science, General Science, Languages, Mathematics, and Nursing.
The iCMA initiative is one of the main themes of work within the COLMSCT which is one of the Open University’s Centres for Excellence in Learning and Teaching (CETLs).
The Centre appointed its first fellows in 2005 and work will continue until 2010. Within the eAssessment strand fellows were able to ask questions that go beyond the bounds that constrain normal course production cycles. Foremost among these discussions was the general question of whether or not eAssessment was capable of assessing higher order learning in any meaningful way. To help think this through, a workshop was convened in late 2005 with invited experts at the forefront of eAssessment from other universities. One conclusion, arising from the combined educational and computing expertise of the discussants and the ‘what if’ enabling approach of the COLMSCT leadership, was that developing models of eAssessment had the potential to be used in a much wider role than had hitherto been possible. The challenge to COLMSCT was to establish projects to test the conclusion of the experts.
In March of 2006 COLMSCT issued a call for proposals from academic staff to “develop and evaluate innovative e-assessment projects within their own teaching context”. While the remit of COLMSCT was the Mathematics, Science, Computing and Technology areas the call was widened to the whole university with appropriate support from the university’s Learning and Teaching Office. The initiative specifically acknowledged the current gap between academic aspirations and the types of interactions commonly found within standard Computer-Based Assessment systems and encouraged proposals that went beyond those current boundaries. The iCMA initiative offered collaboration and resources to help specify and implement the assessment and to evaluate both the process of creating the assessment and the outcomes.
In the intervening years the iCMA initiative has grown to include projects that have delivered iCMAs for use in Biology, Chemistry, Computing, Earth Science, General Science, Languages, Mathematics, and Nursing. The projects are required to undertake the full project cycle from proposal, through specification and implementation, to evaluation with students and propagation of the outcomes within and outside the university.
Please scroll down to related resources below for more about the iCMA initiative. The Resource allows you to download a version to print.
The interactive questions on this site are run by the OpenMark assessment system which handles your access to a question as if you were taking a real assessment. However this site provides you with considerable flexibility as to how you access these demonstration questions and if you use all of this flexibility you may break some of the rules that are applied to more formal assessments.
One rule you may break is an 'Access out of sequence' error. If you see one of these errors, you now know why. Just click on the link provided to 'Return to the test'.
The major similarity between projects in the iCMA initiative is reflected in the name; all projects are attempting to engage students in an ‘interactive’ exchange around one or more learning outcomes, with the computer providing instant feedback and multiple attempts for students who answer incorrectly. The overall project is titled eAssessment for Learning and all projects include teaching feedback, often with course references, to persuade students to revisit topics where their answers are incorrect, before attempting the question again. For example see Figure 1 below.

Figure 1 An illustration of immediate targeted feedback
While the OU is not unique in using eAssessment in this way it was perhaps one of the first to realise how automation could be used to support students studying on their own away from tutors and peers. The university has been host to a variety of projects in this field stretching back to the 1970s. When the OU joined the Moodle community in 2005 the very first thing it added to the Moodle eAssessment tool was the ability to give a much wider variety of feedback. As prior to OU involvement Moodle was already the leading open-source VLE, here was evidence that the OU had given more thought than 10,000 other institutions worldwide on how Moodle’s eAssessment could be used to support the distant learner.
The importance of feedback for learning has been highlighted by a number of authors, emphasising its role in fostering meaningful interaction between student and instructional materials (Buchanan, 2000), its contribution to student development and retention (Yorke, 2001), but also its time-consuming nature for many academic staff (Gibbs, 2006). In distance education, where students work remotely from both peers and tutors, the practicalities of providing rapid, detailed and regular feedback on performance are vital issues.
Gibbs and Simpson suggest eleven conditions in which assessment supports student learning (Gibbs and Simpson 2004).
Four of these conditions, those in italics, are particularly apposite with regard to the use of eAssessment within distance education. They are reflected in the design of OpenMark and are amplified in the rationale behind the development of the S151, Maths for Science, online assessments (Ross, Jordan and Butcher, 2006) where
Readers might like to try this typical OpenMark question with instant feedback https://students.open.ac.uk/openmark/omdemo.text.q4. For non-scientists a response of the form '1s2', which is partially right, should give helpful feedback.
References
Buchanan, T. (2000) The efficacy of a World-Wide Web mediated formative assessment, Journal of Computer Assisted Learning, 16, 193-200
Gibbs, G. and Simpson, C. (2004), Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education, 1, pp 3-31
Gibbs, G, (2006) Why assessment is changing, in C. Bryan and K. Clegg (eds), Innovative assessment in Higher Education, Routledge
Ross, S, Jordan, S and Butcher, P (2006), Online instantaneous and targeted feedback for remote learners, in C. Bryan and K. Clegg (eds), Innovative assessment in Higher Education, Routledge
Yorke, M. (2001) Formative assessment and its relevance to retention, Higher Education Research and Development, 20(2), 115-126
All of the iCMA projects wanted students to use and act on the instant feedback there and then while the problem is still in their mind; points 10 and 11 from the Gibbs and Simpson conditions in the previous section. And so the majority of questions were designed such that if the student’s first answer is incorrect, they can have an immediate second, or third, attempt. See Figure 2.

Figure 2 An illustration of three attempts at an interactive question
Readers might like to try this non-typical OpenMark question which allows up to 100 attempts (!) https://students.open.ac.uk/openmark/omdemo.twod.marker
Here is how the IMS Question and Test Interoperability (QTI) specification defines adaptive questions (items):
An adaptive item is an item that adapts its appearance, its scoring (Response Processing) or both in response to each of the candidate's attempts. For example, an adaptive item may start by prompting the candidate with a box for free-text entry but, on receiving an unsatisfactory answer, present a simple choice interaction instead and award fewer marks for subsequently identifying the correct response. Adaptivity allows authors to create items for use in formative situations which both help to guide candidates through a given task while also providing an outcome that takes into consideration their path.
Readers will see that by coupling feedback with multiple attempts we have much of what is described as an adaptive item. But as the excerpts from the following question show the iCMA project embraced all aspects of adaptive questions. We can report that the question features of OpenMark satisfactorily supported all the iCMA projects.

Figure 3a Initially students have to enter their own words into text-entry boxes

Figure 3b The correct response is locked and the student is asked to try again for the remainder

Figure 3c A second attempt

Figure 3d Now two correct responses are locked and the remaining questions become selection lists

Figure 3e On the third attempt the text-entry boxes from attempts 1 and 2 have been replaced by selection lists
Readers might like to try this question for themselves https://students.open.ac.uk/openmark/omdemo.adaptive.q1
It follows that as well as wishing students to work towards the correct answer for the original question then perhaps we should also provide more opportunities for the student to practice. This too has been supported by many of the iCMA projects.
What might look like a single question to the user often has several variations behind the scenes such that a student may revisit the iCMAs for further practice and receive variations on their original questions; in this respect the iCMAs resemble a patient tutor, correcting initial misunderstandings and providing further examples to reinforce the learning. None of the COLMSCT iCMAs is being used for summative purposes but if they were the in-built variability also counteracts plagiarism. Here are two such variations.

Figure 4a There are a variety of eruptions available to this question

Figure 4b Of course the response matching behind the scenes has to cope with the variety of eruptions too
Readers might like to try this example https://students.open.ac.uk/openmark/omdemo.maths.q2
Answer the question – you will get feedback to help you if you need it. And once you have completed the question, request 'Next question'. Do you see the same question? In fact this question has five variations so there is a 20% chance that you might see the same question repeated.
Across the iCMA initiative we have seen the creation of questions that use the capabilities of modern multimedia computers to display the problem and support interactions with the questions.
For example within Science there are several examples of ‘virtual microscopes’ having been put to imaginative use as teaching tools. Now it is possible to reuse the same idea within an iCMA with each of the views below corresponding to different levels of magnification (figures 5a-c).

Figure 5a Low resolution

Figure 5b Medium resolution

Figure 5c High resolution
And as this figure shows not only can the resources be varied but students can be asked to interact with them directly to show that they have understood what they are looking at (Figure 6 below).

Figure 6 Identifying a parasite on a microscope slide
Incorporation of online resources
The Clinical Decision Making Maze also challenges the student to interpret an array of online resources from audio interviews through data sheets to online databases thereby making the experience more akin to a real life nurse/patient consultation. These are configured to come up in different tabs of a tabbed browser leaving the question in the first tab. Try it here https://students.open.ac.uk/openmark/omdemo.mm.cdm
Compound questions are not unusual (see Figure 7 below). While these are more difficult for the author to analyse and comment on, they do provide students with more substantial tasks.

Figure 7 A compound question with multiple responses to be marked
Several authors are exploring how advances in computing technologies can be utilised in iCMAs. For example we know there is variation in how human markers mark written materials and we can ask how a computer might fare if asked to mark the wide range of student responses that such questions elicit. Jordan, Butcher and Brockbank have been exploring the application of both computational linguistics and computational algorithms to marking free form text (Figure 8).

Figure 8 Automatic marking of free-text responses
We have a demonstration test that contains six questions that require free-text responses. Readers are invited to try it by following this link https://students.open.ac.uk/openmark/omdemo.pm2009.
And Thomas has been exploring the automatic marking of diagrams. Students use a linked applet to draw their diagram which is then automatically marked and feedback in the normal OpenMark style.
The most striking differences have come about from the initiative’s venture beyond the Mathematics, Science and Technology fields that form the backbone of COLMSCT. In both Health and Social Care and Languages we have seen both different interactive activities devised and different forms of eAssessment created.
New question components
In its simplest form our work with Languages has resulted in the cost-effective development of a new OpenMark component that allows the selection of multiple words in a paragraph. In a symbiotic relationship the new component builds on existing OpenMark functionality and contributes a new question type into the larger pool. This example in figure 9 raises the question of whether other subjects might not devise their own specialised interactions.

Figure 9 The new OpenMark word selection component developed by COLMSCT
Readers may wish to explore their own knowledge of English grammar with this example https://students.open.ac.uk/openmark/omdemo.mcq.q6. Please note that this question is aimed at master’s level students and such students are expected to understand the reasons for their own errors, such that little teaching feedback is included.
Novel use of existing question components
But there are also novel uses of existing question interactions. For example the line interaction was created to enable mathematicians, scientists and technologists to draw tangents or best fit straight lines (Figure 10).

Figure 10 A user is drawing the line shown here
Consequently it was both a surprise and a delight to see how it might be used to help language learning (Figure 11).

Figure 11 And here is drawing a line to link words
Readers may wish to try a question similar to that shown in Figure 11 by following this link https://students.open.ac.uk/openmark/omdemo.twod.lineconnect
Different forms of eAssessment
While all of our iCMAs have relied on an assessment of knowledge, one, the Clinical Decision making Maze, has also followed the pathway that the student takes through the activity. With different responses leading to different pathways this is an example of an adaptive eAssessment. As such this provides a different form of experience to the sequences of unrelated questions found in many applications of eAssessment.
In looking back over the projects we have encountered some difficulties in supporting our fellows’ requests for adaptive testing. While OpenMark’s question features have coped very well with our fellows’ designs, the difficulty in sequencing questions as a result of student performance has been much harder to solve; indeed the iCMA initiative has circumvented this problem and not solved it.
The iCMA initiative offered collaboration and resources to fellows in three key areas: Pedagogic, Technical and Evaluation support.
Pedagogic support
We started by recognising that writing questions that explore students’ understanding of their subject is a skilled activity. Couple this with the wish to include feedback that enables students who respond incorrectly to correct their misunderstanding and the task grows, but so does the student engagement. While most fellows started with their own idea the iCMA coordinator and project consultants had many years of experience in creating interactive multimedia materials and could guide the fellows as to what might work and to steer them away from what it might be impossible to achieve. For example interactive eAssessment must react sensibly to student inputs and if the question is too ‘open’ this becomes impossible so setting the question and providing appropriate response matching are key starting points.
We would also include here the application of technology to support pedagogic ends. Our authors did not have to concern themselves with the implementation issues so that ‘difficult’ areas of implementation did not cloud their view of what they wished to do.
Technical support
All technical issues were undertaken by the project coordinator and two experienced consultants, Spencer Harben and Adam Gawronski, who were familiar with writing interactive computer-based materials. The consultants were able to guide the fellows on what forms of interaction could be supported and how different inputs might lead to a range of responses that would have to be dealt with.
Clearly questions should be functionally correct and the coordinator and consultants ensured that:
Evaluation support
Support was provided to evaluate both the process of creating the assessment and the outcomes of using the assessment with students. For the latter the fellows used a selection of data analysis of student responses collected by the eAssessment systems, online questionnaires, observation in the Institute of Educational Technology’s data-capture suite, online monitoring with Elluminate and one-to-one interviews.
All of the COLMSCT iCMAs were implemented in the OpenMark eAssessment system that was developed at the Open University http://www.open.ac.uk/openmarkexamples. The university has integrated OpenMark with Moodle and readers wishing to know more about this project are referred to our OpenLearn website http://labspace.open.ac.uk/course/view.php?id=3484.
The OpenMark philosophy
“There is already a trend towards a larger proportion of multiple-choice questions in British exams – a tendency often taken to extremes in the United States. This may be less a matter of academic merit than the convenience that these papers can be marked electronically. This is an outcome that should worry all those involved in education. The best test of a test is whether it stretches pupils, not that it is easy to mark” Editorial, The Times, 7th August 2004.
The raison d’être for OpenMark is to enable the Open University to produce interactive assessments that go beyond the bounds of the restricted range of question formats found in most commercial CAA systems.
And The OpenMark philosophy is also
• To exploit modern interactive computing technology for the purpose of enhancing the student learning experience.
• To provide an ‘open’, extensible, framework capable of supporting question types that go beyond multiple-choice.
• To support assessments that are engaging.
• To provide immediate feedback.
OpenMark is somewhat different from many CAA systems in the methods used for building questions. OpenMark questions and assessments that are to be run in the OpenMark system are constructed in an editor such as Eclipse using a combination of OpenMark XML components and Java. This combination enables OpenMark to be used very flexibly but it comes at the price of requiring authors to be comfortable when reading and writing Java.
Figure 12 illustrates how this works in practice with most output to the student being written in XML and most input from the student being analyzed in Java. Because the XML can be controlled by the Java it is also possible to introduce variations under computational control. The balance is that each technology is used for what it is good at; XML for specifying the content and laying out the web pages; Java for analyzing and making decisions on student responses.

Figure 12 OpenMark XML components are combined with small Java code segments which analyse responses and select the feedback that is to be given.
The open source site holding the OpenMark system also includes a variety of examples that show how the XML and Java work together. We would acknowledge that there is a learning curve to using OpenMark but multiple interactive media developers at the OU have risen up that curve and there are now thousands of OpenMark questions in regular use.
We would stress that COLMSCT set itself the task of pushing the boundaries and the flexibility of the OpenMark system has suited our purposes admirably.
The iCMA initiative started to deliver its first iCMAs in July 2006. The following graph (Figure 13) shows how the University’s use of iCMAs has increased during the lifetime of COLMSCT. We would not wish to claim that we are responsible for all of the increase over the period but we can be clear that COLMSCT fellows have been the leaders through their role in COLMSCT and in Course Team work in driving the upward trend shown.

Figure 13 iCMAs served by the OU to July 2010. The figure shows total usage by year and includes diagnostic, formative and summative iCMAs.
COLMSCT has also provided the pedagogic underpinning behind the university’s development of its eAssessment systems. Sally Jordan and Pete Thomas were the academic advisers to the VLE eAssessment project, Ingrid Nix and Ali Wyllie were contributing members of the eAssessment Faculty Liaison Group and Phil Butcher was the VLE eAssessment project manager. These fellows were able to steer the development of Moodle towards more OpenMark like features with Phil Butcher providing many of the designs. The software developments have been undertaken by technical staff in the Strategic Development section of the Learning and Teaching Solutions department and are described in ‘Online assessment at the Open University using open source software’ in the Resources section of this site.
I have worked in computer-based learning since 1974 and during that time have performed most roles encompassing programmer, author, systems designer, researcher, educational evaluator, manager and acting head of department. I wrote my first interactive CBL program at Leeds University in 1974, my first for the OU in 1975 and moved from Leeds to the OU in 1977. As well as overseeing the COLMSCT iCMA initiative I have also continued to maintain my software skills and on the free-text marking project I have been active both in enhancing the OpenMark response matching algorithm and using it to analyse students’ responses.
During my long, but varied, career at the OU, I have been on numerous course teams that have advanced the use of interactive computers in the OU’s teaching and learning practices, and 30 years on the pace of change doesn’t slacken. That the COLMSCT CETL has chosen to address online assessment as a major theme illustrates that increasingly sophisticated application of computers in education continues to make major in-roads into established practices.
Here are a few highlights that led to my current fellowship in COLMSCT. The Works Metallurgist (with Adrian Demaid for TS251) in 1980 was the first OU CAL program to use interactive graphics and The Teddy Bear Videodisc (with Keith Williams (Technology) and Martin Wright (BBC) for T252) in 1984 was the first to integrate full-screen video and interactive computing. In 1986 I joined the T102 Living with Technology, course team that introduced personal computing into OU first level courses in 1989; in 1996 T102 became the first non-ICT undergraduate course to utilise conferencing and e-mail on a large scale. From 1996 I managed the introduction of multimedia computing into the Science Faculty with S103 Discovering Science becoming the first course to replace paper-based CMAs with interactive questions delivered through a modern multimedia computer; this was the start of the development of the OpenMark interactive assessment system. In 2002 OpenMark was moved online to support S151 Maths for Science and since 2005 has been integrated ever more closely with the OU’s VLE developments. OpenMark is the system that has enabled the creation of the diverse COLMSCT iCMA initiative projects shown on this site.
I have an MPhil in Computer-Based Education from the University of Leeds. I guess I thought that this was ‘normal’ until I heard Tim O’Shea (also ex. Computer-Based Learning at Leeds) describe the impact of Leeds in this area and how, in our different ways, he and I had brought that approach to the OU. And having made me think about it I can see he has a point. In my formative CBL years it was made clear to me by Roger Hartley (then Director of the Leeds CBL Unit) that it’s not only the student who should work hard at CBL but also the authors and at run-time the computer. In recent years my role as the COLMSCT iCMA initiative coordinator has put me in a prime position to continue this tradition while at the same time helping my COLMSCT colleagues deliver their ideas across the internet.
Phil Butcher, Coordinator of the iCMA initiative and COLMSCT Teaching Fellow
P.G.Butcher@open.ac.uk
This project, with funding from both COLMSCT and piCETL, has investigated the way in which students engage with interactive computer-marked assignments (iCMAs). This is a huge task and one that is far from complete. However this page gives a flavour of the methodology and reports some early findings.
Note: This website will be frozen from August 2010, but I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
The project’s methodology has included an extensive analysis of the quantitative data captured when students attempt iCMAs. Tools have been produced to extract information on things such as the time spent per question, the order in which the questions are attempted, the use made of the iCMA relative to any deadlines and other aspects of the course’s calendar, the percentage of blank responses and the extent to which responses are altered in response to feedback. In addition to compiling an overall picture of the use made of each iCMA, the influence has been investigated of various factors that might affect use. For example, to what extent does the presence of an examination encourage students to use formative-only iCMAs? The quantitative analysis has been supplemented by a qualitative investigation into student perceptions of iCMAs in different situations.
A range of courses using iCMAs were chosen for inclusion in the project, including S104 Exploring Science, S154 Science Starts Here, S110 Health sciences in practice, S279 Our Dynamic Planet, S342 Physical Chemistry, SK123 Understanding cancers, SM358 The Quantum World, SXR103 Practising Science, SXR208 Observing the Universe and M150 Data, Computing and Information. Diagnostic quizzes such as ‘Are you ready for Level 1 Science?’ were also included. Courses and components were selected so as to encompass a wide range of different types of iCMA use, though the initial analysis focused on OpenMark iCMAs (since the tools were written to extract information from the OpenMark database) and on Science Faculty courses.
Follow the links on the right-hand side of this page for more information about when students attempt iCMAs, the number of questions attempted, the length of short-answer free-text questions, the use made of feedback and student opinion of iCMAs. The overall conclusions to date are
A related project has considered appropriate statistical tools for the analysis of OpenMark iCMAs, looking at the level of the whole quiz, individual questions and separate variants of a question. Random guess scores for a range of Moodle and OpenMark question types have also been calculated. For more information on this work follow the appropriate links on the right-hand side of this page or see the two documents written by Helen Jordan.
Reviews of the literature (e.g. Black and Wiliam, 1998; Gibbs and Simpson, 2004) have identified conditions under which assessment appears to support and encourage learning. Several of these conditions concern feedback, but the provision of feedback does not in itself lead to learning. Sadler (1989) argues that in order for feedback to be effective, action must be taken to close the gap between the student’s current level of understanding and the level expected by the teacher. It follows that, in order for assessment to be effective, feedback must not only be provided, but also understood by the student and acted on in a timely fashion.
These points are incorporated into five of Gibbs and Simpson’s (2004) eleven conditions under which assessment supports learning:
Condition 4: Sufficient feedback is provided, both often enough and in enough detail;
Condition 6: The feedback is timely in that it is received by students while it still matters to them and in time for them to pay attention to further learning or receive further assistance;
Condition 8: Feedback is appropriate, in relation to students’ understanding of what they are supposed to be doing;
Condition 9: Feedback is received and attended to;
Condition 11: Feedback is acted upon by the student.
It can be difficult and expensive to provide students with sufficient feedback (Condition 4), especially in a distance-learning environment, where opportunities for informal discussion are limited. Feedback on tutor-marked assignments is useful but may be received too late to be useful (Condition 6) and it is then difficult for students to understand and act upon it (Conditions 8 and 10), even assuming that they do more than glance at the mark awarded (Condition 9).
One possible solution to these dilemmas is to use e-assessment. Feedback can be tailored to students’ misconceptions and delivered instantaneously and, provided the assessment system is carefully chosen and set-up, students can be given an opportunity to learn from the feedback whilst it is still fresh in their minds, by immediately attempting a similar question or the same question for a second time, thus closing the feedback loop. Distance learners are no longer disadvantaged — indeed the system can emulate a tutor at the student’s elbow (Ross et al., 2006, p.125) — and ‘little and often’ assessments can be incorporated at regular intervals throughout the course, bringing the additional benefits of assisting students to pace their study and to engage actively with the learning process, thus encouraging retention. For high-population courses, e-assessment can also deliver savings of cost and effort. Finally, e-assessment is the natural partner to the growth industry of e-learning.
However opinions of e-assessment are mixed and evidence for its effectiveness is inconclusive; indeed e-assessment is sometimes perceived as having a negative effect on learning (Gibbs, 2006). Murphy (2008) reports that high stakes multiple-choice tests of writing can lead to actual writing beginning to disappear from the curriculum; she also reports that ‘the curriculum begins to take the form of the test’. There are more widely voiced concerns that e-assessment tasks (predominantly but not exclusively multiple-choice) can encourage memorisation and factual recall and lead to surface-learning, far removed from the tasks that will be required of the learners in the real world (Nicol, 2007: Scouller and Prosser, 1994). Also, although multiple-choice questions are in some senses very reliable, doubt has been expressed that they may not always be assessing what the teacher believes that they are, partly because multiple-choice questions require ‘the recognition of the answer rather than the construction of a response’ (Nicol, 2007)
Ashton and her colleagues (2006) point out that the debate about the effectiveness of multiple-choice questions ‘diverts focus away from many of the key benefits that online assessment offers to learning’. Perhaps the question we should be asking is not ‘should we be using e-assessment?’ but rather ‘what are the features of an effective e-assessment system?’ (Mackenzie, 2003).
References
Ashton, H.S., Beevers, C.E., Milligan, C.D., Schofield, D.K., Thomas, R.C. and Youngson, M.A. (2006). Moving beyond objective testing in online assessment, in S.C. Howell and M. Hricko (eds) Online assessment and measurement: case studies from higher education, K-12 and corporate. Hershey, PA: Information Science Publishing: 116-127.
Black, P. and Wiliam, D. (1998) Assessment and classroom learning, Assessment in Education, 5, 1, 7-74.
Gibbs, G. (2006). Why assessment is changing. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp11-22.
Mackenzie, D. (2003). Assessment for e-learning : what are the features of an ideal e-assessment system?. 7th International CAA Conference, Loughborough, UK. At http://www.caaconference.com/pastConferences/2003/procedings/index.asp
Murphy, S. (2008) Some consequences of writing assessment, in A. Havnes and L. McDowell (eds) Balancing Dilemmas in Assessment and Learning in Contemporary Education. London: Routledge:33-49.
Nicol, D.J. (2007). E-assessment by design: using multiple choice tests to good effect. Journal of Further and Higher Education, 31, 1, 53–64.
Ross, S.M., Jordan, S.E & Butcher, P.G. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp123-131.
Sadler, D.R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119–144.
Scouller, K.M. & Prosser, M. (1994). Students’ experiences of studying for multiple choice question examinations. Studies in Higher Education, 19, 3, 267–279.
Figure 1 shows usefulness of apparently very simply data, in this case the overall number of actions (transactions) per day on the diagnostic quiz 'Are you ready for level 1 science?'. A cyclical pattern can be seen in the figure and careful inspection reveals an unexpected finding, namely that potential students are less likely to use the quiz on a Saturday.
Figure 1 Action levels (the number of transactions for all users) for 'Are you ready for level 1 science' on a day-by-day basis for the whole of 2008.
This finding sparked an investigation into the average daily use made of the Open University's virtual learning environment as a whole, and it was found that the number of registered students using the VLE is greatest on Monday, declines steadily to Friday, drops more noticeably on Saturday before rising a little on Sunday. For a distance-learning University which holds many tutorials on a Saturday (on the basis of the fact that this is likely to be the most convenient day for students) this finding has startling implications. Within each day, the rate of iCMA transactions is remarkably steady between 9am and 9pm, then drops overnight. An increase in the global presentation of Open University courses is likely to result in a change in this pattern.
Figures 2(b) and 2(c) illustrate the influence of cut-off date on the overall activity for summative iCMAs, in this case for S154 Science starts here. Use builds as the cut-off date approaches. The practice iCMA is used throughout the entire 10-week presentation (Figure 2(a).
Figure 2 Action levels (total number of interactions for all students) for (a) the S154 practice iCMA for the presentation that started on 27th September 2008 and (b) and (c) the two summative iCMAs (iCMA41 and iCMA42) for the same presentation.
SDK125 Health sciences : a case study approach has shorter practice iCMAs associated with each of its summative iCMAs; it also has an end-of-course examination. Whilst the activity on its summative iCMAs (Figures 3(b) and 3(d) is similar to that for S154, it can be seen (Figures 3(a) and 3(c)) that students attempt the formative iCMAs both as practice for the relevant summative iCMAs and as revision prior to the examination.
Figure 3 Action levels (total number of interactions for all students) for (a) and (c) two of SDK125’s practice iCMAs; (b) and (d) the equivalent summative iCMAs.
Figures 4(a) and 4(b), again drawn using S154 data, illustrate typical behaviours for individual students on all summative iCMAs. Many students behave like Student 1; they open the iCMA and attempt all 10 questions on a single day, frequently close to the cut-off date (Figure 4(a)). The behaviour shown in Figure 4(b) (Student 2) is similarly common for all summative iCMAs – students open the iCMA and look at all the questions, then they attempt them as and when they are able to as they work through the course material, using feedback from unsuccessful attempts at the iCMA questions. The behaviour shown in Figure 4(c) (Student 3) is quite typical for S154 but is not seen for other modules. Students are advised to attempt questions 1-4 after completing Chapter 2, questions 5-6 after Chapter 3 and questions 7-10 after Chapter 4, and many do precisely this.
Figure 4 Three typical student behaviours exhibited on S154 summative iCMAs.
Not surprisingly, when iCMAs are used summatively, most students complete all the questions. In formative-only use, there is typically a reduction in use as the iCMA progresses, as shown in the figure for the Practice iCMA for S154 Science starts here. As the iCMA progresses, there is both a decrease in the number of people who have completed each question (dark blue lines) and a decrease in the extent to which users repeat questions (paler lines).
It has been suggested that this decline in use is caused by having too many questions, but a similar decline is seen for courses with several shorter formative-only iCMAs; use decreases both during each iCMA and from iCMA to iCMA. Furthermore, there are always some users who access iCMAs, but do not complete any questions. The iCMA whose use is shown in the figure (with 640 users completing Question 1 and 668 completing Question 2), was opened by 768 registered S154 students, and it appears that around 100 of these people did not take further action. In interviews, most students were happy to admit to a lower level of engagement with iCMAs when in formative-only use, but none admitted to failing to complete any questions after opening an iCMA, so the reason for this behaviour is not known (though it is speculated to occur when students decide for themselves that the questions are either trivial or too difficult).
The general question-by-question reduction in use is bucked in several places, e.g. at Question 43 and Question 46. These questions are clearly identified in the iCMA as being the first questions to assess new chapters of the course, presumably chapters that students find challenging and so seek additional practice and reinforcement. Thus, clear signposting appears to be beneficial.
Student responses to short-answer free-text questions in summative use have generally been found to be more likely to be correct, more likely to be expressed as sentences and longer than responses to the same questions in formative-only use.
Figures 1 and 2 compare the length of responses obtained to Question A: 'A snowflake falls vertically with constant speed. What does this tell you about the forces acting on the snowflake'.
Figure 1 Distribution of length of 888 responses to Question A in formative-only use.
Figure 2 Distribution of length of 2057 responses to Question A in summative use.
In formative-only use (Figure 1) the peak at one word corresponds to the answer ‘balanced’ and the peak at 3 words corresponds to the answers such as ‘they are balanced’ and ‘equal and opposite’. In summative use (Figure 2) the peak at 4 words corresponds to answers such as ‘the forces are balanced’ and the peak at 8 words corresponds to answers such as ‘the forces acting on the snowflake are balanced’ and ‘there are no unbalanced forces acting on it’.
It is quite common for the distribution of lengths to be bimodal; in other questions there is sometimes a peak for answers that simply answer the question (e.g. ‘the force is reduced by a factor of four’) and another for answers that add an explanation (e.g. ‘the force is reduced by a factor of four since it depends on dividing by the square of the separation’).
Unfortunately some excessively long responses were received (up to several hundred words) to early summative versions of short-answer free text questions, and these frequently contained a correct answer within an incorrect one. Responses of this type are recognised as being the most difficult for computerised marking systems of any type to deal with and for this reason, from February 2009, a filter has been introduced to limit the length of responses to 20 words. Students who give a longer answer are told that their answer is too long and are given an extra attempt. The filter was initially accompanied by the warning 'You should give your answer as a short phrase or sentence. Answers of more than 20 words will not be accepted'
The introduction of the filter and explanatory text reduced the number of students who added text to previous answers without thought to the sense of the response so produced. It also dealt with the excessively long responses that were difficult to mark, and increased the number of students giving their responses as sentences. However, for all questions, the addition of the filter and explanatory text resulted in an overall increase in median response length (see the distribution shown in Figure 3).
Figure 3 Distribution of length of 1991 responses to Question A in summative use with filter and additional wording on the question.
A possible explanation of this effect is that more students were heeding the advice to give their answer as a sentence, now that this advice was given in the question. A less desirable explanation is that students were interpreting the advice to use no more than 20 words as indicating that they should be writing exactly or almost twenty words. From July 2009, the advice accompanying the filter has been changed to ‘You should give your answer as a short phrase or sentence.’ Studeints are still given an extra attempt if their answer is too long, and it is only at this stage that they are informed that their answer must be no more than 20 words long. This second change of wording appears to have had the desired effect - the most recent distribution still has peaks at 4 words and 8 words, but there are considerably fewer responses close to the 20-word limit.
Similar effects were observed for all questions and the undesiable effect of a change in wording that was intended to be helpful is a useful reminder of the need to monitor student responses to e-assessment questions.
When asked in end-of-iCMA feedback questions whether they found the feedback useful, the number of students who respond in the affirmative is consistently around 90%. Similarly, 85% of surveyed S104 students agreed with the statement ‘If I get the answer to an iCMA question wrong, the computer-generated feedback is helpful’ and 85% of surveyed SXR103 students agreed with the statement ‘If I initially get an answer to an iCMA question wrong, the hints enable me to get the correct answer at a later attempt’.
Observations of students in the usability laboratory painted a rather different picture. Whilst students sometimes made good use of the various aspects of the feedback provided in altering their answer for a subsequent attempt (making use of the simple fact of being told they were wrong, the more detailed prompt and the reference to the course material), there were also several instances where students did not pay sufficient attention to the feedback even when they appeared to read it. For example, student Christina* entered ‘absorbing a photon’ in answer to a question for which a correct answer would have referred to the emission not absorption of a photon. Her answer was incorrect, but a software problem (later resolved) meant that she was told her answer was correct. She looked at the final answer provided by the iCMA question and said ‘oh, did that right’ despite the fact that the word ‘emission’ was clearly visible, emboldened, on the screen she was looking at.
In an attempt to investigate factors that might influence the use that students make of feedback, an analysis was performed into the extent to which incorrect responses are left unchanged for a second or third attempt after feedback has been provided and the extent to which the data-entry box is left blank. Both of these types of behaviour are more common for free-text questions than for selected response items and for questions that the student considers more difficult.
Whether the iCMA is summative, formative-only or diagnostic is also an influential factor. For variants of the same question (a question requiring the student to use a provided word equation to calculate density): in summative use, 21% of the third-attempt responses were identical to those given previously with 2% of them blank; in formative-only use, 46% of the third-attempt responses were identical to those given previously with 7% of them blank; in diagnostic use, 55% of the third attempt responses were identical to those given previously with 19% of them blank.
Interviews identified a reluctance to spend time on a question (e.g. finding a calculator) when the mark did not count as a factor behind this result. In addition, students who have a complete lack of understanding of the question or feedback feel unable to enter an answer in the first place or to alter it. As Trevor said:
I found that the hint in the second attempt was absolutely uninformative and I couldn’t see where I was wrong.
*All names have been changed
In general, students regard iCMAs as useful in helping them to learn and in highlighting what they need to study further, with 87% of respondents agreeing with the statement 'Answering iCMA questions helps me to learn' , 78% agreeing that 'Answering iCMA questions directly helps me to learn skills or knowledge' and 79% agreeing that 'Answering iCMA questions helps me to understand what I need to study further. 64-68% agreed that 'answering iCMA questions is fun'. These findings are substantiated by free-text survey comments from students such as
Interviewed student Martin* felt the iCMAs were particularly useful because he was studying at home and without easy access to a tutor. Although most students felt that TMAs (tutor-marked assignments) were more useful in their learning (agreeing with 'I learn more by doing TMA questions that by doing iCMA questions' and 'The feedback that my tutor provides on my TMA answers is more helpful than the feedback provided by the computer on my iCMA answers' , a large percentage were neutral in their response to these statements and some felt that iCMAs were more useful. When this point was followed-up in the interviews, most people identified iCMAs and TMAs as useful for different things. Rachel said that she would be happy with courses assessed entirely by iCMA. Deborah (whose course did not have iCMAs) highlighted the importance of the timeliness of iCMA feedback to
The instantaneous receipt of feedback was the most commonly identified useful feature of iCMAs, with one student contrasting iCMAs with computer-marked assignments (CMAs) in earlier modules, which were submitted and returned through the post, and thus
Other features of iCMAs that were identified as particularly useful included the availability of three attempts, the content of the feedback prompts and the references to course materials. Trevor was pleased that the questions were relatively testing:
However it should not be forgotten that a small number of students do not find iCMAs helpful or enjoyable, perhaps linked to the fact that some (rightly or wrongly) believe that the computer sometimes marks their answers inaccurately (17% agreed with the statement 'The computer sometimes marks my iCMA answers incorrecly') or penalises them for careless mistakes (22% agreed with the statement 'The computer penalises me for careless mistakes').
A decision was taken to interview some students whose survey responses had indicated some disquiet with iCMAs. Patricia had said:
Steven had said:
However, significantly, it transpired during the interviews that both of these students were extremely happy with iCMAs in general, just not with particular questions or with aspects of their use.
Most students felt that their mark for iCMA questions should count towards their overall course score (71% disagreed with the statement 'I don't think that the mark I get in iCMAs should count towards my overall course score' ), and those interviewed felt that the 20% weighting in S104 was about right. It proved impossible to interview any students who felt that the mark they got in iCMAs should not count towards their overall score. There was nevertheless a difference in the reported influence of marks on behaviour, with two extremes being Martin, who reported engaging in summative and formative-only iCMAs in exactly the same way and Trevor (who had not attempted SXR103’s purely formative-iCMA) who said:
* all names have been changed
Computer marked assignments (CMAs) have been used throughout the Open University’s 40-year history. The original multiple-choice CMA questions were delivered to students on paper; responses were entered on OMR forms and submitted by post. A range of statistical tools has been in operation for many years, enabling course team members to satisfy themselves of the validity of individual questions and whole CMAs; these tools were also designed to improve the quality of future CMAs by enabling question-setters to observe student performance on their previous efforts.
The introduction of online interactive computer marked assignments (iCMAs) has extended the range of e-assessment task that is available to course teams. Equivalent statistical tools to those used in the original CMA system are now available for both Moodle and OpenMark iCMAs. However iCMAs are different from CMAs in three ways:
A recent graduate of mathematics, with an interest in statistics and probability, was employed on a consultancy contract in the summer of 2009, to investigate whether the previously used statistical tests were valid for iCMAs. She found no over-riding reason to doubt the validity of any of the tests, though the usefulness of some of them was subject to question and the different scoring mechanism for iCMA questions (linked to multiple attempts) meant that the recommended ranges for the test statistics were likely to be different for iCMAs. The tests were run against several iCMAs and the consultant worked with Sally Jordan (as chair of S104 Exploring Science) to recommend new empirically-based ranges. She also recommended some alternative and additional statistical tests that might be of use in alerting course teams to discrepant behaviour of iCMAs, individual questions and particular variants.
The work is described in more detail in a document by Helen Jordan (given on the right hand side of this page). A summary is given in the following sub-pages, along with a reflection on the outcomes of running the tools against several additional ‘unseen’ iCMAs.
The previously calculated statistics were:
Mean, median, standard deviation, skewness, kurotosis, coefficient of internal consistency, error ratio and standard error.
For iCMAs, the recommendations are:
Course teams should be told the mean, median, and standard deviation (with recommended ranges for mean and standard deviation).
Course teams should be told the standard error (calculated according to a simplified definition and with a recommended range) along with a new statistic, the ‘systematic standard deviation’. If the standard error is too high, then students of a similar ability are likely to get significantly different marks; if the systematic standard deviation is too low then the iCMA does not discriminate well between students of different abilities.
One of the common causes of a high standard error is that there are too few questions in the iCMA. However, in many cases, the iCMA under consideration forms only part of the assessment of a given student. In these cases, it does not really matter if the standard error of one particular iCMA is relatively high, provided that the standard error of the combined assessments is low. To counter the effect of the number of questions, it may also be useful to quote the standard error multiplied by the square root of the number of questions (again with a recommended range).
Coefficient of internal consistency and error ratio should not be quoted, since these are related to the standard error and the standard deviation and so do not add any further information.
Course teams should not be told the skewness or kurtosis since these are used to determine whether data seems to be normally distributed, but here we already know that the overall scores are not normally distributed. Instead, course teams are advised to look at histograms showing the overall distribution of student scores. The upper histogram in Figure 1 shows a reasonable distribution of scores whilst the lower histogram illustrates a situation in which a large number of students have very high scores.
Figure 1 Two histograms demonstrating different behaviours
Course teams are also advised to look at the proportion of students scoring 0, 1, 2 or 3 marks per question. If too many students either get questions right at the first attempt or not at all, then the students did not gain much from being allowed multiple attempts. This may suggest that the feedback in this iCMA is not very useful to students.
Back
The previously calculated statistics were: Facility index (mean), standard deviation, intended and effective weight, discrimination index and discrimination efficiency.
For iCMA questions, the recommendations are:
Course teams should be told the facility index, standard deviation, intended and effective weight and discrimination index of each question, with recommended ranges for facility index and discrimination index.
The facility index is simply the mean score achieved on the question. We recommend using coloured highlighting to alert course teams to questions with facility indices in the follow ranges:
| 95%-100% | Extremely easy |
| 90%-95% | Very easy |
| 85%-90% | Easy |
| <40% | Difficult |
Course teams should consider questions highlighted in this way, in the context of the other questions in the iCMA and the purpose, timing and weighting of the iCMA.
The standard deviation indicates the spread of students’ scores about the mean for the question. A low standard deviation might indicate that the question is not effective in discriminating between students.
The intended weight is the maximum mark it is possible to score on a question; the effective weight shows how much the question actually contributes to the spread of scores.
The discrimination index shows how well correlated the student’s scores on each question are with their scores on the rest of the test. It ranges between –100 and 100, though a question with a negative discrimination index would be rather strange, since it would indicate that students who scored low marks on this question scored high marks on the rest of the test, and vice versa. The larger the value of discrimination index, the better this question is as a predictor of how the students will do on the rest of the test. At first sight, it might seem that the higher the discrimination index, the better. Indeed, questions with a high discrimination index will tend to substantially lower the standard error for the iCMA, or raise the systematic standard deviation, which has to be a good thing. However if we consider the extreme case where all the questions are the same, we would expect all questions to have a discrimination index very close to 100, but we might just as well have had only one question on the test. Likewise, if we had a test where all but one question was the same, then the different question would probably have had a low discrimination index.
Again, it is very helpful to look at the proportion of responses to each question which have been marked 0, 1, 2 or 3. The behaviour can be quite extreme, even in questions which appear to be behaving well according to other measures, and different questions can be behaving quite differently even thought they have similar facility indices; the questions represented in the two right-hand plots of Figure 2 both have a facility index around 58%.
Figure 2 The proportion of responses marked 0, 1, 2 or 3 for three iCMA questions.
If a higher proportion of students are scoring 0 than are scoring 1 or 2, this may indicate that students do not understand the feedback provided sufficiently well to enable them to correct their previous attempt at the question. Course teams should be alerted when questions exhibit this behaviour.
A range of tools is now available to determine whether or not all the variants of a question are of equivalent difficulty.
Figure 3 The proportion of students scoring 0, 1, 2 or 3 for each variant of two iCMA questions.
The three tools described above should be used in conjunction with each other. In particular, if you reject the null hypothesis, it is helpful to consider the plots in order to see which variants are causing the problem. The upper plot in Figure 3 is for a question with 7 variants of very similar difficulty (p = 0.97) whereas the lower plot is for a question in which some variants appear to be behaving in differing ways (a view confirmed by the fact that p=0.0045). For this question, variant 3 (facility index 77.78%) appears to be easier than the other variants and variant 2 (facility index 66.30%) is more difficult. Variant 2 of this question is shown in Figure 4.
Figure 4 Variant 2 of the question whose behaviour is illustrated in Figure 3.
Inspection of the actual student responses to all variants quickly alerted the course team to the cause of the problem: the table that students use in the course material refers from mRNA codon to amino acid, but students are frequently looking up the anticodon instead. In the case of variant 3, this results in a stop codon not an amino acid, so presumably students realise they have done something wrong (thus the variant is easier than the others). In the case of variant 2, this results in Gln, which many students misread as Gin – so when they are told that their answer is incorrect they start by correcting Gin to Gln rather than looking for a more fundamental mistake. This causes variant 2 to be more difficult than the other variants.
For the future, variants 2 and 3 will be removed and targeted feedback will be added to all variants for answers that give amino acids found by looking up the anticodon rather than the codon.
The process of investigation followed for this question illustrates the importance of looking at the actual student responses when a problem has been identified by the statistical tools.
Which question type is the most difficult?
Figure 5 shows the performance of students on a variety of different question types across an entire presentation of S104 Exploring science.
Figure 5 The proportions of students scoring 0, 1, 2 or 3 marks per question in various types of question in use in S104.
There are some surprises – note in particular the relative ease of short answer free text questions (described as ‘Free Text’ in Figure 5) despite the fact that these require the respondent to construct a response in natural language and to do so without the benefit of any prompts in the question. This implies a different form of cognitive processing and memory retrieval when compared with selected response items1 (multiple choice, multiple response and drag and drop). The least well scoring questions are those that require students to enter their answer as a number with appropriate units. This finding has been reported previously2 and relates to the fact that many students struggle to work out the appropriate units; it is not in the main caused by a simple omission of units.
What causes variants to be of differing difficulty?
Provided care is taken, it is usually possible to write several equivalent variants of questions that require numerical answers. So no significant differences were found between the variants for questions in use in S151 Maths for Science (though some variants have already been removed following the realisation that they were lower or higher scoring than others).‘Taking care’ in this context means, for example, that:
In addition, the following have been shown to result in variants of questions being of significantly different difficulty:
It can be extremely difficult to write non-numerical questions of equivalent difficulty, with multiple choice and multiple response questions causing a lot of difficulty (some answers are more obviously correct or incorrect than others). The most divergence between variants was seen on a question in SK123 Understanding cancers in which students had to say whether each of four statements were true or false. The five variants of the question had been created by combining a selection of statements in different ways and one of the statements turned out to be more difficult to identify as true or false than the others. However the situation was compounded by the fact that this statement occurred in one variant of average difficulty as well as in two of significantly greater difficulty than the others. Further investigation showed that the other statements in the ‘average’ variant were all extremely easy to identify as true or false, so if students initially gave the incorrect answer for the rogue statement, they immediately knew which statement to correct when they were told that their answer was incorrect.
The difficulty of writing variants of similar difficulty calls into question the desirability of using different variants of questions in summative use. But yet this undoubtedly acts to discourage plagiarism.
A puzzle
The question shown in Figure 6 was flagged as having variants of significantly different difficulty. But all variants appeared similar – all required students to do a similar calculation, all required an answer to be rounded up to the same precision and with the same units. Eventually the variant that had been identified as more difficult than the others was found on a ‘homework’ website – with an incorrect answer. This incorrect answer was given by students on a depressingly large number of occasions.
Figure 6 Why is this variant of this question more difficult than the others?
As an extension to the iCMA statistics project, random guess scores were calculated for multiple choice, multiple response and drag and drop questions in a number of different situations (e.g. with different numbers of attempts, different scoring algorithms, different numbers of options to select from and different numbers of options being correct, students being told how many options were correct, or not).
The random guess score for a question is essentially the score that you would expect from someone who is completely logical in working though the question but knows absolutely nothing about the subject matter.
Examples
Consider a multiple choice question in which the user has to select one from four options and has a single attempt. They have a one in four chance of getting the right answer, so the random guess score is 25%.
Now consider the same multiple choice question, but allow the user 4 attempts, with full marks available whenever the user gets the correct answer. The logical user can work through the responses in order, so will always get the correct answer within the four allowed attempts i.e. the random guess score is 100%.
Now consider the same multiple choice question, again with 4 attempts, but now the user scores 4 marks (100%) if they are correct at first attempt, 3 marks (75%) if correct at second attempt, 2 marks (50%) if correct at third attempt and 1 mark (25%) if correct at fourth attempt:
At the first attempt, as with example 1 the user has a 1 in 4 chance of getting the question right, and if they get it right at this attempt then they score 4.
In order to get a mark at the second attempt, the user has to get the question wrong at the first attempt. They have a 3 in 4 chance of doing this. Then they have a 1 in 3 chance of getting the question right, and if they get it right at this attempt then they score 3.
In order to get a mark at the third attempt, the user has to get the question wrong at the first attempt (probability 3/4) and the second attempt (probability 2/3). Then they have a 1/2 chance of getting the question right, and if they get it right at this attempt then they score 2.
In order to get a mark at the fourth attempt, the user has to get the question wrong at the first attempt (probability 3/4), the second attempt (probability 2/3) and the third attempt (probability 1/2). However they are then certain to get this question right, i.e. they have a probability of 1 of doing this, and if they get it right at this attempt then they score 1.
This means that the overall random guess score out of 4 is:
This is 62.5%
Calculated values of random guess scores for a range of iCMA question types
Spreadsheets are available (under Documents on the right-hand side of the page), giving random guess scores for various types of multiple choice, multiple response and drag and drop questions. However there are many variables that can influence the random guess score, so a random guess score calculator is also provided.
Multiple choice and multiple response questions
‘multiple_response_1’ contains data for multiple choice and multiple response questions where students who get the answer wrong are simply told that their answer is incorrect. ‘multiple_response_2’ contains data for multiple choice and multiple response questions where students are told how many of their initial choices were correct.
In both cases, there are separate tabs for different total numbers of options. Under each tab the rows represent the number of correct options and the columns represent the number of attempts available. The upper table under each tab is for situations where users have been told how many options to select; the lower table is for when they have not been told how many options to select.
Drag and drop questions
‘drag_and_drop_once_only’ contains data for drag and drop questions where each draggable object may be put into at most one box. ‘drag_and_drop_unlimited’ contains data for drag and drop questions where each draggable object may be put into an unlimited number of boxes.
In both cases, there are separate tabs for different numbers of boxes to be filled in. The rows represent the number of choices to put in the boxes and the columns show the number of attempts allowed. These figures assume that if a student does not get the question right then they are told how many of their choices are correct. If this assumption is not valid, the random guess score calculator should be used.
Random guess score calculator
To run the program:
An example of a surprising result
The way in which the random guess score varies with different factors is sometimes surprising and counterintuitive. For example, Figure 7 shows the variation of random guess score for a multiple response question with 6 options, in which the score available at each attempt falls proportionally and in which partial credit is given for answers that are partially correct at the final attempt. When only one option is required, the random guess score increases with number of attempts allowed. However, when more than one option is correct, the random guess score decreases with the number of attempts allowed.
Figure 7 Variation of random guess score with number of correct options and number of attempts available.
This surprising result appears to be a consequence of giving partial credit for partially correct responses. If a completely correct final answer is required, the random guess score is dramatically reduced in some cases.
A selection of COLMSCT and piCETL related papers, presentations and workshops and talks given by Sally Jordan, 2006-2010
Publications and external conference contributions
Jordan, Sally (2007) The mathematical misconceptions of adult distance-learning science students. Proceedings of the CETL-MSOR Conference 2006, edited by David Green. Maths, Stats and OR Network, pp 87-92. ISBN 978-0-9555914-0-2.
Butcher, Philip and Jordan, Sally (2007) Interactive assessment in science at the Open University: 1975 – 2007. Invited oral presentation at ‘Computer-based assessment in the broader physical sciences’: a joint event hosted by the OpenCETL and the Physical Sciences Subject Centre, 26th April 2007.
Jordan, S., Brockbank, B. and Butcher, P. (2007) Extending the pedagogic role of online interactive assessment: providing feedback on short free-text responses. REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May 2007. Available at http://ewds.strath.ac.uk/REAP07
Jordan, Sally (2007) Computer based assessment with short free responses and tailored feedback. Proceedings of the Science Learning and Teaching Conference 2007, edited by Peter Goodhew. Higher Education Academy, pp 158-163. ISBN 978-1-905788-2.
Hudson, Ken and Jordan, Sally (2007) Practitioner scholarship can lead to institutional change – implementing interactive computer based assessment. Oral presentation at ISSOTL 2007, the International Society for the Scholarship of Teaching and learning, 4th Annual Conference, Sydney, 2nd-5th July 2007.
Jordan, Sally (2007) Assessment for learning; learning from Assessment? Oral presentation at Physics Higher Education Conference, Dublin, 6th-7th September 2007.
Stevens, Valda and Jordan, Sally (2008) Interactive online assessment with teaching feedback for open learners. Oral presentation at Assessment and Student Feedback workshop, Higher Education Academy Centre for ICS, London, April 2008.
Jordan, Sally (2008) eAssessment for student learning: short free-text questions with tailored feedback. Workshop at the University of Chester Staff Conference, May 2008.
Swithenby, Stephen and Jordan, Sally (2008) Supporting open learners by computer based assessment with short free-text responses and tailored feedback. Part of an invited symposium on ‘Matching technologies and pedagogies for supported open learning’ at the 6th International Conference on Education and Information Systems, Technologies and Applications, EISTA, Orlando, 29th June – 2nd July 2008.
Jordan, Sally (2008) Assessment for learning: pushing the boundaries of computer based assessment. Assessment in Higher Education Conference, University of Cumbria, July 2008.
Jordan, Sally (2008) Supporting distance learners with interactive screen experiments. Contributed oral presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) Online interactive assessment: short free text questions with tailored feedback. Contributed poster presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) E-assessment for learning? The potential of short free-text questions with tailored feedback (2008) In invited Symposium ‘Moving forward with e-assessment’ at at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Butcher, Philip and Hunter, Arlene (2008) Online interactive assessment for open learning. Roundtable discussion at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Brockbank, Barbara, Jordan, Sally and Mitchell, Tom (2008) Investigating the use of short answer free-text eAssessment questions with tailored feedback. Poster presentation at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Brockbank, Barbara, Butcher, Philip and Mitchell, Tom (2008) Online assessment with tailored feedback as an aid to effective learning at a distance: including short free-text questions. Poster presentation at 16th Improving Student Learning Symposium, University of Durham, 1st-3rd September 2008.
Hatherly, Paul; Macdonald, John; Cayless, Paul and Jordan, Sally (2008) ISEs: a new resource for experimental physics. Workshop at the Physics Higher Education Conference, Edinburgh, 4th-5th September 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. OpenCETL Bulletin, 3, p4.
Jordan, Sally (2008) Online interactive assessment with short free-text questions and tailored feedback. New Directions, 4, 17-20.
Jordan, Sally (2009) Online interactive assessment in teaching science: a view from the Open University. Education in Science, Number 231, 16-17.
Jordan, Sally and Mitchell, Tom (2009) E-assessment for learning? The potential of short free-text questions with tailored feedback. British Journal of Educational Technology, 40, 2, 371-385.
Hatherly, Paul, Jordan, Sally and Cayless, Alan (2009) Interactive screen experiments – innovative virtual laboratories for distance learners. European Journal of Physics, 30, 751-762.
Butcher, P.G., Swithenby, S.J. and Jordan, S.E. (2009) E-Assessment and the independent learner. 23rd ICDE World Conference on Open Learning and Distance Education, 7-10 June 2009, Maastricht, The Netherlands.
Jordan, Sally. (2009) Assessment for learning: pushing the boundaries of computer based assessment. Practitioner Research in Higher Education, 3(1), pp11-19.Available online at
http://194.81.189.19/ojs/index.php/prhe
Jordan, Sally. (2009) An investigation into the use of e-assessment to support student learning. Assessment in Higher Education Conference, University of Cumbria, 8th July 2009. Available online at http://www.cumbria.ac.uk/Services/CDLT/C-SHEN/Events/EventsArchive2009.aspx
Jordan, Sally and Brockbank, Barbara (2009) Online interactive assessment: short free text questions with tailored feedback. Oral presentation at GIREP-EPEC, August 2009.
Jordan, Sally and Butcher, Philip. (2009) Using e-assessment to support distance learners of science. Oral presentation at GIREP-EPEC, August 2009.
Hatherly, Paul, Jordan, Sally and Cayless, Alan. (2009) Interactive screen experiments – connecting distance learners to laboratory practice. Oral presentation at GIREP-EPEC, August 2009.
Butcher, P.G & Jordan, S.E. (2010) A comparison of human and computer marking of short free-text student responses. Computers & Education, 55, 489-499. DOI: 10.1016/j.compedu.2010.02.012
Jordan, Sally. (2010). E-assessment for learning and learning from e-assessment : short-answer free text questions with tailored feedback. Presentation and workshop to HEA Physical Sciences Centre “The future of technology enhanced assessment’, Royal Society of Chemistry, Burlington House, London, 28th April 2010.
Jordan, Sally (2010) Short answer free text e-assessment questions with tailored feedback. Invited seminar to Human Computer Interaction group at the University of Sussex, 21st May 2010.
Jordan, Sally (2010) Maths for science for those with no previous qualifications: a view from the Open University. HEA Physical Sciences Centre ‘Maths for Scientists’ meeting, 26th May 2010.
Jordan, Sally (2010) Student engagement with e-assessment questions. Poster at the 2010 International Computer Assisted Assessment (CAA) Conference, Southampton, July 2010.
Jordan, S. and Butcher, P. (2010) Using e-assessment to support distance learners of science. In Physics Community and Cooperation: Selected Contributions from the GIREP-EPEC and PHEC 2009 International Conference, ed. D, Raine, C. Hurkett and L. Rogers. Leicester: Lula/The Centre for Interdisciplinary Science. ISBN 978-1-4461-6219-4, pp202-216.
Jordan, Sally (2010) Do we know what we mean by ‘quality’ in e-assessment? Roundtable discussion at EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference, Northumbria, September 2010.
Jordan, Sally, Butcher, Phil, Knight, Sarah and Smith, Ros (2010) ‘Your answer was not quite correct, try again’ : Making online assessment and feedback work for learners. Workshop at ALT-C 2010 ‘Into something rich and strange – making sense of the sea change’, September 2010, Nottingham.
Jordan, Sally (2010) Using simple software to generate answer matching rules for short-answer e-assessment questions in physics and astronomy. Oral presentation at the Physics Higher Education Conference, University of Strathclyde, September 2010.
Butcher, P.G. & Jordan, S.E, (in press) Featured case study in JISC Effective Practice Guide, Summer 2010
Contributions to internal (OU) conferences, meetings and workshops
Jordan, Sally (2006) An analysis of science students’ mathematical misconceptions. Poster presentation at 1st OpenCETL Conference, 8th June 2006.
Jordan, Sally (2006) OpenMark – what’s all the fuss about? Lunchtime seminar at Cambridge Regional Centre, 1st November 2006.
Jordan, Sally (2007) Using interactive online assessment to support student learning. Faculty lunchtime seminar, 30th January 2007.
Jordan, Sally (2007) Issues and examples in online interactive assessment. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Assessment for learning; learning from assessment? Paper presented at the Curriculum, Teaching and Student Support Conference, 1st May 2007.
Jordan, Sally and Brockbank, Barbara (2007) Extending the pedagogic role of online interactive assessment: short answer free text questions. Paper presented at the Curriculum, Teaching and Student Support Conference, 2nd May 2007.
Jordan, Sally (2007) Investigating the use of short answer free text questions in online interactive assessment. Presentation at the Science Staff Tutor Group residential meeting, 9th May 2007.
Jordan, Sally (2007) OpenMark: online interactive workshop. Workshop run at AL Staff Development meeting in Canterbury, 12th May 2007.
Brockbank, Barbara, Jordan, Sally and Butcher, Phil (2007) Investigating the use of short answer free text questions for online interactive assessment. Poster presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions: learning from online assessment. Oral presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Using interactive screen experiments in our teaching: the S104 experience and The Maths Skills ebook. Demonstrations at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally, Ekins, Judy and Hunter, Arlene (2007) eAssessment for learning?: the importance of feedback. Symposium at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally, Brockbank, Barbara and Butcher, Phil (2007) Authoring short answer free text questions for online interactive assessment: have a go! Workshop at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally (2008) Investigating the use of short answer free-text questions in online interactive assessment. Oral presentation to EATING (Education and Technology Interest Group), 17th January 2008.
Jordan, Sally (2008) Investigating the use of short free-text eAssessment questions.Oral presentation to ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally (2008) Writing short free-text eAssessment questions: have a go! Workshop at ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally and Brockbank, Barbara (2008) Writing free text questions for online assessment: have a go! Workshop at the Open University Conference, 29th and 30th April 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. Science Faculty Newsletter, May 2008.
Jordan, Sally and Johnson, Paul (2008) E-assessment opportunities: using free text questions and others. Science Faculty lunchtime seminar followed by workshop, 16th July 2008.
Jordan, Sally and Datta, Saroj (2008) Presentation on Open University use of Interactive Screen Experiments at ISE Launch event, 19th September 2008.
Jordan, Sally, Butler, Diane and Hatherly, Paul (2008) CETL impact: an S104 case study. Series of linked presentations to 3nd OpenCETL Conference, September 2008. [reported in OpenHouse, Feb/March 2009, ‘S104 puts projects into practice’, p4]
Jordan, Sally and Johnson, Paul (2008) Using free text e-assessment questions. Science Faculty lunchtime seminar followed by workshop, 26th November 2008.
Butcher, Phil, Jordan, Sally and Whitelock, Denise (2009) Learn About Formative e-Assessment. IET EPD Learn About Guide.
Butcher, Phil and Jordan, Sally (2009) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 21st January 2009.
Jordan, Sally (2009) E-assessment to support student learning : an investigation into different models of use. Paper presented at Making Connections Conference, 2nd- 3rd June 2009.
Jordan, Sally (2009) (ed) Compilation of interim and final reports on Open University Physics Innovations CETL projects: Assessment.
Butcher, Phil and Jordan, Sally (2009) A comparison of human and computer marking of short-answer free-text student responses. Presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2009) Interpreting the iCMA statistics. Presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Nix, Ingrid, Waights, Verina, Bolton, John and Butcher, Phil (2009) From CETL to course team : embedding iCMA initiatives. Workshop at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Butler, Diane, Hatherly, Paul and Stevens, Valda (2009). From CETL to Course Team: CETL-led initiatives in S104 Exploring Science. Poster presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally (2009) Student engagement for e-assessment. Poster presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2010) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 10th February 2010.
Jordan, Sally (2010) Workshops on using OpenMark PMatch to write short-answer free-text questions, 19th January 2010 and 17th February 2010.
Jordan, Sally, Nix, Ingrid, Wyllie, Ali, Waights, Verina. (2010) Science Faculty lunchtime forum, March 25th 2010.
Jordan, Sally (2010) e-Assessment. In e-Learning in Action: projects and outcomes from the Physics Innovations CETL, pp16-20.
Jordan, Sally (2010) (ed) Compilation of final reports on Open University Physics Innovations CETL projects: e-Assessment.
Back
I am a Staff Tutor in Science at the Open University's Regional Centre in Cambridge, with responsibility for the running of various science courses in the East of England. I am a member of the OU Department of Physics and Astronomy. I was lucky enough to hold teaching fellowships in two of the OU's Centres for Excellence in Teaching and Learning: piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) and COLMSCT (the Centre for Open Learning in Mathematics, Science, Computing and Technology).
Following the end of the OU CETLs in July 2010, I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
I was a member of the course teams producing S104 : Exploring Science (presented for the first time in Feb 2008) and S154 : Science Starts Here (presented for the first time in October 2007). I had responsibility for the development of maths skills in these courses and for the integration of interactive computer marked assignments using the Open University’s OpenMark eAssessment system. I am currently course team chair of S104 and from January 2011 I will be course team chair of S154. I have also been (off and on!) course team chair of S151 : Maths for science, which reflects my great interest in investigating ways to help students to cope with the maths they need in order to study physics and other science courses. I was also on the Course Team of SXR103 : Practising Science for many years.
My COLMSCT and piCETL projects reflect my interests in eAssessment and science students’ mathematical misconceptions. I have investigated the use of short free-text eAssessment questions (with instantaneous feedback, wherever possible tailored to student’s misconceptions) and have applied similar evaluation methodologies to other types of eAssessment tasks, to investigate ways in which iCMAs can most effectively be used to support student learning. Analysis of students’ responses to eAssessment questions provides an opportunity for teachers to learn more about student’s misunderstandings. I have used this methodology to learn more about science students’ mathematical misconceptions. I have developed a ‘Maths Skills eBook’ which is used as a resource on many Open University science courses and I have evaluated the effectiveness and student use of an Interactive Screen Experiment (ISE) developed with piCETL funding for use by S104 students who are not able to conduct one of the course’s home experiments for themselves.
In 2006 I was awarded an Open University Teaching Award for my work in developing S151: Maths for Science and its interactive online assessment system. In 2010 I was one of the OU's nominees for a National Teaching Fellowship.
I have tutored various science and maths courses and have taken a few OU courses as a student. I live in West Norfolk with my husband; my children are both post-graduate students at (conventional) universities. In case you hadn't realised the significance of the shared surname, it was my daughter Helen who did the work on iCMA statistics and random guess scores in Summer 2009. Helen completed an M.Phil in Statistical Science at the University of Cambridge in July 2010 and starts a PhD in statistics at Warwick University in autumn 2010.
My hobbies include walking long-distance footpaths and writing about this (see http://sites.google.com/site/jordanwalks/) and singing in a local choir.
Sally Jordan, COLMSCT Teaching Fellow
S.E. Jordan@open.ac.uk
Nurses are required to make clinical decisions about patients' health and well-being, responding to changes in each patient's condition, which may occur within very small time-frames
The ability to make clinical judgements depends on both a sound theoretical background and good decision-making skills and practice-based learning enables students to develop these skills at the same time as they acquire the necessary underpinning knowledge.
The aim of this project is to develop a web-based tool to assess practice-based learning, building on Laurillard (2002), who suggests that ‘Traditional modes of assessment of knowledge are seen as inadequate because they fail to assess students‘ capability in the authentic activities of their discipline. Such a tool provides an alternative to mentor-led practice assessment, which can bring its own problems due to the tension between nurturing and judgement (Yorke, 2005).
In addition, it complements reflective writing as a method of assessing students’ evolving professional practice, that is to say their integration of theory and practice, which is so vital when preparing students for professional registration. We intend the tool to be reusable in a variety of settings, incorporating a range of resources as appropriate to each setting. The tool will be piloted by invitation only (independent of course assessment) with a cohort of Adult Nursing students, with the aim of incorporating the tool in both Adult and Mental Health Nursing Programmes.
Aims:
Verina waights biography can be found at: http://hsc-people.open.ac.uk/v.waights
Futher details about publications and conference presentations by Verina Waights can be found in ORO, http://oro.open.ac.uk. Follow the link and then search for Verina Waights in the Author box.
Readers might like to try the Clinical Decision-making tool https://students.open.ac.uk/openmark/cdm.projectworld
Opening screen of the tool giving details of the patient’s situation and showing the links to the resources, the initial decision options and the text box for students’ to submit the rationale for their initial decision.
The aim of the project was to identify and share best practice on how to help OU Engineering students achieve their Transferable Skills Intended Learning Outcomes (ILOs) on previous versions (2003-2007) of the Professional Development Planning (PDP) courses: T191 Personal and career development in engineering, T397 Key skills for professional engineers and T398 Towards chartership: professional development for engineers.
This work informed the Course Team to plan changes to the curricula on an extensive rewrite of a new version of T191 ‘Personal and Career Development in Engineering’, for first presentation in October 2008, on the accredited BEng/MEng Engineering Programme Pathways.
The key assessment pedagogy was identified as Constructive Alignment of Teaching and Learning Activities (Biggs and Tang 2007) for a Portfolio of evidence. The activities are aligned to the ILO’s and the assessments. In turn, the ILO’s are mapped against QCA Key Skills at Levels 4 / 5 and the new QAA/UK-SPEC Engineering Benchmark Statement to enable students to apply to become professional engineers (IEng or CEng) .
Associate Lecturers (ALs) and students were surveyed to capture best practice after 8 years experience of the operation of previous versions of the courses. No major concerns were reported with T397 so the rest of the project concentrated on T191.The student survey identified several ‘areas of excellence’ especially the professional help and timely feedback from their tutors and benefits of attendance at the Face to Face tutorials. Ten ‘areas of concern’ were identified on the previous versions. A major problem was that 74% of students out a total of 265 surveyed were spending more than the recommended study time (4hrs per week) recommended for this 15 CATS credits course. This was mainly due to the large number of activity sheets (59 in total) which were repetitive and demanded huge amount of time to complete. Also AL’s said that students needed more guidance on the role of T191 on the BEng degree and how it relates to Engineering.
A major outcome of this project was that the results of AL and student findings have helped the course team to extensively revise and rewrite a new version of T191 for the October 2008 presentation. The number of activities have been reduced to 26 (56% reduction) and the amount of course text has been reduced by 40% into one binding. The new activities have been more closely constructively realigned and mapped to the ILO’s and assessments. A new Introductory Rationale is in the Prologue at the beginning of the course to explain the role of T191 on the BEng degree. These measures should help students manage their study time better and improve retention. The new course makes use of links to the University's Virtual Learning Environment (VLE) resources such as My Stuff, e-Portfolio and OU Careers Advisory Service CV. Also the project should inform better engagement of OU Engineering student communities and proposed new courses on Work Based Learning Foundation Degrees in Engineering.
The IET Student End of Course Survey (Sept 09) for 1st presentation of new version T191 08J showed that there were less causes for concern with the new version compared to previous versions.Students were managing their study time better and fewer fell back with their studies.There was a better understanding of what they were expected to achieve.
The T191 Course Team reported (May 2010) that overall pass rates on the new version had significantly increased from 63% (old version) up to 73% (new version).Also student retention had improved based on TMA submission rates which had also increased greatly from 60% (old version) up to 81% for new version of T191.
Also this year students have started to engage with their new Engineering Pathway Tutors on T191 09J cohort to seek help with Pathway Progression on their BEng/MEng accredited degree to attain professional qualification IEng and CEng.This has been linked to a new Constructively Aligned Activity on eTMA 02 and their ECA Portfolios.Students are giving positive feedback on the pilot
Keith McGraw, COLMSCT Associate Teaching Fellow
klm2@tutor.open.ac.uk
The aim of this project was to determine whether formative feedback explicitly directed to improvement of study skills could help tutors provide students with the means to become more effective learners. The intended impact was that students would gain from the feedback by having increased knowledge of their strengths and weaknesses. Students should then be able to improve their skills and take them forward through the current and future courses. By setting formative targets related to course learning outcomes the tutor should be able to focus the student towards more self-directed learning.
The project follows on from the Formative Assessment in Science Teaching (www.open.ac.uk/fast) findings that feedback on Tutor Marked Assignments (TMAs) is:
The investigation was carried out within the OU course S204 Biology: Uniformity and Diversity. S204 tutors were surveyed using a questionnaire, which showed that Whilst most tutors believe the course learning outcomes are useful, many do not feel it necessary to highlight them to student during tuition or marking. Subsequently a feedback sheet for formative assessment of study skills was designed. Three tutors used these sheets with groups of ~20 students each, over the 7 S204 TMAs. Two tutors also set a skill target for the student to meet on the next or a subsequent TMA. At the end of the course students involved were asked to complete a questionnaire.
Results
Target setting: Although some differences were noted in student beliefs from the questionnaire, setting targets did not appear to have any effect on skills achievement. Both tutors who set targets noted meeting of some of the targets, but agreed it was difficult to say if this was a result of the feedback sheet.
Skills feedback: Interviews of S204 students during FAST showed students do not generally view TMA feedback as relevant to future work (Walker, 2004). However 90% of students taking part in this project thought the feedback sheets not only improved their marks but also were at least somewhat relevant to future study, in the OU or elsewhere. Therefore the conclusion is that including formative assessment of learning outcomes within summative assignments does enable the tutor to help students improve their skills and take their learning forward in both the present and future courses.
Implications: Tutor feedback was that use of the sheets more widely would be welcome. However, if the system was to be rolled out tutors would need some guidance in how to best use the skills feedback sheets. The sheets could be used by students as part of their Personal Development Planning and if made available electronically could be integrated into an ePortfolio. The subsets of students receiving or not receiving targets were relatively small, and more work is necessary to determine any impact of target setting on student learning.
For further details about the project the 'Final Project Report' can be accessed from Related Resources.
Dyke, J. (20080 Formative feedback in summative TMAs. Science Faculty Newsletter, May 2008
Dyke, J. (2007) Incorporating formative, feedforward feedback into summative TMAs and setting targets for learning. Poster presented at Science Learning an Teaching conference 19-20 June 2007
I’m an AL in Science and Widening Participation, and was just starting study for my MEd with the OU when I saw the advertisement for COLMSCT teaching fellows. I saw this as a great opportunity to do some educational research. The theoretical side of the MEd really helped my COLMSCT research, whilst my COLMSCT experience put the research I was meeting on the MEd in perspective, so doing the two in tandem was certainly beneficial.
This was my first attempt at a partially qualitative study, and my COLMSCT mentor and fellow researchers were invaluable in helping me work out my final strategy, as well as providing much appreciated support. Knowing I was part of a group, with help available from many quarters, was important to me, particularly as I was unable to attend any physical meetings at Walton Hall. It could have been a lonely experience, but because of the supportive network, it wasn’t.
The highlight of the work for me was taking a poster of my COLMSCT research to the HEA Science Learning and Teaching conference at Keele University. Those who know me will know this was not easy, because eight years ago I was suddenly struck with extreme agoraphobia, and recovery has been slow. This was the furthest I’d travelled, and to stay overnight and attend two days of presentations and workshops was a huge achievement. The ability to do so certainly came from my determination to put the results of my COLMSCT research on display!
It doesn’t stop there. My goal since I became ill had been to return to work in a physical university environment. I finally saw the opportunity just as my COLMSCT fellowship was coming to an end. My success in attending Keele gave me the confidence to decide I could return to work at my local university part time. Through being a COLMSCT fellow as well as studying for the MEd I had gained enough knowledge and research experience to be appointed as a pedagogical research associate in the new School of Pharmacy at UCLan, and I’m writing this from my desk at work (in my dinner hour of course!). It’s wonderful to be back amongst old friends, and to be bringing in knowledge and experience which would not have been gained without my COLMSCT fellowship.
I have been an AL since November 1999, starting in the Technology faculty, and am currently tutoring S204 and Y158. I run the national student conference for S204, now in its 4th year, write and check TMA questions for S204, monitor on S204 and am the AL Common Room manager. The OU is currently my sole employer. I am extremely interested in teaching methods and have just completed study of SEH806 Contemporary Issues in Science Education within the OU MEd programme, with the aim of registering for an EdD. My second course towards the MEd is E836. I am committed to dissemination of good practice, and try to encourage this within the AL Common Room. Most of my teaching has been in FE and HE, on a range of courses including biochemistry, plant and soil science, biology and environment.
Janet Dyke, COLMSCT Associate Teaching Fellow
Jed83@tutor.open.ac.uk
In October 2005 T171 reached the end of the final presentation after one pilot presentation and nine full presentations. The aim of this project is to collate the vast amount of experience of online teaching and learning amassed by T171 tutors.
This will cover all areas of e-tutoring including the use of conferences to support both staff and students; running e-tutorials; best practice for e-TMAs and e-staff development. The study will include not only those things that have worked well but also the failures along the way and what has been learned from these.
The intention is to share the experience of e-tutoring the students who have passed through these online groups during the past nine presentations through academic papers and conference presentations. It is hoped this will be of great benefit to tutors moving into the area of online tutoring both within Open University and in other areas of tertiary education.
Having left full-time education in 1974 with 7 ‘O’ levels and no wish to go to university, I decided in 1995 to give up a well paid job and become a full time student at Leeds Metropolitan University. I completed a BSc (Hons) in Computing in 1999 and enjoyed studying so much that I went on to take a PhD, completing in 2004. My research investigated the influence of learning style on text based computer conferencing and was conducted entirely with Open University students, mainly from T171.
I started teaching for Leeds Metropolitan University and Open University in 1999. I joined Open University to teach T171 then added U130, A171 and A172 to my teaching portfolio. I also experienced teaching at one T293 summer school before the course ended. My most recent appointment is for T175.
In addition to working as an AL I have peer monitored for T171 and A172 and will soon be monitoring T175, I have mentored A17* tutors, run staff development sessions for R07 both face-to-face and online, run student induction sessions for R07, co-moderated the R07 ICT help conference, provided ICT mentoring to R07 tutors, and assisted with a joint project run with JIVE to provide support for women on technology courses.
When I’m not glued to my computer doing Open University work I like to get out into the countryside and do some walking. I have recently trained as an Open Access Volunteer for Nidderdale ANOB which gets me out come rain or shine. Having studied for 11½ years without a break I promised myself a rest at the end of my PhD but I did sneak in an Open University short course last year and a few weekend courses this year. I think I am addicted to study.
Hilary Cunningham-Atkins, COLMSCT Associate Teaching Fellow
Haca2@tutor.open.ac.uk
The primary aim of this project was to develop and then implement an online interactive formative assessment framework, designed from a constructivist and interventionist perspective that would promote student engagement and understanding of academic progression from an extrinsic as well as intrinsic perspective.
Students learn best when they are fully engaged in the learning process, can see some form of personal gain from an activity, are motivated to test their current level of learning against known standards, and are offered targeted and timely support to help address subsequent personal learning needs. In addition, in line with Vygotsky’s work on learning progression, it is clear that intervention by another person and/or appropriate learning support tool allows an individual to develop further than if left on their own. The usual way to do this is through the use of assessment, but this in itself can act as an overbearing influence on what and how students learn, rather than providing an holistic support mechanism that encourages continuous reflective learning. Summative assessment provides a quantitative measure of learning at specific points in time, but may not encourage students to focus on specific strengths and weaknesses; formative assessment can provide specific reflective and feed-forward support, but given the time-poor nature of students, do they engage with it?
The online interactive formative assessment framework was specifically designed to enhance student awareness, understanding and recognition of competency levels from a learning outcomes approach, and allowed testing of ongoing academic progress at predetermined and self-selected points throughout the year. Each assessment was explicitly linked to other course and learning components (including the summative assessment strategy), as a means of providing an integrated approach to learning. By working through the formative assessments it was hoped that students would become more self-directed and confident in their learning skills and abilities, which in turn would aid retention.
A second level Geosciences course - S279 Our Dynamic Planet (due for first presentation in 2007) was selected for this project, as it was based upon two prior courses both of which were perceived as conceptually difficult by students and tutors, and so was expected to be academically challenging to its students, who would be required to engage with a broad range of scientific disciplines and generic learning skills throughout the course. The framework meanwhile, used OpenMark (a web-based system developed by the OU) in which students have up to three attempts to correctly answer each question, and are offered instantaneous and targeted feedback after each incorrect attempt. The system automatically collects information on student interactions, offering valuable insight into how (and which) students are engaging with the assessment/ course. This data permits new targeted feedback to be added in response to common errors, specific skills or content that is poorly demonstrated.
All feedback within the framework was formative, commenting on how well each of the learning outcomes tested over a period of study had been demonstrated, as well as the overall level of academic competency attained by the user at that point in time. By the end of the project, the framework encompassed seven interactive assessments (linked to the first of the two books within S279), consisting of ten variable format questions (set at two levels of academic complexity), with an eighth group of assessments offering a selection of questions from the preceding assessments, resulting in the formation of an instantaneous revision tool. You can access the assessments from the Related resources on the right hand side "Student Online Formative Assessments (SOFAs) iCMAs".
Preliminary results indicate the majority of students who attempted the assessments rated them as enjoyable, with a significant proportion revisiting specific assessments to enhance previous outcomes and check progress. Examination of the perceived effectiveness of the framework has been carried out using a ‘success case method’ approach, which allowed qualitative and quantitative data to be collected on both success and failure indicators from various stakeholders involved with the framework, which in turn are being used to identify key issues pertinent to future development. (Based on these results, a proposal is about to be submitted by the S279 course team to the Science Faculty for resources to develop the paper-based formative assessments written (during this project) for Book 2, to be converted into OpenMark assessments.)
For further details about the project the 'Final Project Report' and related papers can be accessed from Related Resources.
Back
The reader may like to try the S279 Our Dynamic Planet iCMAs (what the author calls 'Student Online Formative Assessments' (SOFAs) )
The format is identical to the learning environment presented to students on the OU course S279 Our Dynamic Planet, in which the students are expected to refer to the course text whilst completing each assessment (rather than attempt each question from memory).
All the iCMAs are associated with Book 1 of S279 Our Dynamic Planet
Chapter 1 https://students.open.ac.uk/openmark/s279.book1chapter1world/
Chapter 2 https://students.open.ac.uk/openmark/s279.book1chapter2world/
Chapter 3 https://students.open.ac.uk/openmark/s279.book1chapter3world/
Chapter 4 https://students.open.ac.uk/openmark/s279.book1chapter4world/
Chapter 5 https://students.open.ac.uk/openmark/s279.book1chapter5world/
Chapter 6 https://students.open.ac.uk/openmark/s279.book1chapter6world/
Chapter 7 https://students.open.ac.uk/openmark/s279.book1chapter7world/
Defining geological terms https://students.open.ac.uk/openmark/s279.book1chapter8geotermsworld/
Mathematical problems 1 https://students.open.ac.uk/openmark/s279.book1chapter8mathmodworld/
Mathematical problems 2 https://students.open.ac.uk/openmark/s279.book1chapter8mathtaxingworld/
Understanding geological processes 1 https://students.open.ac.uk/openmark/s279.book1chapter8geoprocessmodworld/
Understanding geological processes 2 https://students.open.ac.uk/openmark/s279.book1chapter8geoprocesstaxingworld/
Working with diagrams 1 https://students.open.ac.uk/openmark/s279.book1chapter8diagmodworld/
Working with diagrams 2 https://students.open.ac.uk/openmark/s279.book1chapter8diagtaxingworld/
Working with tables https://students.open.ac.uk/openmark/s279-07.book1chapter8tablesworld/
A selection of COLMSCT related dissemination activities to October 2008.
Arlene Hunter, COLMSCT Teaching Fellow, A.G.Hunter@open.ac.uk
This project set out to investigate the use of an online decision-making maze tool as a form of eAssessment task more motivating and relevant to practice-based students.
The context was established using a realistic, multi-layered, clinically-based case study. The approach allowed for exploration and the construction of a decision, based on evidence presented as multiple rich media resources for students to explore. Options were limited to three possible decisions at each decision point with students having to justify their decision in a reflective log which captured their evidence at each stage. The case study evolved over time, allowing students to review the results of their decisions at each stage. Students are offered two attempts after which they receive feedback, depending on the route through the maze they have chosen. The assessment system captures their scores and averages the two attempts in the final feedback. The reflective log acts as a record of their reflections and justifications and has the potential to feed into a qualitative piece of assessment, such as a TMA. The approach has been trialed with volunteer students from the pre-registration nursing programme. The tool has the additional potential to be used as a collaborative activity with groups of students.
Our belief is that this approach allows students to construct their view and apply their knowledge in a safe environment where decisions can be rehearsed and explored before application in practice
Readers might like to try the Clinical Decision-making Maze https://students.open.ac.uk/openmark/cdm.projectworld
I am a Lecturer in Learning and Teaching Technologies in the Faculty of Health and Social Care and a Teaching Fellow in the OU's Centre for Excellence in Teaching and Learning (CETL). I have 2 projects in the Centre for Open Learning in Mathematics, Science, Computing and Technology (COLMSCT) CETL and a further project in the Practice-based Professional Learning (PBPL) CETL.
My COLMSCT projects reflect my interest in eAssessment and in applying constructivist pedagogies to formative and summative student learning. Taking a broad perspective, the LINA project explored what design features would make interactive assessment more motivating to learners. The aim being to investigate an approach which would facilitate the progress of students through formative to summative assessment. It built on work Ingrid Nix and I had carried out within the faculty, designing online computer marked assessment sequences for social work and nursing students and observing student use and level of engagement amidst competing pressures of work and study.
The Clinical Decision-Making Maze project (CDM), focused very much on the idea of an extended contextualised narrative as the backbone for a number of decision points, evolving over time into a decision maze with a variety of different possible outcomes. Media resources at each stage flesh out the context, and provide evidence for students to interrogate and use to justify their decisions which are captured in a reflective log. Students construct their own interpretation of events and make decisions accordingly. As the clinical case evolves over time, decision pathways are scaffolded to allow for targeted feedback and to limit the complexity of the underlying maze. The approach marries the complexity of practice simulation scenarios with the flexibility of a decision maze and being embedded within an existing online assessment system (OpenMark), results can be captured, scored and targeted feedback given.
For a full profile visit Ali's faculty page at http://www.open.ac.uk/hsc/people/profile.php?name=Ali_Wyllie
A major concern of the Course Team of M150 is Block2 - JavaScript programming
On line learning aids such as quizzes could substantially help many students and possibly provide a further prop for AL proactive intervention.
Desired impacts include: better understanding of block 2 giving a more solid foundation for further computing courses; retaining students who might otherwise drop out (as many now do at this stage).
Aims: Primarily to support students in their understanding of new and difficult parts of M150 and secondly to further confirm understanding prior to TMA feedback.
Thirdly to investigate extension to other Computing (and other department) courses within the OU and beyond.
Readers might like to try the following iCMAs:
Structured English
https://students.open.ac.uk/openmark/m150-09b.structuredenglishworld
Conditions, Truth and Trace Tables
https://students.open.ac.uk/openmark/m150-09b.conditionsandtablesworld
Basic JavaScript
https://students.open.ac.uk/openmark/m150-09b.basicjsworld/
Selection - the 'if' statement
https://students.open.ac.uk/openmark/m150-09b.ifworld
Repetition - the 'while' statement
https://students.open.ac.uk/openmark/m150-09b.whileworld
Repetition - the 'for' statement Quiz
https://students.open.ac.uk/openmark/m150-09b.forworld
Michael Isherwood
mci3@tutor.open.ac.uk
The aim of this project was to develop the academic and pedagogic basis of the OpenMark eAssessment system and to develop interactive computer marked assignments (iCMAs) for S104 : Exploring Science. The OpenMark system, previously used for formative and summative assessment in S151 : Maths for Science enables students to be provided with instantaneous, targeted and relatively detailed feedback on their work. iCMA questions were also developed for S154 : Science Starts Here and the diagnostic quiz ‘Are you ready for level 1 science?’.
Development of OpenMark iCMAs for S104
S104: Exploring Science is the Science Faculty’s 60 point level one science course, introducing students to Earth science, physics, chemistry and biology and developing mathematical, communication and practical skills. S104 had its first presentation in February 2008 and since then has run in two presentations per year, with 1500-2000 students per presentation. The content of S104 draws heavily on its predecessor, S103, but its tuition and assessment strategies are very different.
S104’s assessment strategy includes the following components:
S104's iCMAs are credit-bearing because the course team wanted students to engage with them in a meaningful way. However their purpose is to provide instantaneous feedback and to help students to pace their study. The iCMA questions use the OpenMark e-assessment system, which enables us to provide students with multiple attempts at each question, with an increasing amount of instantaneous feedback after each attempt. The student can learn from the feedback and use it to correct their answer. Wherever possible the feedback is tailored to the student’s misunderstanding. S104's iCMAs make use of the full range of OpenMark question types, including free text entry of numbers, letters and single words as well as hot-spot, drag and drop, multiple-choice and multiple-response. We have also included a few questions requiring free-text answers of up to a sentence in length (these questions are described in detail at the related project page 'eAssessment questions with short free text responses: a natural language processing approach'). Further information about OpenMark is given on the OpenMark Examples website (http://www.open.ac.uk/openmarkexamples/index.shtml).
Each S104 iCMA opens to students approximately two weeks before they are due to start reading a particular book of the course. Initially the iCMAs closed 1-2 weeks after students were due to move onto the next book, which led to each iCMA cut-off date being after the cut-off date for the TMA assessing the same book. This resulted in many students not starting the iCMA until after they had completed the TMA and moved on to the next book, so it was felt that the pacing function of e-assessment was not being used to the full so iCMA cut-off dates are now just two days after the students are timetabled to move onto the next book, with the TMA due dates a further two days later.
OpenMark sits within Moodle and scores are reported to the student and their tutor via StudentHome and TutorHome respectively. S104 has its own ‘iCMA Guide’ and associate lecturers are given additional advice via the S104 Tutor Forum.
Each presentation of S104 uses 105 questions (10 each for iCMAs41-48 and 25 for iCMA49), with some re-use of questions between presentations. Wherever possible at least five different variants of each question are provided, to act as an anti-plagiarism device, and sometimes more variants are provided to enable the same basic question to be used, for example, in iCMA43 on one presentation of the course and iCMA49 of the following presentation. Each OpenMark question was written by an academic member of the Course Team, programmed by a media developer and checked by both the academic author and an experienced consultant.
A series of workshops was run to train S104 course team members in the authoring of OpenMark questions. These included advice on writing unambiguous questions linked to appropriate learning outcomes, writing appropriate feedback for students, specifying answer matching, ensuring that as many questions as possible are fully accessible (i.e. have versions that can be read by a screen-reader) and checking questions. The basic ‘Writing OpenMark questions’ workshop has since been repeated for other course teams and a guide ‘Good practice in the academic authoring of OpenMark questions’ has also been produced.
Development of OpenMark iCMAs for S154
A bank of 66 OpenMark questions was written in the spring of 2007 for use in various contexts, including the reinforcement and assessment of basic knowledge and skills, especially mathematical skills, developed in the 10 point course S154 : Science Starts Here (first presentations October 2007 and March 2008, with 500-1000 students per presentation). An additional 15 new questions were written prior to the October 2008 presentation of S154, supplemented by two free-text short-answer questions first written for S103 and one S104 question reused with the permission of the author. The questions in the bank have been written with up to 20 different variants each, to enable their use in different places (including S154’s formative and summative iCMAs, ‘Are you ready for level 1 science?’ and the iCMA that accompanies the Maths Skills ebook) and also to provide multiple variants in each iCMA in which they are used. In formative use, multiple variants provide students with more opportunities for practice.
S154 had not initially intended to make summative use of OpenMark questions. However, mounting evidence that students frequently only attempt the first question or two of formative-only iCMAs, led to a course team decision to have two short summative iCMAs, one assessing Chapters 2-4 and one assessing Chapters 6-9. However, the focus of S154 on underpinning mathematical skills means that some students will require a lot of practice so there is also a ‘Practice iCMA’. Students are encouraged to engage with the Practice iCMA regularly, with reminders given on the course website every week and at the end of each chapter in the course book.
‘Are you ready for level 1 science?’
32 of the questions from the bank of mathematical questions described in the previous section have also been used (with different variants) in the diagnostic quiz ‘Are you ready for level 1 science?’, which has been available to prospective students of level 1 Science Faculty courses (including S104, S154, SDK125 and Science Short Courses) since April 2007. These questions are offered alongside 6 questions on English and study skills and a number of advice screens.
An OpenMark-based diagnostic quiz has been used because it forces people to engage actively with the questions, rather than looking at the answers before looking at the questions and then assuming that they could easily have obtained the correct answers for themselves. There is some evidence of this sort of behaviour with the printed and .pdf versions of the other Science Faculty ‘Are you ready for?’ quizzes, and the level 2 courses S207, S205 and S204 have had interactive ‘Are you ready for?’ quizzes for some time.
The other reason why OpenMark is appropriate for the ‘Are you ready for level 1 science’ quiz is that it enables people to be guided through a complex maze of possible routes, depending on their aspirations and the amount of time they have available, in such a way that the quiz does not appear too complex or too long to prospective students. ‘Are you ready for level 1 science?’ is actually three interlinked iCMAs. After a very short ‘introductory quiz’ students are guided either to a ‘basic quiz’ (to assess their preparedness for S154, SDK125 or entry level Science Short Courses) or to a quiz that is designed specifically to assess their preparedness for S104. Within the S104 quiz some of the questions (on arithmetical rules of precedence, negative numbers and fractions, decimals, ratios and percentages) are deemed to be ‘essential’ (these topics are not re-taught in S104) whereas other questions (on topics which are re-taught, albeit sometimes rather briefly, in S104, for example the use of scientific notation) are classified as ‘desirable’ for S104. Prospective students are advised that they will be able to complete the early books of S104 more quickly if they are already familiar with some of these topics.
Acknowledgements
Much of the work of this project has been carried out in conjunction with colleagues on the S154 and S104 course teams, in particular Linda Fowler, Ruth Williams and Valda Stevens. Development of the OpenMark questions for S154 and ‘Are you ready for level 1 science’ would not have been possible with the assistance of Greg Black (Learning and Teaching Solutions). Greg has also provided invaluable guidance on possibilities offered by the rapidly developing technologies, as has Phil Butcher (COLMSCT Teaching Fellow and leader of the VLE eAssessment Project Team).
Examples of the question types we have used for S104, S154 and ‘Are you ready for S104?' can be seen by accessing ‘Are you ready for S104?’ from www.open.ac.uk/science/courses-qualifications/are-you-ready-for-science/interactive-materials/exploring-science.php
More information about the OpenMark eAssessment system and the range of question types available can be viewed at
http://www.open.ac.uk/openmarkexamples/index.shtml
Examples of short-free text questions developed in the related COLMSCT project 'eAssessment questions with short free-text responses : a natural language processing approach' can be seen at students.open.ac.uk/openmark/omdemo.pm2009/
Reviews of the literature (e.g. Black and Wiliam, 1998; Gibbs and Simpson, 2004) have identified conditions under which assessment appears to support and encourage learning. Several of these conditions concern feedback, but the provision of feedback does not in itself lead to learning. Sadler (1989) argues that in order for feedback to be effective, action must be taken to close the gap between the student’s current level of understanding and the level expected by the teacher. It follows that, in order for assessment to be effective, feedback must not only be provided, but also understood by the student and acted on in a timely fashion.
These points are incorporated into five of Gibbs and Simpson’s (2004) eleven conditions under which assessment supports learning:
Condition 4: Sufficient feedback is provided, both often enough and in enough detail;
Condition 6: The feedback is timely in that it is received by students while it still matters to them and in time for them to pay attention to further learning or receive further assistance;
Condition 8: Feedback is appropriate, in relation to students’ understanding of what they are supposed to be doing;
Condition 9: Feedback is received and attended to;
Condition 11: Feedback is acted upon by the student.
It can be difficult and expensive to provide students with sufficient feedback (Condition 4), especially in a distance-learning environment, where opportunities for informal discussion are limited. Feedback on tutor-marked assignments is useful but may be received too late to be useful (Condition 6) and it is then difficult for students to understand and act upon it (Conditions 8 and 10), even assuming that they do more than glance at the mark awarded (Condition 9).
One possible solution to these dilemmas is to use e-assessment. Feedback can be tailored to students’ misconceptions and delivered instantaneously and, provided the assessment system is carefully chosen and set-up, students can be given an opportunity to learn from the feedback whilst it is still fresh in their minds, by immediately attempting a similar question or the same question for a second time, thus closing the feedback loop. Distance learners are no longer disadvantaged — indeed the system can emulate a tutor at the student’s elbow (Ross et al., 2006, p.125) — and ‘little and often’ assessments can be incorporated at regular intervals throughout the course, bringing the additional benefits of assisting students to pace their study and to engage actively with the learning process, thus encouraging retention. For high-population courses, e-assessment can also deliver savings of cost and effort. Finally, e-assessment is the natural partner to the growth industry of e-learning.
However opinions of e-assessment are mixed and evidence for its effectiveness is inconclusive; indeed e-assessment is sometimes perceived as having a negative effect on learning (Gibbs, 2006). Murphy (2008) reports that high stakes multiple-choice tests of writing can lead to actual writing beginning to disappear from the curriculum; she also reports that ‘the curriculum begins to take the form of the test’. There are more widely voiced concerns that e-assessment tasks (predominantly but not exclusively multiple-choice) can encourage memorisation and factual recall and lead to surface-learning, far removed from the tasks that will be required of the learners in the real world (Nicol, 2007: Scouller and Prosser, 1994). Also, although multiple-choice questions are in some senses very reliable, doubt has been expressed that they may not always be assessing what the teacher believes that they are, partly because multiple-choice questions require ‘the recognition of the answer rather than the construction of a response’ (Nicol, 2007)
Ashton and her colleagues (2006) point out that the debate about the effectiveness of multiple-choice questions ‘diverts focus away from many of the key benefits that online assessment offers to learning’. Perhaps the question we should be asking is not ‘should we be using e-assessment?’ but rather ‘what are the features of an effective e-assessment system?’ (Mackenzie, 2003).
References
Ashton, H.S., Beevers, C.E., Milligan, C.D., Schofield, D.K., Thomas, R.C. and Youngson, M.A. (2006). Moving beyond objective testing in online assessment, in S.C. Howell and M. Hricko (eds) Online assessment and measurement: case studies from higher education, K-12 and corporate. Hershey, PA: Information Science Publishing: 116-127.
Black, P. and Wiliam, D. (1998) Assessment and classroom learning, Assessment in Education, 5, 1, 7-74.
Gibbs, G. (2006). Why assessment is changing. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp11-22.
Mackenzie, D. (2003). Assessment for e-learning : what are the features of an ideal e-assessment system?. 7th International CAA Conference, Loughborough, UK. At http://www.caaconference.com/pastConferences/2003/procedings/index.asp
Murphy, S. (2008) Some consequences of writing assessment, in A. Havnes and L. McDowell (eds) Balancing Dilemmas in Assessment and Learning in Contemporary Education. London: Routledge:33-49.
Nicol, D.J. (2007). E-assessment by design: using multiple choice tests to good effect. Journal of Further and Higher Education, 31, 1, 53–64.
Ross, S.M., Jordan, S.E & Butcher, P.G. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp123-131.
Sadler, D.R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119–144.
Scouller, K.M. & Prosser, M. (1994). Students’ experiences of studying for multiple choice question examinations. Studies in Higher Education, 19, 3, 267–279.
Evaluation of S104 and S154’s use of iCMA questions and of the ‘Are you ready for level 1 science?’ quiz formed part of the larger related project 'Analysis of the impact of iCMAs on student learning'.
This project considered different models of iCMA use in order to determine:
The project’s methodology included extensive analysis of the data captured when students attempt iCMAs as well as more qualitative methodologies.
Evaluation of ‘Are you ready for level 1 science?’ showed it to have been very heavily used (with more than 26, 000 people accessing the quiz between April 2007 and January 2009) and popular, though unfortunately only around 50-65% of students on the first two presentations of S104 appear to have used the diagnostic quiz before deciding to study this course. A feedback question asked prospective students whether they found ‘Are you ready for level 1 science?’ useful, which course(s) they were considering both before and after attempting the quiz, and whether they had any suggestions for improvement. Several of the students who took the ‘basic quiz’ obviously found it very easy; some found this reassuring (e.g. in answer to ‘Did you find the quiz useful?’: ‘Yes, I enjoyed it and was pleasantly surprised. it's a long time since I did maths at school!!; ‘Very useful and very reassuring’; ‘It re-ignited a little confidence’) others appeared frustrated (e.g. ‘No. It was far too simplistic’). Analysis of responses to individual questions also indicated that most people answered these very competently. Most students who took the ‘basic quiz’ were initially intending to take S154, SDK125 or a Science Short Course and very few students changed their mind as a result of the quiz.
Responses to the feedback question and analysis of responses to individual questions indicated that prospective students taking the S104 quiz found this more difficult, and in a sense it appears that this quiz was more useful, with some students deciding to start their study with S154 rather than S104 and comments in response to ‘Did you find the quiz useful?’ such as ‘Yes. It confirmed to me that I need to take the preparatory course S154 prior to S104 in 2008. The maths section is my week point, although I feel I only need to brush up on certain areas of it and 154 will do that (I hope).’. However, again, many students were simply assured of their preparedness for the course they were originally intending to study, with comments such as ‘The quiz was very useful. As a 39 yr old who has had little mathematics exposure since the mid 80s, it was refreshing to realise how much I had remembered.’ Responses to individual questions were far more variable than for the basic quiz, but the questions all appeared to have been behaving well.
Postscript - changes to 'Are you ready for level 1 science?', November 2009
Various changes have been made to 'Are you ready for level 1 science?' in response to feedback from users.
1. Four questions on chemistry were added to the 'valuable for S104' section.
2. Although 'Are you ready for level 1 science?' was heavily used and well received, evaluation showed that many potential students were 'getting lost' between the various quizzes. For this reason, the quizzes were reconfigured, with links provided to a single quiz for S104 ('Are you ready for S104?')and a single quiz for the other level 1 courses ('Are you ready for science study?')
3. Users were irritated by the 'study skills' questions so these were removed. The questions checking that students had sufficient time for study were simplified, and the guidance on the time requirements for the various courses was strengthened.
4. Users also requested more specific guidance about whether or not they were sufficiently prepared. For the S104 quiz, a 'traffic light' system is used to indicate whether students should consider doing further study before registering for S104, as illustrated below:
A selection of COLMSCT and piCETL related papers, presentations and workshops, given by Sally Jordan, 2006-2010.
Publications and external conference contributions
Jordan, Sally (2007) The mathematical misconceptions of adult distance-learning science students. Proceedings of the CETL-MSOR Conference 2006, edited by David Green. Maths, Stats and OR Network, pp 87-92. ISBN 978-0-9555914-0-2.
Butcher, Philip and Jordan, Sally (2007) Interactive assessment in science at the Open University: 1975 – 2007. Invited oral presentation at ‘Computer-based assessment in the broader physical sciences’: a joint event hosted by the OpenCETL and the Physical Sciences Subject Centre, 26th April 2007.
Jordan, S., Brockbank, B. and Butcher, P. (2007) Extending the pedagogic role of online interactive assessment: providing feedback on short free-text responses. REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May 2007. Available at http://ewds.strath.ac.uk/REAP07
Jordan, Sally (2007) Computer based assessment with short free responses and tailored feedback. Proceedings of the Science Learning and Teaching Conference 2007, edited by Peter Goodhew. Higher Education Academy, pp 158-163. ISBN 978-1-905788-2.
Hudson, Ken and Jordan, Sally (2007) Practitioner scholarship can lead to institutional change – implementing interactive computer based assessment. Oral presentation at ISSOTL 2007, the International Society for the Scholarship of Teaching and learning, 4th Annual Conference, Sydney, 2nd-5th July 2007.
Jordan, Sally (2007) Assessment for learning; learning from Assessment? Oral presentation at Physics Higher Education Conference, Dublin, 6th-7th September 2007.
Stevens, Valda and Jordan, Sally (2008) Interactive online assessment with teaching feedback for open learners. Oral presentation at Assessment and Student Feedback workshop, Higher Education Academy Centre for ICS, London, April 2008.
Jordan, Sally (2008) eAssessment for student learning: short free-text questions with tailored feedback. Workshop at the University of Chester Staff Conference, May 2008.
Swithenby, Stephen and Jordan, Sally (2008) Supporting open learners by computer based assessment with short free-text responses and tailored feedback. Part of an invited symposium on ‘Matching technologies and pedagogies for supported open learning’ at the 6th International Conference on Education and Information Systems, Technologies and Applications, EISTA, Orlando, 29th June – 2nd July 2008.
Jordan, Sally (2008) Assessment for learning: pushing the boundaries of computer based assessment. Assessment in Higher Education Conference, University of Cumbria, July 2008.
Jordan, Sally (2008) Supporting distance learners with interactive screen experiments. Contributed oral presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) Online interactive assessment: short free text questions with tailored feedback. Contributed poster presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) E-assessment for learning? The potential of short free-text questions with tailored feedback (2008) In invited Symposium ‘Moving forward with e-assessment’ at at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Butcher, Philip and Hunter, Arlene (2008) Online interactive assessment for open learning. Roundtable discussion at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Brockbank, Barbara, Jordan, Sally and Mitchell, Tom (2008) Investigating the use of short answer free-text eAssessment questions with tailored feedback. Poster presentation at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Brockbank, Barbara, Butcher, Philip and Mitchell, Tom (2008) Online assessment with tailored feedback as an aid to effective learning at a distance: including short free-text questions. Poster presentation at 16th Improving Student Learning Symposium, University of Durham, 1st-3rd September 2008.
Hatherly, Paul; Macdonald, John; Cayless, Paul and Jordan, Sally (2008) ISEs: a new resource for experimental physics. Workshop at the Physics Higher Education Conference, Edinburgh, 4th-5th September 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. OpenCETL Bulletin, 3, p4.
Jordan, Sally (2008) Online interactive assessment with short free-text questions and tailored feedback. New Directions, 4, 17-20.
Jordan, Sally (2009) Online interactive assessment in teaching science: a view from the Open University. Education in Science, Number 231, 16-17.
Jordan, Sally and Mitchell, Tom (2009) E-assessment for learning? The potential of short free-text questions with tailored feedback. British Journal of Educational Technology, 40, 2, 371-385.
Hatherly, Paul, Jordan, Sally and Cayless, Alan (2009) Interactive screen experiments – innovative virtual laboratories for distance learners. European Journal of Physics, 30, 751-762.
Butcher, P.G., Swithenby, S.J. and Jordan, S.E. (2009) E-Assessment and the independent learner. 23rd ICDE World Conference on Open Learning and Distance Education, 7-10 June 2009, Maastricht, The Netherlands.
Jordan, Sally. (2009) Assessment for learning: pushing the boundaries of computer based assessment. Practitioner Research in Higher Education, 3(1), pp11-19.Available online at
http://194.81.189.19/ojs/index.php/prhe
Jordan, Sally. (2009) An investigation into the use of e-assessment to support student learning. Assessment in Higher Education Conference, University of Cumbria, 8th July 2009. Available online at http://www.cumbria.ac.uk/Services/CDLT/C-SHEN/Events/EventsArchive2009.aspx
Jordan, Sally and Brockbank, Barbara (2009) Online interactive assessment: short free text questions with tailored feedback. Oral presentation at GIREP-EPEC, August 2009.
Jordan, Sally and Butcher, Philip. (2009) Using e-assessment to support distance learners of science. Oral presentation at GIREP-EPEC, August 2009.
Hatherly, Paul, Jordan, Sally and Cayless, Alan. (2009) Interactive screen experiments – connecting distance learners to laboratory practice. Oral presentation at GIREP-EPEC, August 2009.
Butcher, P.G & Jordan, S.E. (2010) A comparison of human and computer marking of short free-text student responses. Computers & Education, 55, 489-499. DOI: 10.1016/j.compedu.2010.02.012
Jordan, Sally. (2010). E-assessment for learning and learning from e-assessment : short-answer free text questions with tailored feedback. Presentation and workshop to HEA Physical Sciences Centre “The future of technology enhanced assessment’, Royal Society of Chemistry, Burlington House, London, 28th April 2010.
Jordan, Sally (2010) Short answer free text e-assessment questions with tailored feedback. Invited seminar to Human Computer Interaction group at the University of Sussex, 21st May 2010.
Jordan, Sally (2010) Maths for science for those with no previous qualifications: a view from the Open University. HEA Physical Sciences Centre ‘Maths for Scientists’ meeting, 26th May 2010.
Jordan, Sally (2010) Student engagement with e-assessment questions. Poster at the 2010 International Computer Assisted Assessment (CAA) Conference, Southampton, July 2010.
Jordan, S. and Butcher, P. (2010) Using e-assessment to support distance learners of science. In Physics Community and Cooperation: Selected Contributions from the GIREP-EPEC and PHEC 2009 International Conference, ed. D, Raine, C. Hurkett and L. Rogers. Leicester: Lula/The Centre for Interdisciplinary Science. ISBN 978-1-4461-6219-4, pp202-216.
Jordan, Sally (2010) Do we know what we mean by ‘quality’ in e-assessment? Roundtable discussion at EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference, Northumbria, September 2010.
Jordan, Sally, Butcher, Phil, Knight, Sarah and Smith, Ros (2010) ‘Your answer was not quite correct, try again’ : Making online assessment and feedback work for learners. Workshop at ALT-C 2010 ‘Into something rich and strange – making sense of the sea change’, September 2010, Nottingham.
Jordan, Sally (2010) Using simple software to generate answer matching rules for short-answer e-assessment questions in physics and astronomy. Oral presentation at the Physics Higher Education Conference, University of Strathclyde, September 2010.
Butcher, P.G. & Jordan, S.E, (in press) Featured case study in JISC Effective Practice Guide, Summer 2010
Contributions to internal (OU) conferences, meetings and workshops
Jordan, Sally (2006) An analysis of science students’ mathematical misconceptions. Poster presentation at 1st OpenCETL Conference, 8th June 2006.
Jordan, Sally (2006) OpenMark – what’s all the fuss about? Lunchtime seminar at Cambridge Regional Centre, 1st November 2006.
Jordan, Sally (2007) Using interactive online assessment to support student learning. Faculty lunchtime seminar, 30th January 2007.
Jordan, Sally (2007) Issues and examples in online interactive assessment. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Assessment for learning; learning from assessment? Paper presented at the Curriculum, Teaching and Student Support Conference, 1st May 2007.
Jordan, Sally and Brockbank, Barbara (2007) Extending the pedagogic role of online interactive assessment: short answer free text questions. Paper presented at the Curriculum, Teaching and Student Support Conference, 2nd May 2007.
Jordan, Sally (2007) Investigating the use of short answer free text questions in online interactive assessment. Presentation at the Science Staff Tutor Group residential meeting, 9th May 2007.
Jordan, Sally (2007) OpenMark: online interactive workshop. Workshop run at AL Staff Development meeting in Canterbury, 12th May 2007.
Brockbank, Barbara, Jordan, Sally and Butcher, Phil (2007) Investigating the use of short answer free text questions for online interactive assessment. Poster presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions: learning from online assessment. Oral presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Using interactive screen experiments in our teaching: the S104 experience and The Maths Skills ebook. Demonstrations at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally, Ekins, Judy and Hunter, Arlene (2007) eAssessment for learning?: the importance of feedback. Symposium at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally, Brockbank, Barbara and Butcher, Phil (2007) Authoring short answer free text questions for online interactive assessment: have a go! Workshop at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally (2008) Investigating the use of short answer free-text questions in online interactive assessment. Oral presentation to EATING (Education and Technology Interest Group), 17th January 2008.
Jordan, Sally (2008) Investigating the use of short free-text eAssessment questions.Oral presentation to ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally (2008) Writing short free-text eAssessment questions: have a go! Workshop at ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally and Brockbank, Barbara (2008) Writing free text questions for online assessment: have a go! Workshop at the Open University Conference, 29th and 30th April 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. Science Faculty Newsletter, May 2008.
Jordan, Sally and Johnson, Paul (2008) E-assessment opportunities: using free text questions and others. Science Faculty lunchtime seminar followed by workshop, 16th July 2008.
Jordan, Sally and Datta, Saroj (2008) Presentation on Open University use of Interactive Screen Experiments at ISE Launch event, 19th September 2008.
Jordan, Sally, Butler, Diane and Hatherly, Paul (2008) CETL impact: an S104 case study. Series of linked presentations to 3nd OpenCETL Conference, September 2008. [reported in OpenHouse, Feb/March 2009, ‘S104 puts projects into practice’, p4]
Jordan, Sally and Johnson, Paul (2008) Using free text e-assessment questions. Science Faculty lunchtime seminar followed by workshop, 26th November 2008.
Butcher, Phil, Jordan, Sally and Whitelock, Denise (2009) Learn About Formative e-Assessment. IET EPD Learn About Guide.
Butcher, Phil and Jordan, Sally (2009) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 21st January 2009.
Jordan, Sally (2009) E-assessment to support student learning : an investigation into different models of use. Paper presented at Making Connections Conference, 2nd- 3rd June 2009.
Jordan, Sally (2009) (ed) Compilation of interim and final reports on Open University Physics Innovations CETL projects: Assessment.
Butcher, Phil and Jordan, Sally (2009) A comparison of human and computer marking of short-answer free-text student responses. Presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2009) Interpreting the iCMA statistics. Presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Nix, Ingrid, Waights, Verina, Bolton, John and Butcher, Phil (2009) From CETL to course team : embedding iCMA initiatives. Workshop at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Butler, Diane, Hatherly, Paul and Stevens, Valda (2009). From CETL to Course Team: CETL-led initiatives in S104 Exploring Science. Poster presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally (2009) Student engagement for e-assessment. Poster presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2010) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 10th February 2010.
Jordan, Sally (2010) Workshops on using OpenMark PMatch to write short-answer free-text questions, 19th January 2010 and 17th February 2010.
Jordan, Sally, Nix, Ingrid, Wyllie, Ali, Waights, Verina. (2010) Science Faculty lunchtime forum, March 25th 2010.
Jordan, Sally (2010) e-Assessment. In e-Learning in Action: projects and outcomes from the Physics Innovations CETL, pp16-20.
Jordan, Sally (2010) (ed) Compilation of final reports on Open University Physics Innovations CETL projects: e-Assessment.
I am a Staff Tutor in Science at the Open University's Regional Centre in Cambridge, with responsibility for the running of various science courses in the East of England. I am a member of the OU Department of Physics and Astronomy. I was lucky enough to hold teaching fellowships in two of the OU's Centres for Excellence in Teaching and Learning: piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) and COLMSCT (the Centre for Open Learning in Mathematics, Science, Computing and Technology).
Following the end of the OU CETLs in July 2010, I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
I was a member of the course teams producing S104 : Exploring Science (presented for the first time in Feb 2008) and S154 : Science Starts Here (presented for the first time in October 2007). I had responsibility for the development of maths skills in these courses and for the integration of interactive computer marked assignments using the Open University’s OpenMark eAssessment system. I am currently course team chair of S104 and from January 2011 I will be course team chair of S154. I have also been (off and on!) course team chair of S151 : Maths for science, which reflects my great interest in investigating ways to help students to cope with the maths they need in order to study physics and other science courses. I was also on the Course Team of SXR103 : Practising Science for many years.
My COLMSCT and piCETL projects reflect my interests in eAssessment and science students’ mathematical misconceptions. I have investigated the use of short free-text eAssessment questions (with instantaneous feedback, wherever possible tailored to student’s misconceptions) and have applied similar evaluation methodologies to other types of eAssessment tasks, to investigate ways in which iCMAs can most effectively be used to support student learning. Analysis of students’ responses to eAssessment questions provides an opportunity for teachers to learn more about student’s misunderstandings. I have used this methodology to learn more about science students’ mathematical misconceptions. I have developed a ‘Maths Skills eBook’ which is used as a resource on many Open University science courses and I have evaluated the effectiveness and student use of an Interactive Screen Experiment (ISE) developed with piCETL funding for use by S104 students who are not able to conduct one of the course’s home experiments for themselves.
In 2006 I was awarded an Open University Teaching Award for my work in developing S151: Maths for Science and its interactive online assessment system. In 2010 I was one of the OU's nominees for a National Teaching Fellowship.
I have tutored various science and maths courses and have taken a few OU courses as a student. I live in West Norfolk with my husband; my children are both post-graduate students at (conventional) universities. My hobbies include walking long-distance footpaths and writing about this (see http://sites.google.com/site/jordanwalks/) and singing in a local choir.
Sally Jordan, Open University
S.E.Jordan@open.ac.uk
To produce e-tutorial teaching and support modules for M150
The aim of this project is to produce tutorial teaching and support modules that will sit side by side with e-assessment. The objective is to test how we can ensure we support the student who is not getting 100% on a self assessment. At the moment the student can get answers and feedback but this almost certainly is still not sufficient to be confident the student is capable in that topic. Most students no longer attend face to face tutorials to get support. But they do appear to be motivated to do the self assessment quizzes on the VLE. So while we have a captive audience an objective is to produce support materials on the VLE that teach the student. These will help the student in giving them support in areas where they may be weak as diagnosed by a quiz and enhance the teaching in the units. The teaching modules will use a variety of media, video tutorials, PowerPoint presentations with audio, exercises with solutions etc. The Computing Department needs to increase dramatically the retention rate on its level one courses. This kind of enhanced support will not only attempt to help a student have a better chance of passing the course but hopefully also encourage them to do more studies in the computing and information technology areas.
Ian’s developments are contained within the OU’s student VLE and cannot be made publicly available.
The project has now progressed to the position where 22 tutorials on JavaScript have been uploaded to a Moodle test site. About half the tutorials are videos of about 5 minutes duration presented by a M150 Associate Lecturer, the other half are PowerPoint with audio tutorials. In addition 11 tutorials have been uploaded to an ipod video device. These are all but one the same videos as those on the web site. It has been shown though that audio with PowerPoint can also be uploaded to an ipod.
There have been some interesting technical issues in getting to this point such as lighting for the video recording, downloading to computer from the camcorder results in a darker image for the video stored on the computer. There is the usual aspect of one piece of software providing some but not all of the facilities one would need and another having different facilties that include those wanted but omitting some that were still needed. The conclusion is the usual one of no software being satisfactory for the whole task. Therefore all the videos by the time of uploading to the web or ipod had suffered degradation in quality from what was on the camcorder tape. The best software for recording PowerPoint with audio sadly did not have the very useful facility of being able to publish the presentation to ipod as well as to the web. At some time in the future I hope to produce a document outlining my experiences of making videos for web and ipod presentation.
The nest phase of the project is to evaluate the tutorials with students. Firstly this will be carried out with students who have recently studied M150 as they are readily available and thankfully willing to do this for the project. It is then envisaged that the tutorials will be piloted with some students who are studying M150. The next update will provide news of the evaluation.
I am a Staff Tutor in Mathematics, Computing and Technology based in Newcastle upon Tyne, Open University in the North. I am a member of the Computing Department. I joined the university in 1988, but had been a tutor since 1975. I have worked on many course teams and I have chaired the M45X computing project courses. I have been an external assessor for a number of universities for quite a variety of courses. I am a mathematician originally and by heart but I have spent a lot of my time in the university working on database courses. I have an interest in project management and have been a consultant for a number of companies in the North East. My passion is for sport, playing squash, cricket and tennis and watching football and rugby.
Abstract
Web tutorials and podcasts are key components in e-learning. Academics in Higher Education are being attracted by the potential of their use in teaching and learning. Yet the majority of academics are not experts in producing videos or in using media software. This paper explores one person’s perspective, as a non expert, of producing videos for the web and for podcasting.
Introduction
The great advantage of web based tutorials and podcasts is their portability. They can be viewed when and where the user desires and can be paused; not something easy to do with a live lecture. They are flexible to use, can be responsive to the fast changes in subject material. An academic can produce quality portable material that can be used again and again.
Web based quizzes for self assessments are becoming popular as can be seen from the work being carried out in the Centres for Excellence and Teaching (CETLs) in universities. The structure of quizzes has become very sophisticated in the type of questions that can be formed and in the feedback that can be provided to the participant. Evidence from the CETLs indicates that evaluations of the quizzes show that they are well received by the users. Now web tutorials are being linked to these quizzes to provide guidance for those participants who have not scored high marks. So making web tutorials will become yet another tool for the academic in the support of their students. A simple way of producing a web tutorial is to use software such as TechSmith’s Camtasia to record audio for a PowerPoint presentation. The software has the facility to add a narration to a PowerPoint presentation and convert it in to use via the web or iPod. You have all the facilities of PowerPoint with a voice over. These have been produced and researched in the CETLs and are given excellent feedback from the students.
This paper will concentrate on the production of video tutorials as this aspect has far more complex problems to overcome compared to audio podcasts and audio PowerPoint tutorials. This is particularly the case for the producer if that person has little experience in media production methods.
Why should there be a need for Web based Tutorials?
It would not be right to make web based tutorials just because the technology is available to do so. They are time consuming to make so there has to be good reasons for using the end product. Neither can the reason be so that students can happily miss lectures and tutorials and catch up via the use of a web based tutorial because the web based tutorial is then being used for convenience not educational benefits. Web based tutorials can be used to show a practical application, a short experiment particularly if the video was showing something that could not be staged in a lecture room. They can be used by a lecturer to demonstrate one particular aspect that is very important and should not be missed (the video being on the web would mean there is no excuse for any student not to have studied the aspect). They can be used to give motivational support such as ‘how to pass my course’, examination techniques, ‘how to enjoy studying at this institution’ etc. The ability for the video to be stored means it can be viewed again and again and at anytime of the day or night which does not or cannot happen with a tutorial presented by a face to face mechanism. Like any educational support tool the value of the tool is dependent on using it for the right things and not using it in a scatter gun approach as a solution to all problems.
Why A Non Expert Producer?
In the formative years of the Open University the academics created many innovative teaching materials. One scheme was using television. Most people still seem to remember the male academic on TV wearing the flares and the kipper tie. Televising OU programmes created a vast audience of students and not just ones who had signed up for an OU course. The general public became aware of the OU it was a significant marketing tool as well as a very innovative new form of student support. The television programmes were very costly to make which eventually caused their demise. They were not just costly in terms of academic time but they were made in partnership with the BBC. They were made with BBC staff in specially built studios on the OU campus. The production staff included all the usual categories of staff needed to make television programmes including a director, camera personnel, floor manager, make up personnel. The studios had high quality lighting with qualified lighting personnel, high quality sound equipment with sound engineers. So television was an innovative idea but at high cost for high quality. The web based tutorial could be said to be 21st century replacement for the OU television programmes. There is not the resource available though to mimic the BBC for the production of the tutorials. There are not the resources in Higher Education institutions to provide the facilities to produce all the videos that would be requested. Like many other aspects of their work an academic needs to multi task and to produce teaching materials of many varied forms. So it will be the case in most instances of a non media expert trying to produce videos’.
The Process
Producing a video with a camcorder is a very simple task to complete even for the non expert. Even for teaching purposes there is not a need to produce in the majority of cases a cinema quality video. Producing videos that are good or even average though is not quite simple. YouTube contains lots of videos produced by amateurs and these people are not media experts. It contains many videos that can be classed as of a tutorial nature; they are trying to teach some concept. But how many are good? Many appear to be quite dark or with poor resolution. It is interesting to see on the teaching videos how many show a continuous play of shadows on the whiteboard, the large number for which you cannot read what is on the whiteboard. Many in which the presenter is some dark shadowy figure. These are all problems associated with using a camcorder in an ordinary lecture room, downloading to a computer and then producing the video for the web or iPod. All of which can be overcome particularly if known about before production rather than after when it may be too late or too costly to do something about them.
Once a video has been made using a camcorder you can view it on a television. The showing on television will look lighter, better and with greater resolution then the final product so it easy to be fooled in to believing you have a good product. To get the best download to a computer a Firewire cable should be used. But even with using such a cable the video when viewed on the computer will appear darker than on the television. The fuzzy effect that one sees on a YouTube video may well be due to the producer using software on the computer to attempt to lighten the video. The video can be made lighter but this will affect the sharpness of the video and give it a fuzzy appearance. The computer software to produce the video in web format will darken the video as well although this appears less so in producing the video for podcasting. Most of these problems can be overcome, some by the producer becoming more experienced but in the main by either working with a person trained in media production or studying a media course such as video editing.
A View of Some of the Problems
One problem with a video of a lecture is the inability to show clearly what is shown on the whiteboard. So if it is necessary to see whiteboard it is necessary to pan back and forth between the whiteboard and the lecturer or be good at editing and edit in the whiteboard elements. A simple solution is to ensure the whiteboard content uses a very large font for text showing say only three lines and a diagram or picture fills the whiteboard screen. Lighting is another problem. It is quite difficult to get lighting that is sufficiently bright for videoing in a classroom or lecture hall. Bringing in professional lighting is costly and maybe very difficult to arrange. Even with good lighting it is useful if the lecturer wears light coloured but not white (to stop a dazzling reflection) attire. It is useful if they use a pointer to point to items on the whiteboard rather then create distracting shadows by using their hand and arm to do the pointing. Some lecturers like to walk about when teaching – does this give a natural effect? For some it is distracting particularly on a video. Or is it best to find a spot for them to stand to give the lecture and ask them not to move? Many lecturers will be able to give a lecture without notes which is ideal for a video. But a presenter holding a set of notes does not provide good viewing on a video. Television presenters use an autocue facility; these are not going to be available for most academic producers. A solution to this is to have the lecture scripted and then the script can be shown on a monitor using a word processor. This requires the monitor to be out of the shot of the camcorder, within easy viewing of the lecturer, the font on the script to be large and for someone to manually scroll the document at the pace of the speech of the lecturer.
Teaching videos on YouTube teaching are too long. Focussing on a computer monitor or the screen of an iPod for more than ten minutes of a presentation is very difficult. Attention span in learning is short as is well known and this is particularly true if viewing a tutorial on the web or iPod. Many of the YouTube videos become tedious to watch because they are too long no matter how relevant or good the content. So a major constraint in producing good video tutorials is time limitation. Can you get the important message you want to deliver presented in a short time. This means a video of at most 5 – 8 minutes. This focuses the mind on the content of the video.
It is good to report that the process of transferring the video to web or podcast is quite straightforward using software such as Articulate, TechSmith Camtasia and Adobe Captivate.
Quizzes Revisited
With a web based quiz that allows the student to assess their ability on a topic what support is there for a student who does not do well. Generally the student can attempt each question a number of times and repeat the quiz as many times as desired. So if the student after all the attempts in not getting a high mark they are not proficient I that topic. They may be able to see a tutor and discuss the problems. But if not what can they do? It is now possible to link a web tutorial to a quiz. Conditions can be set so that if the overall grade is higher than a certain percentage the student is provided with feedback on performance and a motivating well done message. But if the overall grade is less than a set percentage an icon appears on the web page of the student and the icon is the shortcut key to run a web tutorial. Hopefully the student will view the tutorial then re-do the quiz and do better. All of this is done at a time convenient to the student at a place convenient to the student.
Conclusion
As someone who did not have photography or film making as a hobby working on a project that involves producing videos has involved a great deal of learning and problem solving. It has been very interesting to note the problems that were never envisaged at pre-production stage. There is no doubt that web and podcast tutorials are going to be used and very useful in teaching because of their portability and flexibility. It would not seem possible for education institutions to provide the professional resources to make all the teaching videos required. But perhaps there is a need for the institutions and their staff to start a dialogue to find ways for the institution to provide more help, expert advice, better but simpler facilities, to enable the production of the large amount of videos that will be wanted.
Abstract
Self assessment, tutorials on the a Virtual Learning Environment (VLE) and podcasts are key components in e-learning. This paper reports on one aspect of the use of these student support tools for a course offered by the Open University in the United Kingdom.
Introduction
Web based quizzes for self assessments are becoming popular as can be seen from the work being carried out in the Centres for Excellence and Teaching (CETLs) in universities. The structure of quizzes has become very sophisticated in the type of questions that can be formed and in the feedback that can be provided to the participant. Evidence from the CETLs indicates that evaluations of the quizzes show that they are well received by the users. Now web tutorials are being linked to a quiz to provide guidance for those participants who have not scored a high mark on the quiz. The quiz is a major support mechanism for the student to assess how they progressing in their studies. Since quizzes and tutorials can be delivered by the web on a VLE they form key components in any e-learning course.
VLE quizzes
What is the general form of a VLE quiz? They have about 8 questions or less. So they are short in overall length in order to encourage the student to do all the questions and by being short it does not face the student with a daunting task. A quiz will cover a small area of learning. This allows the student to check their progress in byte sized amounts. Hence they do not cover a large topic area for the student to find they did not know some aspect of the topic area that was covered many weeks ago. The student is able via the quiz to know their status with a concept and learn a topic in the knowledge that at each stage in the learning process they are successful. Hence they learn a topic by building knowledge in stages and at each stage they know they have learned all the necessary concepts successfully.
The author of a quiz determines for each question how many attempts the student can make in that visit to the quiz. Three attempts seem to be the norm. The author will also determine the number of tries a student can make for each quiz, the norm being as many times as they wish. If the student on a question gets an answer incorrect then a hint is given to where they have gone wrong and help towards getting it correct if they try it again.
There is great flexibility in the format of a quiz question. They could ask the student to choose an answer, they can ask for a diagram or graph to be completed and drag and drop of boxes can be used to form an answer. A student can be asked to provide an answer in words. This is much harder to check for correctness but many quizzes now use this free text facility.
At the end of a quiz the student can be given an overall score and feedback on their performance. Student feedback in evaluation of the use of quizzes in learning is always very positive. More information on the VLE and the many projects in teaching and learning can be found on the Open University CETL web site www.open.ac.uk/opencetl/index.php
Tutorials via a VLE
The great advantage of web based tutorials and podcasts is their portability. They can be viewed when and where the user desires and can be paused; not something easy to do with a live lecture. They are flexible to use, can be responsive to the fast changes in subject material. An academic can produce quality portable material that can be used again and again.
A simple way of producing a web tutorial is to use software such as TechSmith’s Camtasia to record audio for a PowerPoint presentation. The software has the facility to add a narration to a PowerPoint presentation and convert it in to use via the web or iPod. You have all the facilities of PowerPoint with a voice over. These have been produced and researched in the CETLs and are given excellent feedback from the students.
Producing a video with a camcorder is a very simple task to complete even for the non expert. Even for teaching purposes there is not a need to produce in the majority of cases a cinema quality video. Producing videos that are good or even average though is not quite simple. YouTube contains lots of videos produced by amateurs and these people are not media experts. It contains many videos that can be classed as of a tutorial nature; they are trying to teach some concept. Many appear to be quite dark or with poor resolution. It is interesting to see on the teaching videos how many show a continuous play of shadows on the whiteboard, the large number for which you cannot read what is on the whiteboard. Many in which the presenter is some dark shadowy figure. These are all problems associated with using a camcorder in an ordinary lecture room, downloading to a computer and then producing the video for the web or ipod. All of which can be overcome particularly if known about before production rather than after when it may be too late or too costly to do something about them. The process of transferring the video to web or podcast is quite straightforward using software such as Articulate, TechSmith Camtasia and Adobe Captivate.
Teaching videos on YouTube teaching are too long. Focussing on a computer monitor or the screen of an iPod for more than ten minutes of a presentation is very difficult. Many of the YouTube videos become tedious to watch because they are too long no matter how relevant or good the content. So a major constraint in producing good video tutorials is time limitation. Can you get the important message you want to deliver presented in a short time. This means a video of at most 5 – 8 minutes. This certainly focuses the mind on the content of the video.
It would not be right to make web based tutorials just because the technology is available to do so. They are time consuming to make so there has to be good reasons for using the end product. Neither can the reason be so that students can happily miss lectures and tutorials and catch up via the use of a web based tutorial because the web based tutorial is then being used for convenience not educational benefits. Web based tutorials can be used to show a practical application, a short experiment particularly if the video was showing something that could not be staged in a lecture room. They can be used by a lecturer to demonstrate one particular aspect that is very important and should not be missed (the video being on the web would mean there is no excuse for any student not to have studied the aspect). They can be used to give motivational support such as ‘how to pass my course’, examination techniques, ‘how to enjoy studying at this institution’ etc. The ability for the video to be stored means it can be viewed again and again and at anytime of the day or night which does not or cannot happen with a tutorial presented by a face to face mechanism. Like any educational support tool the value of the tool is dependent on using it for the right things and not using it in a scatter gun approach as a solution to all problems.
Tutorials via Portable Devices
The tutorials mentioned in the previous section can as well as being put on the VLE to be viewed can be stored as a file that can be downloaded on to an MP3 player or iPod. When converting the tutorial to a specified format using the software you can choose to convert it for use on the web or for iPod. Providing the tutorial for an iPod video, iTouch gives greater portability for the student. They can download the tutorial and view it whenever and wherever they desire.
VLE Quiz and Linked Tutorial
Web tutorials are being linked to web quizzes to provide guidance for those participants who have not scored high marks. With a web based quiz that allows the student to assess their ability on a topic what support is there for a student who does not do well. Generally the student can attempt each question a number of times and repeat the quiz as many times as desired. So if the student after all the attempts is not getting a high mark they are not proficient in that topic. They may be able to see a tutor and discuss the problems. But if not what can they do? It is now possible to link a web tutorial to a quiz. Conditions can be set so that if the overall grade is higher than a certain percentage the student is provided with feedback on performance and a motivating well done message. But if the overall grade is less than a set percentage an icon appears on the web page of the student and the icon is the shortcut key to run a web tutorial. Hopefully the student will view the tutorial then re-do the quiz and do better. All of this is done at a time convenient to the student at a place convenient to the student. The student can view the tutorial over and over again and can pause the tutorial to make notes if necessary.
Future
There is no doubt that e-learning courses will play an important part in education in the coming years. The readers for e-books are only just coming onto the market and are expensive. But like all technology gadgets the price of the e-book readers will start to fall once they become in mass usage. The impact of e-books on e-learning cannot yet be forecast but it almost certainly will have a major effect on the production of materials in education. One can envisage the demise of the printed texts and printed course units. The e-book with another device that accesses the web and stores audio and video files will be the course. We will see the production of a new e-book reader that can be used to access the web and store audio and video files. Then one device will contain not just one course but many courses and will also allow the student to access all the support mechanisms available on the VLE. Students in the future will desire and request such portability.
Conclusion
There is no doubt in my mind that web and podcast tutorials are going to be used and very useful in teaching because of their portability and flexibility. Self Assessment quizzes on a VLE will be standard on all courses. The developments of the VLEs will provide greater functionality so that quizzes will become easier to format and have even more flexibility in styles both for the questions and the answers. The student experience will improve with the ability to check their own progress as they study, the opportunity to be more flexible in when and where they study. Quizzes and tutorials on the VLE are a worthy and significant progress in the support of student learning and teaching.
Web Site
http://www.open.ac.uk/opencetl/index.php
The aim of this project is to explore the value of online virtual experiments in enhancing student understanding of course material, retention and employability skill. A 'virtual lab' is defined as an 'e-learning activity based on conventional laboratory procedures, but delivered on-line to distance learners, to give them a more real experience of biological material, procedures and applicability, normally absent from a paper-based course'.
The aim of this project is to determine whether 'virtual labs' with periodic, fast, on-line formative assessment and feedback/feed-forward, would:
This addresses the weakness identified by the Science Faculty of the 'difficulties in teaching science at distance imposed by the need to include practical work'.
Development of the iCMAs
This project has designed formative interactive computer marked assessments (iCMAs) based on some of the microscope images that are available in the virtual laboratory used by the course S320 Infectious disease (see iCMAs by clicking the link on the right hand side of this page).
The project will be evaluating the value of these activities in relation to students' learning development.
Readers may like to view some of the formative iCMAs:
Recognizing parasites
https://students.open.ac.uk/openmark/s320.book2world/
Microbiology
https://students.open.ac.uk/openmark/microbiology-08.expt/
Ken Hudson, COLMSCT Associate Teaching Fellow
kmh24@tutor.open.ac.uk
This project builds on the work of Mirabelle Walker (Fellow 2005 - 2007) and also the Formative Assessment in Science Teaching project (www.open.ac.uk/fast)
This proposal was particularly prompted by the level of change that Mirabelle’s project has brought about in TMA marking for T175, Networked living: Exploring information and communication technologies, and also by discussion within the T175 Course team on new ideas (technologies) that might be included in T175.
This project aims:
Frances Chetwynd
f.d.chetwynd@open.ac.uk
This project has been designed as a sequel to research by COLMSCT Fellow, Mirabelle Walker, in the Technology Faculty which set out to investigate, and find ways to improve, the quality of written feedback being given on students’ tutor-marked assignments (TMAs).
This project investigates the types of feedback provided by language tutors on two Spanish courses offered by the Department of Languages (Faculty of Education and Language Studies) and examines how students respond to it. It replicates a previous COLMSCT-funded study, conducted in the Technology Faculty by Mirabelle Walker, whose project set out to investigate, and find ways to improve, the quality of written feedback being given by Technology tutors on students’ tutor-marked assignments (TMAs).
This sequel project tests the extent to which the findings of the previous study are applicable to other academic subjects. It presents an analysis of over 4000 written comments made by Language tutors on 72 assignments in two Spanish modules, and of follow-up telephone interviews with 20 of the students whose tutors’ comments were analysed. Preliminary findings suggest that overall, Language tutors make more comments and annotations than Technology tutors. They also comment more on skills than on content (an opposite tendency to the one observed in Technology). In terms of depth, a lower proportion of the comments by Language tutors are corrections, whereas they make more comments simply indicating errors than their Technology colleagues, and also comments that provide explanations.
These findings will be triangulated with an analysis of the follow-up interviews, which is currently in progress. The final results are expected to provide a better understanding of current practice across subject areas and to inform further recommendations on effective feedback in different subjects. They will be widely shared with researchers and practitioners, both within and beyond the Open University
This project was designed to investigate, and find ways to improve, the quality of written feedback being given on students’ tutor-marked assignments.
Drawing on three courses in the Technology Faculty [1] over 3000 comments on 106 scripts and associated cover sheets were analysed using a coding scheme devised by Brown and Glover (2006). Forty-three of the students whose commented work had been analysed were subsequently interviewed. A matching of students’ response to the feedback they had received to the nature of that feedback led to three key findings:
These results were shared with the tutors on the three courses where the feedback had been analysed, and guidelines were developed to enhance the feedback being given. An analysis of the feedback on assignments after this intervention showed that the intervention had mixed results. There were improvements in the use of skills development comments on all three courses, but on two of the three courses there was a decrease in gap-bridging comments. An investigation of this latter finding revealed no single cause, but indicated the need for further staff development work and the undesirability of attempting to make two changes to feedback practice simultaneously.
The results relating to effective feedback have subsequently been shared more widely, both within and beyond the Open University, and related staff development work has been, or is being, undertaken.
The 'Final Project Report' and related deliverables can be accessed from Related resources below.
[1] The Technology Faculty has subsequently been merged into the new Faculty of Mathematics, Computing and Technology.
A sequel to this project
Colleagues in the Spanish Department of the Faculty of Education and Language Studies, Maria Fernandez-Toro and Mike Truman, have carried out a research project that follows up the work described here with an investigation into feedback given to students on their Spanish courses. As well as discovering what feedback is like on a language course, and how students respond to it, this work has enabled comparisons to be made across the two Faculties.
To read more about this project please click the link under Related projects on the right hand side of this page where you can also view the final report.
A selection of COLMSCT related dissemination activities to January 2009.
I was pleased that the Open University’s bid to host several CETLs was successful, as I had long felt that excellence in research was promoted above excellence in teaching here at the Open University, and saw the CETLs as a chance to redress the balance. Given my own long-standing interest in teaching and learning, I hoped I would be able to participate in the work of one of the CETLs. So when I heard that I could put in a bid to do a project with COLMSCT I immediately began to plan my response. I had recently become interested in the impact of feedback on assessment in the learning process, so my project proposal centred round investigating the effectiveness of TMA feedback in my own faculty - the Technology Faculty, as it was then.
The acceptance of my proposal, in the form of a two-year appointment as a part-time COLMSCT Teaching Fellow, led to two of my most enjoyable years at the Open University as I first delved into what sort of feedback was being given on TMAs, and how students reacted to it, and then disseminated my findings. The dissemination phase has led to my visiting several regions and running workshops about feedback for ALs. It has also led to my giving papers at educational conferences and getting articles published in educational journals. All of these were relatively unfamiliar activities for me, so my COLMSCT project has widened my horizons as well as increased my understanding of what works in TMA feedback, and why.
My only concern now, at the end of my project, is that there is still a great deal to be done regarding TMA feedback. I have only scratched the surface of what could - and should - be done to promote better feedback within the Open University. We may have done better than most universities in this area in the recent student satisfaction survey, but this is no reason to rest on our laurels! We can do even better, and our students would benefit if we did.
I have worked with what was the Faculty of Technology and is now the Faculty of Mathematics, Computing and Technology of the Open University since 1973, and am currently a Lecturer in the Department of Communication and Systems. I have helped to design, and prepared learning materials for, over fifteen courses. I have been Director of the Information, Computer and Communication Technologies Programme Board and Chair of the Degree Board for the popular BSc (Hons) Information Technology and Computing degree. Over the years I have developed a keen interest in the inter-relationship between assessment and student learning and have been a member of several course teams that have used innovative types of assessment.
From 2005 to 2007 I had a Teaching Fellowship with COLMSCT to investigate written feedback on students' assignments: what it's like, what students make of it and how to improve it.
Publications related to my COLMSCT project include:
This project is being funded under the E-Assessment for learning Initiative. The aim of this project is to develop a framework for e-assessment that could be used across English language courses in general, to support learning outcomes related to the description, analysis, and interpretation of linguistic data.
Students on applied linguistics and language studies courses need a basic competence in linguistic analysis, but vary considerably in the relevant knowledge and skills on course entry. This project investigated the feasibility of using optional web-based self-assessment activities to help students develop skills of linguistic analysis. A set of interactive activities was developed to support students on a masters level language studies course, E854 Investigating language in action. Developmental testing of the activities was carried out with 28 students from related courses, and a further 5 testers took part in an observational study in the IET research laboratory. Based on these studies, the activities were then revised to improve clarity, accuracy and usability.
Feedback from the testers was positive. The activities seem to cater well for different learning styles by allowing users to work at their own pace and in their own way. Providing well-defined questions with immediate feedback appears to be an effective way of engaging students. The design was generally intuitive, although the absence of any way of returning to previous pages may have reduced the value of the feedback to students. Amending this feature would likely be a worthwhile improvement to the system.
As E854 Investigating language in action only begins its first presentation in October 2009, we cannot yet evaluate the way the activities are used by their target audience. Given that the activities are optional, a crucial issue will be the extent to which students are willing to spend time on them. Further research is therefore needed to monitor their use and to evaluate the extent to which they meet the aims of the project.
Readers might like to try the following iCMA. (Please note that this iCMA can only be made available to OU staff and students as it uses copyrighted texts):
Describing English
https://students.open.ac.uk/openmark/e841-08.describing_englishstaff
Sarah North, Faculty of Education and Language Studies
S.P.North@open.ac.uk
A significant number of students studying higher level chemistry courses have some difficulty with the mathematics. This impacts on their studies particularly in areas involving physical chemistry which are more mathematically-based.
The aim is to provide an interactive assessment which will allow them to practise their mathematics in the context of examples from chemistry and then to use the feedback to improve their understanding of the required mathematics and if necessary to show them where additional help is available. By using interactive assessment the feedback can be instantaneous and targeted.
Readers might like to try the following iCMAs:
First order kinetics
https://students.open.ac.uk/openmark/s342-08.mathfchem-foworld/
Second order kinetics
https://students.open.ac.uk/openmark/s342-08.mathfchem-soworld/
The Arrhenius equation
https://students.open.ac.uk/openmark/s342-08.mathfchem-arworld/
Christine Leach
cal239@tutor.open.ac.uk
This project aims to produce a design for a continuum of topic-based computer-marked questions, from easy to difficult and from formative to summative. Students can choose, depending on their self-assessment, which questions on the continuum to engage with and to log their reflections as they do so. Our research questions will focus on how students respond to this method of selecting their learning journey, whether students agree with the mapping of the continuum according to the typology of questions which we have devised, and what design improvements can be suggested.
Students come onto courses with different skills levels and learning needs. By developing a continuum which communicates to students which questions may be appropriate for them to engage with based on their confidence and competence levels, we hope students will be able to make effective use of their study time and to undertake the summative assessment at a point when they are ready.
We have identified a typology of question types to map onto the continuum. The aim is to find a series of designs which are motivating for students to engage with and which they recognize as being easier or more difficult and hence would expect to find within certain positions in the continuum.
The continuum sequence of six formative questions starts with a task/problem. The sequence then moves through checking comprehension, knowledge, and applying skills. According to their choice students will experience progressively less ‘expert’ input and support, and more independence, leading to applying the skill in generative situations.
When they are ready they can select the (final) summative sequence of questions. Students are encouraged to answer these in a linear sequence due to the way the scenario unfolds in each question. This sequenceclosely reflects the learning objectives in the formative questions. On completion of the summative test the student may revisit any of the formative questions.
Some questions will incorporate audio, animations or stills. Each question will also include a Confidence indicator tool. This requires students to indicate on seeing the possible question answers how confident they are that their answer will be correct. Marks are deducted if a student is over-confident and achieves a wrong answer. The project aims to explore how students feel about using this approach and whether it is an appropriate tool to help students develop self-awareness of their confidence and competence levels, especially for students in disciplines which advocate evidence-based practice.
Learning logs within each question enable students to reflect on and analyse their learning experiences and help them make optimum use of the system. The same logs will serve to inform the project team about the reasons for choices made.
The LINA project is currently being trialed by students on K216 Applied Social Work Practice at: http://learn.open.ac.uk/file.php/1333/LINAhome.htm
Readers might like to try the following interactive Computer Marked Assessments:
A selection of COLMSCT related dissemination activities to January 2009.
I am a Lecturer in Learning and Teaching Technologies in the Faculty of Health and Social Care and a Teaching Fellow in the OU's Centre for Excellence in Teaching and Learning (CETL). I have 2 projects in the Centre for Open Learning in Mathematics, Science, Computing and Technology (COLMSCT) CETL and a further project in the Practice-based Professional Learning (PBPL) CETL.
My COLMSCT projects reflect my interest in eAssessment and in applying constructivist pedagogies to formative and summative student learning. Taking a broad perspective, the LINA project explored what design features would make interactive assessment more motivating to learners. The aim being to investigate an approach which would facilitate the progress of students through formative to summative assessment. It built on work Ingrid Nix and I had carried out within the faculty, designing online computer marked assessment sequences for social work and nursing students and observing student use and level of engagement amidst competing pressures of work and study.
The Clinical Decision-Making Maze project (CDM), focused very much on the idea of an extended contextualised narrative as the backbone for a number of decision points, evolving over time into a decision maze with a variety of different possible outcomes. Media resources at each stage flesh out the context, and provide evidence for students to interrogate and use to justify their decisions which are captured in a reflective log. Students construct their own interpretation of events and make decisions accordingly. As the clinical case evolves over time, decision pathways are scaffolded to allow for targeted feedback and to limit the complexity of the underlying maze. The approach marries the complexity of practice simulation scenarios with the flexibility of a decision maze and being embedded within an existing online assessment system (OpenMark), results can be captured, scored and targeted feedback given.
For a full profile visit Ali's faculty page at http://www.open.ac.uk/hsc/people/profile.php?name=Ali_Wyllie
This project is part of the COLMSCT investigations into using online questions which require students to respond with short answers in free text. In particular, this project is looking into how to set questions that require the students to make two or three separate points, and obtain credit for each subpart. This project is working closely with the e-assessment projects of Sally Jordan and Phil Butcher.
Previous COLMSCT projects have demonstrated that students find online assessment valuable, but for short-answer questions, it is difficult for someone without specialist training to write good mark schemes. Even with the necessary experience, the process is time consuming. When the questions are extended to cover more than one teaching point, the combinatorial increase in possible solutions makes the development of the mark scheme commensurately more difficult. The purpose of this project is to see whether some techniques can be developed which allow the question-setter to pose (and automatically mark) more complex questions, but without a heavy increase in the workload needed to on the part of the assessor.
A set of multiple-solution questions is being developed for the Computing Masters-level course in security engineering. This will enable us to investigate whether hand-written mark schemes (in Java or prolog) work better than off-the-shelf solutions.
As yet there are no interactive examples to include.
Alistair Willis
a.g.willis@open.ac.uk
The focus for the project is to provide an online self-assessment tool for developing an understanding of music analysis (mainly descriptive) and music notation.
TA212 The Technology of Music is a 60 credit point course.
Back
Robert Davis, COLMSCT Associate Teaching Fellow
rd2267@tutor.open.ac.uk
An analysis of student answers to interactive online assessment questions for the Open University course S151 : Maths for Science has provided insight into adult distance-learning science students’ mathematical misconceptions. Some of the findings have been unexpected and frequently errors are caused by more basic misunderstandings than lecturers might imagine. The analysis has revealed specific misconceptions relating to units, powers notation, arithmetic fractions and the rules of precedence.
Background to the project
The ‘Maths Problem’ in physics teaching is well documented and many imaginative solutions have been proposed. These solutions frequently make innovative use of new technology, but they commonly start from the premise that we know why physics students struggle with mathematics. This project has stepped back to consider the nature of the misconceptions which lead adult students to have difficulty with the mathematics needed for the study of physics.
Open University undergraduate courses are completely open entry. One implication of this is that students studying Science Faculty courses have a very wide range of mathematical backgrounds, varying from those who already have a degree in a numerate discipline to those with no previous mathematical qualifications at all. Many OU students have not studied mathematics since they were at school (which, for adult students, might have been many years ago) and they frequently lack confidence in their mathematical abilities. Elementary mathematical skills are embedded in the Science Faculty’s interdisciplinary level 1 course S104 Exploring Science (which replaced S103 Discovering Science), but lack of mathematical ability and confidence remains a problem for many students as they progress to level 2 courses in physics, astronomy, chemistry, Earth science and biology. The 10 point level 1 course S151 Maths for Science was written to meet this need. The course has now been studied by more than 8000 students since it was first presented in 2002. It has been both well received by students and instrumental in increasing retention rates on higher level Science Faculty courses.
One of the issues confronting all providers of distance education is the need to provide students with meaningful feedback without necessarily ever meeting them, and the S151 Course Team took the decision to pilot an interactive web-based system for both summative and formative assessment. The assessment system, now known as ‘OpenMark’, is now used by several Open University courses.
Methodology
In addition to its many benefits for student learning, the OpenMark system has provided a rich source of information about the mistakes made by students. Data from more than S151 assessment questions have been analysed, typically for around 200 students at a time, and this is leading to increased insight into students’ mathematical misconceptions.
Most of the analysis has been done on questions from the ‘End of Course Assessment’, which has a summative as well as a formative function. An implication of this is that students are trying very hard to get the questions ‘right’, so the errors revealed cannot, in the main, be attributed to students guessing the answer. In addition, since so few of the questions are multiple choice, the analysis has been able to go beyond a consideration of commonly selected distractors to look at the actual responses entered by students. Finally, most of the assessment questions exist in several variants, with different questions being presented to different students. This feature, which exists to limit opportunities for plagiarism, has also added to the author’s confidence in the general applicability of some of the findings. For example, for one variant of a question, around 50% of incorrect responses gave the answer 243; for a different variant of the same question a similar percentage of incorrect responses gave the answer 11809.8, and so on. These errors can be explained by an identical misunderstanding, in this case a misunderstanding of the rules of precedence.
Findings
There have been some surprises; for example, the most badly answered question on every S151 assessment (even more badly answered than questions on logs, exponentials or differential calculus) requires students to substitute values into a simple equation (no algebraic rearrangement is required) and to give their answer to an appropriate number of significant figures and with correct SI units. The most frequent error in this question is caused by students giving incorrect units, and more detailed analysis of the actual units given has led to the conclusion that many students are struggling because of lack of understanding of arithmetic with fractions.
This is just one example of the fact that errors are frequently caused by more basic misunderstandings than lecturers might imagine, and misconceptions in one area can fuel difficulties at a later stage For example, students who have difficulty in finding and interpreting the gradient of straight-line graphs are likely to struggle with differential calculus.
Project outcomes
The insight gained from this analysis had led to changes in the assessment tasks and teaching in S151 itself, and increased mathematical underpinning in the new courses S154 Science starts here! and S104 Exploring science.
Links
For more information about S151: Maths for Science go to http://www3.open.ac.uk/courses/bin/p12.dll?C01S151
For more information about the OpenMark eAssessment system go to http://www.open.ac.uk/openmarkexamples/index.shtml
A selection of COLMSCT and piCETL related papers, presentations and workshops, given by Sally Jordan, 2006-2010.
Publications and external conference contributions
Jordan, Sally (2007) The mathematical misconceptions of adult distance-learning science students. Proceedings of the CETL-MSOR Conference 2006, edited by David Green. Maths, Stats and OR Network, pp 87-92. ISBN 978-0-9555914-0-2.
Butcher, Philip and Jordan, Sally (2007) Interactive assessment in science at the Open University: 1975 – 2007. Invited oral presentation at ‘Computer-based assessment in the broader physical sciences’: a joint event hosted by the OpenCETL and the Physical Sciences Subject Centre, 26th April 2007.
Jordan, S., Brockbank, B. and Butcher, P. (2007) Extending the pedagogic role of online interactive assessment: providing feedback on short free-text responses. REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May 2007. Available at http://ewds.strath.ac.uk/REAP07
Jordan, Sally (2007) Computer based assessment with short free responses and tailored feedback. Proceedings of the Science Learning and Teaching Conference 2007, edited by Peter Goodhew. Higher Education Academy, pp 158-163. ISBN 978-1-905788-2.
Hudson, Ken and Jordan, Sally (2007) Practitioner scholarship can lead to institutional change – implementing interactive computer based assessment. Oral presentation at ISSOTL 2007, the International Society for the Scholarship of Teaching and learning, 4th Annual Conference, Sydney, 2nd-5th July 2007.
Jordan, Sally (2007) Assessment for learning; learning from Assessment? Oral presentation at Physics Higher Education Conference, Dublin, 6th-7th September 2007.
Stevens, Valda and Jordan, Sally (2008) Interactive online assessment with teaching feedback for open learners. Oral presentation at Assessment and Student Feedback workshop, Higher Education Academy Centre for ICS, London, April 2008.
Jordan, Sally (2008) eAssessment for student learning: short free-text questions with tailored feedback. Workshop at the University of Chester Staff Conference, May 2008.
Swithenby, Stephen and Jordan, Sally (2008) Supporting open learners by computer based assessment with short free-text responses and tailored feedback. Part of an invited symposium on ‘Matching technologies and pedagogies for supported open learning’ at the 6th International Conference on Education and Information Systems, Technologies and Applications, EISTA, Orlando, 29th June – 2nd July 2008.
Jordan, Sally (2008) Assessment for learning: pushing the boundaries of computer based assessment. Assessment in Higher Education Conference, University of Cumbria, July 2008.
Jordan, Sally (2008) Supporting distance learners with interactive screen experiments. Contributed oral presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) Online interactive assessment: short free text questions with tailored feedback. Contributed poster presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) E-assessment for learning? The potential of short free-text questions with tailored feedback (2008) In invited Symposium ‘Moving forward with e-assessment’ at at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Butcher, Philip and Hunter, Arlene (2008) Online interactive assessment for open learning. Roundtable discussion at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Brockbank, Barbara, Jordan, Sally and Mitchell, Tom (2008) Investigating the use of short answer free-text eAssessment questions with tailored feedback. Poster presentation at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Brockbank, Barbara, Butcher, Philip and Mitchell, Tom (2008) Online assessment with tailored feedback as an aid to effective learning at a distance: including short free-text questions. Poster presentation at 16th Improving Student Learning Symposium, University of Durham, 1st-3rd September 2008.
Hatherly, Paul; Macdonald, John; Cayless, Paul and Jordan, Sally (2008) ISEs: a new resource for experimental physics. Workshop at the Physics Higher Education Conference, Edinburgh, 4th-5th September 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. OpenCETL Bulletin, 3, p4.
Jordan, Sally (2008) Online interactive assessment with short free-text questions and tailored feedback. New Directions, 4, 17-20.
Jordan, Sally (2009) Online interactive assessment in teaching science: a view from the Open University. Education in Science, Number 231, 16-17.
Jordan, Sally and Mitchell, Tom (2009) E-assessment for learning? The potential of short free-text questions with tailored feedback. British Journal of Educational Technology, 40, 2, 371-385.
Hatherly, Paul, Jordan, Sally and Cayless, Alan (2009) Interactive screen experiments – innovative virtual laboratories for distance learners. European Journal of Physics, 30, 751-762.
Butcher, P.G., Swithenby, S.J. and Jordan, S.E. (2009) E-Assessment and the independent learner. 23rd ICDE World Conference on Open Learning and Distance Education, 7-10 June 2009, Maastricht, The Netherlands.
Jordan, Sally. (2009) Assessment for learning: pushing the boundaries of computer based assessment. Practitioner Research in Higher Education, 3(1), pp11-19.Available online at
http://194.81.189.19/ojs/index.php/prhe
Jordan, Sally. (2009) An investigation into the use of e-assessment to support student learning. Assessment in Higher Education Conference, University of Cumbria, 8th July 2009. Available online at http://www.cumbria.ac.uk/Services/CDLT/C-SHEN/Events/EventsArchive2009.aspx
Jordan, Sally and Brockbank, Barbara (2009) Online interactive assessment: short free text questions with tailored feedback. Oral presentation at GIREP-EPEC, August 2009.
Jordan, Sally and Butcher, Philip. (2009) Using e-assessment to support distance learners of science. Oral presentation at GIREP-EPEC, August 2009.
Hatherly, Paul, Jordan, Sally and Cayless, Alan. (2009) Interactive screen experiments – connecting distance learners to laboratory practice. Oral presentation at GIREP-EPEC, August 2009.
Butcher, P.G & Jordan, S.E. (2010) A comparison of human and computer marking of short free-text student responses. Computers & Education, 55, 489-499. DOI: 10.1016/j.compedu.2010.02.012
Jordan, Sally. (2010). E-assessment for learning and learning from e-assessment : short-answer free text questions with tailored feedback. Presentation and workshop to HEA Physical Sciences Centre “The future of technology enhanced assessment’, Royal Society of Chemistry, Burlington House, London, 28th April 2010.
Jordan, Sally (2010) Short answer free text e-assessment questions with tailored feedback. Invited seminar to Human Computer Interaction group at the University of Sussex, 21st May 2010.
Jordan, Sally (2010) Maths for science for those with no previous qualifications: a view from the Open University. HEA Physical Sciences Centre ‘Maths for Scientists’ meeting, 26th May 2010.
Jordan, Sally (2010) Student engagement with e-assessment questions. Poster at the 2010 International Computer Assisted Assessment (CAA) Conference, Southampton, July 2010.
Jordan, S. and Butcher, P. (2010) Using e-assessment to support distance learners of science. In Physics Community and Cooperation: Selected Contributions from the GIREP-EPEC and PHEC 2009 International Conference, ed. D, Raine, C. Hurkett and L. Rogers. Leicester: Lula/The Centre for Interdisciplinary Science. ISBN 978-1-4461-6219-4, pp202-216.
Jordan, Sally (2010) Do we know what we mean by ‘quality’ in e-assessment? Roundtable discussion at EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference, Northumbria, September 2010.
Jordan, Sally, Butcher, Phil, Knight, Sarah and Smith, Ros (2010) ‘Your answer was not quite correct, try again’ : Making online assessment and feedback work for learners. Workshop at ALT-C 2010 ‘Into something rich and strange – making sense of the sea change’, September 2010, Nottingham.
Jordan, Sally (2010) Using simple software to generate answer matching rules for short-answer e-assessment questions in physics and astronomy. Oral presentation at the Physics Higher Education Conference, University of Strathclyde, September 2010.
Butcher, P.G. & Jordan, S.E, (in press) Featured case study in JISC Effective Practice Guide, Summer 2010
Contributions to internal (OU) conferences, meetings and workshops
Jordan, Sally (2006) An analysis of science students’ mathematical misconceptions. Poster presentation at 1st OpenCETL Conference, 8th June 2006.
Jordan, Sally (2006) OpenMark – what’s all the fuss about? Lunchtime seminar at Cambridge Regional Centre, 1st November 2006.
Jordan, Sally (2007) Using interactive online assessment to support student learning. Faculty lunchtime seminar, 30th January 2007.
Jordan, Sally (2007) Issues and examples in online interactive assessment. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Assessment for learning; learning from assessment? Paper presented at the Curriculum, Teaching and Student Support Conference, 1st May 2007.
Jordan, Sally and Brockbank, Barbara (2007) Extending the pedagogic role of online interactive assessment: short answer free text questions. Paper presented at the Curriculum, Teaching and Student Support Conference, 2nd May 2007.
Jordan, Sally (2007) Investigating the use of short answer free text questions in online interactive assessment. Presentation at the Science Staff Tutor Group residential meeting, 9th May 2007.
Jordan, Sally (2007) OpenMark: online interactive workshop. Workshop run at AL Staff Development meeting in Canterbury, 12th May 2007.
Brockbank, Barbara, Jordan, Sally and Butcher, Phil (2007) Investigating the use of short answer free text questions for online interactive assessment. Poster presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions: learning from online assessment. Oral presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Using interactive screen experiments in our teaching: the S104 experience and The Maths Skills ebook. Demonstrations at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally, Ekins, Judy and Hunter, Arlene (2007) eAssessment for learning?: the importance of feedback. Symposium at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally, Brockbank, Barbara and Butcher, Phil (2007) Authoring short answer free text questions for online interactive assessment: have a go! Workshop at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally (2008) Investigating the use of short answer free-text questions in online interactive assessment. Oral presentation to EATING (Education and Technology Interest Group), 17th January 2008.
Jordan, Sally (2008) Investigating the use of short free-text eAssessment questions.Oral presentation to ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally (2008) Writing short free-text eAssessment questions: have a go! Workshop at ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally and Brockbank, Barbara (2008) Writing free text questions for online assessment: have a go! Workshop at the Open University Conference, 29th and 30th April 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. Science Faculty Newsletter, May 2008.
Jordan, Sally and Johnson, Paul (2008) E-assessment opportunities: using free text questions and others. Science Faculty lunchtime seminar followed by workshop, 16th July 2008.
Jordan, Sally and Datta, Saroj (2008) Presentation on Open University use of Interactive Screen Experiments at ISE Launch event, 19th September 2008.
Jordan, Sally, Butler, Diane and Hatherly, Paul (2008) CETL impact: an S104 case study. Series of linked presentations to 3nd OpenCETL Conference, September 2008. [reported in OpenHouse, Feb/March 2009, ‘S104 puts projects into practice’, p4]
Jordan, Sally and Johnson, Paul (2008) Using free text e-assessment questions. Science Faculty lunchtime seminar followed by workshop, 26th November 2008.
Butcher, Phil, Jordan, Sally and Whitelock, Denise (2009) Learn About Formative e-Assessment. IET EPD Learn About Guide.
Butcher, Phil and Jordan, Sally (2009) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 21st January 2009.
Jordan, Sally (2009) E-assessment to support student learning : an investigation into different models of use. Paper presented at Making Connections Conference, 2nd- 3rd June 2009.
Jordan, Sally (2009) (ed) Compilation of interim and final reports on Open University Physics Innovations CETL projects: Assessment.
Butcher, Phil and Jordan, Sally (2009) A comparison of human and computer marking of short-answer free-text student responses. Presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2009) Interpreting the iCMA statistics. Presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Nix, Ingrid, Waights, Verina, Bolton, John and Butcher, Phil (2009) From CETL to course team : embedding iCMA initiatives. Workshop at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Butler, Diane, Hatherly, Paul and Stevens, Valda (2009). From CETL to Course Team: CETL-led initiatives in S104 Exploring Science. Poster presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally (2009) Student engagement for e-assessment. Poster presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2010) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 10th February 2010.
Jordan, Sally (2010) Workshops on using OpenMark PMatch to write short-answer free-text questions, 19th January 2010 and 17th February 2010.
Jordan, Sally, Nix, Ingrid, Wyllie, Ali, Waights, Verina. (2010) Science Faculty lunchtime forum, March 25th 2010.
Jordan, Sally (2010) e-Assessment. In e-Learning in Action: projects and outcomes from the Physics Innovations CETL, pp16-20.
Jordan, Sally (2010) (ed) Compilation of final reports on Open University Physics Innovations CETL projects: e-Assessment.
Back
I am a Staff Tutor in Science at the Open University's Regional Centre in Cambridge, with responsibility for the running of various science courses in the East of England. I am a member of the OU Department of Physics and Astronomy. I was lucky enough to hold teaching fellowships in two of the OU's Centres for Excellence in Teaching and Learning: piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) and COLMSCT (the Centre for Open Learning in Mathematics, Science, Computing and Technology).
Following the end of the OU CETLs in July 2010, I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
I was a member of the course teams producing S104 : Exploring Science (presented for the first time in Feb 2008) and S154 : Science Starts Here (presented for the first time in October 2007). I had responsibility for the development of maths skills in these courses and for the integration of interactive computer marked assignments using the Open University’s OpenMark eAssessment system. I am currently course team chair of S104 and from January 2011 I will be course team chair of S154. I have also been (off and on!) course team chair of S151 : Maths for science, which reflects my great interest in investigating ways to help students to cope with the maths they need in order to study physics and other science courses. I was also on the Course Team of SXR103 : Practising Science for many years.
My COLMSCT and piCETL projects reflect my interests in eAssessment and science students’ mathematical misconceptions. I have investigated the use of short free-text eAssessment questions (with instantaneous feedback, wherever possible tailored to student’s misconceptions) and have applied similar evaluation methodologies to other types of eAssessment tasks, to investigate ways in which iCMAs can most effectively be used to support student learning. Analysis of students’ responses to eAssessment questions provides an opportunity for teachers to learn more about student’s misunderstandings. I have used this methodology to learn more about science students’ mathematical misconceptions. I have developed a ‘Maths Skills eBook’ which is used as a resource on many Open University science courses and I have evaluated the effectiveness and student use of an Interactive Screen Experiment (ISE) developed with piCETL funding for use by S104 students who are not able to conduct one of the course’s home experiments for themselves.
In 2006 I was awarded an Open University Teaching Award for my work in developing S151: Maths for Science and its interactive online assessment system. In 2010 I was one of the OU's nominees for a National Teaching Fellowship.
I have tutored various science and maths courses and have taken a few OU courses as a student. I live in West Norfolk with my husband; my children are both post-graduate students at (conventional) universities. My hobbies include walking long-distance footpaths and writing about this (see http://sites.google.com/site/jordanwalks/) and singing in a local choir.
Sally Jordan, Open University
S.E.Jordan@open.ac.uk
The primary aim of this project was to design, implement and evaluate a software tool for helping students learn and revise diagramming skills of entity-relationship diagrams (ERDs), a crucial component of any database course. The tool, if successful, would be employed on the Computing Department’s third level course database course, M359, from 2008. At the heart of the tool is an automatic marker for ERDs being developed as a separate research project within the Computing Department. The automatic marker not only awards a grade for a diagram but also provides information useful for feedback purposes.
During 2006, a revision tool known as ‘The Exerciser’ was designed and implemented. It has embedded within it a number of typical TMA/Exam questions which students can use to practice their data modelling skills. It was evaluated by 30 volunteer M358 students (M358 is an earlier database course that was replaced by M359 in 2007). The feedback from students was extremely positive, and many suggestions for improvement were made. The tool was revised and re-tested with the same students at about the time they were revising for their final examination and was found to be an improvement. All the students felt that the tool would be a useful addition to the course.
Following some minor modifications and bug-fixing, the tool was offered, for testing, in 2007 to the first cohort of M359 students. While there was a lot of interest, only about 15 students provided feedback. Nevertheless, the feedback was extremely useful because these students clearly had higher expectations of the tool’s capabilities than the M358 students. Once again, the tool was revised extensively, mainly by the addition of functionalities, and re-evaluated. The student feedback was again very positive. After some minor corrections, the tool was felt to be in a good enough state to be offered to students in 2008. As a pre-cursor, the tool is currently with LTS for testing and will be deployed for use in conjunction with TMA03 where ERDs are assessed.
A secondary aim of the project was to investigate the extent to which the diagramming tool could be applied to a wider range of diagrams than just ERDs. ERDs are an example of more general graph-based diagrams and it seemed reasonable to suppose that the general approach adopted in the tool would be applicable in other domains where graph-based diagrams are utilised to show associations between objects. In the event, only one other type of diagram, biological flow diagrams, was investigated. The revision tool was amended to deal with the slightly different notations used in biological flows and an example TMA question was coded. The experiment showed that the revision tool could be successfully applied in another domain provided that the notation could be adapted and the fundamental use of diagrams is to show associations between objects in that domain. Since the end of the project, the approach has been applied to sequence diagrams (another modelling technique in Computing).
Several papers have been published on the tool and the automatic marker and some interest external to the OU has been received. We are currently in discussion with a UKFE institution for use of the tool (one of the student testers recommended the use of the tool to her academic department).
Overall, this has been a very successful project. A useful tool has been developed and will be employed on an OU course. The feasibility of adapting the tool to other domains has also been established.
The 'Final Project Report' and other deliverables can be accessed from Related Resources on the right hand side of this page.
A selection of COLMSCT related dissemination activities to October 2008.
Thomas, P., Smith, N., Waugh, K. (2008) Automatically assessing graph-based diagrams. Learning, Media and Technology, Volume 33 Issue 3 pp 249-267
Thomas, P. G., K. Waugh, N. Smith (2007) Learning and automatically assessing graph-based diagrams. Paper presented at ALT-C: Beyond Control, Nottingham, UK, Sep 2007
Thomas, P. G., K. Waugh and N. Smith (2007) Computer Assisted Assessment of Diagrams. Paper presented at ITiCSE, Dundee, Scotland, June 2007
Thomas, P., K. Waugh, and N. Smith (2007) Tools for supporting the teaching and learning of data modelling. Paper presented at the ED-MEDIA International Conference, Vancouver, Canada, June, 2007
I am a Senior Lecturer in Faculty of Maths and Computing, and was formerly Dean of Faculty of Maths and Computing.
I led the MZX Group that researched, in the latter half of the 1990’s, teaching and learning via the Internet and supporting students and tutors in Internet hosted learning environments. This group performed the original research and development of the eTMA system now used by the OU. Until recently I was chair of the University’s Electronic Assessment Project Board that oversaw the implementation and development of the eTMA system.
In the last ten years I led the AESOP (An Electronic Student Observatory) project which examined how students learn to program at a distance, and the EAP (Electronic Assessment) project which followed on from the MZX work looking at electronic remote examinations, their production, presentation and marking.
Most recently I have been researching the automatic marking of free-form answers to examination questions in Computing. This has led to investigations into the automatic marking of diagrams (in examinations and assessments). I supervise a number of research students in this and related fields.
In terms of teaching, I am currently Chair of a postgraduate course in Requirements Engineering (M883) and prior to that I chaired of M301, Software Systems and their Development. My main teaching interests are in programming, software engineering and operating systems.
I currently hold the Mawby Visiting Fellowship at Kellogg College, Oxford where, with colleagues in the Department for Continuing Education, I am researching aspects of e-Learning in Computing.
Pete Thomas, COLMSCT Teaching Fellow
P.G.Thomas@open.ac.uk
From 2007, the OU requires its students to be on-line and it is also adopting the MOODLE VLE. This increases the feasiblilty of implementing electronic assessment. In order to improve retention on Level 1 Open University mathematics, the COLMSCT project involved piloting short interactive internet quizzes. The OU package “Open Mark” was used, enabling students to receive instant feedback, where as previously they had to wait days or weeks. Students are allowed several attempts at each question, with appropriate teaching feedback after each attempt. At the end of each quiz, alongside the mark, relevant study advice is given to the student, including references to appropriate course material. A hint facility was was also introduced for students who were unable to start a question. Open Mark has a variety of question types and is being integrated into MOODLE VLE and so will be open source.
Administrators can see all student attempts, and overall question mean scores. The statisitics can also help both in modifying questions and their feedback and for informing future initiatives. The quizzes have been evaluated using a user feedback question at the end of every quiz, the Open Mark administrators reports and video of current students “thinking aloud”, whilst attempting the quizzes.
User feedback suggests that the quizzes are enjoyable as well as helpful to student learning. It is hoped that this increased motivation and aid to learning will improve student retention.
Authoring and programming of quiz questions is time-consuming. However there is built-in variation, so that questions may appear in different guises (e.g. parameters take different values) for subsequent users and repeat attempts. Hence once authored the quizzes need little subsequent attention.
Thee is much interest in the Faculty in this project and several other course teams are starting to run pilots of their own.
The quizzes associated with preparatory work for the level 1 course MU120, will soon be available on Openlearn, the OU’s open content web-site.
The 'Final Project Report' and other deliverables can be accessed from Related Resources on the right hand side of this page.
Readers might like to try the following iCMAs:
A selection of COLMSCT related dissemination activities to October 2008.
This project has investigated the use of computer-aided assessment for checking and providing instantaneous feedback on questions requiring short free text answers, typically a sentence in length. The questions were initially written using software provided by Intelligent Assessment Technologies Ltd (IAT). This software uses natural language processing (NLP) techniques of information extraction, but an authoring tool is provided to shield the question author from the complexities of NLP. The IAT software was used with the Open University’s OpenMark e-assessment system, thus enabling students to be offered three attempts at each question with increasing feedback after each attempt. Feedback on incomplete and incorrect responses was provided from within the IAT authoring tool, using a flagging system developed during the project.
A bank of 82 questions was written, suitable for use on level 1 Science Faculty courses. Most of the questions were originally offered in a series of a purely-formative and optional interactive computer marked assignments (iCMAs) to students on S103 Discovering Science (final presentation started October 2007), and the answer matching was improved in the light of these students’ responses to the questions. Fifteen of the questions are now in use in regular lightly-weighted summative iCMAs on S104 Exploring Science (first presentation started February 2008) with a further 11 questions in use on the practice iCMAs of S154 Science Starts Here, and SXR103 Practising Science.
An example question is shown below. To try short-answer free-text questions for yourself click on 'useful links'.

After the initial training phase, we were able to write short answer free-text questions and appropriate answer matching with relative ease. The time spent on the initial writing of a short answer question and its answer matching has varied between a few minutes and several hours, depending on the complexity of the question. The time spent in amending the question and the answer matching in the light of student responses has been even more dependent on the complexity of the question, taking several days for some questions. The impressive features of IAT system’s answer matching include its ability to recognise negatives and double negatives and to cope with responses where the word order is significant.
A comparison of human and computer marking of short free-text student responses in summer 2008 (see 'Evaluation' and the related project 'eAssessment questions with short free text responses: using computational algorithms') showed that it was possible to obtain equally impressive figures for accuracy of answer-matching using OpenMark’s own algorithmically-based answer matching software, ‘PMatch’. By ‘algorithmically-based’ we mean that the answer-matching is based on a number of simple rules. However the answer matching is not simply looking for keywords – word-order often matters (‘The cat sat on the mat’ has a different meaning to ‘The mat sat on the cat’) and the presence or absence of negation (‘not’ etc.) can be important too. Most significantly, our answer matching is based on many thousands of responses from real students, who answer the questions in both expected and unexpected ways. Sometimes their answers have led us to refine the question itself in addition to improving the answer matching.
Since summer 2008, PMatch has been refined further and an additional spell-checker has been added. We have also conducted additional trials to ensure that PMatch really is providing accurate answer matching, and from summer 2010 all short-answer free-text questions will use PMatch. Transferring the answer-matching from one system has also enabled us to add additional targeted feedback and so to maximise the learning that students derive from these questions.
This project is one of several projects funded by the Open University’s Centres for Excellence in Teaching and Learning (CETLs) that have had direct impact on S104’s teaching and assessment strategies, illustrating the ability of the CETLs to enable effective practitioner-led research.
E-assessment enables feedback to be delivered instantaneously. This provides an opportunity for students to take immediate action to ‘close the gap’ between their current level and a reference point, and thus for the feedback to be effective (Ramaprasad, 1983; Sadler, 1989). However concern has been expressed that conventional e-assessment tasks can encourage a surface approach to learning (Scouller & Prosser, 1994; Gibbs, 2006).
Assessment items can be broadly classified as selected response (for example multiple-choice) or constructed response (for example short-answer). Short-answer constructed response items require the respondent to construct a response in natural language and to do so without the benefit of any prompts in the question. This implies a different form of cognitive processing and memory retrieval when compared with selected response items (Nicol, 2007). Short-answer constructed response items are highly valued in traditional paper-based assessment and learning, but have been almost completely absent from computer-based assessment due to limitations and perceived limitations in computerised marking technology.
Software for the marking of free-text answers
Perhaps the most well-known system for the e-assessment of free text is e-rater (Attali & Burstein, 2006), an automatic essay scoring system employing a holistic scoring approach. The system is able to correlate human reader scores with automatically extracted linguistic features, and provide an agreement rate of over 97% for domains where grading is concerned more with writing style than with content. A different technique which shows high promise is that of Latent Semantic Analysis (LSA) (Landauer, Foltz & Laham, 2003). LSA has been applied to essay grading, and high agreement levels obtained. These techniques are more suited to marking essays than short-answer questions, since they focus on metrics which broadly correlate with writing style, augmented with aggregate measures of vocabulary usage. Computerised marking of short-answer questions on the other hand, is concerned with marking for content above all else.
C-rater is a short-answer marking engine developed by Education Testing Service (ETS) (Leacock & Chodorow, 2003). The system represents correct (i.e. model) answers using ‘canonical representations’, which attempt to represent the knowledge contained within an answer, normalised for syntactic variations, pronoun references, morphological variations, and the use of synonyms. Reported agreement with human markers is of the order of 85%.
In the UK, Pulman and Sukkarieh (2005) used information extraction techniques for marking short-answer questions Their system can be configured in a ‘knowledge engineering’ mode, where the information extraction patterns are discovered by a human expert, and a ‘machine learning’ mode, where the patterns are learned by the software. The ‘knowledge engineering’ approach is more accurate and requires less training data but it requires considerable skill and time of a knowledge engineer.
The software developed by Intelligent Assessment Technologies and used by the Open University is most closely related to the system developed by Pulman and Sukkarieh, in that it borrows from information extraction techniques. The main strength of the IAT system is that it provides an authoring tool which enables a question author with no knowledge of natural language processing (NLP) to use the software.
E-assessment at the Open University : Use of IAT questions within OpenMark and Moodle
The Open University was set up in 1969 to provide degree programmes by distance learning. There are no formal entrance requirements for undergraduate courses. Students are usually adults and most are in employment and/or have other responsibilities, so study part-time. Most students are allocated to a personal tutor, but opportunities for face-to-face contact are limited, and although support by telephone is encouraged and increasing use is made of email and online forums, there remain a few students for whom communication with tutors is particularly difficult. Extensive use is made of tutor-marked assignments and tutors are encouraged to return these quickly, but in the lonely world of the distance learner, instantaneous feedback on online assessment tasks provides a way of simulating for the student ‘a tutor at their elbow’ (Ross, Jordan & Butcher, 2006, p.125). ‘Little and often’ assignments can be incorporated at regular intervals throughout the course, assisting students to allocate appropriate amounts of time and effort to the most important aspects of the course (Gibbs & Simpson, 2004). Finally, the Open University is making increasing use of e-learning, so e-assessment is a natural partner (Mackenzie, 2003), providing alignment of teaching and assessment modes (Gipps, 2005).
In order to provide Open University students with useful instantaneous feedback, the IAT short-answer questions are embedded within the OpenMark assessment system, which was developed by the Open University but is now open source. OpenMark provides a range of other question types allowing for the free-text entry of numbers, scientific units, simple algebraic expressions and single words as well as drag-and-drop, hotspot, multiple-choice and multiple-response questions. However the significant feature for the current project is OpenMark’s ability to provide students with multiple attempts at each question, with the amount of feedback increasing at each attempt. If the questions are used summatively, the mark awarded decreases after each attempt, but the presence of multiple attempts with increasing feedback remains a feature. Thus, even in summative use, the focus is on assessment for learning. At the first attempt an incorrect response will result in very brief feedback, designed to give the student the opportunity to correct their answer with the minimum of assistance. If the student’s response is still incorrect or incomplete at the second attempt, they will receive a more detailed hint, wherever possible tailored to the misunderstanding which has led to the error and with a reference to the course material. After a third unsuccessful attempt, or whenever a correct answer has been given, the student will receive a model answer.
The OpenMark e-assessment system sits within the Moodle virtual learning environment (Butcher, 2008). The Moodle Gradebook enables students to monitor their own progress, encouraging sustainable self-assessment practices (Boud, 2000). The tutor’s view of the Gradebook encourages dialogue between student and tutor (Nicol & Milligan, 2006).
More information about Intelligent Assessment Technologies Ltd. and their products can be viewed at http://www.intelligentassessment.com/
More information about the OpenMark eAssessment system and the range of question types available can be viewed at http://www.open.ac.uk/openmarkexamples/index.shtml
References
The evaluation of the project has included an investigation into student perceptions of questions of this type and their use of the feedback provided; a comparison of the computer's marking with that of human markers and a comparison with the marking of two algorithmically based computerised systems.
The questions have been well received by students, with around 90% of S103 students who completed a feedback question reporting that they enjoyed them and around 90% also reporting that they found the feedback useful. Six S103 students were observed attempting the questions in the Institute for Educational Technology (IET)’s usability laboratory. These students were observed to enter their answers in very different ways and whilst some made good use of the feedback provided, others appeared not really to engage with it. Responses to the questions in summative use have been found to be more likely to be correct, more likely to be expressed in a sentence and longer, with some excessively long responses in multiple sentences resulting in the imposition of a filter to disallow responses of more than 20 words. Students seem to engage with the feedback provided in a more meaningful way when the questions carry some weight; unfortunately many also become preoccupied with the minutiae of the marking.
The computer’s marking of responses to seven questions was compared with the marking of six S103 tutors. Chi-squared tests showed that for four of the seven questions, the marking of all the markers (including the computer system) was indistinguishable at the 1% level. For the other three questions, the markers were marking in a way that was significantly different. However in all cases, the mean mark allocated by the computer system was within the range of means allocated by the human markers. Where there were discrepancies in marking, the majority were as a result of the variation in the human marking. On some occasions one tutor consistently marked in a way that was different from the others; on other occasions an individual marked inconsistently. Divergence of human marking could frequently be attributed to insufficient detail in the marking guidelines or to uncertainty over whether to give credit for a partially correct solution. However, there were also some errors caused by slips and by poor subject knowledge or understanding. The marking of the computer system was in agreement with that of the question author for between 89.9% and 99.5% of the responses to the seven questions. Further improvements have been made to the answer matching since the human-computer marking comparison took place in June 2007, and in July 2008, the marking of a new batch of responses was found to be in agreement with the question author for between 97.5% and 99.6% of the responses. The most difficult responses for the IAT system (and indeed for any computer-based system) to match accurately are those that include both a correct and an incorrect answer.
A comparison with the marking of two algorithmically-based systems (OpenMark’s own response matching routine and Regular Expressions) led to the suprising result that OpenMark’s response matching routine, even in the hands of a relatively inexperienced undergraduate and for a relatively short period of time, appeared to be able to provide answer matching on a par with that developed with the assistance of the IAT authoring tool. OpenMark’s response matching routine is not a simple ‘bag of words’ system; it can cope with inaccuracies in spelling and with negation and the order and proximity of words can be specified. It is important to note that, whether a NLP-based system or an algorithmically-based system is used, the fact that responses from real students are used in developing the answer matching appears to be a significant feature in developing answer matching that is, generally, more accurate and reliable than human markers.
A selection of COLMSCT and piCETL related papers, presentations and workshops given by Sally Jordan, 2006-2010
Publications and external conference contributions
Contributions to internal (OU) conferences, meetings and workshops
A selection of COLMSCT related papers, presentations, workshops and talks given by Barbara Brockbank to April 2009.
I am a Staff Tutor in Science at the Open University's Regional Centre in Cambridge, with responsibility for the running of various science courses in the East of England. I am a member of the OU Department of Physics and Astronomy. I was lucky enough to hold teaching fellowships in two of the OU's Centres for Excellence in Teaching and Learning: piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) and COLMSCT (the Centre for Open Learning in Mathematics, Science, Computing and Technology).
Following the end of the OU CETLs in July 2010, I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
I was a member of the course teams producing S104 : Exploring Science (presented for the first time in Feb 2008) and S154 : Science Starts Here (presented for the first time in October 2007). I had responsibility for the development of maths skills in these courses and for the integration of interactive computer marked assignments using the Open University’s OpenMark eAssessment system. I am currently course team chair of S104 and from January 2011 I will be course team chair of S154. I have also been (off and on!) course team chair of S151 : Maths for science, which reflects my great interest in investigating ways to help students to cope with the maths they need in order to study physics and other science courses. I was also on the Course Team of SXR103 : Practising Science for many years.
My COLMSCT and piCETL projects reflect my interests in eAssessment and science students’ mathematical misconceptions. I have investigated the use of short free-text eAssessment questions (with instantaneous feedback, wherever possible tailored to student’s misconceptions) and have applied similar evaluation methodologies to other types of eAssessment tasks, to investigate ways in which iCMAs can most effectively be used to support student learning. Analysis of students’ responses to eAssessment questions provides an opportunity for teachers to learn more about student’s misunderstandings. I have used this methodology to learn more about science students’ mathematical misconceptions. I have developed a ‘Maths Skills eBook’ which is used as a resource on many Open University science courses and I have evaluated the effectiveness and student use of an Interactive Screen Experiment (ISE) developed with piCETL funding for use by S104 students who are not able to conduct one of the course’s home experiments for themselves.
In 2006 I was awarded an Open University Teaching Award for my work in developing S151: Maths for Science and its interactive online assessment system. In 2010 I was one of the OU's nominees for a National Teaching Fellowship.
I have tutored various science and maths courses and have taken a few OU courses as a student. I live in West Norfolk with my husband; my children are both post-graduate students at (conventional) universities. My hobbies include walking long-distance footpaths and writing about this (see http://sites.google.com/site/jordanwalks/) and singing in a local choir.
Elsewhere on this website Sally Jordan describes the creation and evaluation of a range of questions which were designed to require students to respond using short-answer free-text responses of up to 20 words. The resulting responses were marked by humans and a computational linguistics algorithm, Intelligent Assessment Technology’s Free Text system, and the accuracy of the outcomes compared. The comparison showed that the computer could perform at a level equivalent to human markers. This project extends this work to encompass two computer marking solutions which are more readily available.
Users of eAssessment systems contained in purpose built Computer-Assisted Assessment (CAA) systems or general purpose Virtual Learning Environments will be provided with some form of question type that provides facilities for matching short-answer free text responses. Typically these are quite limited in their aims and are most suited to matching a word or two with little account being taken of misspelling or synonyms. But gradually a range of more powerful response matching algorithms are becoming more widely available. These can be classified into two distinct groups, those that use some form of computational linguistics and those that are based on algorithmic manipulation of keywords.
Sally Jordan’s work with Intelligent Assessment Technology’s Free Text system was implemented through the Open University’s OpenMark CAA system and OpenMark enabled this project to undertake comparison studies with two alternative systems that use computational manipulation of keywords, OpenMark’s internal response matching algorithm PMatch and Java’s Regular Expressions.
OpenMark’s response matching algorithm originated in the Computer-Based Learning Unit of Leeds University in the 1970s and has been further developed in recent years as part of this COLMSCT project. A description of the pattern matching routine, PMatch, can be found under Documents on the right. Regular expressions are found in computer languages such as Java and PHP and recently Joseph Rezeau has provided the Moodle VLE with a question type which uses Regular Expressions. Both of these response matching systems rely on computational algorithms for their efficacy but these algorithms contain no knowledge of grammar or syntax.
This project used the data sets of student responses collected by Sally Jordan and employed a student to undertake comparative studies using OpenMark and Regular Expressions response matching. Early results on data sets of around 100-200 responses were surprisingly good with both the OpenMark and Regular Expressions algorithms achieving success rates that were on a par with the IAT system and with human markers.
The results have now been published. Please see Philip G Butcher and Sally E Jordan (2010), A comparison of human and computer marking of short free-text student responses, Computers & Education, Volume 55, Issue 2, September 2010, Pages 489-499.
An extended study has also been performed in parallel with Sally Jordan’s project using larger (>1,000) datasets of student responses. The results are in line with those reported in the Computers & Education paper.
A demonstration iCMA with responses to seven questions marked by OpenMark's response matching algorithm is available on the right.
I have worked in computer-based learning since 1974 and during that time have performed most roles encompassing programmer, author, systems designer, researcher, educational evaluator, manager and acting head of department. I wrote my first interactive CBL program at Leeds University in 1974, my first for the OU in 1975 and moved from Leeds to the OU in 1977. As well as overseeing the COLMSCT iCMA initiative I have also continued to maintain my software skills and on the free-text marking project I have been active both in enhancing the OpenMark response matching algorithm and using it to analyse students’ responses.
During my long, but varied, career at the OU, I have been on numerous course teams that have advanced the use of interactive computers in the OU’s teaching and learning practices, and 30 years on the pace of change doesn’t slacken. That the COLMSCT CETL has chosen to address online assessment as a major theme illustrates that increasingly sophisticated application of computers in education continues to make major in-roads into established practices.
Here are a few highlights that led to my current fellowship in COLMSCT. The Works Metallurgist (with Adrian Demaid for TS251) in 1980 was the first OU CAL program to use interactive graphics and The Teddy Bear Videodisc (with Keith Williams (Technology) and Martin Wright (BBC) for T252) in 1984 was the first to integrate full-screen video and interactive computing. In 1986 I joined the T102 Living with Technology, course team that introduced personal computing into OU first level courses in 1989; in 1996 T102 became the first non-ICT undergraduate course to utilise conferencing and e-mail on a large scale. From 1996 I managed the introduction of multimedia computing into the Science Faculty with S103 Discovering Science becoming the first course to replace paper-based CMAs with interactive questions delivered through a modern multimedia computer; this was the start of the development of the OpenMark interactive assessment system. In 2002 OpenMark was moved online to support S151 Maths for Science and since 2005 has been integrated ever more closely with the OU’s VLE developments. OpenMark is the system that has enabled the creation of the diverse COLMSCT iCMA initiative projects shown on this site.
I have an MPhil in Computer-Based Education from the University of Leeds. I guess I thought that this was ‘normal’ until I heard Tim O’Shea (also ex. Computer-Based Learning at Leeds) describe the impact of Leeds in this area and how, in our different ways, he and I had brought that approach to the OU. And having made me think about it I can see he has a point. In my formative CBL years it was made clear to me by Roger Hartley (then Director of the Leeds CBL Unit) that it’s not only the student who should work hard at CBL but also the authors and at run-time the computer. In recent years my role as the COLMSCT iCMA initiative coordinator has put me in a prime position to continue this tradition while at the same time helping my COLMSCT colleagues deliver their ideas across the internet.
A demonstration iCMA, showing seven of our questions as offered to students and marked by OpenMark's response matching algorithm is available for viewing at https://students.open.ac.uk/openmark/omdemo.pm2009/
For more information about OpenMark's pattern matching algorithm, PMatch, please see the entry on the right.
These links are only accessible to Open University staff.To view the full version of this project's final report please see: - http://learn.open.ac.uk/file.php/2448/PMatch_report/P_G_Butcher_PMatch_final_report_for_use_internal_to_the_OU.doc
To view the related published paper please see: - http://learn.open.ac.uk/file.php/2448/PMatch_report/CAE1579_published_paper.pdf
This project is funded by the E-Assessment for Learning Initiative. This work aimed to consolidate and improve students’ ability to work with UML diagrams, which forms a key aspect of the course M256 'Software development with Java' and of professional software development. It centred around the implementation of two short quizzes with interactive feedback, in which students would be required to interpret and amend/complete UML diagrams.
This work aimed to consolidate and improve students’ ability to work with UML diagrams, which forms a key aspect of the course M256 Software development with Java and of professional software development. It centred around the implementation of two short quizzes with interactive feedback, in which students would be required to interpret and amend/complete UML diagrams.
Implementation of the quizzes was not straightforward since they involve students ‘drawing’ on screen, and this was not explicitly supported by the technology (OpenMark). This meant that the quiz questions had to be less complex than originally envisaged. Nevertheless, the findings so far indicate that such quizzes play a very useful role as revision tools, and in consolidating understanding and measuring progress.
Readers might like to try the following iCMAs:
UML diagrams quiz 1
https://students.open.ac.uk/openmark/m256-09.quiz1world/
UML diagrams quiz 2
https://students.open.ac.uk/openmark/m256-09.quiz2world/
Sarah Mattingly, Faculty of Maths, Computing and Technology
S.L.Mattingly@open.ac.uk
The Initiative has been set up to bring together Faculties to accelerate adoption of online computer based assessment in their courses and programmes. The fundamental premise is that computer based assessment with feedback is an underused pedagogic strategy in the OU and that we should build internal awareness and academic capability in order to improve the learning opportunities we offer to students. Fuller academic engagement will underpin our aspirations to provide a national lead in this form of elearning.
The E-Assessment for Learning Initiative aims to bridge the gap between academic aspirations and systems development. It supports thirteen sub-projects from across the University. In each, academic staff undertake innovative e-assessment implementation projects within their own course context and work under the CETL and VLE umbrella of activities in order to explore and inform future practice and in particular the VLE.
An overview and review of the iCMA initiative
Phil Butcher, Coordinator of the iCMA initiative and COLMSCT Teaching Fellow
The interactive computer marked assessment (iCMA) initiative was conceived in 2005 on the premise that developing models of eAssessment had the potential to be used in a much wider role than had hitherto been possible. The initiative followed closely on, and has been enabled by, the 2005 upgrade to the university’s OpenMark CAA system. This upgrade had been built on modern internet technologies and provided a platform which was able to harness the full potential of a modern multimedia computer.
Sally Jordan, COLMSCT Teaching Fellow
This project, with funding from both COLMSCT and piCETL, has investigated the way in which students engage with interactive computer-marked assignments (iCMAs). This is a huge task and one that is far from complete. However this page gives a flavour of the methodology and reports some early findings.
Verina Waights and Ali Wyllie
Nurses are required to make clinical decisions about patients' health and well-being, responding to changes in each patient's condition, which may occur within very small time-frames
Arlene Hunter, COLMSCT Teaching Fellow
The primary aim of this project was to develop and then implement an online interactive formative assessment framework, designed from a constructivist and interventionist perspective that would promote student engagement and understanding of academic progression from an extrinsic as well as intrinsic perspective.
Ali Wyllie and Verina Waights, COLMSCT Teaching Fellows
This project set out to investigate the use of an online decision-making maze tool as a form of eAssessment task more motivating and relevant to practice-based students.
Michael Isherwood
A major concern of the Course Team of M150 is Block2 - JavaScript programming
Ian Cooke
To produce e-tutorial teaching and support modules for M150
Sally Jordan (COLMSCT Teaching Fellow) and Barbara Brockbank (COLMSCT Associate Teaching Fellow)
This project has investigated the use of computer-aided assessment for checking and providing instantaneous feedback on questions requiring short free text answers, typically a sentence in length. The questions were initially written using software provided by Intelligent Assessment Technologies Ltd (IAT). This software uses natural language processing (NLP) techniques of information extraction, but an authoring tool is provided to shield the question author from the complexities of NLP. The IAT software was used with the Open University’s OpenMark e-assessment system, thus enabling students to be offered three attempts at each question with increasing feedback after each attempt. Feedback on incomplete and incorrect responses was provided from within the IAT authoring tool, using a flagging system developed during the project.
Phil Butcher, COLMSCT Teaching Fellow
Elsewhere on this website Sally Jordan describes the creation and evaluation of a range of questions which were designed to require students to respond using short-answer free-text responses of up to 20 words. The resulting responses were marked by humans and a computational linguistics algorithm, Intelligent Assessment Technology’s Free Text system, and the accuracy of the outcomes compared. The comparison showed that the computer could perform at a level equivalent to human markers. This project extends this work to encompass two computer marking solutions which are more readily available.
Ken Hudson, COLMSCT Associate Teaching Fellow
The aim of this project is to explore the value of online virtual experiments in enhancing student understanding of course material, retention and employability skill. A 'virtual lab' is defined as an 'e-learning activity based on conventional laboratory procedures, but delivered on-line to distance learners, to give them a more real experience of biological material, procedures and applicability, normally absent from a paper-based course'.
Sarah Mattingly, Faculty of Maths, Computing and Technology
This project is funded by the E-Assessment for Learning Initiative. This work aimed to consolidate and improve students’ ability to work with UML diagrams, which forms a key aspect of the course M256 'Software development with Java' and of professional software development. It centred around the implementation of two short quizzes with interactive feedback, in which students would be required to interpret and amend/complete UML diagrams.
Sarah North, Faculty of Education and Language Studies
This project is being funded under the E-Assessment for learning Initiative. The aim of this project is to develop a framework for e-assessment that could be used across English language courses in general, to support learning outcomes related to the description, analysis, and interpretation of linguistic data.
Christine Leach
A significant number of students studying higher level chemistry courses have some difficulty with the mathematics. This impacts on their studies particularly in areas involving physical chemistry which are more mathematically-based.
Ingrid Nix (COLMSCT Teaching Fellow) and Ali Wyllie (COLMSCT Teaching Fellow)
This project aims to produce a design for a continuum of topic-based computer-marked questions, from easy to difficult and from formative to summative. Students can choose, depending on their self-assessment, which questions on the continuum to engage with and to log their reflections as they do so. Our research questions will focus on how students respond to this method of selecting their learning journey, whether students agree with the mapping of the continuum according to the typology of questions which we have devised, and what design improvements can be suggested.
Alistair Willis
This project is part of the COLMSCT investigations into using online questions which require students to respond with short answers in free text. In particular, this project is looking into how to set questions that require the students to make two or three separate points, and obtain credit for each subpart. This project is working closely with the e-assessment projects of Sally Jordan and Phil Butcher.
Judy Ekins, COLMSCT Teaching Fellow
From 2007, the OU requires its students to be on-line and it is also adopting the MOODLE VLE. This increases the feasiblilty of implementing electronic assessment. In order to improve retention on Level 1 Open University mathematics, the COLMSCT project involved piloting short interactive internet quizzes. The OU package “Open Mark” was used, enabling students to receive instant feedback, where as previously they had to wait days or weeks. Students are allowed several attempts at each question, with appropriate teaching feedback after each attempt. At the end of each quiz, alongside the mark, relevant study advice is given to the student, including references to appropriate course material. A hint facility was was also introduced for students who were unable to start a question. Open Mark has a variety of question types and is being integrated into MOODLE VLE and so will be open source.
The interactive computer marked assessment (iCMA) initiative was conceived in 2005 on the premise that developing models of eAssessment had the potential to be used in a much wider role than had hitherto been possible. The initiative followed closely on, and has been enabled by, the 2005 upgrade to the university’s OpenMark CAA system. This upgrade had been built on modern internet technologies and provided a platform which was able to harness the full potential of a modern multimedia computer.
The leadership of COLMSCT believed that if OU academics could be freed from the constraints of normal course production they could provide the innovation that would lead the university in helping to develop these new models of eAssessment. The projects that are reported on here have shown what can be achieved in a variety of subject areas. Four years on COLMSCT takes some pride in reporting that COLMSCT fellows have been to the fore in directing OU eAssessment developments and that it is the faculties that have had COLMSCT fellows that are in the vanguard of using eAssessment in OU courses.
COLMSCT offered fellows collaboration with specialists in pedagogy, educational research, and educational computing. COLMSCT also provided the resources to help specify and implement the assessments and to evaluate both the process of creating the assessments and the outcomes. In all thirteen projects were supported in Biology, Chemistry, Computing, Earth Science, General Science, Languages, Mathematics, and Nursing.
Please follow the section headings on the right of this screen to read more about the iCMA initiative. The Resource allows you to download a version to print.
The iCMA initiative is one of the main themes of work within the COLMSCT which is one of the Open University’s Centres for Excellence in Learning and Teaching (CETLs).
The Centre appointed its first fellows in 2005 and work will continue until 2010. Within the eAssessment strand fellows were able to ask questions that go beyond the bounds that constrain normal course production cycles. Foremost among these discussions was the general question of whether or not eAssessment was capable of assessing higher order learning in any meaningful way. To help think this through, a workshop was convened in late 2005 with invited experts at the forefront of eAssessment from other universities. One conclusion, arising from the combined educational and computing expertise of the discussants and the ‘what if’ enabling approach of the COLMSCT leadership, was that developing models of eAssessment had the potential to be used in a much wider role than had hitherto been possible. The challenge to COLMSCT was to establish projects to test the conclusion of the experts.
In March of 2006 COLMSCT issued a call for proposals from academic staff to “develop and evaluate innovative e-assessment projects within their own teaching context”. While the remit of COLMSCT was the Mathematics, Science, Computing and Technology areas the call was widened to the whole university with appropriate support from the university’s Learning and Teaching Office. The initiative specifically acknowledged the current gap between academic aspirations and the types of interactions commonly found within standard Computer-Based Assessment systems and encouraged proposals that went beyond those current boundaries. The iCMA initiative offered collaboration and resources to help specify and implement the assessment and to evaluate both the process of creating the assessment and the outcomes.
In the intervening years the iCMA initiative has grown to include projects that have delivered iCMAs for use in Biology, Chemistry, Computing, Earth Science, General Science, Languages, Mathematics, and Nursing. The projects are required to undertake the full project cycle from proposal, through specification and implementation, to evaluation with students and propagation of the outcomes within and outside the university.
The interactive questions on this site are run by the OpenMark assessment system which handles your access to a question as if you were taking a real assessment. However this site provides you with considerable flexibility as to how you access these demonstration questions and if you use all of this flexibility you may break some of the rules that are applied to more formal assessments.
One rule you may break is an 'Access out of sequence' error. If you see one of these errors, you now know why. Just click on the link provided to 'Return to the test'.
The major similarity between projects in the iCMA initiative is reflected in the name; all projects are attempting to engage students in an ‘interactive’ exchange around one or more learning outcomes, with the computer providing instant feedback and multiple attempts for students who answer incorrectly. The overall project is titled eAssessment for Learning and all projects include teaching feedback, often with course references, to persuade students to revisit topics where their answers are incorrect, before attempting the question again. For example see Figure 1 below.

Figure 1 An illustration of immediate targeted feedback
While the OU is not unique in using eAssessment in this way it was perhaps one of the first to realise how automation could be used to support students studying on their own away from tutors and peers. The university has been host to a variety of projects in this field stretching back to the 1970s. When the OU joined the Moodle community in 2005 the very first thing it added to the Moodle eAssessment tool was the ability to give a much wider variety of feedback. As prior to OU involvement Moodle was already the leading open-source VLE, here was evidence that the OU had given more thought than 10,000 other institutions worldwide on how Moodle’s eAssessment could be used to support the distant learner.
The importance of feedback for learning has been highlighted by a number of authors, emphasising its role in fostering meaningful interaction between student and instructional materials (Buchanan, 2000), its contribution to student development and retention (Yorke, 2001), but also its time-consuming nature for many academic staff (Gibbs, 2006). In distance education, where students work remotely from both peers and tutors, the practicalities of providing rapid, detailed and regular feedback on performance are vital issues.
Gibbs and Simpson suggest eleven conditions in which assessment supports student learning (Gibbs and Simpson 2004).
Four of these conditions, those in italics, are particularly apposite with regard to the use of eAssessment within distance education. They are reflected in the design of OpenMark and are amplified in the rationale behind the development of the S151, Maths for Science, online assessments (Ross, Jordan and Butcher, 2006) where
Readers might like to try this typical OpenMark question with instant feedback https://students.open.ac.uk/openmark/omdemo.text.q4. For non-scientists a response of the form '1s2', which is partially right, should give helpful feedback.
References
Buchanan, T. (2000) The efficacy of a World-Wide Web mediated formative assessment, Journal of Computer Assisted Learning, 16, 193-200
Gibbs, G. and Simpson, C. (2004), Conditions under which assessment supports students' learning. Learning and Teaching in Higher Education, 1, pp 3-31
Gibbs, G, (2006) Why assessment is changing, in C. Bryan and K. Clegg (eds), Innovative assessment in Higher Education, Routledge
Ross, S, Jordan, S and Butcher, P (2006), Online instantaneous and targeted feedback for remote learners, in C. Bryan and K. Clegg (eds), Innovative assessment in Higher Education, Routledge
Yorke, M. (2001) Formative assessment and its relevance to retention, Higher Education Research and Development, 20(2), 115-126
All of the iCMA projects wanted students to use and act on the instant feedback there and then while the problem is still in their mind; points 10 and 11 from the Gibbs and Simpson conditions in the previous section. And so the majority of questions were designed such that if the student’s first answer is incorrect, they can have an immediate second, or third, attempt. See Figure 2.

Figure 2 An illustration of three attempts at an interactive question
Readers might like to try this non-typical OpenMark question which allows up to 100 attempts (!) https://students.open.ac.uk/openmark/omdemo.twod.marker
Here is how the IMS Question and Test Interoperability (QTI) specification defines adaptive questions (items):
An adaptive item is an item that adapts its appearance, its scoring (Response Processing) or both in response to each of the candidate's attempts. For example, an adaptive item may start by prompting the candidate with a box for free-text entry but, on receiving an unsatisfactory answer, present a simple choice interaction instead and award fewer marks for subsequently identifying the correct response. Adaptivity allows authors to create items for use in formative situations which both help to guide candidates through a given task while also providing an outcome that takes into consideration their path.
Readers will see that by coupling feedback with multiple attempts we have much of what is described as an adaptive item. But as the excerpts from the following question show the iCMA project embraced all aspects of adaptive questions. We can report that the question features of OpenMark satisfactorily supported all the iCMA projects.

Figure 3a Initially students have to enter their own words into text-entry boxes

Figure 3b The correct response is locked and the student is asked to try again for the remainder

Figure 3c A second attempt

Figure 3d Now two correct responses are locked and the remaining questions become selection lists

Figure 3e On the third attempt the text-entry boxes from attempts 1 and 2 have been replaced by selection lists
Readers might like to try this question for themselves https://students.open.ac.uk/openmark/omdemo.adaptive.q1
It follows that as well as wishing students to work towards the correct answer for the original question then perhaps we should also provide more opportunities for the student to practice. This too has been supported by many of the iCMA projects.
What might look like a single question to the user often has several variations behind the scenes such that a student may revisit the iCMAs for further practice and receive variations on their original questions; in this respect the iCMAs resemble a patient tutor, correcting initial misunderstandings and providing further examples to reinforce the learning. None of the COLMSCT iCMAs is being used for summative purposes but if they were the in-built variability also counteracts plagiarism. Here are two such variations.

Figure 4a There are a variety of eruptions available to this question

Figure 4b Of course the response matching behind the scenes has to cope with the variety of eruptions too
Readers might like to try this example https://students.open.ac.uk/openmark/omdemo.maths.q2
Answer the question – you will get feedback to help you if you need it. And once you have completed the question, request 'Next question'. Do you see the same question? In fact this question has five variations so there is a 20% chance that you might see the same question repeated.
Across the iCMA initiative we have seen the creation of questions that use the capabilities of modern multimedia computers to display the problem and support interactions with the questions.
For example within Science there are several examples of ‘virtual microscopes’ having been put to imaginative use as teaching tools. Now it is possible to reuse the same idea within an iCMA with each of the views below corresponding to different levels of magnification (figures 5a-c).

Figure 5a Low resolution

Figure 5b Medium resolution

Figure 5c High resolution
And as this figure shows not only can the resources be varied but students can be asked to interact with them directly to show that they have understood what they are looking at (Figure 6 below).

Figure 6 Identifying a parasite on a microscope slide
Incorporation of online resources
The Clinical Decision Making Maze also challenges the student to interpret an array of online resources from audio interviews through data sheets to online databases thereby making the experience more akin to a real life nurse/patient consultation. These are configured to come up in different tabs of a tabbed browser leaving the question in the first tab. Try it here https://students.open.ac.uk/openmark/omdemo.mm.cdm
Compound questions are not unusual (see Figure 7 below). While these are more difficult for the author to analyse and comment on, they do provide students with more substantial tasks.

Figure 7 A compound question with multiple responses to be marked
Several authors are exploring how advances in computing technologies can be utilised in iCMAs. For example we know there is variation in how human markers mark written materials and we can ask how a computer might fare if asked to mark the wide range of student responses that such questions elicit. Jordan, Butcher and Brockbank have been exploring the application of both computational linguistics and computational algorithms to marking free form text (Figure 8).

Figure 8 Automatic marking of free-text responses
We have a demonstration test that contains six questions that require free-text responses. Readers are invited to try it by following this link https://students.open.ac.uk/openmark/omdemo.pm2009.
And Thomas has been exploring the automatic marking of diagrams. Students use a linked applet to draw their diagram which is then automatically marked and feedback in the normal OpenMark style.
The most striking differences have come about from the initiative’s venture beyond the Mathematics, Science and Technology fields that form the backbone of COLMSCT. In both Health and Social Care and Languages we have seen both different interactive activities devised and different forms of eAssessment created.
New question components
In its simplest form our work with Languages has resulted in the cost-effective development of a new OpenMark component that allows the selection of multiple words in a paragraph. In a symbiotic relationship the new component builds on existing OpenMark functionality and contributes a new question type into the larger pool. This example in figure 9 raises the question of whether other subjects might not devise their own specialised interactions.

Figure 9 The new OpenMark word selection component developed by COLMSCT
Readers may wish to explore their own knowledge of English grammar with this example https://students.open.ac.uk/openmark/omdemo.mcq.q6. Please note that this question is aimed at master’s level students and such students are expected to understand the reasons for their own errors, such that little teaching feedback is included.
Novel use of existing question components
But there are also novel uses of existing question interactions. For example the line interaction was created to enable mathematicians, scientists and technologists to draw tangents or best fit straight lines (Figure 10).

Figure 10 A user is drawing the line shown here
Consequently it was both a surprise and a delight to see how it might be used to help language learning (Figure 11).

Figure 11 And here is drawing a line to link words
Readers may wish to try a question similar to that shown in Figure 11 by following this link https://students.open.ac.uk/openmark/omdemo.twod.lineconnect
Different forms of eAssessment
While all of our iCMAs have relied on an assessment of knowledge, one, the Clinical Decision making Maze, has also followed the pathway that the student takes through the activity. With different responses leading to different pathways this is an example of an adaptive eAssessment. As such this provides a different form of experience to the sequences of unrelated questions found in many applications of eAssessment.
In looking back over the projects we have encountered some difficulties in supporting our fellows’ requests for adaptive testing. While OpenMark’s question features have coped very well with our fellows’ designs, the difficulty in sequencing questions as a result of student performance has been much harder to solve; indeed the iCMA initiative has circumvented this problem and not solved it.
The iCMA initiative offered collaboration and resources to fellows in three key areas: Pedagogic, Technical and Evaluation support.
Pedagogic support
We started by recognising that writing questions that explore students’ understanding of their subject is a skilled activity. Couple this with the wish to include feedback that enables students who respond incorrectly to correct their misunderstanding and the task grows, but so does the student engagement. While most fellows started with their own idea the iCMA coordinator and project consultants had many years of experience in creating interactive multimedia materials and could guide the fellows as to what might work and to steer them away from what it might be impossible to achieve. For example interactive eAssessment must react sensibly to student inputs and if the question is too ‘open’ this becomes impossible so setting the question and providing appropriate response matching are key starting points.
We would also include here the application of technology to support pedagogic ends. Our authors did not have to concern themselves with the implementation issues so that ‘difficult’ areas of implementation did not cloud their view of what they wished to do.
Technical support
All technical issues were undertaken by the project coordinator and two experienced consultants, Spencer Harben and Adam Gawronski, who were familiar with writing interactive computer-based materials. The consultants were able to guide the fellows on what forms of interaction could be supported and how different inputs might lead to a range of responses that would have to be dealt with.
Clearly questions should be functionally correct and the coordinator and consultants ensured that:
Evaluation support
Support was provided to evaluate both the process of creating the assessment and the outcomes of using the assessment with students. For the latter the fellows used a selection of data analysis of student responses collected by the eAssessment systems, online questionnaires, observation in the Institute of Educational Technology’s data-capture suite, online monitoring with Elluminate and one-to-one interviews.
All of the COLMSCT iCMAs were implemented in the OpenMark eAssessment system that was developed at the Open University http://www.open.ac.uk/openmarkexamples. The university has integrated OpenMark with Moodle and readers wishing to know more about this project are referred to our OpenLearn website http://labspace.open.ac.uk/course/view.php?id=3484.
The OpenMark philosophy
“There is already a trend towards a larger proportion of multiple-choice questions in British exams – a tendency often taken to extremes in the United States. This may be less a matter of academic merit than the convenience that these papers can be marked electronically. This is an outcome that should worry all those involved in education. The best test of a test is whether it stretches pupils, not that it is easy to mark” Editorial, The Times, 7th August 2004.
The raison d’être for OpenMark is to enable the Open University to produce interactive assessments that go beyond the bounds of the restricted range of question formats found in most commercial CAA systems.
And The OpenMark philosophy is also
• To exploit modern interactive computing technology for the purpose of enhancing the student learning experience.
• To provide an ‘open’, extensible, framework capable of supporting question types that go beyond multiple-choice.
• To support assessments that are engaging.
• To provide immediate feedback.
OpenMark is somewhat different from many CAA systems in the methods used for building questions. OpenMark questions and assessments that are to be run in the OpenMark system are constructed in an editor such as Eclipse using a combination of OpenMark XML components and Java. This combination enables OpenMark to be used very flexibly but it comes at the price of requiring authors to be comfortable when reading and writing Java.
Figure 12 illustrates how this works in practice with most output to the student being written in XML and most input from the student being analyzed in Java. Because the XML can be controlled by the Java it is also possible to introduce variations under computational control. The balance is that each technology is used for what it is good at; XML for specifying the content and laying out the web pages; Java for analyzing and making decisions on student responses.

Figure 12 OpenMark XML components are combined with small Java code segments which analyse responses and select the feedback that is to be given.
The open source site holding the OpenMark system also includes a variety of examples that show how the XML and Java work together. We would acknowledge that there is a learning curve to using OpenMark but multiple interactive media developers at the OU have risen up that curve and there are now thousands of OpenMark questions in regular use.
We would stress that COLMSCT set itself the task of pushing the boundaries and the flexibility of the OpenMark system has suited our purposes admirably.
The iCMA initiative started to deliver its first iCMAs in July 2006. The following graph (Figure 13) shows how the University’s use of iCMAs has increased during the lifetime of COLMSCT. We would not wish to claim that we are responsible for all of the increase over the period but we can be clear that COLMSCT fellows have been the leaders through their role in COLMSCT and in Course Team work in driving the upward trend shown. 
Figure 13 iCMAs served by the OU to July 2010
The figure shows total usage by year and includes diagnostic, formative and summative iCMAs.
COLMSCT has also provided the pedagogic underpinning behind the university’s development of its eAssessment systems. Sally Jordan and Pete Thomas were the academic advisers to the VLE eAssessment project, Ingrid Nix and Ali Wyllie were contributing members of the eAssessment Faculty Liaison Group and Phil Butcher was the VLE eAssessment project manager. These fellows were able to steer the development of Moodle towards more OpenMark like features with Phil Butcher providing many of the designs. The software developments have been undertaken by technical staff in the Strategic Development section of the Learning and Teaching Solutions department and are described in ‘Online assessment at the Open University using open source software’ in the Resources section of this site.
I have worked in computer-based learning since 1974 and during that time have performed most roles encompassing programmer, author, systems designer, researcher, educational evaluator, manager and acting head of department. I wrote my first interactive CBL program at Leeds University in 1974, my first for the OU in 1975 and moved from Leeds to the OU in 1977. As well as overseeing the COLMSCT iCMA initiative I have also continued to maintain my software skills and on the free-text marking project I have been active both in enhancing the OpenMark response matching algorithm and using it to analyse students’ responses.
During my long, but varied, career at the OU, I have been on numerous course teams that have advanced the use of interactive computers in the OU’s teaching and learning practices, and 30 years on the pace of change doesn’t slacken. That the COLMSCT CETL has chosen to address online assessment as a major theme illustrates that increasingly sophisticated application of computers in education continues to make major in-roads into established practices.
Here are a few highlights that led to my current fellowship in COLMSCT. The Works Metallurgist (with Adrian Demaid for TS251) in 1980 was the first OU CAL program to use interactive graphics and The Teddy Bear Videodisc (with Keith Williams (Technology) and Martin Wright (BBC) for T252) in 1984 was the first to integrate full-screen video and interactive computing. In 1986 I joined the T102 Living with Technology, course team that introduced personal computing into OU first level courses in 1989; in 1996 T102 became the first non-ICT undergraduate course to utilise conferencing and e-mail on a large scale. From 1996 I managed the introduction of multimedia computing into the Science Faculty with S103 Discovering Science becoming the first course to replace paper-based CMAs with interactive questions delivered through a modern multimedia computer; this was the start of the development of the OpenMark interactive assessment system. In 2002 OpenMark was moved online to support S151 Maths for Science and since 2005 has been integrated ever more closely with the OU’s VLE developments. OpenMark is the system that has enabled the creation of the diverse COLMSCT iCMA initiative projects shown on this site.
I have an MPhil in Computer-Based Education from the University of Leeds. I guess I thought that this was ‘normal’ until I heard Tim O’Shea (also ex. Computer-Based Learning at Leeds) describe the impact of Leeds in this area and how, in our different ways, he and I had brought that approach to the OU. And having made me think about it I can see he has a point. In my formative CBL years it was made clear to me by Roger Hartley (then Director of the Leeds CBL Unit) that it’s not only the student who should work hard at CBL but also the authors and at run-time the computer. In recent years my role as the COLMSCT iCMA initiative coordinator has put me in a prime position to continue this tradition while at the same time helping my COLMSCT colleagues deliver their ideas across the internet.
Phil Butcher, Coordinator of the iCMA initiative and COLMSCT Teaching Fellow
P.G.Butcher@open.ac.uk
This project, with funding from both COLMSCT and piCETL, has investigated the way in which students engage with interactive computer-marked assignments (iCMAs). This is a huge task and one that is far from complete. However this page gives a flavour of the methodology and reports some early findings.
Note: This website will be frozen from August 2010, but I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
The project’s methodology has included an extensive analysis of the quantitative data captured when students attempt iCMAs. Tools have been produced to extract information on things such as the time spent per question, the order in which the questions are attempted, the use made of the iCMA relative to any deadlines and other aspects of the course’s calendar, the percentage of blank responses and the extent to which responses are altered in response to feedback. In addition to compiling an overall picture of the use made of each iCMA, the influence has been investigated of various factors that might affect use. For example, to what extent does the presence of an examination encourage students to use formative-only iCMAs? The quantitative analysis has been supplemented by a qualitative investigation into student perceptions of iCMAs in different situations.
A range of courses using iCMAs were chosen for inclusion in the project, including S104 Exploring Science, S154 Science Starts Here, S110 Health sciences in practice, S279 Our Dynamic Planet, S342 Physical Chemistry, SK123 Understanding cancers, SM358 The Quantum World, SXR103 Practising Science, SXR208 Observing the Universe and M150 Data, Computing and Information. Diagnostic quizzes such as ‘Are you ready for Level 1 Science?’ were also included. Courses and components were selected so as to encompass a wide range of different types of iCMA use, though the initial analysis focused on OpenMark iCMAs (since the tools were written to extract information from the OpenMark database) and on Science Faculty courses.
Follow the links on the right-hand side of this page for more information about when students attempt iCMAs, the number of questions attempted, the length of short-answer free-text questions, the use made of feedback and student opinion of iCMAs. The overall conclusions to date are
A related project has considered appropriate statistical tools for the analysis of OpenMark iCMAs, looking at the level of the whole quiz, individual questions and separate variants of a question. Random guess scores for a range of Moodle and OpenMark question types have also been calculated. For more information on this work follow the appropriate links on the right-hand side of this page or see the two documents written by Helen Jordan.
Reviews of the literature (e.g. Black and Wiliam, 1998; Gibbs and Simpson, 2004) have identified conditions under which assessment appears to support and encourage learning. Several of these conditions concern feedback, but the provision of feedback does not in itself lead to learning. Sadler (1989) argues that in order for feedback to be effective, action must be taken to close the gap between the student’s current level of understanding and the level expected by the teacher. It follows that, in order for assessment to be effective, feedback must not only be provided, but also understood by the student and acted on in a timely fashion.
These points are incorporated into five of Gibbs and Simpson’s (2004) eleven conditions under which assessment supports learning:
Condition 4: Sufficient feedback is provided, both often enough and in enough detail;
Condition 6: The feedback is timely in that it is received by students while it still matters to them and in time for them to pay attention to further learning or receive further assistance;
Condition 8: Feedback is appropriate, in relation to students’ understanding of what they are supposed to be doing;
Condition 9: Feedback is received and attended to;
Condition 11: Feedback is acted upon by the student.
It can be difficult and expensive to provide students with sufficient feedback (Condition 4), especially in a distance-learning environment, where opportunities for informal discussion are limited. Feedback on tutor-marked assignments is useful but may be received too late to be useful (Condition 6) and it is then difficult for students to understand and act upon it (Conditions 8 and 10), even assuming that they do more than glance at the mark awarded (Condition 9).
One possible solution to these dilemmas is to use e-assessment. Feedback can be tailored to students’ misconceptions and delivered instantaneously and, provided the assessment system is carefully chosen and set-up, students can be given an opportunity to learn from the feedback whilst it is still fresh in their minds, by immediately attempting a similar question or the same question for a second time, thus closing the feedback loop. Distance learners are no longer disadvantaged — indeed the system can emulate a tutor at the student’s elbow (Ross et al., 2006, p.125) — and ‘little and often’ assessments can be incorporated at regular intervals throughout the course, bringing the additional benefits of assisting students to pace their study and to engage actively with the learning process, thus encouraging retention. For high-population courses, e-assessment can also deliver savings of cost and effort. Finally, e-assessment is the natural partner to the growth industry of e-learning.
However opinions of e-assessment are mixed and evidence for its effectiveness is inconclusive; indeed e-assessment is sometimes perceived as having a negative effect on learning (Gibbs, 2006). Murphy (2008) reports that high stakes multiple-choice tests of writing can lead to actual writing beginning to disappear from the curriculum; she also reports that ‘the curriculum begins to take the form of the test’. There are more widely voiced concerns that e-assessment tasks (predominantly but not exclusively multiple-choice) can encourage memorisation and factual recall and lead to surface-learning, far removed from the tasks that will be required of the learners in the real world (Nicol, 2007: Scouller and Prosser, 1994). Also, although multiple-choice questions are in some senses very reliable, doubt has been expressed that they may not always be assessing what the teacher believes that they are, partly because multiple-choice questions require ‘the recognition of the answer rather than the construction of a response’ (Nicol, 2007)
Ashton and her colleagues (2006) point out that the debate about the effectiveness of multiple-choice questions ‘diverts focus away from many of the key benefits that online assessment offers to learning’. Perhaps the question we should be asking is not ‘should we be using e-assessment?’ but rather ‘what are the features of an effective e-assessment system?’ (Mackenzie, 2003).
References
Ashton, H.S., Beevers, C.E., Milligan, C.D., Schofield, D.K., Thomas, R.C. and Youngson, M.A. (2006). Moving beyond objective testing in online assessment, in S.C. Howell and M. Hricko (eds) Online assessment and measurement: case studies from higher education, K-12 and corporate. Hershey, PA: Information Science Publishing: 116-127.
Black, P. and Wiliam, D. (1998) Assessment and classroom learning, Assessment in Education, 5, 1, 7-74.
Gibbs, G. (2006). Why assessment is changing. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp11-22.
Mackenzie, D. (2003). Assessment for e-learning : what are the features of an ideal e-assessment system?. 7th International CAA Conference, Loughborough, UK. At http://www.caaconference.com/pastConferences/2003/procedings/index.asp
Murphy, S. (2008) Some consequences of writing assessment, in A. Havnes and L. McDowell (eds) Balancing Dilemmas in Assessment and Learning in Contemporary Education. London: Routledge:33-49.
Nicol, D.J. (2007). E-assessment by design: using multiple choice tests to good effect. Journal of Further and Higher Education, 31, 1, 53–64.
Ross, S.M., Jordan, S.E & Butcher, P.G. (2006). Online instantaneous and targeted feedback for remote learners. In C. Bryan & K.V. Clegg, K.V. (Eds), Innovative assessment in higher education. London: Routledge. pp123-131.
Sadler, D.R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119–144.
Scouller, K.M. & Prosser, M. (1994). Students’ experiences of studying for multiple choice question examinations. Studies in Higher Education, 19, 3, 267–279.
Figure 1 shows usefulness of apparently very simply data, in this case the overall number of actions (transactions) per day on the diagnostic quiz 'Are you ready for level 1 science?'. A cyclical pattern can be seen in the figure and careful inspection reveals an unexpected finding, namely that potential students are less likely to use the quiz on a Saturday.
Figure 1 Action levels (the number of transactions for all users) for 'Are you ready for level 1 science' on a day-by-day basis for the whole of 2008.
This finding sparked an investigation into the average daily use made of the Open University's virtual learning environment as a whole, and it was found that the number of registered students using the VLE is greatest on Monday, declines steadily to Friday, drops more noticeably on Saturday before rising a little on Sunday. For a distance-learning University which holds many tutorials on a Saturday (on the basis of the fact that this is likely to be the most convenient day for students) this finding has startling implications. Within each day, the rate of iCMA transactions is remarkably steady between 9am and 9pm, then drops overnight. An increase in the global presentation of Open University courses is likely to result in a change in this pattern.
Figures 2(b) and 2(c) illustrate the influence of cut-off date on the overall activity for summative iCMAs, in this case for S154 Science starts here. Use builds as the cut-off date approaches. The practice iCMA is used throughout the entire 10-week presentation (Figure 2(a)).
Figure 2 Action levels (total number of interactions for all students) for (a) the S154 practice iCMA for the presentation that started on 27th September 2008 and (b) and (c) the two summative iCMAs (iCMA41 and iCMA42) for the same presentation.
SDK125 Health sciences : a case study approach has shorter practice iCMAs associated with each of its summative iCMAs; it also has an end-of-course examination. Whilst the activity on its summative iCMAs (Figures 3(b) and 3(d) is similar to that for S154, it can be seen (Figures 3(a) and 3(c)) that students attempt the formative iCMAs both as practice for the relevant summative iCMAs and as revision prior to the examination.
Figure 3 Action levels (total number of interactions for all students) for (a) and (c) two of SDK125’s practice iCMAs; (b) and (d) the equivalent summative iCMAs.
Figures 4(a) and 4(b), again drawn using S154 data, illustrate typical behaviours for individual students on all summative iCMAs. Many students behave like Student 1; they open the iCMA and attempt all 10 questions on a single day, frequently close to the cut-off date (Figure 4(a)). The behaviour shown in Figure 4(b) (Student 2) is similarly common for all summative iCMAs – students open the iCMA and look at all the questions, then they attempt them as and when they are able to as they work through the course material, using feedback from unsuccessful attempts at the iCMA questions. The behaviour shown in Figure 4(c) (Student 3) is quite typical for S154 but is not seen for other modules. Students are advised to attempt questions 1-4 after completing Chapter 2, questions 5-6 after Chapter 3 and questions 7-10 after Chapter 4, and many do precisely this.
Figure 4 Three typical student behaviours exhibited on S154 summative iCMAs.
Not surprisingly, when iCMAs are used summatively, most students complete all the questions. In formative-only use, there is typically a reduction in use as the iCMA progresses, as shown in the figure for the Practice iCMA for S154 Science starts here. As the iCMA progresses, there is both a decrease in the number of people who have completed each question (dark blue lines) and a decrease in the extent to which users repeat questions (paler lines).
It has been suggested that this decline in use is caused by having too many questions, but a similar decline is seen for courses with several shorter formative-only iCMAs; use decreases both during each iCMA and from iCMA to iCMA. Furthermore, there are always some users who access iCMAs, but do not complete any questions. The iCMA whose use is shown in the figure (with 640 users completing Question 1 and 668 completing Question 2), was opened by 768 registered S154 students, and it appears that around 100 of these people did not take further action. In interviews, most students were happy to admit to a lower level of engagement with iCMAs when in formative-only use, but none admitted to failing to complete any questions after opening an iCMA, so the reason for this behaviour is not known (though it is speculated to occur when students decide for themselves that the questions are either trivial or too difficult).
The general question-by-question reduction in use is bucked in several places, e.g. at Question 43 and Question 46. These questions are clearly identified in the iCMA as being the first questions to assess new chapters of the course, presumably chapters that students find challenging and so seek additional practice and reinforcement. Thus, clear signposting appears to be beneficial.
Student responses to short-answer free-text questions in summative use have generally been found to be more likely to be correct, more likely to be expressed as sentences and longer than responses to the same questions in formative-only use.
Figures 1 and 2 compare the length of responses obtained to Question A: 'A snowflake falls vertically with constant speed. What does this tell you about the forces acting on the snowflake'.
Figure 1 Distribution of length of 888 responses to Question A in formative-only use.
Figure 2 Distribution of length of 2057 responses to Question A in summative use.
In formative-only use (Figure 1) the peak at one word corresponds to the answer ‘balanced’ and the peak at 3 words corresponds to the answers such as ‘they are balanced’ and ‘equal and opposite’. In summative use (Figure 2) the peak at 4 words corresponds to answers such as ‘the forces are balanced’ and the peak at 8 words corresponds to answers such as ‘the forces acting on the snowflake are balanced’ and ‘there are no unbalanced forces acting on it’.
It is quite common for the distribution of lengths to be bimodal; in other questions there is sometimes a peak for answers that simply answer the question (e.g. ‘the force is reduced by a factor of four’) and another for answers that add an explanation (e.g. ‘the force is reduced by a factor of four since it depends on dividing by the square of the separation’).
Unfortunately some excessively long responses were received (up to several hundred words) to early summative versions of short-answer free text questions, and these frequently contained a correct answer within an incorrect one. Responses of this type are recognised as being the most difficult for computerised marking systems of any type to deal with and for this reason, from February 2009, a filter has been introduced to limit the length of responses to 20 words. Students who give a longer answer are told that their answer is too long and are given an extra attempt. The filter was initially accompanied by the warning 'You should give your answer as a short phrase or sentence. Answers of more than 20 words will not be accepted'
The introduction of the filter and explanatory text reduced the number of students who added text to previous answers without thought to the sense of the response so produced. It also dealt with the excessively long responses that were difficult to mark, and increased the number of students giving their responses as sentences. However, for all questions, the addition of the filter and explanatory text resulted in an overall increase in median response length (see the distribution shown in Figure 3).
Figure 3 Distribution of length of 1991 responses to Question A in summative use with filter and additional wording on the question.
A possible explanation of this effect is that more students were heeding the advice to give their answer as a sentence, now that this advice was given in the question. A less desirable explanation is that students were interpreting the advice to use no more than 20 words as indicating that they should be writing exactly or almost twenty words. From July 2009, the advice accompanying the filter has been changed to ‘You should give your answer as a short phrase or sentence.’ Studeints are still given an extra attempt if their answer is too long, and it is only at this stage that they are informed that their answer must be no more than 20 words long. This second change of wording appears to have had the desired effect - the most recent distribution still has peaks at 4 words and 8 words, but there are considerably fewer responses close to the 20-word limit.
Similar effects were observed for all questions and the undesiable effect of a change in wording that was intended to be helpful is a useful reminder of the need to monitor student responses to e-assessment questions.
When asked in end-of-iCMA feedback questions whether they found the feedback useful, the number of students who respond in the affirmative is consistently around 90%. Similarly, 85% of surveyed S104 students agreed with the statement ‘If I get the answer to an iCMA question wrong, the computer-generated feedback is helpful’ and 85% of surveyed SXR103 students agreed with the statement ‘If I initially get an answer to an iCMA question wrong, the hints enable me to get the correct answer at a later attempt’.
Observations of students in the usability laboratory painted a rather different picture. Whilst students sometimes made good use of the various aspects of the feedback provided in altering their answer for a subsequent attempt (making use of the simple fact of being told they were wrong, the more detailed prompt and the reference to the course material), there were also several instances where students did not pay sufficient attention to the feedback even when they appeared to read it. For example, student Christina* entered ‘absorbing a photon’ in answer to a question for which a correct answer would have referred to the emission not absorption of a photon. Her answer was incorrect, but a software problem (later resolved) meant that she was told her answer was correct. She looked at the final answer provided by the iCMA question and said ‘oh, did that right’ despite the fact that the word ‘emission’ was clearly visible, emboldened, on the screen she was looking at.
In an attempt to investigate factors that might influence the use that students make of feedback, an analysis was performed into the extent to which incorrect responses are left unchanged for a second or third attempt after feedback has been provided and the extent to which the data-entry box is left blank. Both of these types of behaviour are more common for free-text questions than for selected response items and for questions that the student considers more difficult.
Whether the iCMA is summative, formative-only or diagnostic is also an influential factor. For variants of the same question (a question requiring the student to use a provided word equation to calculate density): in summative use, 21% of the third-attempt responses were identical to those given previously with 2% of them blank; in formative-only use, 46% of the third-attempt responses were identical to those given previously with 7% of them blank; in diagnostic use, 55% of the third attempt responses were identical to those given previously with 19% of them blank.
Interviews identified a reluctance to spend time on a question (e.g. finding a calculator) when the mark did not count as a factor behind this result. In addition, students who have a complete lack of understanding of the question or feedback feel unable to enter an answer in the first place or to alter it. As Trevor said:
I found that the hint in the second attempt was absolutely uninformative and I couldn’t see where I was wrong.
*All names have been changed
In general, students regard iCMAs as useful in helping them to learn and in highlighting what they need to study further, with 87% of respondents agreeing with the statement 'Answering iCMA questions helps me to learn' , 78% agreeing that 'Answering iCMA questions directly helps me to learn skills or knowledge' and 79% agreeing that 'Answering iCMA questions helps me to understand what I need to study further. 64-68% agreed that 'answering iCMA questions is fun'. These findings are substantiated by free-text survey comments from students such as
Interviewed student Martin* felt the iCMAs were particularly useful because he was studying at home and without easy access to a tutor. Although most students felt that TMAs (tutor-marked assignments) were more useful in their learning (agreeing with 'I learn more by doing TMA questions that by doing iCMA questions' and 'The feedback that my tutor provides on my TMA answers is more helpful than the feedback provided by the computer on my iCMA answers' , a large percentage were neutral in their response to these statements and some felt that iCMAs were more useful. When this point was followed-up in the interviews, most people identified iCMAs and TMAs as useful for different things. Rachel said that she would be happy with courses assessed entirely by iCMA. Deborah (whose course did not have iCMAs) highlighted the importance of the timeliness of iCMA feedback to
The instantaneous receipt of feedback was the most commonly identified useful feature of iCMAs, with one student contrasting iCMAs with computer-marked assignments (CMAs) in earlier modules, which were submitted and returned through the post, and thus
Other features of iCMAs that were identified as particularly useful included the availability of three attempts, the content of the feedback prompts and the references to course materials. Trevor was pleased that the questions were relatively testing:
However it should not be forgotten that a small number of students do not find iCMAs helpful or enjoyable, perhaps linked to the fact that some (rightly or wrongly) believe that the computer sometimes marks their answers inaccurately (17% agreed with the statement 'The computer sometimes marks my iCMA answers incorrecly') or penalises them for careless mistakes (22% agreed with the statement 'The computer penalises me for careless mistakes').
A decision was taken to interview some students whose survey responses had indicated some disquiet with iCMAs. Patricia had said:
Steven had said:
However, significantly, it transpired during the interviews that both of these students were extremely happy with iCMAs in general, just not with particular questions or with aspects of their use.
Most students felt that their mark for iCMA questions should count towards their overall course score (71% disagreed with the statement 'I don't think that the mark I get in iCMAs should count towards my overall course score' ), and those interviewed felt that the 20% weighting in S104 was about right. It proved impossible to interview any students who felt that the mark they got in iCMAs should not count towards their overall score. There was nevertheless a difference in the reported influence of marks on behaviour, with two extremes being Martin, who reported engaging in summative and formative-only iCMAs in exactly the same way and Trevor (who had not attempted SXR103’s purely formative-iCMA) who said:
* all names have been changed
Computer marked assignments (CMAs) have been used throughout the Open University’s 40-year history. The original multiple-choice CMA questions were delivered to students on paper; responses were entered on OMR forms and submitted by post. A range of statistical tools has been in operation for many years, enabling course team members to satisfy themselves of the validity of individual questions and whole CMAs; these tools were also designed to improve the quality of future CMAs by enabling question-setters to observe student performance on their previous efforts.
The introduction of online interactive computer marked assignments (iCMAs) has extended the range of e-assessment task that is available to course teams. Equivalent statistical tools to those used in the original CMA system are now available for both Moodle and OpenMark iCMAs. However iCMAs are different from CMAs in three ways:
A recent graduate of mathematics, with an interest in statistics and probability, was employed on a consultancy contract in the summer of 2009, to investigate whether the previously used statistical tests were valid for iCMAs. She found no over-riding reason to doubt the validity of any of the tests, though the usefulness of some of them was subject to question and the different scoring mechanism for iCMA questions (linked to multiple attempts) meant that the recommended ranges for the test statistics were likely to be different for iCMAs. The tests were run against several iCMAs and the consultant worked with Sally Jordan (as chair of S104 Exploring Science) to recommend new empirically-based ranges. She also recommended some alternative and additional statistical tests that might be of use in alerting course teams to discrepant behaviour of iCMAs, individual questions and particular variants.
The work is described in more detail in a document by Helen Jordan (given on the right hand side of this page). A summary is given in the following sub-pages, along with a reflection on the outcomes of running the tools against several additional ‘unseen’ iCMAs.
The previously calculated statistics were:
Mean, median, standard deviation, skewness, kurotosis, coefficient of internal consistency, error ratio and standard error.
For iCMAs, the recommendations are:
Course teams should be told the mean, median, and standard deviation (with recommended ranges for mean and standard deviation).
Course teams should be told the standard error (calculated according to a simplified definition and with a recommended range) along with a new statistic, the ‘systematic standard deviation’. If the standard error is too high, then students of a similar ability are likely to get significantly different marks; if the systematic standard deviation is too low then the iCMA does not discriminate well between students of different abilities.
One of the common causes of a high standard error is that there are too few questions in the iCMA. However, in many cases, the iCMA under consideration forms only part of the assessment of a given student. In these cases, it does not really matter if the standard error of one particular iCMA is relatively high, provided that the standard error of the combined assessments is low. To counter the effect of the number of questions, it may also be useful to quote the standard error multiplied by the square root of the number of questions (again with a recommended range).
Coefficient of internal consistency and error ratio should not be quoted, since these are related to the standard error and the standard deviation and so do not add any further information.
Course teams should not be told the skewness or kurtosis since these are used to determine whether data seems to be normally distributed, but here we already know that the overall scores are not normally distributed. Instead, course teams are advised to look at histograms showing the overall distribution of student scores. The upper histogram in Figure 1 shows a reasonable distribution of scores whilst the lower histogram illustrates a situation in which a large number of students have very high scores.
Figure 1 Two histograms demonstrating different behaviours
Course teams are also advised to look at the proportion of students scoring 0, 1, 2 or 3 marks per question. If too many students either get questions right at the first attempt or not at all, then the students did not gain much from being allowed multiple attempts. This may suggest that the feedback in this iCMA is not very useful to students.
The previously calculated statistics were: Facility index (mean), standard deviation, intended and effective weight, discrimination index and discrimination efficiency.
For iCMA questions, the recommendations are:
Course teams should be told the facility index, standard deviation, intended and effective weight and discrimination index of each question, with recommended ranges for facility index and discrimination index.
The facility index is simply the mean score achieved on the question. We recommend using coloured highlighting to alert course teams to questions with facility indices in the follow ranges:
| 95%-100% | Extremely easy |
| 90%-95% | Very easy |
| 85%-90% | Easy |
| <40% | Difficult |
Course teams should consider questions highlighted in this way, in the context of the other questions in the iCMA and the purpose, timing and weighting of the iCMA.
The standard deviation indicates the spread of students’ scores about the mean for the question. A low standard deviation might indicate that the question is not effective in discriminating between students.
The intended weight is the maximum mark it is possible to score on a question; the effective weight shows how much the question actually contributes to the spread of scores.
The discrimination index shows how well correlated the student’s scores on each question are with their scores on the rest of the test. It ranges between –100 and 100, though a question with a negative discrimination index would be rather strange, since it would indicate that students who scored low marks on this question scored high marks on the rest of the test, and vice versa. The larger the value of discrimination index, the better this question is as a predictor of how the students will do on the rest of the test. At first sight, it might seem that the higher the discrimination index, the better. Indeed, questions with a high discrimination index will tend to substantially lower the standard error for the iCMA, or raise the systematic standard deviation, which has to be a good thing. However if we consider the extreme case where all the questions are the same, we would expect all questions to have a discrimination index very close to 100, but we might just as well have had only one question on the test. Likewise, if we had a test where all but one question was the same, then the different question would probably have had a low discrimination index.
Again, it is very helpful to look at the proportion of responses to each question which have been marked 0, 1, 2 or 3. The behaviour can be quite extreme, even in questions which appear to be behaving well according to other measures, and different questions can be behaving quite differently even thought they have similar facility indices; the questions represented in the two right-hand plots of Figure 2 both have a facility index around 58%.
Figure 2 The proportion of responses marked 0, 1, 2 or 3 for three iCMA questions.
If a higher proportion of students are scoring 0 than are scoring 1 or 2, this may indicate that students do not understand the feedback provided sufficiently well to enable them to correct their previous attempt at the question. Course teams should be alerted when questions exhibit this behaviour.
A range of tools is now available to determine whether or not all the variants of a question are of equivalent difficulty.
Figure 3 The proportion of students scoring 0, 1, 2 or 3 for each variant of two iCMA questions.
The three tools described above should be used in conjunction with each other. In particular, if you reject the null hypothesis, it is helpful to consider the plots in order to see which variants are causing the problem. The upper plot in Figure 3 is for a question with 7 variants of very similar difficulty (p = 0.97) whereas the lower plot is for a question in which some variants appear to be behaving in differing ways (a view confirmed by the fact that p=0.0045). For this question, variant 3 (facility index 77.78%) appears to be easier than the other variants and variant 2 (facility index 66.30%) is more difficult. Variant 2 of this question is shown in Figure 4.
Figure 4 Variant 2 of the question whose behaviour is illustrated in Figure 3.
Inspection of the actual student responses to all variants quickly alerted the course team to the cause of the problem: the table that students use in the course material refers from mRNA codon to amino acid, but students are frequently looking up the anticodon instead. In the case of variant 3, this results in a stop codon not an amino acid, so presumably students realise they have done something wrong (thus the variant is easier than the others). In the case of variant 2, this results in Gln, which many students misread as Gin – so when they are told that their answer is incorrect they start by correcting Gin to Gln rather than looking for a more fundamental mistake. This causes variant 2 to be more difficult than the other variants.
For the future, variants 2 and 3 will be removed and targeted feedback will be added to all variants for answers that give amino acids found by looking up the anticodon rather than the codon.
The process of investigation followed for this question illustrates the importance of looking at the actual student responses when a problem has been identified by the statistical tools.
Which question type is the most difficult?
Figure 5 shows the performance of students on a variety of different question types across an entire presentation of S104 Exploring science.
Figure 5 The proportions of students scoring 0, 1, 2 or 3 marks per question in various types of question in use in S104.
There are some surprises – note in particular the relative ease of short answer free text questions (described as ‘Free Text’ in Figure 5) despite the fact that these require the respondent to construct a response in natural language and to do so without the benefit of any prompts in the question. This implies a different form of cognitive processing and memory retrieval when compared with selected response items1 (multiple choice, multiple response and drag and drop). The least well scoring questions are those that require students to enter their answer as a number with appropriate units. This finding has been reported previously2 and relates to the fact that many students struggle to work out the appropriate units; it is not in the main caused by a simple omission of units.
What causes variants to be of differing difficulty?
Provided care is taken, it is usually possible to write several equivalent variants of questions that require numerical answers. So no significant differences were found between the variants for questions in use in S151 Maths for Science (though some variants have already been removed following the realisation that they were lower or higher scoring than others).‘Taking care’ in this context means, for example, that:
In addition, the following have been shown to result in variants of questions being of significantly different difficulty:
It can be extremely difficult to write non-numerical questions of equivalent difficulty, with multiple choice and multiple response questions causing a lot of difficulty (some answers are more obviously correct or incorrect than others). The most divergence between variants was seen on a question in SK123 Understanding cancers in which students had to say whether each of four statements were true or false. The five variants of the question had been created by combining a selection of statements in different ways and one of the statements turned out to be more difficult to identify as true or false than the others. However the situation was compounded by the fact that this statement occurred in one variant of average difficulty as well as in two of significantly greater difficulty than the others. Further investigation showed that the other statements in the ‘average’ variant were all extremely easy to identify as true or false, so if students initially gave the incorrect answer for the rogue statement, they immediately knew which statement to correct when they were told that their answer was incorrect.
The difficulty of writing variants of similar difficulty calls into question the desirability of using different variants of questions in summative use. But yet this undoubtedly acts to discourage plagiarism.
A puzzle
The question shown in Figure 6 was flagged as having variants of significantly different difficulty. But all variants appeared similar – all required students to do a similar calculation, all required an answer to be rounded up to the same precision and with the same units. Eventually the variant that had been identified as more difficult than the others was found on a ‘homework’ website – with an incorrect answer. This incorrect answer was given by students on a depressingly large number of occasions.
Figure 6 Why is this variant of this question more difficult than the others?
Back
As an extension to the iCMA statistics project, random guess scores were calculated for multiple choice, multiple response and drag and drop questions in a number of different situations (e.g. with different numbers of attempts, different scoring algorithms, different numbers of options to select from and different numbers of options being correct, students being told how many options were correct, or not).
The random guess score for a question is essentially the score that you would expect from someone who is completely logical in working though the question but knows absolutely nothing about the subject matter.
Examples
Consider a multiple choice question in which the user has to select one from four options and has a single attempt. They have a one in four chance of getting the right answer, so the random guess score is 25%.
Now consider the same multiple choice question, but allow the user 4 attempts, with full marks available whenever the user gets the correct answer. The logical user can work through the responses in order, so will always get the correct answer within the four allowed attempts i.e. the random guess score is 100%.
Now consider the same multiple choice question, again with 4 attempts, but now the user scores 4 marks (100%) if they are correct at first attempt, 3 marks (75%) if correct at second attempt, 2 marks (50%) if correct at third attempt and 1 mark (25%) if correct at fourth attempt:
At the first attempt, as with example 1 the user has a 1 in 4 chance of getting the question right, and if they get it right at this attempt then they score 4.
In order to get a mark at the second attempt, the user has to get the question wrong at the first attempt. They have a 3 in 4 chance of doing this. Then they have a 1 in 3 chance of getting the question right, and if they get it right at this attempt then they score 3.
In order to get a mark at the third attempt, the user has to get the question wrong at the first attempt (probability 3/4) and the second attempt (probability 2/3). Then they have a 1/2 chance of getting the question right, and if they get it right at this attempt then they score 2.
In order to get a mark at the fourth attempt, the user has to get the question wrong at the first attempt (probability 3/4), the second attempt (probability 2/3) and the third attempt (probability 1/2). However they are then certain to get this question right, i.e. they have a probability of 1 of doing this, and if they get it right at this attempt then they score 1.
This means that the overall random guess score out of 4 is:
This is 62.5%
Calculated values of random guess scores for a range of iCMA question types
Spreadsheets are available (under Documents on the right-hand side of the page), giving random guess scores for various types of multiple choice, multiple response and drag and drop questions. However there are many variables that can influence the random guess score, so a random guess score calculator is also provided.
Multiple choice and multiple response questions
‘multiple_response_1’ contains data for multiple choice and multiple response questions where students who get the answer wrong are simply told that their answer is incorrect. ‘multiple_response_2’ contains data for multiple choice and multiple response questions where students are told how many of their initial choices were correct.
In both cases, there are separate tabs for different total numbers of options. Under each tab the rows represent the number of correct options and the columns represent the number of attempts available. The upper table under each tab is for situations where users have been told how many options to select; the lower table is for when they have not been told how many options to select.
Drag and drop questions
‘drag_and_drop_once_only’ contains data for drag and drop questions where each draggable object may be put into at most one box. ‘drag_and_drop_unlimited’ contains data for drag and drop questions where each draggable object may be put into an unlimited number of boxes.
In both cases, there are separate tabs for different numbers of boxes to be filled in. The rows represent the number of choices to put in the boxes and the columns show the number of attempts allowed. These figures assume that if a student does not get the question right then they are told how many of their choices are correct. If this assumption is not valid, the random guess score calculator should be used.
Random guess score calculator
To run the program:
An example of a surprising result
The way in which the random guess score varies with different factors is sometimes surprising and counterintuitive. For example, Figure 7 shows the variation of random guess score for a multiple response question with 6 options, in which the score available at each attempt falls proportionally and in which partial credit is given for answers that are partially correct at the final attempt. When only one option is required, the random guess score increases with number of attempts allowed. However, when more than one option is correct, the random guess score decreases with the number of attempts allowed.
Figure 7 Variation of random guess score with number of correct options and number of attempts available.
This surprising result appears to be a consequence of giving partial credit for partially correct responses. If a completely correct final answer is required, the random guess score is dramatically reduced in some cases.
A selection of COLMSCT and piCETL related papers, presentations and workshops and talks given by Sally Jordan, 2006-2010
Publications and external conference contributions
Jordan, Sally (2007) The mathematical misconceptions of adult distance-learning science students. Proceedings of the CETL-MSOR Conference 2006, edited by David Green. Maths, Stats and OR Network, pp 87-92. ISBN 978-0-9555914-0-2.
Butcher, Philip and Jordan, Sally (2007) Interactive assessment in science at the Open University: 1975 – 2007. Invited oral presentation at ‘Computer-based assessment in the broader physical sciences’: a joint event hosted by the OpenCETL and the Physical Sciences Subject Centre, 26th April 2007.
Jordan, S., Brockbank, B. and Butcher, P. (2007) Extending the pedagogic role of online interactive assessment: providing feedback on short free-text responses. REAP International Online Conference on Assessment Design for Learner Responsibility, 29th-31st May 2007. Available at http://ewds.strath.ac.uk/REAP07
Jordan, Sally (2007) Computer based assessment with short free responses and tailored feedback. Proceedings of the Science Learning and Teaching Conference 2007, edited by Peter Goodhew. Higher Education Academy, pp 158-163. ISBN 978-1-905788-2.
Hudson, Ken and Jordan, Sally (2007) Practitioner scholarship can lead to institutional change – implementing interactive computer based assessment. Oral presentation at ISSOTL 2007, the International Society for the Scholarship of Teaching and learning, 4th Annual Conference, Sydney, 2nd-5th July 2007.
Jordan, Sally (2007) Assessment for learning; learning from Assessment? Oral presentation at Physics Higher Education Conference, Dublin, 6th-7th September 2007.
Stevens, Valda and Jordan, Sally (2008) Interactive online assessment with teaching feedback for open learners. Oral presentation at Assessment and Student Feedback workshop, Higher Education Academy Centre for ICS, London, April 2008.
Jordan, Sally (2008) eAssessment for student learning: short free-text questions with tailored feedback. Workshop at the University of Chester Staff Conference, May 2008.
Swithenby, Stephen and Jordan, Sally (2008) Supporting open learners by computer based assessment with short free-text responses and tailored feedback. Part of an invited symposium on ‘Matching technologies and pedagogies for supported open learning’ at the 6th International Conference on Education and Information Systems, Technologies and Applications, EISTA, Orlando, 29th June – 2nd July 2008.
Jordan, Sally (2008) Assessment for learning: pushing the boundaries of computer based assessment. Assessment in Higher Education Conference, University of Cumbria, July 2008.
Jordan, Sally (2008) Supporting distance learners with interactive screen experiments. Contributed oral presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) Online interactive assessment: short free text questions with tailored feedback. Contributed poster presentation at the American Association of Physics Teachers Summer Meeting, Edmonton, July 2008.
Jordan, Sally (2008) E-assessment for learning? The potential of short free-text questions with tailored feedback (2008) In invited Symposium ‘Moving forward with e-assessment’ at at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Butcher, Philip and Hunter, Arlene (2008) Online interactive assessment for open learning. Roundtable discussion at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Brockbank, Barbara, Jordan, Sally and Mitchell, Tom (2008) Investigating the use of short answer free-text eAssessment questions with tailored feedback. Poster presentation at the Fourth biennial EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference (ENAC 2008), Potsdam, August 2008.
Jordan, Sally, Brockbank, Barbara, Butcher, Philip and Mitchell, Tom (2008) Online assessment with tailored feedback as an aid to effective learning at a distance: including short free-text questions. Poster presentation at 16th Improving Student Learning Symposium, University of Durham, 1st-3rd September 2008.
Hatherly, Paul; Macdonald, John; Cayless, Paul and Jordan, Sally (2008) ISEs: a new resource for experimental physics. Workshop at the Physics Higher Education Conference, Edinburgh, 4th-5th September 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. OpenCETL Bulletin, 3, p4.
Jordan, Sally (2008) Online interactive assessment with short free-text questions and tailored feedback. New Directions, 4, 17-20.
Jordan, Sally (2009) Online interactive assessment in teaching science: a view from the Open University. Education in Science, Number 231, 16-17.
Jordan, Sally and Mitchell, Tom (2009) E-assessment for learning? The potential of short free-text questions with tailored feedback. British Journal of Educational Technology, 40, 2, 371-385.
Hatherly, Paul, Jordan, Sally and Cayless, Alan (2009) Interactive screen experiments – innovative virtual laboratories for distance learners. European Journal of Physics, 30, 751-762.
Butcher, P.G., Swithenby, S.J. and Jordan, S.E. (2009) E-Assessment and the independent learner. 23rd ICDE World Conference on Open Learning and Distance Education, 7-10 June 2009, Maastricht, The Netherlands.
Jordan, Sally. (2009) Assessment for learning: pushing the boundaries of computer based assessment. Practitioner Research in Higher Education, 3(1), pp11-19.Available online at
http://194.81.189.19/ojs/index.php/prhe
Jordan, Sally. (2009) An investigation into the use of e-assessment to support student learning. Assessment in Higher Education Conference, University of Cumbria, 8th July 2009. Available online at http://www.cumbria.ac.uk/Services/CDLT/C-SHEN/Events/EventsArchive2009.aspx
Jordan, Sally and Brockbank, Barbara (2009) Online interactive assessment: short free text questions with tailored feedback. Oral presentation at GIREP-EPEC, August 2009.
Jordan, Sally and Butcher, Philip. (2009) Using e-assessment to support distance learners of science. Oral presentation at GIREP-EPEC, August 2009.
Hatherly, Paul, Jordan, Sally and Cayless, Alan. (2009) Interactive screen experiments – connecting distance learners to laboratory practice. Oral presentation at GIREP-EPEC, August 2009.
Butcher, P.G & Jordan, S.E. (2010) A comparison of human and computer marking of short free-text student responses. Computers & Education, 55, 489-499. DOI: 10.1016/j.compedu.2010.02.012
Jordan, Sally. (2010). E-assessment for learning and learning from e-assessment : short-answer free text questions with tailored feedback. Presentation and workshop to HEA Physical Sciences Centre “The future of technology enhanced assessment’, Royal Society of Chemistry, Burlington House, London, 28th April 2010.
Jordan, Sally (2010) Short answer free text e-assessment questions with tailored feedback. Invited seminar to Human Computer Interaction group at the University of Sussex, 21st May 2010.
Jordan, Sally (2010) Maths for science for those with no previous qualifications: a view from the Open University. HEA Physical Sciences Centre ‘Maths for Scientists’ meeting, 26th May 2010.
Jordan, Sally (2010) Student engagement with e-assessment questions. Poster at the 2010 International Computer Assisted Assessment (CAA) Conference, Southampton, July 2010.
Jordan, S. and Butcher, P. (2010) Using e-assessment to support distance learners of science. In Physics Community and Cooperation: Selected Contributions from the GIREP-EPEC and PHEC 2009 International Conference, ed. D, Raine, C. Hurkett and L. Rogers. Leicester: Lula/The Centre for Interdisciplinary Science. ISBN 978-1-4461-6219-4, pp202-216.
Jordan, Sally (2010) Do we know what we mean by ‘quality’ in e-assessment? Roundtable discussion at EARLI (European Association for Research on Learning and Evaluation)/Northumbria Assessment Conference, Northumbria, September 2010.
Jordan, Sally, Butcher, Phil, Knight, Sarah and Smith, Ros (2010) ‘Your answer was not quite correct, try again’ : Making online assessment and feedback work for learners. Workshop at ALT-C 2010 ‘Into something rich and strange – making sense of the sea change’, September 2010, Nottingham.
Jordan, Sally (2010) Using simple software to generate answer matching rules for short-answer e-assessment questions in physics and astronomy. Oral presentation at the Physics Higher Education Conference, University of Strathclyde, September 2010.
Butcher, P.G. & Jordan, S.E, (in press) Featured case study in JISC Effective Practice Guide, Summer 2010
Contributions to internal (OU) conferences, meetings and workshops
Jordan, Sally (2006) An analysis of science students’ mathematical misconceptions. Poster presentation at 1st OpenCETL Conference, 8th June 2006.
Jordan, Sally (2006) OpenMark – what’s all the fuss about? Lunchtime seminar at Cambridge Regional Centre, 1st November 2006.
Jordan, Sally (2007) Using interactive online assessment to support student learning. Faculty lunchtime seminar, 30th January 2007.
Jordan, Sally (2007) Issues and examples in online interactive assessment. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions. Seminar at Physics Education Innovations in Practice 2007 (πCETL AL Conference), 28th April 2007.
Jordan, Sally (2007) Assessment for learning; learning from assessment? Paper presented at the Curriculum, Teaching and Student Support Conference, 1st May 2007.
Jordan, Sally and Brockbank, Barbara (2007) Extending the pedagogic role of online interactive assessment: short answer free text questions. Paper presented at the Curriculum, Teaching and Student Support Conference, 2nd May 2007.
Jordan, Sally (2007) Investigating the use of short answer free text questions in online interactive assessment. Presentation at the Science Staff Tutor Group residential meeting, 9th May 2007.
Jordan, Sally (2007) OpenMark: online interactive workshop. Workshop run at AL Staff Development meeting in Canterbury, 12th May 2007.
Brockbank, Barbara, Jordan, Sally and Butcher, Phil (2007) Investigating the use of short answer free text questions for online interactive assessment. Poster presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Students’ mathematical misconceptions: learning from online assessment. Oral presentation at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally (2007) Using interactive screen experiments in our teaching: the S104 experience and The Maths Skills ebook. Demonstrations at 2nd OpenCETL Conference, 15th October 2007.
Jordan, Sally, Ekins, Judy and Hunter, Arlene (2007) eAssessment for learning?: the importance of feedback. Symposium at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally, Brockbank, Barbara and Butcher, Phil (2007) Authoring short answer free text questions for online interactive assessment: have a go! Workshop at 2nd OpenCETL Conference, 16th October 2007.
Jordan, Sally (2008) Investigating the use of short answer free-text questions in online interactive assessment. Oral presentation to EATING (Education and Technology Interest Group), 17th January 2008.
Jordan, Sally (2008) Investigating the use of short free-text eAssessment questions.Oral presentation to ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally (2008) Writing short free-text eAssessment questions: have a go! Workshop at ‘Assessment for Open Learning’ Symposium, Stroud, March 2008.
Jordan, Sally and Brockbank, Barbara (2008) Writing free text questions for online assessment: have a go! Workshop at the Open University Conference, 29th and 30th April 2008.
Jordan, Sally (2008). Investigating the use of short answer free text questions in online interactive assessment. Science Faculty Newsletter, May 2008.
Jordan, Sally and Johnson, Paul (2008) E-assessment opportunities: using free text questions and others. Science Faculty lunchtime seminar followed by workshop, 16th July 2008.
Jordan, Sally and Datta, Saroj (2008) Presentation on Open University use of Interactive Screen Experiments at ISE Launch event, 19th September 2008.
Jordan, Sally, Butler, Diane and Hatherly, Paul (2008) CETL impact: an S104 case study. Series of linked presentations to 3nd OpenCETL Conference, September 2008. [reported in OpenHouse, Feb/March 2009, ‘S104 puts projects into practice’, p4]
Jordan, Sally and Johnson, Paul (2008) Using free text e-assessment questions. Science Faculty lunchtime seminar followed by workshop, 26th November 2008.
Butcher, Phil, Jordan, Sally and Whitelock, Denise (2009) Learn About Formative e-Assessment. IET EPD Learn About Guide.
Butcher, Phil and Jordan, Sally (2009) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 21st January 2009.
Jordan, Sally (2009) E-assessment to support student learning : an investigation into different models of use. Paper presented at Making Connections Conference, 2nd- 3rd June 2009.
Jordan, Sally (2009) (ed) Compilation of interim and final reports on Open University Physics Innovations CETL projects: Assessment.
Butcher, Phil and Jordan, Sally (2009) A comparison of human and computer marking of short-answer free-text student responses. Presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2009) Interpreting the iCMA statistics. Presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Nix, Ingrid, Waights, Verina, Bolton, John and Butcher, Phil (2009) From CETL to course team : embedding iCMA initiatives. Workshop at 4th OpenCETL Conference, December 2009.
Jordan, Sally, Butler, Diane, Hatherly, Paul and Stevens, Valda (2009). From CETL to Course Team: CETL-led initiatives in S104 Exploring Science. Poster presentation at 4th OpenCETL Conference, December 2009.
Jordan, Sally (2009) Student engagement for e-assessment. Poster presentation at 4th OpenCETL Conference, December 2009.
Butcher, Phil and Jordan, Sally (2010) ‘Expert’ presentation on ‘Learn about e-assessment: formative assessment’ at Learn About Fair, 10th February 2010.
Jordan, Sally (2010) Workshops on using OpenMark PMatch to write short-answer free-text questions, 19th January 2010 and 17th February 2010.
Jordan, Sally, Nix, Ingrid, Wyllie, Ali, Waights, Verina. (2010) Science Faculty lunchtime forum, March 25th 2010.
Jordan, Sally (2010) e-Assessment. In e-Learning in Action: projects and outcomes from the Physics Innovations CETL, pp16-20.
Jordan, Sally (2010) (ed) Compilation of final reports on Open University Physics Innovations CETL projects: e-Assessment.
Back
I am a Staff Tutor in Science at the Open University's Regional Centre in Cambridge, with responsibility for the running of various science courses in the East of England. I am a member of the OU Department of Physics and Astronomy. I was lucky enough to hold teaching fellowships in two of the OU's Centres for Excellence in Teaching and Learning: piCETL (the Physics Innovations Centre for Excellence in Teaching and Learning) and COLMSCT (the Centre for Open Learning in Mathematics, Science, Computing and Technology).
Following the end of the OU CETLs in July 2010, I will blog about my ongoing work and thoughts, mostly on e-assessment, at http://www.open.ac.uk/blogs/SallyJordan/
I was a member of the course teams producing S104 : Exploring Science (presented for the first time in Feb 2008) and S154 : Science Starts Here (presented for the first time in October 2007). I had responsibility for the development of maths skills in these courses and for the integration of interactive computer marked assignments using the Open University’s OpenMark eAssessment system. I am currently course team chair of S104 and from January 2011 I will be course team chair of S154. I have also been (off and on!) course team chair of S151 : Maths for science, which reflects my great interest in investigating ways to help students to cope with the maths they need in order to study physics and other science courses. I was also on the Course Team of SXR103 : Practising Science for many years.
My COLMSCT and piCETL projects reflect my interests in eAssessment and science students’ mathematical misconceptions. I have investigated the use of short free-text eAssessment questions (with instantaneous feedback, wherever possible tailored to student’s misconceptions) and have applied similar evaluation methodologies to other types of eAssessment tasks, to investigate ways in which iCMAs can most effectively be used to support student learning. Analysis of students’ responses to eAssessment questions provides an opportunity for teachers to learn more about student’s misunderstandings. I have used this methodology to learn more about science students’ mathematical misconceptions. I have developed a ‘Maths Skills eBook’ which is used as a resource on many Open University science courses and I have evaluated the effectiveness and student use of an Interactive Screen Experiment (ISE) developed with piCETL funding for use by S104 students who are not able to conduct one of the course’s home experiments for themselves.
In 2006 I was awarded an Open University Teaching Award for my work in developing S151: Maths for Science and its interactive online assessment system. In 2010 I was one of the OU's nominees for a National Teaching Fellowship.
I have tutored various science and maths courses and have taken a few OU courses as a student. I live in West Norfolk with my husband; my children are both post-graduate students at (conventional) universities. In case you hadn't realised the significance of the shared surname, it was my daughter Helen who did the work on iCMA statistics and random guess scores in Summer 2009. Helen completed an M.Phil in Statistical Science at the University of Cambridge in July 2010 and starts a PhD in statistics at Warwick University in autumn 2010.
My hobbies include walking long-distance footpaths and writing about this (see http://sites.google.com/site/jordanwalks/) and singing in a local choir.
Sally Jordan, COLMSCT Teaching Fellow
S.E. Jordan@open.ac.uk
Nurses are required to make clinical decisions about patients' health and well-being, responding to changes in each patient's condition, which may occur within very small time-frames
The ability to make clinical judgements depends on both a sound theoretical background and good decision-making skills and practice-based learning enables students to develop these skills at the same time as they acquire the necessary underpinning knowledge.
The aim of this project is to develop a web-based tool to assess practice-based learning, building on Laurillard (2002), who suggests that ‘Traditional modes of assessment of knowledge are seen as inadequate because they fail to assess students‘ capability in the authentic activities of their discipline. Such a tool provides an alternative to mentor-led practice assessment, which can bring its own problems due to the tension between nurturing and judgement (Yorke, 2005).
In addition, it complements reflective writing as a method of assessing students’ evolving professional practice, that is to say their integration of theory and practice, which is so vital when preparing students for professional registration. We intend the tool to be reusable in a variety of settings, incorporating a range of resources as appropriate to each setting. The tool will be piloted by invitation only (independent of course assessment) with a cohort of Adult Nursing students, with the aim of incorporating the tool in both Adult and Mental Health Nursing Programmes.
Aims:
Verina waights biography can be found at the following link:
http://www.open.ac.uk/hsc/people/profile.php?name=Verina_Waights
Futher details about publications and conference presentations by Verina Waights can be found in ORO. Follow the link below and then search for Verina Waights in the Author box.
Readers might like to try the Clinical Decision-making tool https://students.open.ac.uk/openmark/cdm.projectworld
Opening screen of the tool giving details of the patient’s situation and showing the links to the resources, the initial decision options and the text box for students’ to submit the rationale for their initial decision.
Verina Waights and Ali Wyllie
V.Waights@open.ac.uk and A.J.Wyllie@open.ac.uk
Investigation into 'Elearning communities and identities' encompasses technological, educational, psychological and sociological issues. While all of these issues frequently interlink to enable or inhibit learning it is important to focus on key elements in our research into teaching and learning. These projects seek to enable support, collaboration, dissemination and networking possibilities both within and outside the OU. The key areas of activity are:
Wendy Fisher, COLMSCT Teaching Fellow
The use of Tablet personal computers (PC) to mark paperless assignments was designed to support lecturers in writing quality feedback to engage students in learning, positioned at the point of learning. The project ran over eighteen months and a perceptual evaluation, evaluated lecturers and students attitudes to the use of technology in the assessment process. Following the evaluation, the dissemination of findings through presentations and papers was given to mathematics, science, computing and technology lecturers in higher education institutions, within and outside the United Kingdom.
Ken Platts, COLMSCT Associate Teaching Fellow
The project was defined in response to feedback and complaints from students and colleagues about the increasingly common use of DVD technology that requires either prolonged use of desktop or laptop computers or the personal printing of large amounts of course material in order that it can used in such places as trains, buses, aircraft or at work.
Val Hancock
The Open University's Virtual Learning Environment (VLE) makes many learning tools and services available via the internet but not everyone has easy access to the internet (e.g. students in the armed forces, prisons, residential care or hospitals). This project aimed to make more courses available to prisoners and other students who would otherwise be excluded from studying because they did not have internet access.
Stanley Oldfield, COLMSCT Teaching Fellow
In recent years there has been increasing emphasis on the development of communication and collaboration skills within OU undergraduate courses. This has arisen from the general adoption of the Skills agenda, partly as a result of pressure from employers and professional bodies to ensure that by graduation student possess more than knowledge of their subject or even the ability to apply that knowledge to real world situations.
Hilary Cunningham-Atkins, COLMSCT Associate Teaching Fellow
In October 2005 T171 reached the end of the final presentation after one pilot presentation and nine full presentations. The aim of this project is to collate the vast amount of experience of online teaching and learning amassed by T171 tutors.
Giselle Ferreira
This work was carried out between October 2007 and December 2008.
Stuart Slater, COLMSCT Associate Teaching Fellow
The original scope of the project involved developing tools, techniques and applications to enable both staff and students to begin actually developing client side applications for their own mobile devices.
Keith Beechener, COLMSCT Associate Teaching Fellow
Using email, electronic communications and other emerging online tools to support an effective interaction between distance learners and tutors. This project is based on the experiences of tutoring students on first level Open University courses.
Kath Clay
The concept of a subject community is one that enables both students and academic staff to have some continuity and general skills and knowledge sharing outwith the framework / content of any particular course
Rob Parsons, COLMSCT Associate Teaching Fellow
This project aims to examine the interactions of moderators and students on two Open University, online technology courses in order to determine what kind of activities and approaches succeed in keeping students engaged and moving forward with the course.
Dave Hubble, COLMSCT Associate Teaching Fellow
This project aims to investigate levels of student engagement in e-learning activities, relate this to academic performance and unltimately develop methods for improving the level of engagement where this is currently low
Diane Butler, COLMSCT Teaching Fellow
The aim of this project is to improve the integration of online activities in to Open University courses.
Karen Shipp, COLMSCT Teaching Fellow
Investigating the impact of reduced face-to-face interaction on the underlying dynamics of the OU teaching system and on the relationships that maintain motivation and retention, in order to identify appropriate interventions
Andy Diament, COLMSCT Associate Teaching Fellow
A study to identify, describe and disseminate good practice for synchronous online tutorials, specifically for teaching in Maths/Science/Computing/Technology Areas.
Anne Pike, COLMSCT Associate Teaching Fellow
This project was designed to investigate, and find ways to improve, the distance learning experience of the OU student in prison.
Alice Peasgood. COLMSCT Teaching Fellow
My research explores students’ everyday experiences of ICT or new technologies, and how they apply those to their learning. The students are adults within a widening participation (WP) programme of distance education courses, the Openings Programme.
Jill Shaw
The aim of this project is to investigate students' increased use of collaborative learning resources via fOUndIt (http://foundit.open.ac.uk). Additionally it aims to address the issue of course resources becoming outdated, by exploring the use of fOUndIt as a gathering point for user-contributed news, ideas and resources.
Breen Sweeney
Research was carried out to investigate how mathematics can be taught in a virtual world.
John Woodthorpe
To evaluate a range of Web 2.0 technologies for the automated and semi-automated generation, delivery and maintenance of electronic resources in courses and the wider OU community.
Karen Kear
An investigation of wikis and audio conferencing for supporting students’ collaborative learning in online course settings.
Anna Peachey, COLMSCT Associate Teaching Fellow
One of the fastest growing areas of online teaching and learning at the moment is the use of virtual worlds or MUVEs (Multi User Virtual Environments). The Open University has been working in Second Life, arguably the most versatile virtual world for education purposes, since 2006.
Shailey Minocha
The aim of the project is to investigate the pedagogical effectiveness of 3D Multi-User Virtual Environments (MUVEs) such as Second Life and their role in enhancing the student's learning experience.
Liz Thackray
Immersive Virtual Worlds (IVWs) have gained a great deal of attention in the education community in recent years.
Stephen Burnley, COLMSCT Teaching Fellow
Students taking “Environmental monitoring, modelling and control” (T308) are required to undertake an environmental assessment project as their end of course assessment. The project integrates the various themes of the course and provides experience of the work carried out by many environmental technologists.
A critical review of the use of enabling technologies for commentary on both on-line assessment and on-line tutorials, with reference to both technical aptitude and motivation to use new technologies in e-assessment and e-learning processes
The use of Tablet personal computers (PC) to mark paperless assignments was designed to support lecturers in writing quality feedback to engage students in learning, positioned at the point of learning. The project ran over eighteen months and a perceptual evaluation, evaluated lecturers and students attitudes to the use of technology in the assessment process. Following the evaluation, the dissemination of findings through presentations and papers was given to mathematics, science, computing and technology lecturers in higher education institutions, within and outside the United Kingdom.
The project was funded by the Centre for Open Learning in Mathematics, Science, Computing and Technology (COLMSCT) Centre for Excellence in Teaching and Learning (CETL) at the Open University (OU). Working in partnership with the Faculty of Technology and latterly the Technology and Education Research Group, a comparative study was designed to look at what the pedagogical challenges were for lecturers when marking paperless assessment. The OU is a distance leaning organisation and correspondence tuition is where the main teaching of the subject is assessed by a lecturer providing the student with feedback on their assignments.
The Formative Assessment in Science Teaching (www.open.ac.uk/fast) project identified eleven conditions under which assessment supports student learning. In (Gibbs 2003) assessment tactics included the quantity and timing of feedback, and the quality of the feedback as being important. Lecturers provided students with feedback on paperless assignments using both desktop and tablet personal computers (PC), the latter being just like a electronic pad of paper, and the impact of these technologies on the quality and quantity of the feedback was studied.
One major outcome of the project was to develop a model for understanding the process and pedagogy of how lecturers mark and annotate paperless assignments. And that for many lecturers they record a symbiotic relationship with Tablet PC’s. A very significant finding has been verified using feedback coding, that this symbiotic relationship, may have also led to the fact that although lecturers gave less feedback using a tablet pc, the depth of their comments was much deeper than with a desktop pc.
Another outcome has been to undertake workshops, and run expert forums to provide practical hands-on experience for lecturers to experience using a Tablet PC in their own teaching and learning. This workshop has also been extended to include presentations of work in collaboration with two other COLMSCT teaching fellows, Keith Beechener and Bill Tait, on our research findings on ‘Making the technology fit the Pedagogy’.
References
For further details about the project the 'Final Project Report' and realted papers can be accessed from Related Resources.
Reports
Internal Presentations and Papers
External Presentations and Papers
Tablet PC Manual
Questionnaires
Wendy Fisher, Staff Tutor in Technology, The Open University in Yorkshire and the Humbler. As a COLMSCT fellow my research will help inform those involved with the virtual learning platforms at the Open University of the processes involved in using digital enabling technologies in e-learning and e-assessment.
If you want an up date on my project then the url for my blog is http://assessmentwithtabletpcs.blogspot.com/
The project was defined in response to feedback and complaints from students and colleagues about the increasingly common use of DVD technology that requires either prolonged use of desktop or laptop computers or the personal printing of large amounts of course material in order that it can used in such places as trains, buses, aircraft or at work.
Content
The aim of the project is to:
The research methods employed in the research are:
a) A literature review regarding technologies and infrastructures used to support open and distance learning particularly in the United Kingdom.
b) Questionnaires with students and tutors on undergraduate and post graduate courses. The students sampled were undergraduate studying ICT technologies as part of a degree course and post graduates studying project management. A number of HP IPAQ were loaned to students and their reactions recorded
c) A number of in depth interviews with students, tutors and subject experts have taken place.
The use of mobile devices in the part time distance learning field for open and distance learning remains low in the UK. The factors seem to be:

Figure 1 Student Usage of 'on the move' devices

Figure 2 Student engagement with different devices
It appears students are not as supportive of mobile learning using PDAs and Smartphones; these have had a mixed reception with student feedback
The results of my research to date have been mixed; although there is some enthusiasm for podcasting there appears to be issues concerning the value and acceptance of some types of learning devices and techniques particularly amongst part time distance learners.
The rate of acceptance of mobile learning by the OU community may well be different to that envisaged in the published research (eg Kukulska-Hulme & Traxler 2005) and reported by the Athabasca University, Canada’s Open University at the M Libraries conference in 2007.
There appears to be a growing number of young ICT competent students who are totally at ease with mobile learning but there is still a large number of the more traditional OU students less willing to engage with learning on the move.
My final report is currently in outline stage with a number of the sections requiring revision to allow for the changes in technologies since 2006. These changes will include the use of mobile broadband, the launch of the iPhone and its clones, the evolution of the OLPC (One Laptop per Child) technology (eg Asus Eec) and growing acceptance of the netbook concept. Despite the rapid rate of change in technology and devices and the substantial growth (40%) in Windows Mobile based devices in 2007 (Gartner 2007) a further 2007 survey showed that only 5% of American users (i.e. early adopters) used a mobile device to access the internet. A newspaper More worryingly a survey by Reevoo published in 2009 identifies user resistance to touchscreen interfaces in mobile phones.
The feedback from the COLMSCT project trials is also mixed. The use of podcasts is generally welcomed by the respondents and regarded as a more immediate and personal communication. However PDAs and Smartphones are not as popular; these have been reported by triallists as difficult and expensive to use with the ‘on the move’ benefits of mobile access difficult to achieve due to battery life, technical difficulties, inadequate screen sizes and interface methods. The advantages of podcasting technology to students who do not attend tutorials (ie overseas) and those requiring specific attention (eg TMA feedback, revision hints and tips etc) seems clear and the use of this technology to provide a more immediate type of feedback and presentation is welcomed. .Despite this it seems that many students welcome the supplementary information provided and the ability of mobile learning to allow the A/L to provide this securely and without delay is a significant factor.
While the use of mobile learning should be encouraged for distance learning there is need for greater research into its uses, the issues of pace and scale, capabilities and limitations and the way students use, and choose not use, mobile devices. This research needs to be allied with the identification of improvements in the delivery ie the ease of use of the devices and the supporting infrastructure.
The COLMSCT research showed that a number of suitable techniques could be usefully trialled and analysed and I have used text messaging, podcasting, smartphones and PDAs both as part of my project and as a member of another Mobile Learning research review by Jane Lunsford.
I have provided disseminated the results so far at course conferences (T209 Information and communication technologies: people and interactions review, M865 Project management rewrite, etc.) and drawn attention to the positive benefits of podcasting for group Tutor Markled Assessment (TMA) feedback, examination preparation and the relative simplicity of the production technique for the Associate Lecturers. I have recommended its use at sessions with Staff Tutors and Course Chairs.
Positive feedback questionnaires from T209 and M865 students mirror the public popularity of and the growing availability of media podcasts and download eg BBC, The Times, Independent, etc.
As part of my completed COLMSCT project I intend to present my findings to the T209 and M865 course teams as well as making presentations at conferences.
As an Associate Lecturer on the technology courses T175 Networked living: exploring information and communication technologies and T209 Information and communication technologies: people and interactions I see a wide range of ICT skills and competencies in the level 1 and 2 students and the computing requirements of the technology courses.
The degree of the ICT competence is a significant factor in the individual student’s approach to part time distance learning and this can be particularly difficult for OU students based overseas and those attempting to balance busy business lives with their personal commitments; studies are often fragmentised in an effort to meet the requirements of their courses. In my introductory conversations with students I have also become aware that the once familiar ‘on the move’ OU study I experienced as a part time student is often no longer feasible due to changes in the nature of the course material (eg DVD format, course websites) and the means of access to host sites. In addition, technical security barriers (eg firewalls, group policies) often restricted the student’s access to study OU material from the work environment.
Since I felt this was an important feature in modern OU study I applied for and was awarded a COLMSCT fellowship in 2006 to carry out research in mobile learning. Earlier reading of research in this area indicated that mobile learning could be a significant benefit to many OU students and staff, not only those in the technology field. I felt, as a user of technology over a number of years that the benefits arising from the use of mobile devices by the OU community would be clear and unequivocal.
I was recently awarded the OU Teaching Award, 2009 as nominated by the Mathematics, Computing and Technlogy Faculty, for my COLMSCT Fellowship work evaluating how students can be supported with hand held technologies for Open University Technology study purposes.
The Open University's Virtual Learning Environment (VLE) makes many learning tools and services available via the internet but not everyone has easy access to the internet (e.g. students in the armed forces, prisons, residential care or hospitals). This project aimed to make more courses available to prisoners and other students who would otherwise be excluded from studying because they did not have internet access.
Most prisoners don’t have access to the internet and, as a consequence, many Open University course components are not available to those studying in prison. As course teams increase their use of the VLE, prisoners are finding that substantial parts of the Open University student experience are becoming inaccessible and their course choice is severely restricted. The university has been educating prisoners since the 1970s. If this is to continue, alternative approaches must be found for course components that require internet access.
For some time, the university's prison tutors have been contriving ways to help their students in prison overcome the obstacles that lack of internet access creates. This work has been done independently, with no way to communicate the techniques that have been devised to other tutors. The first phase of this project, therefore, saw the creation of a Prison Tutors forum and Prison Tutor Support wiki to facilitate communication between tutors and foster a nationwide, cross-faculty community of Open University prison tutors. Forum discussion covers many topics, not just those related to lack of internet access. The wiki is an evolving resource for Open University prison tutors, created by those selfsame tutors. It is used for recording such information as the computing facilities in individual prisons, security clearance processes and the days and times when tutorials are possible, as well as documenting existing alternative approaches. The success of both these resources means that they will continue in existence beyond the end of the project.
The second stage of the project involved trialling alternative approaches on a specific course and comparing the teaching and learning experiences of tutors and their students in prison. Nine offender learners in seven prisons registered for the course. Six students were interviewed at the start of the course. The interviews highlighted which course components would be difficult to undertake in each prison. The subsequent alternative approaches that were developed reflected the various computing facilities offered by the different prisons. The interviews were part of a staged approach to the research and responses fed into the development of questions for questionnaires that were completed by the students' tutors at the end of the course. These questionnaires established that, despite the difficulties they had encountered, all the tutors would be happy to support another student in prison. A course tutor pack was developed to support tutors on future presentations of the course. The pack contained such information as potential problem areas, liaising with prison staff and, most importantly, alternative approaches to course activities that required internet access.
To keep the task of providing alternative approaches manageable, the trial developed a framework that defined activities as Essential, Desirable or Optional (EDO). For essential course components an alternative approach had to be found to enable the student to complete the course successfully. Although students could pass the course without the desirable components, their overall grade was likely to be affected. Therefore, every effort was made to provide alternatives to as many desirable components as possible. Optional components were those either clearly marked as optional in the course or those where it was known that a large proportion of students outside prison did not engage with the activity. Alternatives to optional components were only provided if they required minimal effort.
By using the EDO framework, tutors could immediately see where they, or another intermediary, needed to deliver an alternative approach. As centralised support was available, tutors could select from a range of possible alternative approaches and choose one that was appropriate for their student.
Using the EDO framework to rate the internet-based course activities as essential, desirable or optional is a new learning design process that has applications beyond those associated with internet access. Further research is needed to identify other learning scenarios where the framework can usefully be applied. Possible applications include
Further information about the Open University's courses available to offender learners can be found on the OU Prison Education site
Sample VLE tutor group forum content
'Alternative approaches to online activities in a prison environment' presentation at Open CETL conference, Milton Keynes 2008
'Breaking through the Internet barrier - studying with the OU in prison' poster at Open CETL conference, Milton Keynes 2008
Sample FirstClass tutor group forum content
'An objective evaluation of online forum usage amongst prison tutors in a distance learning environment' presentation at Education and Information Systems, Technologies and Applications (EISTA) conference, Orlando 2009
'Excluded from society – included in The Open University' poster at Making Connections conference, Milton Keynes 2009
'No internet? No problem!' presentation at Alt-C conference, Manchester 2009
'No internet? No problem!' short paper abstract for Alt-C conference, Manchester 2009
'One size doesn't fit all' presentation at Making Connections conference, Milton Keynes 2009
'Supporting students in prison – cottage industry or big business?' abstract for presentation at Open CETL conference 2009
'Supporting students in prison – cottage industry or big business?' presentation at Open CETL conference 2009
'What’s the alternative? Widening participation by making VLE-based distance learning available to those without internet access' presentation at The Cambridge International Conference on Open and Distance Learning, Cambridge 2009Contents:
In recent years there has been increasing emphasis on the development of communication and collaboration skills within OU undergraduate courses. This has arisen from the general adoption of the Skills agenda, partly as a result of pressure from employers and professional bodies to ensure that by graduation student possess more than knowledge of their subject or even the ability to apply that knowledge to real world situations.
Current attempts to encourage the ability to work with others range from the engagement in simple communication activities using email or First Class conferencing, through participation in various collaborative and group work activities, to engagement in fully fledged teamworking activities.
The recently launched course M253: Teamworking in distributed environments, which had its first presentation from February to August 2005, was an attempt by the Department of Computing to provide students with experience of working in virtual teams, with no face-to-face contact whatsoever. It has taken a fairly innovative and experimental approach to the problem of how to deliver team-working skills at a distance, and inevitably needs reflection on how the process can be improved, and how lessons learned from our initial work can be incorporated into future presentations.
The BEST project was intended to analyse the material from the first presentation of M253, in terms of both student and tutor experiences using, amongst other resources, the material from the archived team conferences to understand how effective the mechanisms for establishing and maintaining teams, and ensuring student participation in their shared tasks, have operated. From this analysis, together with other inputs such as a survey of existing collaborative activities within current and planned OU courses, a wider analysis of the literature on virtual teams, the interviewing of professionals in organisations which operate with virtual teams, and investigation into more appropriate technology for the support of virtual teams, it was expected that it would be possible to provide valuable advice to the wider community interested in providing students with an effective team working experience relating to the best ways of incorporating such activities into their courses.
The majority of the conclusions of this project can be found by following the links to the set of papers and presentations to be found on the Dissemination Activities page of this website.
This project has provided time and funding to stand aside from the routine work of course production and presentation and to consider in depth the implications of what, as an innovative course team, we have been attempting to deliver over the last two or three years.
I have been able to investigate the wider literature on developing team working skills in students, and the associated impact on approaches to tutoring - to look in more detail at the theoretical underpinnings of the subject, and to consider their implications for improving the current course. I have been able to talk to other people, both within and outside the OU, engaged in similar activities about how to best deliver such courses.
On the practical side I have been able to look at student and tutor behaviour as the student teams carry out their course activities - in particular to look at their reactions to the technology currently available to support the collaborative aspects of the course. In this context, I have had time to investigate the usefulness and usability of a range of possible alternative technologies which might improve the student experience, and to run a small-scale experiment as part of the development and testing of the collaborative features of the emerging OU VLE.
I have also been able to undertake a considerable amount of scholarly activity in the area of teaching and learning, which has led to several conference presentations and published papers, and in particular have developed a new model for the way in which collaborative learning should be approached at both the individual course and at the institutional level.
Little, if any, of this would have been possible without the project funding.
A selection of COLMSCT related dissemination activities to October 2008.
Stanley has been working since 2001 as a Senior Lecturer in the Department of Computing at the Open University. He has been an OU tutor since 1975 on a variety of courses. He was seconded to the OU from his position as Principal Lecturer in the Department of Computing at the University of Plymouth, where he was also Sub Dean (Teaching and Learning) in the Faculty of Technology, to work on the production and presentation of M150 the OU’s new entry level course in Computing. He has also been involved, since its inception, in the presentation of TM42X, the final year individual project course for Computing and Technology degree students. More recently he has been a key member of the team which produced and presented M253, the Department of Computing’s innovative Team Working course.
While at Plymouth he was part of the team that developed the Student Centred Learning Initiative, and was also involved in the TLTP EFFECTS project, which was a professional development programme to integrate / embed Computing and IT into the routine delivery of teaching in H.E. He was a Computing Subject Specialist Assessor for the HEFCE Quality /Assessment programme and a member of the Computing Subject Benchmarking Panel, and was also a member of the Learning Development Group set up by the Committee of Professors and Heads of Computing in UK Universities.
From 1996 to 1999 he was the Contractor for an EU Tempus project set up to assist in the Restructuring of Degree Courses in Computing in Bulgarian Technical Universities, and has been an active member of the subsequent Socrates Thematic Network for European Computing Education and Training.
He is particularly concerned about the relevance of what is taught to Computing students in the Higher Education context and about the appropriate use of technology to assist in the processes of teaching and learning.
Oldfield, S. (2006) Teamworking in Distance Education. Presentation at European Association of Distance Teaching Unoiversities in Tallinn, November 2006
Oldfield, S. (2007) Designing Courses to Develop Online Teamworking Skills: A Helical Model. Presentation at 2nd Informatics Education Europe Conference, Thessaloniki, Greece, Nov 2007
Oldfield, S. (2007) Learning about Online Collaboration: Pedagogical Perspectives. Presentation at ISSOTL , Sydney, Australia July 2-5 2007
Oldfield, S. (2008) Online Collaborative Activities: The Developmental Dimension. Presentation at Sloan-C International Symposium on Emerging Technology Applications, Carefree, Arizona, May 2008
Oldfield, S. and D. Morse (2006) Exploiting Connectedness in the Informatics Curriculum. Presentation at the 1st Informatics Education Europe Conference, Montpellier, France, 9-10 Dec 2006
Oldfield, S. and D. Morse (2008) C is for Collaboration: A Developmental Perspective. Presentation at IADIS e-Society conference in Carvoiero, Portugal, April 2008
In October 2005 T171 reached the end of the final presentation after one pilot presentation and nine full presentations. The aim of this project is to collate the vast amount of experience of online teaching and learning amassed by T171 tutors.
This will cover all areas of e-tutoring including the use of conferences to support both staff and students; running e-tutorials; best practice for e-TMAs and e-staff development. The study will include not only those things that have worked well but also the failures along the way and what has been learned from these.
The intention is to share the experience of e-tutoring the students who have passed through these online groups during the past nine presentations through academic papers and conference presentations. It is hoped this will be of great benefit to tutors moving into the area of online tutoring both within Open University and in other areas of tertiary education.
Having left full-time education in 1974 with 7 ‘O’ levels and no wish to go to university, I decided in 1995 to give up a well paid job and become a full time student at Leeds Metropolitan University. I completed a BSc (Hons) in Computing in 1999 and enjoyed studying so much that I went on to take a PhD, completing in 2004. My research investigated the influence of learning style on text based computer conferencing and was conducted entirely with Open University students, mainly from T171.
I started teaching for Leeds Metropolitan University and Open University in 1999. I joined Open University to teach T171 then added U130, A171 and A172 to my teaching portfolio. I also experienced teaching at one T293 summer school before the course ended. My most recent appointment is for T175.
In addition to working as an AL I have peer monitored for T171 and A172 and will soon be monitoring T175, I have mentored A17* tutors, run staff development sessions for R07 both face-to-face and online, run student induction sessions for R07, co-moderated the R07 ICT help conference, provided ICT mentoring to R07 tutors, and assisted with a joint project run with JIVE to provide support for women on technology courses.
When I’m not glued to my computer doing Open University work I like to get out into the countryside and do some walking. I have recently trained as an Open Access Volunteer for Nidderdale ANOB which gets me out come rain or shine. Having studied for 11½ years without a break I promised myself a rest at the end of my PhD but I did sneak in an Open University short course last year and a few weekend courses this year. I think I am addicted to study.
This work was carried out between October 2007 and December 2008.
A Framework for Teaching Ethics to ICS Students and Practitioners using Open Educational Resources (£3,500)
Successful bid submitted to the Higher Education Academy, Subject Centre for Information and Computer Science, ICS (Principal Investigator: Giselle M. d. S. Ferreira; Specialist Consultant: Prof. John Monk). This 8-month project (start date: 1 December 2008) exploits findings of the ‘Ethics’ pilot to inform the creation of a complete study unit in the area of ethics in ICS, to be made openly and freely available online, initially on the OpenLearn experimentation site LabSpace and, after peer review, on the OpenLearn’s LearningSpace . Proposal and further details are available online at http://www.ics.heacademy.ac.uk/projects/development-fund/fund_details.php?id=117
Achieving Transformation, Enhanced Learning and Innovation through Educational Resources in Design: ATELIER-D (£180,000)
Successful bid submitted to JISC (Principal Investigator: Steve Garner, Design Group, Faculty of MCT). This two-year project (start date: 1 November 2008) engages six academic staff plus a research assistant and Steering Group in the construction of a virtual atelier that combines well-established practice in Art and Design education with new opportunities presented by ICT to create a new approach to learning and teaching Design. The project is founded on the three core OU Design courses (at Levels 1, 2 and 3) and their integration into a new Design Programme that will include the notion of atelier-based learning at its core. OpenLearn provides the platform and main tools for this project, which will also include amongst its second-year outputs, the creation of Open Educational Resources for wide dissemination of findings and good practice identified in the project. Proposal and further details are available online at the ATELIER-D Project Website and the JISC ATELIER-D project page. Project blog at http://designthinking.typepad.com/atelierd/
Ferreira, G. M. d, S. (2008) 'On the impact of open educational resources: two case-studies.' A working paper.
Ferreira, G. M. d. S. (2008) 'Out with the old, in with the new': Questions concerning the role of the teacher in 21st century education - Paper presented at Online EDUCA Berlin, Germany, 3-5 December 2008.
Ferreira, G. M. d. S. (2008) 'Out with the old, in with the new!' - Presentation from Online EDUCA Berlin, Germany, 3-5 December 2008.
Ferreira, G. M. d. S. (2008) 'What about the teacher? Open educational resources et al.' - Presentation from EADTU, Poitiers, France, 18-19 September 2008.
Ferreira, G. M. d. S. (2009) 'New spaces, new tools, new roles' - Abstract for the 16th International Conference on Learning, Barcelona, Spain, 1-4 June 2009.
Ferreira, G. M. d. S. (2009) 'Open educational resources and teaching in the 21st century: questions of autority - Abstract for the International Council for Distance Education Conference (ICDE), Maastrict, The Netherlands, 7-12 June 2009.
Monk, J. and Ferreira, G. M. d. S. (2008) 'Resources for an experimental course in Ethics' Abstract for The Second International Workshop on Philosphy and Engineering (WPE-2008), London, UK, 10-12 November, 2008.
The original scope of the project involved developing tools, techniques and applications to enable both staff and students to begin actually developing client side applications for their own mobile devices.
Through the two and a half years of research and development of this work it became clear that this was invariably going to be an almost impossible task as the technologies in mobile devices were accelerating beyond the timescales of the project. The initial work with Microsoft Pocket PC’s and later JAVA powered phones became outdated quickly so that the intention of the project to allow the user to develop for the device they actually owned failed because the software technology standards (such as MIDP 1.0) changed throughout the project. JAVA phone technologies were being replaced much more quickly with a greater incorporation of web browsing and interaction and the widespread take up of new software standards (MIDP 2.0), incorporating much more flexibility with the device such as Bluetooth became commonplace. During the project the Microsoft Mobile software has gone through at least 3 iterations and further still, the author of this work has seen an almost universal extinction of traditional Pocket PC’s from companies such as ACER, TOSHIBA, SONY, HP (Hewlett Packard) and DELL to the point now, where purchasing these devices has meant the user is limited to a few devices re-introduced by HP to cater for the previous generation of users, though many of these users have now swapped to other phone based (non-Microsoft) smart devices such as Blackberries and JAVA smart phones such as the Nokia E61. A further change has occurred in the project, and that has been a greater take up of server side technologies on both PC and mobile devices, this take up of server side technology would be the next step of any future project in mobile development in the authors’ opinion, and would certainly alleviate technology concerns.
Though the project has not succeeded in it’s primary goal, it has shown that through the development of tutorials in both JAVA and C# and an information portal updated with content (for a limited time) on mobile development, that users who have not previously developed client side applications for mobile devices can, and have, developed full applications or in some cases at least appreciated the steps involved in developing for these devices. Some work was done on HCI (Human Computer Interaction) in mobile devices, but continual rewrites of tutorials and the need to develop a portal to deliver the varied mobile content prevented a thorough investigation, and frankly was beyond the scope of the project. One key deliverable of the project was a comprehensive list of almost all Pocket PC devices that have been released with corresponding capabilities, this database of work would be invaluable in ascertaining features, software etc of any Pocket PC that a user wanted to develop for old and new.
Briefing slides about the project and other deliverables from the project (excluding the website which is now offline) can be accessed from Documents section, below this page.
I have been actively working in mobile computing for several years, mainly at the application development and deployment level, and was surprised that more colleagues did not develop for phones and PDA’s …
When I read about the COLMSCT project In 2005 I saw an opportunity to be able to investigate the area of mobile development in regards to how other teaching staff and students might get along with developing for, rather than using, mobile devices for learning and teaching. I’ve spent two years working part time on the project, through one of the fastest growing periods for mobile computing developments. Though this caused me many problems with having useful (up to date) learning material for staff and students, I developed personally through the experience in several areas which I feel has improved me as a researcher and teacher.
I presumed back in 2005/6, that mobile technologies were steadily developing and that to provide a range of resources for academic staff to develop their own applications was going to be easy enough. Little did I know that due to the nature of the work being spread of 2 years that the changing pace of technology would make much of my work obsolete within months, by eighteen months the tutorials etc were no longer suitable.
Whatever the outcome of the work, one thing is certain, and that is the time as a fellow allowed me to research more fully, an area of technology that I had previously only really had practical experience of. The range of devices, usability, developing and running of applications on so many devices really opened my eyes to the pace of change of the technology and improved my own familiarity with new tools and technologies, almost weekly.
The COLMSCT fellow program became more than delivering tutorials and materials for others to learn about mobile technologies, but represented a consolidation of my own knowledge in an area of fast technological change.
Slater, S. (2007) JAVA based MIDP 2.0 tutorials: Tutorial 1
Slater, S. (2007) JAVA based MIDP 2.0 tutorials: Tutorial 2
Slater, S. (2007) JAVA based MIDP 2.0 tutorials: Tutorial 3
Slater, S. (2007) JAVA based MIDP 2.0 tutorials: Tutorial 4
Slater, S. (2007) JAVA based MIDP 2.0 tutorials: Tutorial 5
Power Point Slides showing the changes done to the web site as it was re-developed
Briefing slidesUsing email, electronic communications and other emerging online tools to support an effective interaction between distance learners and tutors. This project is based on the experiences of tutoring students on first level Open University courses.
Email is rapidly becoming accepted as a preferred means of communication between students and tutors. Protocols and guidelines for its use exist within just about every e-enabled organisation. Just as we get used to the latest ’10 commandments’ and ‘codes of conduct’ other forms of electronic communication become available. From email there developed conferencing and forums, while instant messaging (IM) has evolved into a multimedia experience using audio and video. Then along come the Web 2.0 technologies of bookmarking, blogs, wikis, podcasts and social networking. Our students at the Open University are adept at using these technologies in their private, business and social lives and it is right that we, their tutors, should exploit their use in the academic arena too. This project explored the potential for these tools to be used to enhance the experience of student-tutor communication.
Email and e-communications
The project commenced as a review of the use of email with my student groups following an experience of a student who had sent over 100 emails during 6 months of an online course, sometimes up to four emails in a day. In one of the exchanges I noted a comment embedded in the text which said, “It was good to talk with you on the phone today …”. This made me realise that perhaps there is merit in considering breaking out of a routine method of correspondence, and to vary the teaching and support methods. The project therefore evolved into an examination of e-communications in general, and then the use of so called Web 2.0 technologies (O’Reilly, 2004) as a tool to support my students.
Making the technology fit the pedagogy
This led me to question whether technology was influencing, perhaps even determining my teaching style, or whether in fact the underlying pedagogy was the same but maybe I was adapting the technology to suit the circumstances I encountered. This led to a number of joint exercises with other COLMSCT Fellows on this theme where we explored and promoted different views on the subject.
Proactive engagement with students
I have been particularly careful not to require students to engage in additional academic activity, and as a result I found that my efforts were channelled into proactive contacts with students. It was clear that I enjoyed using the online technologies to interact with my student groups and I eventually discovered that in fact the key element to my pedagogy was the fact that I was being proactive, and in some ways creative in the use of technology to meet the student needs. Indeed it is clear that many of our students at The Open University are adept at using these tools in their private, business and social lives and it is right that we, their tutors, should exploit their use in the academic arena too.
In early presentations about my experiences I was talking mainly about the opportunities that these technologies enabled for communicating with students. I was looking to enhance the experience of student-tutor communication to the extent that the widening gulf of distance learning might in some way be bridged. It was clear to me that students can be quite strategic when it comes to making choices about how much time and effort to spend on extra-curricular activities. I know this from my own experience as a past and continuing student with The Open University. The trick for me was to try to embed the use of these tools as seamlessly as possible into normal contacts.
Technologies used
In making my decisions about which tools to use to engage my students I took the opportunity to experience a wider range of resources than I have eventually used. For example I have personally experienced the Second Life virtual world and currently feel that the majority of my own students haven’t the time (as evidenced by their own study planning activities and course discussions) to develop into this area so I have chosen not to push it at present. I am aware of many other initiatives with the university that are exploring this area.
Taking this forward into the future there are opportunities to further close the gap by incorporating audio and video conferencing as a tutorial tool for those students who do not or cannot attend face to face tutorials. Online tools such as Flashmeeting and Elluminate are two such options being explored by The Open University, while Skype and instant messaging (IM) via a number of web sites also provide options. As Second Life develops from other projects then this too can be added to the toolbox for proactive online student support. I have so far avoided using the social networking environments such as MySpace and Facebook since they can be time consuming and research as well as my own experience suggests that students may not want to mix with tutors here (Hewitt and Forte, 2006).
Below is a table of my experiences in communicating with students via email, forums, blogs, web pages, podcasts, social bookmarking and social networking (photo sharing). In effect I was using web tools to manage my proactive contacts with students. The outcome of these activities can be summed up in the following table which simplifies the input and output of each activity.
| Tutor proactive input | Student engagement | Output/results |
| Weekly Tutor News message in tutor group forum – incorporating blog of tutor’s weekly activity (published within the First Class conferencing environment) | 90-95% of tutor group read the weekly messages. (verified using History feature of conferencing environment) | Student feedback on this is positive. This becomes the focus of regular engagement with tutor. A number of other tutors have expressed interest in this method and have sought additional details from me. I am aware of some who now do this regularly. |
| Twenty Questions Re-formulation of assignment questions as a series of ‘Have you done this… ? questions. Prepared as preparation for forthcoming assignments. |
Incorporated into Tutor News messages. | Students report satisfaction with these reminders. Now widely used by many tutors on both of my courses (T175 & M150). Some tutors are now preparing the template for me. |
| Short podcasts with general feedback on group performance in last assignment. | 50% of students say they use them and find them useful. | Students report listening to podcasts more than once each. Statistics now available with a new tool being developed by the Mathematics, Computing and Technology Faculty. |
| Tutor group web site Publication of student contributions to course activities. |
Difficult to judge value – no useful student feedback given. | Effort involved in creating web site not justified. Not continued. |
| Photo sharing activity (Flickr) Linked to course themes in each Block of work |
Students not content with following the set theme. Students stretched the theme to suit their own choices. | Direct links to course themes make this worthwhile, but need to prompt contributions from students. This is an area of social networking where the participants seem to set the rules rather than the teacher. |
| Social bookmarking (del.icio.us) URLs recommended in weekly Tutor News message and collated from assignment answers of students. Tags determined by tutor. |
Students contribute automatically via references in assignment answers, which I incorporate into the online resource. | Easy to set up and quick to register links with course related Tags. Student use of this facility not evident. May be useful in later presentations as the ‘stock’ of URLs builds up. |
The following is a chronological list of the dissemination activities I have taken part in and conferences I have attended and reported on.
| Date | Event & Location | Subject / Details |
| Nov / Dec 2005 |
First Class forum [online] COLMSCT E-communications Special Interest Group |
Contributed to the setting up of the SIG and engaged in online activities with colleagues. |
| 9 January 2006 |
International Seminar on E-learning, London South Bank University |
Attended conference. Speaker from University of New South Wales about online photo galleries which inspired my subsequent use of Flickr. |
| 26 January 2006 |
Open CETL Conference & Workshop, Milton Keynes |
Attended workshop presentation about the new Virtual Learning Environment (VLE). |
| 9 March 2006 | Netskills workshop at Manchester University - Mobile Learning: Education on Demand | Attended workshop. Notes posted to SIG forum. |
| 26 April 2006 | Open University Teaching and Learning Conference, Milton Keynes | Ran a workshop on the theme of ‘Email: Pushing the Boundaries of Accessibility’. |
| 8 & 9 June 2006 | Open CETL Conference, Milton Keynes |
Poster presentation entitled ‘You’ve Got Email: Now Wh@t?’. Attended a number of sessions. |
| 10 July 2006 |
OU Mobile Technology Group, Milton Keynes |
Attended discussion group on subject of podcasting. Presented my own experience of creating audio files for students. |
| 4 November 2006 |
OU Region 06 Staff Development event, Cambridge |
Gave presentation on ‘Using Technology to Support Students’. |
| 23 November 2006 |
Netskills Workshop at Letchworth Effective elearning with Moodle |
Attended workshop. |
| 9 – 12 December 2006 | International Society for the Scholarship of Teaching and Learning (ISSoTL) Conference 2006 , Washington DC | Invited to ‘synthesize’ (take notes of) one session at the conference. |
| 14 December 2006 | COLMSCT Community Event, Milton Keynes | Presented a report on my attendance at ISSoTL 2006. |
| 1 March 2007 | Colchester Sixth Form College | Spent morning with Head of IT to learn how to create and manage Moodle courses. |
| 1 & 2 May 2007 | OU Curriculum Teaching and Student Support Conference, Milton Keynes | Gave presentation on ‘Using Technology to Support Distance Students’. |
| 14 May 2007 |
Science faculty workshop CETL Suite, Milton Keynes |
Gave presentation on ‘Podcasting for Students: An AL perspective’. |
| 30 May 2007 |
‘Elearning at the Cusp’ Conference Staffordshire University |
Attended conference. |
| 9 June 2007 |
OU Region 01 Staff Development event for Faculty of Health and Social Care, Camden |
Ran workshop on ‘Using Technology to support students’. |
| 24 – 27 June 2007 | Distance Learning Administration (DLA) 2007 Conference, St Simon’s Island, Georgia, USA. |
Gave presentation on ‘Using Web 2.0 Technologies to support distance learners’. Short paper in conference proceedings, and conference blog published on conference website. |
| 5 July 2007 |
ISSoTL 2007, Sydney, Australia. |
Joint paper with Anne Adams and Jacquie Bennett accepted. Paper entitled ‘The impact of communication technologies on teaching and learning identities’. My avatar appeared ‘live’ at the conference during Anne Adams’ presentation of the paper. |
| 4 September 2007 |
ALT-C Conference 2007 (unable to attend due to family illness) |
Joint paper with Hilary Cunningham-Atkins and John Woodthorpe (COLMSCT Fellows) accepted for a symposium entitled ‘Finding the balance: learning technology for multiple generations’. |
| 25 September 2007 |
Science VLE Workshop, CETL Suite, Milton Keynes |
Gave presentation on Podcasting and took part in workshop. |
| 15 & 16 October 2007 |
2nd Open CETL Conference (Pushing the Boundaries), Milton Keynes |
Gave presentation on ‘Using Web 2.0 technology to support distance learners’. Ran workshop with Wendy Fisher and Bill Tait (COLMSCT Fellows) on ‘Making the technology fit the pedagogy’. |
| 1 & 2 November 2007 | SOTL Commons Conference, Statesboro, Georgia, USA | Gave joint presentation with Bill Tait and Wendy Fisher on ‘Making the technology fit the pedagogy’. My own part of the presentation focused on ‘My Web 2.0 Classroom’. |
| 13 November 2007 | Guest lecture to students at University of Houston (via Internet video conference, using Apple iChat) from Milton Keynes. | Provided a guest lecture on subject of ‘Using Web 2.0 to support students’. |
| 10 & 11 November 2007 |
Associate Lecturer Conference, Milton Keynes |
Poster displayed. ‘Using Web 2.0 technologies to support distance students'. |
| 13 February 2008 |
Department of Computing, Podcasting Workshop at CETL Suite, Milton Keynes |
Gave presentation on ‘Podcasts in Distance Education’ and took part in workshop. |
| 21 February 2008 |
EATING Talk, Technology Faculty, Milton Keynes |
Gave a talk on the subject of ‘Using Web 2.0 to support students’. |
| 17 April 2008 | Guest lecture to students at University of Houston (via Internet video conference, using Apple iChat) from home. | Provided a guest lecture on subject of ‘Using Web 2.0 to support students’. |
| 25 April 2008 |
OUSA Conference, Milton Keynes |
Hosted a poster presentation on subject of ‘Learning with Web 2.0’. |
| 29 & 30 April 2008 |
OU Conference, Milton Keynes |
Presented a workshop on ‘Creating a podcast for student engagement’ in company with Brendan Murphy (AL and PhD student). Gave a presentation on ‘Proactive Student Support Online’. |
| 29 June to 2 July 2008 | Education and Information Systems, Technologies and Applications (EISTA) 2008Conference, Orlando, Florida, USA |
Paper and presentation on subject of ‘Using Web 2.0 Technologies to Support Distance Students’ within OU session entitled ‘Matching Technologies and Pedagogies for Supported Open Learning’. Short paper published in conference proceedings. |
| 24 & 25 September 2008 | 3rd Open CETL Conference (Building Bridges), Milton Keynes | Ran a workshop entitled: ‘Don’t just surf the web – take some students along for the ride too’. |
| 16 – 19 October 2008 |
ISSoTL 2008, Edmonton, Alberta, Canada |
Gave a presentation on ‘Proactive engagement with students in an online environment’. |
As an Associate Lecturer on courses with an online element I have always explored different ways of exploiting the technology to support my teaching. When I applied to be a COLMSCT Fellow I expected that I would be given time and opportunity to evaluate what I had already done in terms of reviewing technologies I had used, or specific cases or contacts with students. What actually happened was that I was lucky enough to be able to spend more time exploring a wide variety of different types of electronic communications and engaging in activities where I have been able to enhance the learning experience of my students.
What I also didn't realise at first was that there would be a whole bunch of people who would be interested in what I was doing. Being a COLMSCT Fellow has given me wider knowledge of the inner workings of the Open University and access to people, places and other projects that I didn't know existed. The result is that my own project has benefited from ideas and suggestions beyond those I originally envisaged.
I had no prior experience of presenting at academic conferences and have found that attending these as well gives you opportunities to have valuable conversations with like minded people, and with influential people in your own area of interest. In my experience some of the best presentations at academic conferences are given by practitioners. You get to hear from the people actually involved in the work - those who are at the point of contact with students. My own presentations appear to have been well received and I have enjoyed being able to talk about my work and to present some ideas of my own.
Overall I would say that the most rewarding thing about working with the COLMSCT CETL is the contact with other people. This includes the central staff who are very supportive of Fellows and also the other Fellows who are conducting projects and research in areas of interest to me. It also includes what I believe to be an enhanced relationship with my own students with whom I enjoy the varied forms of electronic contact.
Click icon to see what is on my bookshelf ![]()
Beechener, K. CTSS Conference, 2 May 07 - Using Technology to Support Distance Learners
Beechener, K, Fisher W, Tait B, Open CETL Conference Workshop, Making the technology fit the pedagogy, 16 Oct 07
Beechener, K, Learning & Teaching Conference, 26 Apr 06 - Email - Pushing the Boundaries of Accessibility
Beechener, K. Open CETL Conference, 15 Oct 07 - Using Web 2.0 Technology to Support Distance Learners
Beechener, K. Podcasting for students: an AL perspective, May 07
Beechener, K. Podcasting: the Associate Lecturer view, Feb 08
Beechener, K. Poster for AL Conference, Nov 07 - Using Web 2.0 Technologies to Support Distance Students
Beechener, K. SoTL Commons Conference 2007, Making the technology fit the pedagogy, 2 Nov 07
Beechener, K. Tutor News conference messages T175-06J
Beechener, K. Using Web 2.0 to support students, EATING Talk, Feb 08
The concept of a subject community is one that enables both students and academic staff to have some continuity and general skills and knowledge sharing outwith the framework / content of any particular course
This COLMSCT Teaching Fellowship Project explored the development of an Engineering e-community with student and staff (academics and AL) stakeholders. It aimed to provide enhanced student support that will not be bound by any course, presentation, duration or subject matter. Further, the resource has potential for use in promoting continual Engineering professional development activities, often a requirement of Engineering students (defined as those studying courses within the Engineering subject profile). At the start of the project it was a perception (of OU Engineering staff) that these students historically do not readily participate in available (mainly course based) on-line forum. This perception was investigated via an on-line student survey and follow-up student interviews: The resulting semi-quantitative information and conclusions on use trends and student opinion of existing forum, were used to inform decisions relating to the features of a future Engineering e-community. A detailed model for an e-community resource has been proposed (structure, format and content)that has used data obtained Faculty staff, ALs and other CETLs (e.g. PILS). Theory and models of community development and best practice from elsewhere both within and out with the Open University were noted and recognised within the model. A mid-project decision was made to complete this model development in isolation from the SSR (Student Support Review) Pilot within the MCT Faculty Engineering Subject Area (which was launched at the mid point of my COLMSCT project). A review of the potential integration of the proposed model within SSR final implementation will take place during 2010 /111 when both activities become more mature and the results of SSR pilot investigations become available. The full report of the project is located in the resources section of this page. Dissemination Activities: Presentation of a review of the Project Findings to the MCT Faculty Engineering Programme Board (May 2010) -see documents section. Paper published in the proceedings of the HEA Engineering Subject Centre Conference 'EE2010 Inspiring the next generation of engineers' ISBN 978 1 90763209 9 -see resources and documents sections. Other spin-off activities. As well as attended the EE2010 conference (July 2010) and the HEA Engineering Centre STEMPRM Symposium (June 2009) I was invited to lead a Technology Associate Lecturer training workshops at R05 AL Conference (May 2009)- see documents section of this project page.
Clay, K (2009) Poster presented at 4th Open CETL Conference, Milton Keynes, 15-16 December 2009
Clay, K. COLMSCT Project Executive Summary
Clay, K. Engineering Pathway and Community Building Sessions at AL Conference May 09
Clay, K. PPT presentation for Community Building Session at AL Conference May 09
Clay, K. PPT Presentation for Engineering Pathways Session at AL Conference May 09
Clay, K. Report on Symposium Attendance STEMPRM June 2009
Clay. K (2010) Project Review Presentation to OU MCT Engineering Programme Board 12 May2010
This project aims to examine the interactions of moderators and students on two Open University, online technology courses in order to determine what kind of activities and approaches succeed in keeping students engaged and moving forward with the course.
I am examining the interactions of moderators and students on TT280 in order to determine what kind of activities and approaches succeed in keeping students engaged and moving forward with the course. This is in the context of an online course with no tutors. I have used content analysis on selected forums, using a draft coding scheme, and someprinciples drawn from Collison G, Elbaum B, Haavind S and Tinker R (2000) Facilitating Online Learning: Effective Strategies for Moderators Atwood Publishing: Madison WI.
I have tested the results and emerging hypotheses on my various colleagues in the Web Applications Certificate forums.
I expanded this initial analysis by using activity theory to analyse the behaviours of both moderators and students. I expanded this further into the idea of runaway objects, posited by Yrjo Engestrom, which allowed me to produce a clearer picture of the relationship between moderators and students, and I went a stage further by incorporating Engestrom's ideas around wildfire activity, which have proved to be very fruitful.
I have already done some work on student outcomes which demonstrates a close association between level of conference participation and grade achieved. I have also done some work on the moderating role (These were published as What am I doing here? An example of reflection-in-action, collected conference papers of 11th Cambridge International Conference on Open and Distance Learning. Cambridge: Open University, 2005), and a small analysis of moderating behaviour with one specific student from which I drew up a coding scheme.
Some initial work has made it evident that moderators' responses depend to a degree on what the students are doing to start with, and to focus on this I have been drawing up a taxonomy of student behaviours, with a related taxonomy of moderator behaviours.
The student behaviours can be categorised as follows:
There is also a sort of subterranean behaviour “seeking clarification”, but this can usually be subsumed under other behaviours.
In response the moderators' behaviour can be classified as follows:
Some of these can potentially be subdivided with fruitful results e.g. some "Seeking solutions" includes "tell me the answer" behaviour, which might provide the moderators with good cues for "teaching how to learn" type responses. It is also notable that moderators' behaviour in response to expressing feelings and seeking reassurance is not to respond directly.
Other moderator behaviour is not responsive so much as predetermined
These classifications are still works in progress.
The image above gives a visual impression of the amount of material I have available. This show about 2500 pages which is less than a third of the output of one presentation. You can also see my indefatigable assistants, Badger and Gopher,sizing up the task.
The deepest level of analysis has been afforded, however, by activity theory and in particular wildfire activity. this work is still ongoing, although I have been able to make a preliminary report of my findings at the Elearn conference in October 2010. Combining the theory of wildfire activity with my observations of what students and moderators do allows for a fresh insight into the way students work, and in particular the relationship between informal and formal learning. In particular it allows insight into the paths a student follows even when undertaking a controlled piece of formal learning, and it allows speculation on the way in which the concept of Zone of proximal development works. The common conception of the ZPD is that it belongs to the person. We have perhaps lost sight of Vygotsky's original insight that it is socially constructed, and socially owned. Combining this with the concept of feature space from wildfire activity theory may allow us better understanding of how people learn, and thus better ideas about how to design learning for them.
Any further work will be reported on my blog Really Useful Knowledge. http://reallyusefulknowledge.blogspot.com/
Parsons, R. (2010) COLMSCT Final Report 'Identifying and applying moderating skills.'
Wildfire activities and moderating skills - poster 1
Wildfire activities and moderating skills - poster 2
Wildfire activities and moderating skills - poster 3
Wildfire activities and moderating skills - poster 4
Wildfire_Teaching - ELearn 2009This project aims to investigate levels of student engagement in e-learning activities, relate this to academic performance and unltimately develop methods for improving the level of engagement where this is currently low.
To do this, U316 'The Environment Web' will be used as a case study, accessing First Class tutor group conferences that levels of patters of participation can be analysed. Analysis will lead on to recommendations for changes to e-learning activities which can be quickly implemented across these courses and followed by a second round of analyses to determine effectiveness. This may be augmented by interview work with staff and/or students.
By developing e-learning aspects of courses appropriately, the student learning experience will be improved through levels of participation in associated activities, more successful learning overall and a stronger sense of progression and community. As the OU moves towards a greater emphasis on e-learning (including electronic submission of assignments and online discussions), it is essential this is implemented in the most effective manner possible and that aspects of teaching not appropriate for electronic delivery are identified. Similarly, it is important to identify aspects where electronic delivery can be offered as an alternative to existing formats without reducing the quality of teaching provided.
With COLMSCT producing its remaining output documents, my project’s final report has been posted as a downloadable teaching resource (see the link under 'Related resources' to the right hand side of this page) with details of how U316 (‘The Environmental Web’) forums were analysed, some key results and a set of suggestions and recommendations for tutors and other staff involved in courses with collaborative electronic activities. Following this, I posted the link to the U316 tutors forum (as well as sending my thanks to those tutors who allowed me access to their tutor group forums), as well as to the S216 (‘Environmental Science’) tutors forum as this course was initially also considered for inclusion in my project. So, what was the response?
Initial responses from tutors was very positive, including both those who were interested in applying some of the recommendations to their tutoring , as well as those taking a more academic interest due to their work on other related educational projects. Beyond this, I also heard from both course managers who informed me that the report had been forwarded to the U316 course team with a view to informing course revisions, with S396 (‘Ecosystems’) also taking an interest as another course with online discussions. Clearly, the project has reached a wider audience within the environmental sciences; the next step is to make sure it moves further afield as it should be applicable to any course with collaborative online activities.
Some background information about my work and non-work life, including the current stage of my COLMSCT work.
I started tutoring for the OU in 2003 after having been an ecology activity tutor for one week on the ‘Practicing Science’ summer school in 2002 – and loving it! I began with the online U316 ‘The Environmental Web’ and added S216 ‘Environmental Science’ a year later – both of which I hope to continue with. I have a PhD in ecology (looking at primary production in tropical freshwater for those who like to know such things…) from Leicester University and have lectured, either as staff or a guest lecturer, at the Universities of Leicester, Nottingham and Southampton.
When not working for the OU, I am a freelance ecological consultant working all over southern England (and beyond if work appears elsewhere). I cover most aspects of ecology and conservation management, with recent projects including chalk grassland monitoring, slow worm translocation, developing a dead wood habitat assessment method, and skylark survey. Through this and other related experience, I am a Chartered Environmentalist (CEnv) and a member of the Institute of Ecology and Environmental Management (MIEEM).
So, educational research has been quite a departure from my science background, but an interesting one. I’ve found that some skills such as project design and data analysis have been transferable, while I’ve learned others from scratch, such as interview design and transcript analysis. Keeps the grey matter lively.
My COLMSCT project is now about halfway through. The first year of data has been analysed, and when the next presentation of U316 starts in early 2008, I will collect data for that year and compare the two in light of suggested tutor actions which have been developed during the work to date and which will be sent to other tutors at the beginning of the year. Although there are major changes relating to e-learning in the OU as a whole (such as the development of a new Virtual Learning Environment), analysis will determine whether tutor-based (rather than more centralised) actions can improve the levels of student engagement in e-learning, and if so, to what extent and how this might affect students’ academic performance. I also have a paper in preparation, but more about that when it is finished, so watch this space!
When not being an OU Associate Teaching Fellow, tutor or ecological consultant, I like to get involved in voluntary conservation & environmental work when time allows. However, when I want something completely different, I go to capoeira classes (a Brazilian martial art with elements of dance and acrobatics – very difficult for those starting in our 30s!), and take part in historical re-enactment where I can be found wearing chainmail and running around a field as a sword-wielding Viking. Marvellous.
So, enough about me – I hope you like the brief life history – if you’d like to know more about my project, please do get in touch – my contact details are on the project’s front page.
Continuing the CETL journey - everyday use.
It was with interest that I read Yvonne Cook’s news item from the Open CETL Conference entitled ‘Continuing the CETL Journey’ (posted 17th Feb 2010). As Yvonne says, “One of the big unanswered questions, as the conference closed, is how to ensure this level of associate lecturer involvement is maintained beyond the life of the CETLs.” One possibility mentioned is extending the work of the CETLs beyond July 2010, something I (and I imagine many other COLMSCT Fellows), would be most interested in as my project has left me with new areas of potential research and currently, a partially completed proposal...
Another aspect, especially for those Associate Fellows who are also course tutors, is whether the work we did is being used on a day to day basis during tutoring activities. Certainly I have found that it is – whether giving informal advice to other tutors with queries about their students’ online participation, making suggestions about the content and design of e-activities, mentoring new tutors, or any one of a range of other uses, I have found that the relevance of our COLMSCT work is all too clear, and the experience it has provided all too valuable. As the Vice-Chancellor Martin Bean stated during his opening speech, not only has there been an impact on staff development and how we practise teaching and learning, but it has provided a challenge whereby we must “ensure that the valuable lessons we have learnt over the last four years are firmly embedded within the weft and warp of our teaching projects”.
To read more about the 4th Open CETL Conference please visit our 'News' Page.
The aim of this project is to improve the integration of online activities in to Open University courses.
It is proposed that this will increase successful student participation, leading to enhanced learning opportunities for students and increased opportunities to deliver learning outcomes in areas where online distance learning has traditionally been found wanting – i.e. group work and peer assessment.
In addition, advice will be given to Course teams to provide tutors with more guidance on managing collaborative activities in order to improve the quality of the e learning experience for students.
Investigating the impact of reduced face-to-face interaction on the underlying dynamics of the OU teaching system and on the relationships that maintain motivation and retention, in order to identify appropriate interventions
The bigger picture my project is concerned with is the inexorable move towards electronic rather than face-to-face interaction in OU teaching, and the effect of this on the feedback systems which have evolved to govern the quality of this teaching. The relationships I am concerned with are primarily those of influence between students, Associate Lecturers (ALs), staff tutors and course teams, particularly with regard to motivation and student retention.
The purpose of this investigation is not to decide whether electronic or face-to-face is ‘better’. The purpose is to discover where this changing situation has disrupted previously successful dynamics and where it has impacted adversely on the learning experience of particular categories of individual, in order to identify appropriate points of intervention, and to design processes to achieve such interventions. My aim is that independently of such interventions, the method of investigation itself will have a positive influence on the emerging situation.
A study to identify, describe and disseminate good practice for synchronous online tutorials, specifically for teaching in Maths/Science/Computing/Technology Areas.
This project investigated the use of a synchronous conferencing tools (Lyceum, FlashMeeting) to support the delivery of a Level 2 Physics course (S207 The Physical World). A large number of interactive activities were developed to exploit the audio / text / graphical / communication tools in two packages and these were piloted over two presentations of the course to students who volunteered beyond their normal commitment to the course.
Feedback from the students was collected by a number of means, including a questionnaire, interviews, comments posted in discussion forums and comments written at the end of several of the live sessions. This, combined with reflection by the tutor, led to the identification of barriers to useful learning, which generally expressed issues of confidence:
The project report provides a sesction on suggestions for planning to overcome these barriers, within individual activities, over the course of a single session and over a series of sessions; and comments from students charting the perceived benefits from the sessions and the growth in their confidence, despite the unfamiliarity of the environment. It is hoped that these suggestions can be adapted to other packages, with different feature sets.
Examples are included of the types of activities, which tended to emphasise comprehension and application of knowledge, largely through problem solving, rather than introduce new knowledge.
The project also examined some input technology which might be more effective for the creation of diagrams and the input of mathematical expressions. The use of a graphics tablet and a tablet PC in different sessions assisted in the quick input of graphics, sketches and mathematical expressions, although the quality of these was limited by the graphical capabilities of the software. A PC NotesTaker Pen, which simultaneously scans an image as it is written with a a special ballpoint pen, was found to be too disruptive for real-time input during a session.
It is intended that the lessons learnt from this project be disseminated to colleagues coming to terms with the use of such environments as their use increases within the University.
The SynchroSystem wiki (http://www.open.ac.uk/wikis/synchrosystem/Main_Page) gives examples of student experience and ideas for effective e-tutorials, as well as discussing other issues and packages in this area.
Andy Diament has been teaching physics and science on 1st and 2nd Level OU courses since 1992, currently S207 The Physical World. He has a strong interest in use of ICT to support learning. His 'main' job is as a lecturer in A Level Physics and Computing at Penwith College in Penzance. At this FE college, he has been instrumental in the establishment and development of a virtual Learning Environment and is/has been involved in several projects to design online materials for learning. His teaching experience also includes several years as a school science teacher and he has previously carried out research using remote sensing techniques at Cambridge University (but that feels like another world now).
This project was designed to investigate, and find ways to improve, the distance learning experience of the OU student in prison.
The aim was to determine what influenced students to embark on an OU course or programme, and to understand how technology, or the lack of it, was affecting their choice of study and future decisions. Who had influenced their decisions and what factors had encouraged or inhibited progress along their learning journey? What were the key motivators or de-motivators? What effect did their learning have on their prison life and visa-versa? What was the effect of the developing on-line initiatives and what were the major barriers to learning? An additional aim was to investigate the experiences of the OU student on release from prison and an additional longitudinal study is planned.
Method
The project involved face-to-face, semi-structured, in-depth interviews with 35 students in 10 prisons across 5 OU regions in England, and informal interviews and discussions with more than 50 staff (both internal and external to the OU) over a period of 6 months. The students interviewed were adults studying a range of OU courses at all levels from Openings to Post-graduate. Initial analysis of the interviews, using grounded theory, produced preliminary findings which were disseminated within the OU and beyond. There are plans for further analysis of the interviews and a longitudinal study of OU students released into the community.
Results
Results showed that although many dedicated staff (both internal and external to OU) worked hard to support the students in prison, the students’ needs were not adequately met. The students came from a variety of different educational backgrounds and many had no previous educational qualifications. The reasons for deciding to study with the OU were numerous but that decision and later encouragement often involved one or two ‘special people’ who the student identified as life-changing. Tutor support for students in prison varied. Some of the best support occurred when tutors had established a good working relationship with the prison’s OU coordinator and made contact at an early stage. Lack of access to IT facilities was a major barrier to study. The amount and type of technology available to the students varied greatly and was not purely connected to prison security category. Due to increased online elements there were a reducing number of courses available for OU students in prison but there was also insufficiently coherent policy or funding to provide adequate support and guidance to meet their needs. There was a lack of awareness of the problems, across all stakeholders, and a lack of communication at all levels which meant that the situation was deteriorating rapidly, needing urgent attention.
Conclusions
The resultant impact from the dissemination of the results was swift (see Impact pages on right). The OU acknowledged the problems. A commissioned report led to the development of the Offender Learning Steering Group and a review of the OU’s Offender Learning strategy. The Offender Learning Steering Group Final Report (in 'Documents' on the right hand side of this page) was presented to the Learning, Teaching and Student Support Committee and all 16 recommendations were accepted (with one amendment). A new Offender Learning Coordinator post and an OU-wide coordinating group have been developed. Technical support has also been provided to implement the changes necessary for OU involvement in the trials of new technologies in prisons in two English regions initially. Negotiations have taken place with key stakeholders, a new Prisons Scheme for England is being negotiated with the National Offender Management Service and the Department of Innovations and Universities has agreed to provide additional funding for the next 3 years (at least). Updated schemes for other Nations may follow though problems were worst in England.
Initial results of this research and a commissioned report by the Director, Students, led to the development of the Offender Learning Steering Group in October 2007 and a review of the Open University’s Offender Learning Strategy. That review was completed on 31st March 2008 and a report was presented to the Learning, Teaching and Student Support Committee in May 2008 (see Offender Learning Steering Group Final Report in 'Documents' on the right hand side of this page). All 15 of their recommendations were passed with only one amendment. A new Offender Learning Coordinator post, an Offender Learning Steering group and an OU-wide Offender Learning Development Group (OLDG) were all created in October 2008. A full-time post was created in Learning and Teaching Solutions (LTS) to provide technical support to Offender Learning and to implement the changes necessary for OU involvement in the trials of new technologies in prisons in a number of regions. An audit of all regions and nations has been completed and is being used to identify best practice for informing and supporting staff and students.
The OLDG, which is an extension of the original Prisons Liaison Group not only meet regularly to discuss the issues and share good practice but have also formed into working groups to tackle some of the issues:-
Increased faculty involvement in the OLDG (4 new members in May 09) is just one way that awareness of Offender Learning is increasing across the University. Other activities which have increased awareness are:-
Conferences
Three conferences have been held - ‘Meeting the Needs' on the 1st November 2007, the 'Offender Learning Colloquium' on the 4th June 2008 and 'A Celebration of Offender Learning' (coinciding with the OU's 40 Anniversary celebrations) on the 16th June 2009.
Websites
The original Prisons scheme website in SRS is being refurbished and a new Offender Learning intranet website is being developed (http://intranet.open.ac.uk/studentservices/tls/pages/LDT-OFL-OffenderLearningHome.php ). There are plans for an external website which will be available from www.open.ac.uk.
Technical developments
The technical work, for the trials to provide secure web access to prisons, has now grown and a new IT group has been developed which involves many Units of the University (AACS, LDT, LTS, Communications etc.)
Partnerships with many external organisations have been developed and old partnerships have been renewed. We now have extremely good, trusting relationships with all key stakeholders. A new Prisons Scheme has been renegotiated with the National Offender Management Service (NOMS and the Department of Innovations and Universities (DIUS) for prisons in England. This has resulted in:-
Corporate Marketing are investigating the Prison staff market.
There have been visits to Ministers and the House of Lords (see minutes to right) in a bid to improve awareness of the benefits of Higher Education in prison. Three Offender Learning Conferences, hosted by the CETL, have also helped to raise awareness of the OU’s involvement in Offender Learning. International experts, researchers, Government and Prison officials and ex-offenders have been invited to join OU teaching and support staff, central academic and executive staff in a bid to move discussions forward. Lord Ramsbotham provided a keynote speech for the 3rd Offender Learning Conference on 16th June 2009 (read out by Will Swann (Director of Students)
Dissemination has been via presentations at conferences within the OU, the UK, Europe and America. Two conferences were organized at the OU for internal and external staff involved in prison education. The first raised awareness of the problems and feedback from that conference provided evidence that a second conference was required with higher level staff, who could change policy. Colleagues from Sweden and Germany were invited to explain how they were improving communication to their offender learners. An international workshop (initially at the request of UNESCO) was organized to discuss the wider role of higher education in prisons and to build a larger, more powerful network including CNED, UNED and UNESCO to influence government and promote change for offenders world-wide.
Below are a selection of COLMSCT related dissemination activities to April 2009.
Papers, Presentations and Posters (External) - see 'Resources' below.
Papers, Presentations and Posters (Internal)
Open University Papers
News articles (internal and external to the OU) - see 'Resources' below
Websites
A new prominent external OU website is planned soon
The European Prison Education Association is an organisation made up of prison educators, administrators, governors, researchers and other professionals whose interests lie in promoting and developing education and related activities in prisons throughout Europe in accordance with the recommendations of the Council of Europe.
The main aim of the project is to increase the participation of prisoners in lifelong learning in order to enable their reintegration into the society after release. The project is designed to reinforce the role of educational policies in prisons taking into consideration the Recommendation No. R (89) 12 , EDUCATION IN PRISON, of the Council of Europe . The project seeks to find solutions to actual identified needs of teaching and learning processes in European prisons, and to develop strategic policy statements addressed to the decision makers of penitentiary systems at European level.
Conclusions and Recommendations Report http://www.eurodesip.org/en/?cat=4
Anne is an Associate Lecturer on Y162 Starting with maths and MU120 Open mathematics, an Ofsted Inspector and independent education consultant. Her experience of prison education is varied; teaching IT and Skills for Life in a local prison, inspecting education in male and female prisons as an Adult Learning Inspector and tutoring prison students as an Associate Lecturer.
Anne started her COLMSCT research into the Digital Divide for Offender Learners in November 2006 and initial results have led to a review of the Open University’s Offender Learning Strategy. She has coordinated the work of the Offender Learning Steering Group (OLSG) and is currently the OU representative, leading a UNESCO international study into HE and Distance Learning in Prisons. She is also coordinating HE involvement on several pilot projects into secure web access in prisons in England which are helping to bridge the Digital Divide for Offender Learners.
Pike, A (2007) Presentation at the 1st Offender Learning Conference, 1st November 2007
Conclusions from 'Meeting the Needs of 2nd Offender Learning Conference, June 08
Pike, A. (2007) Investigating the digital divide for HE distance learners in prison. Short paper submitted to Alt-C, Leeds, 9-11 September 2008 - proper link
Anne Pike presentation at 2nd Open CETL Conference, October 2007
Poster presented at Associate Lecturer Conference 2007
Minutes from the House of Lords All-Party Penal Affairs Group 19th May
Offender Learning Steering Group: Final Report
My research explores students’ everyday experiences of ICT or new technologies, and how they apply those to their learning. The students are adults within a widening participation (WP) programme of distance education courses, the Openings Programme.
Kirkwood and Price (Kirkwood and Price 2005), in a survey spanning five years, suggest that there has been a fundamental shift in students’ access to new technologies, reflecting changes in attitudes and in wider society. Other data, however, show that the ‘digital divide’ is still cutting across social classes (Citizens Online 2007). The aim is to collect authentic learner viewpoints to add to the debate about use of ICT in widening participation.
The project included a paper-based survey of students and in-depth student interviews. In this context, ICT or ‘new technologies’ includes computers, mobile phones and other electronic devices, such as hand-held computers. The methodology builds upon the one used by Conole et al., (2005) in the recent JISC LXP project (learner experiences of ICT)
The overall plan was to investigate students on the Maths and Science Openings courses, with the Arts Openings course for comparison. Students would be drawn from both WP and non-WP audiences for each course (determined by previous educational qualifications and post-code related demographic data).
Access to technologies, in terms of abilities and motivation as well as hardware, will impact upon student recruitment and retention, as well as course design and student support. Investigating this for a widening participation audience makes the work even more relevant. Various (sometimes anecdotal) reports are surfacing about OU students who do not have access to new technologies, but also about students who have technologies and wish to use them more widely. This is a complicated situation, which I aim to explore in my project. The results should inform ICT strategy on the Openings programme, and contribute to a richer picture of ICT use by this audience.
References
My main aim in becoming a COLMSCT Teaching Fellow was to become active in educational research. I had already gained the MA in Open and Distance Education from IET in 2002. My academic work as one of the developers of the Openings Programme had been rewarding, but this left little research time.
COLMSCT has provided a home for me as a new researcher, with like-minded colleagues, time and financial support. For the first time in many years, I feel part of a research community. My mentors, Anne Adams and David Robinson, have kept me on track. I am also an associate member of CALRG, and colleagues in IET have been very helpful. The luxury of this level of support has enabled me to learn the essential techniques of qualitative research and analysis. I can now use a grounded theory approach to analyse interview transcripts, including coding with Nvivo software. My project is quite a broad investigation, and requires a combination of data-gathering techniques: “Investigating the influence of Information and Communications Technologies on student learning in Maths, Science and Technology subjects for a widening participation audience.” So, I have carried out surveys and semi-structured interviews and I am about to use a multi-media pack with some students, who can record their study experiences using a disposable camera, audio-recorder and written diary. The results, so far, indicate that students make quite subtle decisions about when and how to use a computer in their studies. Even those students who are confident with a computer, with home internet access, may choose not to use it for certain tasks. Control over the study environment is just one factor that influences these decisions.
In personal terms, the transition from the analytical approach of the physical sciences to the more discursive style of educational research was a major achievement. Completing E835 Educational Research in Action addressed my initial concerns about the validity of qualitative research. My first paper was presented at BERA in September 2007, and others are at draft stage, ready for the 2008 conference round. I now feel confident as an active researcher, and I’m very glad I applied for the CETL Fellowship.
The aim of this project is to investigate students' increased use of collaborative learning resources via fOUndIt (http://foundit.open.ac.uk). Additionally it aims to address the issue of course resources becoming outdated, by exploring the use of fOUndIt as a gathering point for user-contributed news, ideas and resources.
I plan to trial a pilot of fOUndIt on an Open University Technology Level 1 course (T175 Networked living: exploring information and communication technologies) and through this, provide students with the opportunity to increase their experience of working with others online, engage in active research, gain access to more up-to-date course-related content and expand their knowledge of good quality web resources.
Through the project, and a related project by John Woodthorpe, I hope to identify best practice and issues raised in the use of Web 2.0 technologies to increase course content relevance, preparing the ground for the use of such teaching resources on the new course T215 and via Moodle, the new Virtual Learning Environment.
Shaw, J. (2010) COLMSCT Final Report ‘Investigating the integration of a user-contributed web-based teaching and learning resource (fOUndIt) with conventional course content and the impact of this on the e-learning experience.’
Shaw, J. (2010) COLMSCT Final Report, Appendix A
Shaw, J. (2010) COLMSCT Final Report, Appendix B
Shaw, J. and Woodthorpe, J. (2009) fOUndit: supporting subject communities by sharing online resources. PowerPoint presentation from the Association for Learning Technology Conference (ALT-C) 2009, 8-10 Sept, Manchester, UK.Research was carried out to investigate how mathematics can be taught in a virtual world.
There have been three strands to this work.
Click on the links within "Related resources", below, to see more details.
Holding tutorials in Second Life
The researcher is a tutor on the Open University second-level course in Pure Mathematics (M208). Tutorials were held with volunteer students from this course in the virtual world Second Life, and the subjects learned covered linear algebra, group theory, real analysis, and general exam advice/revision. The researcher investigated issues such as how to teach in a virtual world, and how to overcome the further difficulties of communicating complex equations and diagrams.
During the sessions a special experiment was carried out. The tutorials were held in different environments/locations. When the data was being gathered to analyse the effectiveness of the learning via the standard technique of telephone interviews, the students were asked about their experiences. The surprising result was that one of the environments caused a real feeling of claustrophobia among some students. The results indicate the importance of environment when designing learning spaces.
Comparing the social world Second Life with the role-playing game RuneScape to investigate immersion in virtual worlds.
The terms ‘immersion’ and ‘immersiveness’ are used to describe the degree of involvement in a virtual world, and the terms are usually applied to games, but apply equally well to social virtual worlds like Second Life. Although their meaning is intuitively understood by participants in those worlds there is no agreed definition of the terms. The researcher suggested a hypothesis that the more immersed a user is, the more they will concentrate on their own avatar, rather than look at other areas of the screen.
Eye tracking hardware can determine where a user is gazing at any time and hence record eye movements. During the Summer of 2008 the researcher carried out an experiment to compare Second Life with the Massively Multi-player Online Role Playing Game (MMORPG) called RuneScape, as a first step towards quantifying whether users became more immersed in a game or in a social virtual world.
The researcher’s COLMSCT colleagues Catherine Reuben, Diane Ford, Frauke Constable, Laura Hills, Katherine Perry, and Claire Dunlop participated in this pilot study. He received help, advice and supervision from Dave Perry in the Institute of Educational Technology at the Open University. A Tobii T60 eye tracking monitor was used for the pilot, connected to a purpose-built computer running Windows. The results were analysed, and the researcher collaborated with Dr. Anne Adams in order to compare in a quantitative manner the relative immersion in the two different types of world. A paper was produced and presented by Dr. Adams at the HCI 2009 conference.
Utilising the 3-D nature of virtual worlds to illustrate mathematical concepts
The third strand of the researcher’s work involved investigating how a virtual world can go beyond the normal classroom activities and utilise the 3-D nature of the world for mathematics. In a standard face-to-face classroom environment mathematicians, scientists and technologists have for generations explained 3-D concepts using 2-D surfaces, such as whiteboards or power-point presentations. While in the real world video can of course also be used, in a virtual world you can in theory reproduce everything that a face-to-face session does, but also go beyond this to build and animate 3-D objects to be used in the virtual classroom.
Technology at the moment acts as a barrier to some activities in virtual worlds, as it takes an investment in time and effort to build objects, and much time is taken up with specific technical issues such as lag, student’s access, specifics of the actual virtual world being used (Second Life), and so on. So the researcher produced a proof-of-concept video. A virtual learning session was held in Second Life to explain mathematical concepts, but rather than invite students, the researcher recorded the session to produce a video. Thus the possibilities of teaching mathematics in virtual worlds using their 3-D nature was explored, while allowing a wider access to the resultant video.
The researcher hopes the video is an illustration of what will be possible in the future, as technology, computing power, and virtual worlds advance in design and capabilities.
The researcher may have been the first person in the world to hold tutorials with students on a university level mathematics course in a virtual world.
His first ice-breaker/orientation meeting was held with students on 15th April 2008 in Second Life, and the first mathematics tutorial (on Linear Algebra) was held on 6th May 2008. The tutorials were for M208, the Open University Pure Mathematics course. That year tutorials were also held for M150, the course on Data, Computing and Information, but I believe they were held later … and in any case you could argue that they were about data, computing and information rather than mathematics as such. (The researcher is interested if anybody has information about earlier official mathematics lectures or tutorials in virtual worlds – please contact him.)
Additional information - Schedule
Ice-breakers/Orientation meetings (3)
Mathematics tutorials (7)
Social Worlds are the Multi-User Virtual Environments (MUVEs) which are primarily used for inter-personal relations, although there is an overlap between these and gaming worlds. For example Second Life is often referred to as a game. Other acronyms in use are MMOG for Massively Multi-player Online Game, or MMORPG for Masively Multi-player Online Role Playing Game. These are used mainly, of course, for the pure virtual gaming worlds.
So a good place to start is with a list of virtual worlds.
If your new to Virtual Worlds you might like to review the work of Edward Catronova. There were two things that made people interested in virtual worlds. In a seminal piece of work he is an econimist who was the first to recognise that the total economic value of trading within virtual worlds (in particular games) was equivalent to the GDP of a small country. (The second thing was the fact that virtual gaming became a multi-million dollar industry.)
One of the purposes that social worlds are being put to is education. Many secondary and tertiary educational establishments are investigating the use of virtual worlds for teaching.
So universities world wide are investigating these worlds for eLearning and distance teaching purposes.
The Open University have made an investment in Second Life and there are multiple links from the main COLMSCT pages. Also see the main COLMSCT project page by Anna Peachey Open Life: Teaching and Learning in Second Life which has lots of relevant information.
Project aims:
This proposal fits into the COLMSCT priority of “Subject Communities and Identities in an Online Environment”. It also links into one of the research aims in the ICT department’s Unit plan to “Keep at the forefront of, and innovate in, eLearning” and fits squarely in our “Education and ICT” headline theme. One intended impact of the project on students is to enable and encourage them to recommend, rate and discuss resources. Involving students in directly providing and maintaining course resources will give them opportunities to develop their skills in finding and evaluating material relevant to their studies. Contributing, rating and commenting on them should increase collaboration between students and allow communities to develop around subject areas. This will extend and complement the role of VLE tools such as blogs, wikis and MyStuff in increasing the interaction students have with course materials and with each other
Shaw, J. and Woodthorpe, J. (2009) fOUndit: supporting subject communities by sharing online resources. PowerPoint presentation from the Association for Learning Technology Conference (ALT-C) 2009, 8-10 Sept, Manchester, UK.
Woodthorpe, J. (2010) COLMSCT Final Report. Appendix A.
Woodthorpe, J. (2010) COLMSCT Final Report. Appendix B.An investigation of wikis and audio conferencing for supporting students’ collaborative learning in online course settings.
This project investigated new forms of collaboration technology, as used in Open University technology courses. The project aimed to evaluate the role of synchronous and asynchronous tools for supporting collaborative learning among students.
In particular, the project investigated the following tools, provided within the Open University's Virtual Learning Environment:
The project considered the benefits and issues of using these facilities for structured collaborative work as part of a course (e.g. online tutorials and group projects).
It sought the views and experiences of students and of tutors.
Wikis were investigated in the context of group projects in the course Information and communication technologies: people and interactions (T209).
Elluminate synchronous conferencing was investigated in the context of online tutorials in the course Networked living: exploring information and communication technologies (T175).
Benefits to students arising from the project will be improved facilities and environments for online collaboration and communication. The project will contribute to developing good online tools and good practice in using them for teaching and learning. This will help students to undertake collaborative learning activities with more engagement, enjoyment, and sense of community.
The project was aligned with COLMSCT’s theme: ‘e-Learning communities and identities' as it aimed to enhance collaborative learning and course community.
The project also contributed to the work of the Technology and Education Research Group, within the Maths, Computing and Technology Faculty (MCT). Colleagues from the MCT Faculty and COLMSCT made significant contributions to the work.
Chetwynd, F and Kear, K. (2009) Presentation at the CETL-MSOR Conference, Milton Keynes, UK, 7-8 September 2009.
Kear, K. (2008) eLearning Community talk - wikis and blogs in T175
Kear, K. (2008) Presentation at FELS 'Intellect' Research Group.
Kear, K. (2008) Presentation at the Open CETL Conference, Milton Keynes, UK, 24-25 September 2008.
Kear, K. (2009) Presentation at the Open CETL Conference, Milton Keynes, UK, 15-16 December 2009.
Kear, K. (2010) poster - wikis and Elluminate researchOne of the fastest growing areas of online teaching and learning at the moment is the use of virtual worlds or MUVEs (Multi User Virtual Environments). The Open University has been working in Second Life, arguably the most versatile virtual world for education purposes, since 2006.
The Open University started with two islands in Second Life, Cetlment (a COLMSCT project island) and SchomeBase (belonging to the Schome project: www.schome.ac.uk). In early 2008 we moved from Cetlment to Open Life, located next to SchomeBase, and the two islands were developed together according to our experience, understanding, and observation of what works and doesn't work in this immersive environment. Spaces on Open Life, our formal teaching island, are naturally inclined towards creating communities and supporting smaller groups, with a variety of learning event spaces. Over 2008 the Open University representation in Second Life grew and developed into a lively, thriving presence. Anna's research focused on modeling this community according to real world concepts of community, and in 2009 we acquired a third island, Open Life Village, that has been developed using strong real world metaphors as a social presence for the University community.
Anna Peachey, and Liz Thackray, both COLMSCT Teaching Fellows, use Second Life for tutorials with students studying T175 Networked Living: Exploring Information and Communication Technologies, and have worked on other course resources and events for this programme.

Anna is acting as liaison with all other interested parties in the University to converge in the development of our SL presence and capabilities, and providing an advice and reference point for those who have an interest in exploiting SL. In November 2008 Anna chaired ReLIVE08, an international conference on researching learning in virtual environments that was hosted at our real world campus in Milton Keynes. See the conference legacy website www.open.ac.uk/relive08 for further details.
Anna Peachey has been an Associate Teaching Fellow with the COLMSCT CETL since 2006. During the period of her fellowship she has seen the University presence in virtual worlds move from a single-project pilot to a substantial presence in Second Life, recently recognized by Linden Lab who chose to showcase the OU as an international case study (available from http://secondlifegrid.net.s3.amazonaws.com/docs/Second_Life_Case_OpenU_EN.pdf).
Anna has taken responsibility for overseeing and coordinating all University activity in virtual worlds and her company, Eygus Ltd, now project manages this process under contract with the Learning Innovation Office. In 2008 the project reached the final of the Times Higher Education Award for Outstanding Innovation in ICT. Anna is currently making plans for ReLIVE10, following on from the very successful ReLIVE08 (Researching Learning in Virtual Environments) Conference she chaired through the CETL in 2008 (www.open.ac.uk/relive08).
OU staff and students can access the internal Virtual Worlds project website at http://learn.open.ac.uk/course/view.php?id=5201.
Committee Memberships
Board Memberships
Book
Chapters
Papers
Invited Presentations
Workshops/Symposia/Presentations/Posters
Second Life is a multi-user virtual environment (MUVE), created and managed by a company called Linden Lab. It is a complete, immersive 3D environment, home to over 17 million residents who interact using customisable avatars.
Whilst it is very easy to have a Second Life for free, the environment does have a fully developed economy allowing residents for example to buy clothes for their avatar, buy land and build a house on it, attend parties or go to clubs or take part in world events. The currency is the Linden Dollar and there is a reasonably stable Lindex exchange rate between Linden dollars and US dollars.
Second Life is not a game (like, for example, World of Warcraft) in that avatars are not set tasks, required to achieve objectives or take part in activities. It is an online environment shaped by the residents and, therefore, is more flexible and suitable for applications such as education and business. There is a substantial educational community in Second Life: Universities from across the globe have chosen to establish a presence, paying Linden Lab for virtual islands that they can use for teaching and learning. The educational community is lively, with a lot of educational resources available through the Second Life website. There are several very active mailing lists for exchange of ideas and information and there are frequent educational events, including live link ups with conferences, inworld meetings and groups. 
As well as educational institutions, many businesses are now using Second Life for training, meetings and other interaction. Some enthusiasts have suggested that, especially now that the code for Second Life has been made Open Source, it is paving the way for Web 3.
The Open University completed its first project in Second Life in 2006-2007. Cetlment Island was established as a virtual campus with collaborative shared areas in the centre of the island and alternative teaching and learning spaces around the island. All the teaching and learning spaces had access to tools such as interactive whiteboards, chat tools, blog page links and other tools. There were laptops students could 'wear' to send emails, make blog entries, watch video material and listen to the radio.
The main plaza area housed a library and resource centre as well as some social spaces, including a very funky disco! There was an area to host tutorials to build further skills in using Second Life and sandbox areas for experimentation and practice. The centre of the island made use of real life metaphors for teaching and learning spaces, supported by other spaces that used less conventional metaphors.
Cetlment represented a pilot project for COLMSCT, specifically working with T175 students, and has now ceased to exist. A new OU island, Open Life, followed the Cetlment project and was located adjacent to SchomeBase. This island is funded by the Schome Project, which also runs Schome Park, an island on the Teen Grid. Open Life and SchomeBase were modelled together to provide an integrated and exciting immersive space, building on all the lessons we learned from Cetlment and our time investigating and observing many formal and informal education projects in Second Life. Student participation increased significantly over this period (2008-2009) and community development centred around the residential halls on the islands.
Anna Peachey, and Liz Thackray, both COLMSCT Teaching Fellows, use Second Life for tutorials with students studying T175 Networked Living: Exploring Information and Communication Technologies, and have worked on other course resources and events for this programme, partly funded by an HEA grant.
Working through the Learning Innovation Office, Anna is acting as liaison with all other interested parties in the University to converge in the development of our SL presence and capabilities, and providing an advice and reference point for those who have an interest in exploiting SL. OU staff and students can access the LIO Virtual Worlds website at http://learn.open.ac.uk/site/openlife. In November 2008 Anna chaired ReLIVE08, an international conference on researching learning in virtual environments that was hosted at our real world campus in Milton Keynes. See the conference legacy website at www.open.ac.uk/relive08 for further details.
In 2009 Open Life will continue to be used for tutorials and for further research into how best to support teaching and learning in Second Life, particularly to explore innovative and diverse teaching methods in this immersive environment.
Sloodle, an open source development, is also a significant future tool for the Open University as it enables Second Life to be used within Moodle and therefore potentially within the OU VLE. Chats, calendar objects, assignment details and documents will be able to be 'rezzed' (made into 3D virtual objects) in Second Life directly from Moodle. This has implications, for example, for disabled students who are not able to use screen readers in Second Life. They would be able to join in a Second Life tutorial from a VLE conference window, which the screen reader could access, and contributions could be made from either platform.
Teaching in Second Life requires similar skills to synchronous teaching in a VLE environment. Facilitators can use the built-in text chat tools in Second Life or the built-in SL audio. It is also possible to use a combination of text and audio, and to use instant messaging to communicate privately with individual students during group sessions. Students learn effectively in SL by being active, interacting with other avatars and being set tasks, but Second Life is also frequently used for meetings or more formal presentations and lectures.
Anyone can create a Second Life account and hence use their own avatar to interact inworld.
This is done by going to the Second Life website, downloading the client software and creating the account. You can alter the appearance of your avatar at any time. The only thing you cannot do is change the name of your avatar. You can, however, create multiple avatars if you wish.
The document on the right "Getting online with Second Life" takes you through all the steps necessary to get online and start using Second Life.
The downsides of Second Life mostly revolve around technology. The client software is currently about 35Mb and, as with most online games, resides on the user's computer. This can cause problems if users do not have administrative rights for the machine they are using as they cannot install the software, although this can sometimes be worked around by installing it on a memory stick and running it from there. The client software is updated regularly, which means optional installations of a new version of the software, usually about once every two weeks. The software also requires a reasonably high specification machine, with a recent graphics card and a broadband internet connection. Firewalls can also cause problems.
Open Life has a sandbox for anyone to practice SL skills such as building, an event area for special events, an info-building (the nOUbie Centre), a formal teaching and learning space, a dedicated building for the Open Degree and several small, informal meeting areas, as well as lots of entertaining secrets to discover. Open Life is right next door to Open Life Ocean, and flying or sailing across the ocean will get you to Open Life Village, our social island developed according to a real world village metaphor with houses, village green, pub etc.
The (internal) website for all our virtual world developments can be found at http://learn.open.ac.uk/course/view.php?id=5201
The aim of the project is to investigate the pedagogical effectiveness of 3D Multi-User Virtual Environments (MUVEs) such as Second Life and their role in enhancing the student's learning experience.
The key research questions that we are investigating are:
In the PAVE project (Pedagogical Effectiveness of Multi-user Virtual Environments), our aims has been to investigate the pedagogical effectiveness of 3D Multi-User Virtual Environments (MUVEs) and their role in enhancing the student's learning experience. Although, our empirical studies have focused on Second Life, it is hoped that the results will be applicable for 3D virtual worlds in general. The key research questions that we have investigated are:
Immersive Virtual Worlds (IVWs) have gained a great deal of attention in the education community in recent years.
The 3-dimensional nature of IVWs enables activities and interactions which are not possible in 2-dimensional virtual environments, and may provide a platform for activities which are not easily accommodated in normal classroom environments. One of the better known IVWs is Second Life. The Open University has made a considerable investment in Second Life through the purchase of five islands and a number of experiments in teaching within the virtual environment.
At the present time, there is no primer for how to teach in an IVW. Many educators are beginning to explore the possibilities and to share information about what they are doing, but all are on a journey of discovery. That journey involves utilising knowledge and experience gained from other classroom experiences and from the general body of learning theories, but it also means confronting the unknown, taking risks and developing new approaches which may or may not be linked to previous knowledge and experience.
This project will aim to build on work currently being undertaken in Second Life in developing a toolkit and training materials for Open University ALs. Close collaboration will be maintained with Anna Peachey, who is developing the OpenLife Island and undertaking other work in Second Life.
More specifically, the project will have three main areas of activity:
A group of Open University Associate Lecturers will be recruited to trial the materials developed.
A selection of COLMSCT related dissemination activities to January 2009.
HEA SecondLife Masterclass - 10th June - University of Portsmouth
Among the many events notified by COLMSCT, this one caught my eye. I have been involved in various projects in SecondLife over the past 18 months and am interested in learning more. Included in the notification was a note inviting anybody interested in sharing their experience of SecondLife to contact the organisers. I thought about it, decided everybody would know more than me, thought about it again, phoned a friend, and decided to send an email outlining the work I had been doing and offering my services. Somewhat to my surprise, not only did I get an acknowledgement but found myself one of the 5 presenters at the event! Lesson learned - it's worth pushing doors as at worst they may not open.
On 10th June, about 30 people met together for the workshop. It was very much a hands-on experience with opportunity to share experiences and to learn from others. There were 4 workshops during the day; 2 had an emphasis on scripting and building and the other 2 were more practice focussed on assessment and on the practicalities of working in SecondLife.
The sessions ran in parallel so I was able to attend the assessment workshop and led the afternoon workshop on lessons learned from practice.
In my session, I focussed initially on the tutorial activity being undertaken in T175 - if you want to know more, Anna Peachey and I are offering an opportunity to participate in a tutorial simulation at the OpenCetl conference in September. I then described a piece of work I have been involved in with colleagues at the University of Sussex where students have been developing learning experiences in SecondLife as part of an Interactive Learning Environments module. Finally, there was an opportunity to visit the OU OpenLife island and to show participants some of the development work being undertaken there.
Discussion topics included how do we engage students in SecondLife? What equipment is needed to run SL? - we saw it running on an EEEPC! The hoary question of assessing groupwork raised its head. We also discussed the relevance of SecondLife as a tool and recognised that although SL is useful, it is not appropriate for use all the time and there is much that is better taught using more traditional methods.
It was good to meet some OU colleagues I had not met before and to have the opportunity to network with colleagues from other HEIs with common interests.
Do take the opportunity to put your head above the parapet and share your work if you get the chance. I very much enjoyed my day in Portsmouth and learned a lot from the experience.
I'm also attaching my presentation.
Liz Thackray
Students taking “Environmental monitoring, modelling and control” (T308) are required to undertake an environmental assessment project as their end of course assessment. The project integrates the various themes of the course and provides experience of the work carried out by many environmental technologists.
The key project aims are:
Burnley, S. J. (2008) Poster presentation from the 3rd Open CETL Conference 'Building Bridges' 24-25 Sep 2008.
Burnley, S.J. (2008) Poster presented at the International Society for Scholarship in Teaching and Learning Conference (ISSOTL), Edmonton, Canada. 16-19 October 2008.
Burnley, S.J. and Taherzadeh, S. (2007) Poster presented at the Science Faculty Teaching Innovations Day, March 2007.Mathematics poses a number of challenges for learners and teachers alike in an online learning environment. This is mainly because mathematical notation is difficult to produce electronically without a significant investment in training to use specialist typesetting packages (such as LaTeX). There are, however, a number of eLearning type technologies that might be used to overcome this problem. Additionally, there are a variety of eLearning type products that could be used to enhance a student’s overall learning experience whilst studying mathematics.
These projects are exploring and piloting of teaching and learning of mathematics in an online environment; to build on internal awareness and academic capability in order to improve the learning opportunities we offer to students; facilitate fuller academic engagement with teaching online in order to underpin our aspirations to provide a national lead in this form of elearning. Both the CETL and Mathematics-related communities will greatly benefit from this collaboration
James Gray, COLMSCT Associate Teaching Fellow
This project is addressing a problem that concerns many of us: to produce good looking word processed mathematics can often take longer than doing the mathematics itself.
Stuart Freake, COLMSCT Teaching Fellow
This project aims to investigate options for web–based communication between associate lecturers and students, both for e-tutorials and for one-to-one support, particularly the ability to communicate complex equations and diagrams.
Felicity Bryers, COLMSCT Associate Teaching Fellow
Developments in technology have already provided potential new methods of supporting students. The use of a synchronous online interface involving both voice and an interactive whiteboard offers an exciting opportunity to widen participation by offering tuition to remote students and others unable to attend face-to-face tutorials.
Abigail Kirk
This project has investigated the use of interactive sessions in Elluminate to support students’ learning of Mathcad.
Tim Lowe
The aims of the project is to develop an initial set of example questions using the STACK computer-algebra based e-assessment system and to trial these with students. The intention is that the questions are appropriately randomised and provide extensive, individualised feedback to the answers submitted by students in addition to a full worked solution to the problem.
Gaynor Arrowsmith
To investigate logistic, technical and pedagogical aspects of electronic marking (eMarking) of Tutor-Marked Assignments (eTMAs) for mathematics courses.
Pat Bailey
The aims of the project are to work closely with the Mathematics Online (MOL) and MU123 teams to:
Jonathan Fine
There are problems with mathematics in electronic media. The most basic symptom of the problem is that we often cannot copy-and-paste mathematical content from one application to another.
Sarah Chyriwsky
A key objective of the Mathematics Online Project is the development of an electronic tutor marked assignment system that empowers associate lecturers rather than burdens them
Bill Tait, COLMSCT Associate Teaching Fellow
The overall aim of the project was to evaluate Web Based Learning as a means of providing additional support for students on Open University courses.
This project is addressing a problem that concerns many of us: to produce good looking word processed mathematics can often take longer than doing the mathematics itself.
Online submission is not yet possible for most mathematics courses in the OU because of the extra burden that typing mathematics or writing mathematics electronically will place on students and tutors. In this project a first year tutorial group of mathematics students used ASCIIMathML to type their assignments and the author marked them online. ASCIIMath, as it is also known, provides a linear syntax that can be transformed into MathML and can provide a way of generating dynamic Web pages that allow students to input mathematical expressions that are rendered attractively. The interface is relatively simple and intuitive.
This Final Project Report, available under Related Resources on the right hand side of this page, contains a description of the relevant software, the development of relevant resources and the feedback from the students who took part in the pilot, as well as a summary of the current impact of the project. The evaluation is limited by the numbers of students who took part.
Further details of ASCIIMath and the project can be found on the website http://www.wjagray.co.uk/maths/ASCIIMathMLinfo.html
A selection of COLMSCT related dissemination activities, to October 2007.
James Gray, COLMSCT Associate Teaching Fellow
wjag2@tutor.open.ac.uk
This project has investigated the use of interactive sessions in Elluminate to support students’ learning of Mathcad.
Mathcad is a package used by students on several Mathematics courses, and many students experience problems in learning to use it. It is difficult to support students’ Mathcad work via the means traditionally available within the Open University. For this reason, support sessions using the on-line system Elluminate were investigated.
The support was found to be effective in the following ways. Students brought their own questions/problems to the sessions and explained these to the tutor. They were able to work interactively through the relevant problems/tasks – in most cases the student was able to work live in Mathcad. The students’ understanding of the topics could be seen to improve during the tutorials themselves. Elluminate played the vital role of facilitating real-time interaction between tutor, student and Mathcad software.
A number of issues were identified and investigated. Many of these concerned ways in which the behaviour of tutor and student could make interaction and tutor input less effective. Possible ways of improving this were identified and evaluated. Other issues concerned the interaction between the student and Mathcad, the effects of transmission delays, and problems with application sharing and audio within Elluminate. In many cases solutions to these problems were identified and evaluated.
This project aims to investigate options for web–based communication between associate lecturers and students, both for e-tutorials and for one-to-one support, particularly the ability to communicate complex equations and diagrams.
The project will develop and trial e-tutorials in Electromagnetism, investigating a range of pedagogies and approaches.
This project explored how e-tutorials could be used to support students in a subject area that requires communication of mathematical equations and diagrams. The project was prompted by the need to provide effective tutorial support for a new Electromagnetism course.
Eight tutors each ran a week-long e-tutorial in a FirstClass forum during the presentation of the course. They used different formats and techniques, and the e-tutorials were archived for later analysis of participation and interactions. As well as these e-tutorials, students could also attend regional tutorials and inter-regional day schools. ALs were sent a questionnaire to ascertain their views about the different forms of tutorial support, and a sample of students were also surveyed.
Survey results indicate that providing a mix of e-tutorials, regional tutorials and day schools led to an an increase in overall use of tutorial support. However, students and tutors generally preferred face-to-face support rather than on-line support. Participation in e-tutorials was low - only about 30% of students logged in to the e-tutorials, and less than 10% of students posted one or more messages. Nevertheless, for the hard-core of regular users, which included a number of students who were unable to access face-to-face tutorials because of a disability or difficulty in travelling to the venues, the e-tutorials provided a welcome form of support. A report on the project is available in the documents section of this page.
Experience of running the e-tutorials in 2006 led to production of some guidelines for tutors and students about effective use of e-tutorials, which were circulated for the second presentation of the course. These are available in the documents section of this page.
This project was funded through piCETL. See http://www.open.ac.uk/picetl/activities/details/detail.php?itemId=4612257d02466&themeId=460260f480f84 for further details, papers and reports.
Stuart Freake, COLMSCT Teaching Fellow
S.M.Freake@open.ac.uk
Developments in technology have already provided potential new methods of supporting students. The use of a synchronous online interface involving both voice and an interactive whiteboard offers an exciting opportunity to widen participation by offering tuition to remote students and others unable to attend face-to-face tutorials.
Developments in technology have provided an opportunity for online synchronous tuition for mathematics courses. This allows students who are unable to attend face-to-face to take part in tutorials.
This project is exploring the use of online tuition using first Lyceum and later Elluminate to teach the first level course, Open Mathematics, MU120. One problem with teaching mathematics online is that it is not easy to write mathematics on an electronic whiteboard. The project has concentrated on developing a library of whiteboards for MU120 tutors to use in online tutorials, making it feasible for tutors to teach online without onerous preparation of resources.
The whiteboards were developed for Lyceum and were used by some tutors on a 2007 and 2008 presentation of the course. Feedback from their use is being evaluated.
During 2008, the Open University decided to replace Lyceum by Elluminate, and the 2008 presentation of MU120 was the pilot course. The library of whiteboards is being transferred into Elluminate. This has involved some rethinking to make the most of the new facilities offered by Elluminate. Work is continuing to encourage and train tutors to use online tuition.
Felicity Bryers, COLMSCT Associate Teaching Fellow
fab5@tutor.open.ac.uk
To investigate logistic, technical and pedagogical aspects of electronic marking (eMarking) of Tutor-Marked Assignments (eTMAs) for mathematics courses.
The investigation will include (i) the effect of eMarking on the quality of the marking and of the feedback to students; and (ii) the effect of electronic submission by students on the quality of their assignments, in particular, on the balance between form and content and on the development of good mathematical style.
It will also consider the functionality of a range of software and hardware approaches to the marking of eTMAs, primarily including Word 2007 and annotation through use of Tablet Pens and/or the inking facility within Word 2007.
The results will be used to inform the introduction of eTMAs within mathematics courses.
This project is closely linked to Sarah Chriwsky's project on Marking tools for Electronic TMA – design, evaluation and pedagogic issues.
A key objective of the Mathematics Online Project is the development of an electronic tutor marked assignment system that empowers associate lecturers rather than burdens them
Introduction
The maths online project is exploring ways to enable tutors to assess eTMAs while providing rich feedback, including mathematical notation and diagrams. At the same time, it is important that students are not restricted in the way they can prepare their assignments for electronic submission. In 2008, 24 tutors across a range of mathematics and statistics courses took part in a trial using Word 2007 for e-marking, augmented by tools developed in-house.
Aims
(i) to draw on tutors' experience of the previous Word 2007 mini-trial and other COLMSCT projects this area to identify key technological and pedagogical issues to inform the development of marking tools and subsequent development of marking tools;
(ii) to design and collate a prioritised list of marking tools for development in preparation for the 08B extensive trial;
(iii) to assist with the 08B eTMA trial and, specifically, to design and implement the evaluation of the marking tools as part of this trial;
(iv) to report the findings of the evaluation and to make recommendations on the use of marking tools in mathematics eTMAs.
Additionally, feedback was sought to investigate the impact of handling and marking eTMAs on the quality and quantity of feedback given on assignments.
This project is closely linked with that of Gaynor Arrowsmith.
Evaluation
Tutors' perceptions of handling and marking eTMAs were sought via three formal questionnaires throughout the trial, informal regular reports and an online forum. Examples of some tutors' marking on paper was also compared with electronic marking.
Results
Initial results were presented at the OpenCETL conference at the OU in September 2008. Overall conclusions based on all the feedback were presented at the International Conference of Distance Learning (ICDE) in Maastricht in June 2009.
Initial observations of the impact on the quality of feedback were presented at the Making Connections conference at the OU in June 2009. This was expanded on in a poster presentation at the ALT-C conference in Manchester in September 2009 and further issues were discussed at the 4th OpenCETL conference at the OU in December 2009.
The final report on the trial is now available (see link on right). The conclusions are reproduced here.
Conclusions
Depending on their style of paper-based marking and the nature of the assignment, tutors developed subtly individual approaches to e-marking. Preferences were also dictated by the hardware available to each tutor, as tutors are expected to use their own computer hardware. Their confidence with each software option for marking was also a factor in the approach adopted. For example, those with access to reliable inking technology made extensive, and in some cases exclusive, use of it, whereas others adopted a more mixed mode. The small graphics tablets were generally found to be unsatisfactory, whereas Tablet PCs were rated highly by users.
The format of the student’s work also influenced the marking style developed. Assignments submitted as scans of handwritten work, for example, were received by the tutor as a series of embedded images in a Word 2007 document, in which typed comments had to be applied via textboxes overlaying the student’s work.
Most tutors very quickly naturally sought a method of annotation which facilitated rapid communication of their reactions through comments and explanations placed precisely at the relevant point within the student’s work. However, this ‘flow’ of thought tended to be inhibited rather than aided by the technology, with tutors reporting difficulties in positioning textboxes, adding ticks, scrolling through lengthy assignments and multiple document formats. The impact of this was to make the e-marking experience less convenient and more time-consuming and stressful for the majority, as compared to paper marking. Those using approaches which preserved the ‘flow’ of marking, notably those using track changes or reliable methods of inking found the e-marking experience more positive.
Subtle changes in approach became apparent towards the end of the trial, primarily through the tutors’ open responses to the questionnaire. Recognizing that the effort required was unsustainable there were regretful suggestions that they may need to use a less-rich mix of annotations (ticks, highlights, typed or inked comments, etc) and increase their re-use of comments across scripts, despite the potential for a perceived reduction in personalization. Some tutors reduced their emphasis on positioning feedback close to the relevant point in the student’s work. Some suggested restricting the students’ choice in preparing assignments, for example, by constraining them all to use Word 2007, or all to handwrite and scan.
In summary, the importance for tutors to be able to respond to what the student has written without essentially any interruption to the flow of their thoughts has been confirmed. The project has also provided additional insight into how tutors respond to mathematical eTMAs, which is both individual and complex, and into the facilities required to maintain quality correspondence tuition electronically. E-marking of maths eTMAs can place unreasonable demands on tutors, demands which aren’t sufficiently alleviated by the essentially software-only solution adopted in the trial.
Sarah Chyriwsky
COLMSCT-Centre@open.ac.uk
The aims of the project are to work closely with the Mathematics Online (MOL) and MU123 teams to:
a) Carry out a literature review of pedagogical research in using e-learning objects for teaching mathematics at this level.
b) Research and critically evaluate existing e-learning objects and software packages designed to encourage deep learning in mathematics.
c) Compile a list of existing resources that demonstrate good practice in this area (for example, applets from the Freudenthal Institute, NCTM Illuminations) and to provide a set of guidelines for evaluating learning objects that can be used by ALs and course teams.
d) Test a selection of these objects on students with different backgrounds, experiences and abilities.
e) Based on the results of (a) to (d) and focusing on the mathematics in MU123 and associated pre-course open content materials suggest:
(i) where existing e-learning objects can be used;
(ii) how existing e-learning objects might be extended or developed;
(iii) new e-learning objects that may be developed.
f) Evaluate the suggestions made in e) including trials with students.
The project is linked directly to the Unit plans for production of the new Level 1 course in mathematics (first presentation in 2010); the objectives for the Maths On-line project and may have additional benefits for courses in all faculties which require basic mathematical skills.
It is also in line with the move to electronic learning and the development of the VLE.
We hope that the main impact on students would be:
Pat Bailey
pb24@tutor.open.ac.uk
There are problems with mathematics in electronic media. The most basic symptom of the problem is that we often cannot copy-and-paste mathematical content from one application to another.
The three main forms for mathematical content are the TeX typesetting language, the MathML markup language, and bit-map or other graphics. In addition, traditionally the creation of mathematical content has had a steep learning curve.
The establishment of the Mathematics OnLine project (MOL) shows the importance of these problems to the OU, and their solution will prevent mathematical content being a troublesome and expensive special case.
Printed materials for mathematics and upper level physics courses are created using TeX, and delivered electronically to students only as PDF files. The mathematical content so delivered cannot be copied into discussion forums other than as graphics. Many other courses use Structured Authoring (SA), which is now providing web page delivery of course materials.
This proposal investigates the use of new and emerging technologies that will help solve these problems, which are now becoming more acute as the OU moves towards a Virtual Learning Environment (VLE). The focus is on obtaining pedagogic experience, which can be then be used to improve both the solutions and their presentation to students.
This project has two strands. The first, formulas in web pages, is to create some trial transformations, similar to those in SA, of existing TeX-based printed materials into interactive web pages for the VLE, and to evaluate their usefulness to students. This feedback will be used to refine the web pages and produce examples for authoring, transformation and delivery.
The second strand, authoring formulas, is to explore, evaluate and where helpful develop tools and teaching materials that help students and teachers to create mathematical formulae electronically. This will help us obtain standard tools for use as components in our learning environment, and provide a pedagogic framework for the students’ use of these tools.
The first strand will help bring print production for mathematics courses (authored in TeX) into the mainstream, as represented by Structured Authoring (SA), the OU XML Schema, the VLE and OpenLearn. This will give students more useful online learning materials. The second strand will contribute to OpenMark, and help meet the goals of the Mathematics OnLine project, particularly that of electronic TMAs. It will also produce a self-guided introduction to TeX mathematical markup, that can be published on OpenLearn.
The two strands are complementary because communication is a two-way process.
Jonathan Fine
j.fine@open.ac.uk
A key objective of the Mathematics Online Project is the development of an electronic tutor marked assignment system that empowers associate lecturers rather than burdens them
Introduction
The maths online project is exploring ways to enable tutors to assess eTMAs while providing rich feedback, including mathematical notation and diagrams. At the same time, it is important that students are not restricted in the way they can prepare their assignments for electronic submission. In 2008, 24 tutors across a range of mathematics and statistics courses took part in a trial using Word 2007 for e-marking, augmented by tools developed in-house.
Aims
This project is closely linked with that of Gaynor Arrowsmith.
Evaluation
Tutors' perceptions of handling and marking eTMAs were sought via three formal questionnaires throughout the trial, informal regular reports and an online forum. Examples of some tutors' marking on paper was also compared with electronic marking.
Results
Initial results were presented at the OpenCETL conference at the OU in September 2008. Overall conclusions based on all the feedback were presented at the International Conference of Distance Learning (ICDE) in Maastricht in June 2009.
Initial observations of the impact on the quality of feedback were presented at the Making Connections conference at the OU in June 2009. This was expanded on in a poster presentation at the ALT-C conference in Manchester in September 2009 and further issues were discussed at the 4th OpenCETL conference at the OU in December 2009.
The final report on the trial is now available (see link on right). The conclusions are reproduced here.
Conclusions
Depending on their style of paper-based marking and the nature of the assignment, tutors developed subtly individual approaches to e-marking. Preferences were also dictated by the hardware available to each tutor, as tutors are expected to use their own computer hardware. Their confidence with each software option for marking was also a factor in the approach adopted. For example, those with access to reliable inking technology made extensive, and in some cases exclusive, use of it, whereas others adopted a more mixed mode. The small graphics tablets were generally found to be unsatisfactory, whereas Tablet PCs were rated highly by users.
The format of the student’s work also influenced the marking style developed. Assignments submitted as scans of handwritten work, for example, were received by the tutor as a series of embedded images in a Word 2007 document, in which typed comments had to be applied via textboxes overlaying the student’s work.
Most tutors very quickly naturally sought a method of annotation which facilitated rapid communication of their reactions through comments and explanations placed precisely at the relevant point within the student’s work. However, this ‘flow’ of thought tended to be inhibited rather than aided by the technology, with tutors reporting difficulties in positioning textboxes, adding ticks, scrolling through lengthy assignments and multiple document formats. The impact of this was to make the e-marking experience less convenient and more time-consuming and stressful for the majority, as compared to paper marking. Those using approaches which preserved the ‘flow’ of marking, notably those using track changes or reliable methods of inking found the e-marking experience more positive.
Subtle changes in approach became apparent towards the end of the trial, primarily through the tutors’ open responses to the questionnaire. Recognizing that the effort required was unsustainable there were regretful suggestions that they may need to use a less-rich mix of annotations (ticks, highlights, typed or inked comments, etc) and increase their re-use of comments across scripts, despite the potential for a perceived reduction in personalization. Some tutors reduced their emphasis on positioning feedback close to the relevant point in the student’s work. Some suggested restricting the students’ choice in preparing assignments, for example, by constraining them all to use Word 2007, or all to handwrite and scan.
In summary, the importance for tutors to be able to respond to what the student has written without essentially any interruption to the flow of their thoughts has been confirmed. The project has also provided additional insight into how tutors respond to mathematical eTMAs, which is both individual and complex, and into the facilities required to maintain quality correspondence tuition electronically. E-marking of maths eTMAs can place unreasonable demands on tutors, demands which aren’t sufficiently alleviated by the essentially software-only solution adopted in the trial.
The overall aim of the project was to evaluate Web Based Learning as a means of providing additional support for students on Open University courses.
The overall aim of the project was to evaluate Web Based Learning as a means of providing additional support for students on Open University courses.
Web based learning materials were designed for use in an Open University course on JavaScript programming. Initial studies suggested that they should be implemented in the form of online tutorial learning objects but they also uncovered some pedagogical problems associated with the use of such objects. The problems were studied and a theory of object pedagogy was developed to provide a basis for the design process. Two tutorial objects were developed and deployed on a website for tutor and student evaluation. Feedback from these and other sources indicated that the objects did provide a useful resource for additional learning support. In addition, eXtensible Markup Language (XML) technology was investigated as a means of solving the pedagogical problems by allowing the objects to be easily repurposed, and this proved to be a valuable and enabling innovation. The overall conclusion was that web based learning materials are a viable option for learner support and that they can be adapted, and even extended to fully editable objects and mobile objects by the use of XML technology.
The 'Final Project Report' and related deliverables can be accessed from Related Resources and Documents on the right hand side of this page.
A selection of COLMSCT related dissemination activities to December 2007.
Internal Presentations
External Presentations
I joined the COLMSCT CETL in October of 2006 to carry out a project on Web Based Learning. It sounded like a great opportunity to do something really interesting and it has turned out to be just that.
The project started slowly, getting software, hardware and my act together but eventually took off and a year later I was in Washington presenting a paper at the ISSoTL conference. By this time the project had matured to the extent that I had developed an interest in learning objects. At first this involved mainly the technology and amounted to writing software, which has always been an interest. But it quickly evolved into an engagement with the pedagogy that persisted throughout the project and beyond.
My main aim has been to find the most effective way of combining technology and pedagogy in the design of learning objects. I would like to develop a model that encompasses all the Internet technologies and makes the pedagogy accessible to academic practitioners – a kind of unified model. To this end I have examined a number of ideas, including adaptable learning objects and pedagogical components. I have tested the objects online with OU students and presented more papers at both internal University and external international conferences. I have now reached the stage where I am beginning to understand the problem and I believe I am homing in on a solution.
The most challenging aspect of my COLMSCT work has been making some sort of impact on the establishment. This is probably a slow process in which an individual sees little effect but some changes are taking place driven, I think, by the collective activities of the CETLS and I now feel that my work is taking place in a more productive environment.
The most satisfying aspect of the project has been the effect on my own personal development. It has extended my expertise into the realm of pedagogy without which my work on the technology of web based learning and learning objects would be ineffective.
I originally graduated as a Physicist then completed a master’s degree and a PhD in Nuclear Physics before spending most of my career as a university lecturer in Computing. During this time, I published a number of papers on Radiation and Medical Physics and one textbook. Later, I became involved in educational development and changed my research interest to Internet Based Learning. I published six papers in refereed journals and delivered six presentations to national and international conferences on this and associated topics. I took early retirement from this post and spent a few years developing software for a LearnDirect project and delivering a range of short courses on Internet subjects. I also became an Associate Lecturer at the Open University and since then have tutored on a number of courses in the London and the East of England regions. These include M206, MT262, and currently, M301, M360, M150, TM420, and TM427. I enjoy tutoring and I also like programming. The COLMSCT project will enable me to extend my interest in the development of educational software.
The aims of the project is to develop an initial set of example questions using the STACK computer-algebra based e-assessment system and to trial these with students. The intention is that the questions are appropriately randomised and provide extensive, individualised feedback to the answers submitted by students in addition to a full worked solution to the problem.
The student trials are to investigate
a) Whether the students find the system easy to use,
b) Whether students feel the provision of practice questions using such a system
would aid their learning, and
c) Whether students trust the marking and feedback of the automatic computer-
algebra based system.
The project is focussing on the second level applied mathematics course MST209 Mathematical Methods and Models. The foundations of this course are standard mathematical techniques that lend themselves well to this type of assessment. Most previous uses of such systems elsewhere have been for Level 1 courses and tend not to consider the extensive feedback in response to student answers that is intended here. It is hoped that such questions can be offered to students in future presentations of the course for practice, to enable them to attempt as many instances of a given type of question as necessary to enable them to gain mastery of the mathematical techniques required in its solution, and to be given instant diagnostic feedback on any errors they make. The system is, however, of interest to other courses such as MU123 Discovering Mathematics which is due for its first presentation in 2010 and would like to make use of such systems. It is anticipated that such questions will also be of interest to the wider mathematical community.
Tim Lowe
T.W.Lowe@open.ac.uk
Teaching Fellows and Associate Teaching Fellows are part-time roles carried out for a fixed term. A Teaching Fellow is an existing member of the OU full-time staff with part of their time set aside to work in the Centre. Associate Teaching Fellows are recuited from the Associate Lecturer staff.
Fellows are working on projects relating to one of the COLMSCT themes:
Wendy Fisher, COLMSCT Teaching Fellow
The use of Tablet personal computers (PC) to mark paperless assignments was designed to support lecturers in writing quality feedback to engage students in learning, positioned at the point of learning. The project ran over eighteen months and a perceptual evaluation, evaluated lecturers and students attitudes to the use of technology in the assessment process. Following the evaluation, the dissemination of findings through presentations and papers was given to mathematics, science, computing and technology lecturers in higher education institutions, within and outside the United Kingdom.
Ken Platts, COLMSCT Associate Teaching Fellow
The project was defined in response to feedback and complaints from students and colleagues about the increasingly common use of DVD technology that requires either prolonged use of desktop or laptop computers or the personal printing of large amounts of course material in order that it can used in such places as trains, buses, aircraft or at work.
Tony Jones, COLMSCT Associate Teaching Fellow
Despite more than 35 years of collective experience of assessment in open and distance learning, the Open University’s Science Faculty has no systematic way of making that experience available to its staff. Newcomers, in particular, have few resources to help them write assignments and when staff leave or retire their know-how disappears with them. To address this problem, this project has established a wiki – a website that can be written to as well as read – to collect, organise and preserve good practice in the design of assignments.
James Gray, COLMSCT Associate Teaching Fellow
This project is addressing a problem that concerns many of us: to produce good looking word processed mathematics can often take longer than doing the mathematics itself.
Val Hancock
The Open University's Virtual Learning Environment (VLE) makes many learning tools and services available via the internet but not everyone has easy access to the internet (e.g. students in the armed forces, prisons, residential care or hospitals). This project aimed to make more courses available to prisoners and other students who would otherwise be excluded from studying because they did not have internet access.
Richard Seaton, COLMSCT Associate Teaching Fellow
The use of mobile technologies including mobile phones to deliver learning objects to support students taking distance learning programmes is causing much interest at the UK Open University (OU). The project, An investigation of the use of mobile technologies to support students' learning, has been developing leaning objects to assist students on the T216 Cisco Networking course to practise subnetting, a particular feature of the addressing used to identify all devices on the internet and the majority of today’s computer networks.
Phil Butcher, Coordinator of the iCMA initiative and COLMSCT Teaching Fellow
The interactive computer marked assessment (iCMA) initiative was conceived in 2005 on the premise that developing models of eAssessment had the potential to be used in a much wider role than had hitherto been possible. The initiative followed closely on, and has been enabled by, the 2005 upgrade to the university’s OpenMark CAA system. This upgrade had been built on modern internet technologies and provided a platform which was able to harness the full potential of a modern multimedia computer.
Sally Jordan, COLMSCT Teaching Fellow
This project, with funding from both COLMSCT and piCETL, has investigated the way in which students engage with interactive computer-marked assignments (iCMAs). This is a huge task and one that is far from complete. However this page gives a flavour of the methodology and reports some early findings.
David Robinson, COLMSCT Teaching Fellow
A lot of work has been done on the use of conferences for student content and tutorial help. Less work has been done on e-tuition in practical subjects and even less work on the use of broadcast.
Verina Waights and Ali Wyllie
Nurses are required to make clinical decisions about patients' health and well-being, responding to changes in each patient's condition, which may occur within very small time-frames
Keith McGraw, COLMSCT Associate Teaching Fellow
The aim of the project was to identify and share best practice on how to help OU Engineering students achieve their Transferable Skills Intended Learning Outcomes (ILOs) on previous versions (2003-2007) of the Professional Development Planning (PDP) courses: T191 Personal and career development in engineering, T397 Key skills for professional engineers and T398 Towards chartership: professional development for engineers.
Janet Dyke, COLMSCT Associate Teaching Fellow
The aim of this project was to determine whether formative feedback explicitly directed to improvement of study skills could help tutors provide students with the means to become more effective learners. The intended impact was that students would gain from the feedback by having increased knowledge of their strengths and weaknesses. Students should then be able to improve their skills and take them forward through the current and future courses. By setting formative targets related to course learning outcomes the tutor should be able to focus the student towards more self-directed learning.
Stanley Oldfield, COLMSCT Teaching Fellow
In recent years there has been increasing emphasis on the development of communication and collaboration skills within OU undergraduate courses. This has arisen from the general adoption of the Skills agenda, partly as a result of pressure from employers and professional bodies to ensure that by graduation student possess more than knowledge of their subject or even the ability to apply that knowledge to real world situations.
Hilary Cunningham-Atkins, COLMSCT Associate Teaching Fellow
In October 2005 T171 reached the end of the final presentation after one pilot presentation and nine full presentations. The aim of this project is to collate the vast amount of experience of online teaching and learning amassed by T171 tutors.
Giselle Ferreira
This work was carried out between October 2007 and December 2008.
Stuart Slater, COLMSCT Associate Teaching Fellow
The original scope of the project involved developing tools, techniques and applications to enable both staff and students to begin actually developing client side applications for their own mobile devices.
Arlene Hunter, COLMSCT Teaching Fellow
The primary aim of this project was to develop and then implement an online interactive formative assessment framework, designed from a constructivist and interventionist perspective that would promote student engagement and understanding of academic progression from an extrinsic as well as intrinsic perspective.
Ali Wyllie and Verina Waights, COLMSCT Teaching Fellows
This project set out to investigate the use of an online decision-making maze tool as a form of eAssessment task more motivating and relevant to practice-based students.
Stuart Freake, COLMSCT Teaching Fellow
This project aims to investigate options for web–based communication between associate lecturers and students, both for e-tutorials and for one-to-one support, particularly the ability to communicate complex equations and diagrams.
Felicity Bryers, COLMSCT Associate Teaching Fellow
Developments in technology have already provided potential new methods of supporting students. The use of a synchronous online interface involving both voice and an interactive whiteboard offers an exciting opportunity to widen participation by offering tuition to remote students and others unable to attend face-to-face tutorials.
Abigail Kirk
This project has investigated the use of interactive sessions in Elluminate to support students’ learning of Mathcad.
Michael Isherwood
A major concern of the Course Team of M150 is Block2 - JavaScript programming
Tim Lowe
The aims of the project is to develop an initial set of example questions using the STACK computer-algebra based e-assessment system and to trial these with students. The intention is that the questions are appropriately randomised and provide extensive, individualised feedback to the answers submitted by students in addition to a full worked solution to the problem.
Ian Cooke
To produce e-tutorial teaching and support modules for M150
Sally Jordan (COLMSCT Teaching Fellow) and Barbara Brockbank (COLMSCT Associate Teaching Fellow)
This project has investigated the use of computer-aided assessment for checking and providing instantaneous feedback on questions requiring short free text answers, typically a sentence in length. The questions were initially written using software provided by Intelligent Assessment Technologies Ltd (IAT). This software uses natural language processing (NLP) techniques of information extraction, but an authoring tool is provided to shield the question author from the complexities of NLP. The IAT software was used with the Open University’s OpenMark e-assessment system, thus enabling students to be offered three attempts at each question with increasing feedback after each attempt. Feedback on incomplete and incorrect responses was provided from within the IAT authoring tool, using a flagging system developed during the project.
Phil Butcher, COLMSCT Teaching Fellow
Elsewhere on this website Sally Jordan describes the creation and evaluation of a range of questions which were designed to require students to respond using short-answer free-text responses of up to 20 words. The resulting responses were marked by humans and a computational linguistics algorithm, Intelligent Assessment Technology’s Free Text system, and the accuracy of the outcomes compared. The comparison showed that the computer could perform at a level equivalent to human markers. This project extends this work to encompass two computer marking solutions which are more readily available.
Gaynor Arrowsmith
To investigate logistic, technical and pedagogical aspects of electronic marking (eMarking) of Tutor-Marked Assignments (eTMAs) for mathematics courses.
Keith Beechener, COLMSCT Associate Teaching Fellow
Using email, electronic communications and other emerging online tools to support an effective interaction between distance learners and tutors. This project is based on the experiences of tutoring students on first level Open University courses.
Kath Clay
The concept of a subject community is one that enables both students and academic staff to have some continuity and general skills and knowledge sharing outwith the framework / content of any particular course
Ken Hudson, COLMSCT Associate Teaching Fellow
The aim of this project is to explore the value of online virtual experiments in enhancing student understanding of course material, retention and employability skill. A 'virtual lab' is defined as an 'e-learning activity based on conventional laboratory procedures, but delivered on-line to distance learners, to give them a more real experience of biological material, procedures and applicability, normally absent from a paper-based course'.
Pat Bailey
The aims of the project are to work closely with the Mathematics Online (MOL) and MU123 teams to:
Judith Jeffcoate, COLMSCT Associate Teaching Fellow
This project was designed to look at ways to support students through online study groups, using the facilities available in a virtual learning environment (VLE) such as Moodle.
Jonathan Fine
There are problems with mathematics in electronic media. The most basic symptom of the problem is that we often cannot copy-and-paste mathematical content from one application to another.
Rob Parsons, COLMSCT Associate Teaching Fellow
This project aims to examine the interactions of moderators and students on two Open University, online technology courses in order to determine what kind of activities and approaches succeed in keeping students engaged and moving forward with the course.
Mirabelle Walker, COLMSCT Teaching Fellow
This project was designed to investigate, and find ways to improve, the quality of written feedback being given on students’ tutor-marked assignments.
Frances Chetwynd
This project builds on the work of Mirabelle Walker (Fellow 2005 - 2007) and also the Formative Assessment in Science Teaching project (www.open.ac.uk/fast)
Dave Hubble, COLMSCT Associate Teaching Fellow
This project aims to investigate levels of student engagement in e-learning activities, relate this to academic performance and unltimately develop methods for improving the level of engagement where this is currently low
Diane Butler, COLMSCT Teaching Fellow
The aim of this project is to improve the integration of online activities in to Open University courses.
Karen Shipp, COLMSCT Teaching Fellow
Investigating the impact of reduced face-to-face interaction on the underlying dynamics of the OU teaching system and on the relationships that maintain motivation and retention, in order to identify appropriate interventions
Christine Leach
A significant number of students studying higher level chemistry courses have some difficulty with the mathematics. This impacts on their studies particularly in areas involving physical chemistry which are more mathematically-based.
Andy Diament, COLMSCT Associate Teaching Fellow
A study to identify, describe and disseminate good practice for synchronous online tutorials, specifically for teaching in Maths/Science/Computing/Technology Areas.
Alice Peasgood. COLMSCT Teaching Fellow
My research explores students’ everyday experiences of ICT or new technologies, and how they apply those to their learning. The students are adults within a widening participation (WP) programme of distance education courses, the Openings Programme.
Jill Shaw
The aim of this project is to investigate students' increased use of collaborative learning resources via fOUndIt (http://foundit.open.ac.uk). Additionally it aims to address the issue of course resources becoming outdated, by exploring the use of fOUndIt as a gathering point for user-contributed news, ideas and resources.
Ralph Janke, COLMSCT Associate Teaching Fellow
The aim of this project is to examine the effectiveness of the usage of instant messaging technology between tutors and students as a complementary measure to face-to-face tutorials.
Ingrid Nix (COLMSCT Teaching Fellow) and Ali Wyllie (COLMSCT Teaching Fellow)
This project aims to produce a design for a continuum of topic-based computer-marked questions, from easy to difficult and from formative to summative. Students can choose, depending on their self-assessment, which questions on the continuum to engage with and to log their reflections as they do so. Our research questions will focus on how students respond to this method of selecting their learning journey, whether students agree with the mapping of the continuum according to the typology of questions which we have devised, and what design improvements can be suggested.
Sarah Chyriwsky
A key objective of the Mathematics Online Project is the development of an electronic tutor marked assignment system that empowers associate lecturers rather than burdens them
Breen Sweeney
Research was carried out to investigate how mathematics can be taught in a virtual world.
Alistair Willis
This project is part of the COLMSCT investigations into using online questions which require students to respond with short answers in free text. In particular, this project is looking into how to set questions that require the students to make two or three separate points, and obtain credit for each subpart. This project is working closely with the e-assessment projects of Sally Jordan and Phil Butcher.
John Woodthorpe
To evaluate a range of Web 2.0 technologies for the automated and semi-automated generation, delivery and maintenance of electronic resources in courses and the wider OU community.
Karen Kear
An investigation of wikis and audio conferencing for supporting students’ collaborative learning in online course settings.
Shailey Minocha
The aim of the project is to investigate the pedagogical effectiveness of 3D Multi-User Virtual Environments (MUVEs) such as Second Life and their role in enhancing the student's learning experience.
Robert Davis, COLMSCT Associate Teaching Fellow
The focus for the project is to provide an online self-assessment tool for developing an understanding of music analysis (mainly descriptive) and music notation.
Chris Barrett
SXR208 is a popular second level residential school in practical astronomy and planetary science. It plays a key part in the teaching of astronomy at level 2, providing vital training in observational skills. The host campus, the Observatori Astronomic de Mallorca (OAM), is located abroad, so that traveling to the school and taking part in the practical activities is difficult or impossible for certain students due to health or personal reasons. To cater for this group we propose to develop a remote participation mode for SXR208 activities that maintains the interactive, “hands-on” and group working character of the existing SXR208 projects. Students unable to attend in person in Mallorca would be able to work as part of a resident student group via two-way voice or video links direct to a dome and the computer lab at OAM.
Liz Thackray
Immersive Virtual Worlds (IVWs) have gained a great deal of attention in the education community in recent years.
Charlotte Schulze
This project is concerned with the design of joint online activities that result in some form of tangible outcome: a group of students would be given a specific task which they would have to jointly fulfill while communicating through an online forum. The aim would be to jointly draw up a document – this may even be a piece of primary research – which would subsequently be used as an integral part of the course. The most likely scenario would be that such a document could serve as a discussion piece in a TMA question. It may also be possible that different online activity groups undertake different types of tasks and the resulting documents could form the basis for critical comparative discussions.
Pete Thomas, COLMSCT Teaching Fellow
The primary aim of this project was to design, implement and evaluate a software tool for helping students learn and revise diagramming skills of entity-relationship diagrams (ERDs), a crucial component of any database course. The tool, if successful, would be employed on the Computing Department’s third level course database course, M359, from 2008. At the heart of the tool is an automatic marker for ERDs being developed as a separate research project within the Computing Department. The automatic marker not only awards a grade for a diagram but also provides information useful for feedback purposes.
Judy Ekins, COLMSCT Teaching Fellow
From 2007, the OU requires its students to be on-line and it is also adopting the MOODLE VLE. This increases the feasiblilty of implementing electronic assessment. In order to improve retention on Level 1 Open University mathematics, the COLMSCT project involved piloting short interactive internet quizzes. The OU package “Open Mark” was used, enabling students to receive instant feedback, where as previously they had to wait days or weeks. Students are allowed several attempts at each question, with appropriate teaching feedback after each attempt. At the end of each quiz, alongside the mark, relevant study advice is given to the student, including references to appropriate course material. A hint facility was was also introduced for students who were unable to start a question. Open Mark has a variety of question types and is being integrated into MOODLE VLE and so will be open source.
Stephen Burnley, COLMSCT Teaching Fellow
Students taking “Environmental monitoring, modelling and control” (T308) are required to undertake an environmental assessment project as their end of course assessment. The project integrates the various themes of the course and provides experience of the work carried out by many environmental technologists.
Bill Tait, COLMSCT Associate Teaching Fellow
The overall aim of the project was to evaluate Web Based Learning as a means of providing additional support for students on Open University courses.
Tom Argles
This project aims to deliver innovative teaching of spatial geological data in a digital context to a distance-learning community of students
Bai Bin and Steve Swithenby
Bai Bin, Researcher from Research Center of Distance Education, Beijing Normal University, China, has been visiting the Open University for 3 months from early September 2008. Working with Professor Steve Swithenby, Director of the Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT), Bin’s visit is part of a collaborative research study in quality assurance of online higher education between UK and China.
Ben Mensah and David Robinson
The British Council project called ‘Resource packs in experimental science for teachers and trainee teachers to enable both face-to-face and distance learning in practical work’ is funded under the DelPHE initiative, which promotes links between HE establishments to address the millennium development goals.
COLMSCT is keen to build international as well as national connections with experts in the COLMSCT themes of:
Bai Bin, Researcher from Research Center of Distance Education, Beijing Normal University, China, has been visiting the Open University for 3 months from early September 2008. Working with Professor Steve Swithenby, Director of the Centre for Open Learning of Mathematics, Science, Computing and Technology (COLMSCT), Bin’s visit is part of a collaborative research study in quality assurance of online higher education between UK and China.
An outcome from Bin’s visit is to introduce some successful experiences, learned from his visit, to Chinese colleagues who are interested in quality assurance in distance online education.
Whilst here, Bin has joined the OU level one course, S104, as a visitor to experience the online learning process of OU students. He will attend a similar course in China to compare course design, students support, teaching and learning activities between similar courses in the two countries. Further research conclusions will be drawn to help understand the cultural gap. Bin’s research has been focused on institutional-level and course-level quality assurance in online education.
Fundamental changes in society and education are being wrought by new technologies, competition in higher education has intensified and the boundaries between the full time and part time sectors are blurring. In order to enhance the quality of education, be flexible in responding to changes in diverse student needs and the external social environment, online educational colleges must assure and enhance their academic quality.
Before Bin returns to China in late December he will submit a research report and give a presentation about "Quality assurance of online education in China".
In the meantime, as well as learning all he can from the OU about their quality assurance, Bin has been has attended two conferences: The 3rd Open CETL Conference: Building Bridges, the PBPL Annual Conference 2008, and is due to attend the ReLIVE08: Researching Learning in Virtual Environments conference later this month.
These conferences have given Bin the opportunity to learn about many new innovations in teaching and learning such as ‘Second Life’, interactive computer marked assessment ( iCMAs), and ‘Elluminate’.
During his visit, Bin’s Chinese supervisor, Professor Chenli from Beijing Normal University visited the University as well, meeting with colleagues from the Open CETL, OUW and IET. Steve Swithenby is due to visit China in the Spring.
If you would like to discuss Bin’s research, please email b.bin@open.ac.uk. Bin is on campus until 19 December 2008.
Bai Bin and Steve Swithenby
b.bin@open.ac.uk; s.j.swithenby@open.ac.uk
There are some basic principles of participation in experimentation and investigation that could usefully be taught in a generic way. The increasing availability of computer-based communications and mobile phones means that it is possible to consider presenting on-line the sort of experimental and investigative activities that used to be day school and tutorial based.
Providing investigations in an on-line environment is an essential component of courses in practical-based subjects that are designed for electronic delivery. The aim of these projects is to explore issues and approaches to the online curriculum and in particular to experimentation and investigation in an online environment.
Richard Seaton, COLMSCT Associate Teaching Fellow
The use of mobile technologies including mobile phones to deliver learning objects to support students taking distance learning programmes is causing much interest at the UK Open University (OU). The project, An investigation of the use of mobile technologies to support students' learning, has been developing leaning objects to assist students on the T216 Cisco Networking course to practise subnetting, a particular feature of the addressing used to identify all devices on the internet and the majority of today’s computer networks.
David Robinson, COLMSCT Teaching Fellow
A lot of work has been done on the use of conferences for student content and tutorial help. Less work has been done on e-tuition in practical subjects and even less work on the use of broadcast.
Jessica Bartlett and Sarah Davies
The aim of this project is to significantly extend an intial trial of the potential of remote fieldwork and group working for fieldwork in science by linking groups of students at a home-base with groups of students in the field, using simple but innovative technologies.
Judith Jeffcoate, COLMSCT Associate Teaching Fellow
This project was designed to look at ways to support students through online study groups, using the facilities available in a virtual learning environment (VLE) such as Moodle.
Ralph Janke, COLMSCT Associate Teaching Fellow
The aim of this project is to examine the effectiveness of the usage of instant messaging technology between tutors and students as a complementary measure to face-to-face tutorials.
Chris Barrett
SXR208 is