Skip to content

Toggle service links

You are here

  1. Home
  2. Guidelines for question authors

Guidelines for question authors

Points to note

  • Variety is the spice of life; vary your question styles.
  • Wherever possible (but accept that it is not always possible) design generative questions. OpenMark enables numbers, words, images and sounds, to be varied within questions and allows questions to be randomly chosen from banks of questions. With such features students can revisit the questions for more practice, or should the questions be used for credit, different students can receive different variations of the question.
  • Your questions should elicit responses that you have a good chance of predicting. OpenMark can store all student responses which you may browse to check the behaviour of your question in the real world. By improving response matching following use by students, questions mature, and you can be reasonably confident that your question is working as expected.
  • Most questions allow multiple attempts with three attempts being the norm. A correct response at the first attempt usually scores 100%, with lower percentages being awarded for correct responses which are given at the second or third attempt.
  • Scores are automatically recorded and can be made available to the student at the discretion of the author.
  • For incorrect responses at the first attempt some authors simply indicate the response is incorrect and allow the student the option of correcting their misunderstanding themselves. Detailed feedback pointing towards how to arrive at the correct answer is most often given at the second incorrect response.
  • OpenMark mixes 'assessment for learning' with 'assessment for credit'. It is quite OK to refer to study materials in the feedback to wrong answers. Indeed if marks are still to be gained this is a powerful incentive to get students to re-read the materials.
  • Specific response-matching, with associated feedback, for known errors should be used from the first incorrect response.
  • Include questions that test classic misconceptions – students learn through mistakes.
  • All questions end with an explanation screen which is the same for all students but prefaced by 'Your answer is correct.' or 'Your answer is incorrect' depending on their response.
  • Remember that you will build a student's confidence in their knowledge if you ask them to work out the answer for themselves rather than select it from a list. Examples of questions requiring constructed answers are to be found under Numeric response, Text response and 2D response.
  • The design is such that the question and the explanation will fit in the same window. Scroll bars are provided automatically if required.
  • Questions may be presented as a series of related parts, for example, in Question 'n' the student may be asked to label a diagram, and in Question 'n+1' to answer a text question based on that diagram. And it is possible to prevent access to Question 'n+1' until after Question 'n' has been completed. Using this technique chains of questions can be produced to guide students through complex logical tasks.
  • Finally check how your questions work. If every student scores 100% or 0% on the test or an individual question is that what you want? Equally if your test is formative but only a small percentage of students use it are you content?

Contact us

Comments on OpenMark may be addressed to:

Chris Nelson
Product Development Manager
The Open University

Greg Black
Production Manager
The Open University