Archive for the ‘question analysis’ Category

Same assignment, different students 3

Friday, May 17th, 2013

You’ll be getting the idea…

The figures below show, for each question, the number of students who got it right at first attempt (yellow), second attempt (green), third attempt (blue), or not at all (maroon). So the total height of each bar represents the total number of students who completed each question.

You can spot the differences for yourself and I’m sure you will be able to work out which module is which! However I thought you’d like to know that questions 24-27 are on basic differential calculus. Obviously still some work to do there…

Same assignment, different students 2

Thursday, May 16th, 2013

Following on from my previous post, take a look at the two figures below. They show how students’ overall score on an iCMA varied with the date they submitted. These figures are for the same two assignments as in the previous post (very similar assignments, rather different students).

The top figure (above) is depressingly familiar. The students who submit early all do very well – they probably didn’t need to study the module at all! The rest are rushing to get the assignment done, just before the due date – and lots of them don’t do very well.

I am very pleased with the lower figure. Here students are doing the assignment steadily all the while it is available – and with the exception of a small number who were probably prompted to have a go on the due date by a reminder email we sent, they do pretty similarly, irrespective of when they submitted. This is how assignments should perform!

I’m aware that my interpretation may seem simplistic, but we have other evidence that the first batch of students are overcommitted – they are also younger and have lower previous qualifications – so it all fits.

Finally, following yesterday’s JISC webinar on learning analytics I’m beginning to think that this is how I should be describing the work that I’ve previously categorised as ‘Question analysis’ and  ’Student engagement’. However we describe this sort of analysis, we must do more of it – it’s powerful stuff.

Same assignment, different students

Sunday, May 12th, 2013

I’ve written quite a lot previously about what you can learn about student misunderstandings and student engagement by looking at their use of computer-marked assiggnments. See my posts under ‘question analysis’ and ‘student engagement’.

Recently, I had cause to take this slightly further. We have two interactive computer-marked assignments (iCMAs) that test the same material, and that are known to be of very similar difficulty. Some of the questions in the two assignments are exactly the same, most are slightly different.  But when we see very different patterns of use, this can be attributed to the fact that the two iCMAs are used on different modules, with very different student populations.

Compare the two figures shown below. These simply show the number of questions started (by all students) on each day that the assignment is open. The first figure shows a situation where most questions are not started until close to the cut-off date. The students are behind, struggling and driven by the due-date (I know some of these things from other evidence).

The second figure shows a situation in which most questions are started as soon as the iCMA opens – the students are ready and waiting! These students do better by various measures – more on this to follow.

More about guessing and blank/repeated responses

Tuesday, February 7th, 2012

Depressingly, this post reports a similar finding to the last one.

For the  question shown (which is one of a series of linked questions on the Maths for Science  formative-only practice assessment), 62% of students are right at the first attempt but 22% remain incorrect after the allowed two responses. At the response level, whilst 60.2% of responses are correct, the other options are selected approximately equal numbers of time. The details are below:

P>0.1  12.4% of responses

0.1>P>0.05 14.0% of responses

0.05>P>0.01 60.2% of responses

P<0.01 13.5% of responses

So what’s this saying? (more…)

Is the difficulty calculus or negative numbers?

Friday, December 23rd, 2011

As you’ll have realised, I’m currently analysing responses to questions on calculus. Maths for Science teaches just very basic differentiation and the position of the chapter on differentiation will be different in the revised edition of the course. In the new edition, the chapter will be where it belongs in a logical story line, immediately after the chapter on graphs and gradient. In the old edition, we thought differentiation might be a step too far for some students so offered Differentiation (Chapter 10) as a choice with Statistical Hypothesis Testing (Chapter 9). Add in the fact that we still allowed students to answer questions on both chapters if they wanted to, and the fact that many of the Chapter 10 questions are multiple choice, and it is really quite difficult to tell whether students are guessing answers to the Chapter 10 questions – even the summative ones. This is characteristically different from behaviour on earlier questions where it is very obvious that most students are not guessing.

An extension of this is that it is quite difficult to assess what the real difficulties are that students experience with the chapter on differentiation. We have some familiar stories e.g. answers where a function has been differentiated correctly and values subsituted correctly, but then answers have been given to an inappropriate number of significant figures. Hey ho – I think we get too hung up about this.

Now compare the two questions given below. (more…)

Estimating the gradient of curves – variants behaving differently

Friday, December 23rd, 2011

The screenshots below show two variants of the same question. I should start by emphasising that the question is only used formatively – it’s on the Maths for Science Practice Assignment.

The two variants behave very differently and the reason for this has very little to do with student’s understanding. (more…)

Unit conversions

Monday, December 19th, 2011

As previously discussed, students aren’t great at giving correct units with their answers. However they have real probems with unit conversions!

The unit conversion shown on the left  is exceptionally badly answered, and a lot of students give up without even trying. The difficulties appear to stem from different aspects of the problem, as exemplified by the examples below.

First of all, let’s look at the problem students have in converting from one square (or cubic) unit to another, as on the right-hand side. Errors in this question fall into three basic categories:

(1) students who convert from km to m rather than from km^2 to m^2.

(2) students who do the converstion the wrong way round (i.e. convert from m^2 to km^2).

(3) students who do both of the above (i.e convert from m to km) – actually more common that just doing the conversion the wrong way round.

I’ve known for ages about the difficultly some people have in converting squared or cubic units. It fascinates me. (more…)

More errors in finding the gradient of a graph

Friday, November 4th, 2011

Working yesterday on the chapter on Graphs and Gradient for the new edition of Maths for Science, I remembered the other student error that I have seen in iCMA questions. When asked to find the gradient of a graph like the one shown below, some students essentially ‘count squares’.

In this case, they would give the rise (actually (170-10) km = 160 km) as 16, since that is how many squares there are vertically between the red lines. By counting squares horizontally, they would give the corresponding run as 28 (which is correct apart from the fact that the units – seconds in this case – have been omitted).  So the gradient would be given as 0.57 instead of 5.7 km/s.

A related problem is students measuring the rise and run using a ruler, again rather than reading values from the axes. Perhaps we encourage both these behaviours by making an analogy between the gradient of a line and the gradient of a road. When finding the gradient of a road, we are concerned with actual lengths,  not the representations of  other things on a graph.

Errors in finding the gradient of a graph

Sunday, October 2nd, 2011

Consider the simple question shown below:

This question is generally well answered, but when students make a mistake, what do they do wrong? (more…)

BODMAS, BIDMAS, BEDMAS

Tuesday, September 27th, 2011

More on simple arithmetic skills that people don’t always understand as well as they think they do, leading to difficulties at a later stage.

In the OU Science Faculty we use the mnemonic BEDMAS (others use BODMAS or BIDMAS) to remind students of the  rules governing the order of precedence for arithmetic operations:

Brackets take precedence over

Exponents. Then

Division and

Multiplication must be done before

Addition and

Subtraction.

When analysing student responses to iCMA questions, lack of understanding of the rules of precedence and related issues, whilst not contributing to as many errors as do problems with fractions and/or units, it’s still up there as a common difficulty. Sometimes the problem can be attributed to poor calculator use e.g. a lot of students interpret 3 6/3 as meaning (3 6)/3,  perhaps because they don’t stop and think before using their calculator.  This misunderstanding (seen in lots of variants of a question in summative use) led to a talk I used to give: ‘Why is the answer always 243?’.  But it goes deeper than that! For example, even after teaching students how to multiply out brackets etc., many think that (x + y) 2 is the same as  x2+ y2. Mistakes of this ilk are completely understandable, but it is nevertheless something to watch out for.