More on significant figures

I’ve said before that students are not good at giving answers to an appropriate number of significant figures. But what do they do wrong? The question shown on the left provides some insight.

The correct three options are B, C and E and they are the three most commonly selected. However options C and E are selected more frequently than option B. The two incorrect responses are selected an approproximately equal number of times – perhaps due to guessing; this question is in formative-only use.

Things get more interesting when you just look at responses from students who use all three responses before getting the question right – or who fail to get it right at all. Whilst 69.2% of these responses include Option C and 65.1% include Option E, only 56.4% include Option B – and an almost identical proportion of responses (56.0%) include the incorrect Option A. So leading and trailing zeroes cause problems.

Posted in mathematical misunderstandings | Tagged , | Leave a comment

Significant figures and rounding

I’ve posted before about the difficulties students seem to have with significant figures and rounding. This post adds some data, to give you an appreciation of the size of the problem.

The data are for the question shown on the left (from a summative iCMA, with all the implications that has as to students trying their best etc. – and the findings are very similar for all variants).  In order to answer the question the students have to write a number in scientific notation to three significant figures. In doing this, they have to round the number up. Of the responses analysed:

323 (90.2%) of responses were given in scientific notation

308 (86.0%) of responses were given to three significant figures

270 (75.4%) of responses had been rounded up.

Continue reading

Posted in mathematical misunderstandings | Tagged , , | Leave a comment

Scientific notation

The question shown on the right is quite well answered even though it is on the formative-only practice assessment. Errors when they do occur are mostly as I’d have predicted – 4.5% of responses give the wrong sign in the power of ten and 2.5% of responses raise 2.6  rather than 10 to the correct power. Then a small number give answers that are 10 times too big or 1o times too small or fail to use the superscript function correctly.

However a comparison of the different variants of this question is interesting. A variant where students have to express 0.1578 in scientific notation appears to cause significantly more difficulties than the one shown above. At one level this is the opposite from what you might have expected, because it is all too easy to miscount the zeroes in 0.00026. It appears that ten to the minus one is a difficult power to understand.

Posted in mathematical misunderstandings | Tagged , | Leave a comment

Is the answer 6, 9, 300 or 11809.8?

It was the analysis of question such as the one above that led to one of my early insights into mathematical misunderstandings. I discovered then that a common answer to this question was 11809.8. This is caused by students finding 310 and then dividing the whole thing by 5. So either students misunderstand the rules of precedence or they just can’t use their calculator. As you’ll see from above, I have since added targeted feedback in response to errors of this type.

A more recent analysis has shown Continue reading

Posted in mathematical misunderstandings | Tagged , | Leave a comment

More about guessing and blank/repeated responses

Depressingly, this post reports a similar finding to the last one.

For the  question shown (which is one of a series of linked questions on the Maths for Science  formative-only practice assessment), 62% of students are right at the first attempt but 22% remain incorrect after the allowed two responses. At the response level, whilst 60.2% of responses are correct, the other options are selected approximately equal numbers of time. The details are below:

P>0.1  12.4% of responses

0.1>P>0.05 14.0% of responses

0.05>P>0.01 60.2% of responses

P<0.01 13.5% of responses

So what’s this saying? Continue reading

Posted in multiple-choice questions, question analysis, student engagement | Tagged , , , | Leave a comment

‘A nice demonstration of a problem with multiple-choice questions’

That was my husband’s comment when we were analysing responses to the question shown on the left. Start by noting that although this is a drag and drop question it is indeed effectively a multiple-choice (or multiple-response) question – you can choose from eight pre-determined options for each ‘blank’ that you are required to fill. It is also worth noting that this question is on the formative-only practice assessment.

62% of students get the question right at first attempt. However 23% are still wrong after three attempts, having received all our carefully crafted feedback. And whilst 54.1% of responses are completely correct, all the other responses appear to have been guessed from the plausible options. Not really how you want students to be answering questions. Not good! I wrote the question, so I’m allowed to say that.

Posted in drag and drop, multiple-choice questions | Tagged , | Leave a comment

Formative or summative logarithms

I’ve posted before about the fact that whilst students usually engage quite well with formative-only iCMA questions, when the going gets tough, they are inevitably more likely to guess than is the case when the mark counts. When I eventually get to the end of my course writing (and associated preference for blogging about things to do with maths misunderstandings, on the basis that this is relevant for the course writing too), I will talk about our changes of assessment strategy in the Open University Science Faculty. For now, I just want to reflect on the size of the formative vs summative effect. Don’t treat this too seriously, but I think the answer to ‘how big is the effect’ may be 3%. Read on.

Conside the question shown on the right. Variants of this question occur both in the formative practice assessment and in one of the summative end-of-module assessments.

In the practice assessment, 74.3% of responses were correct whilst 6.2% gave the number given in the question (so for this variant, they gave an answer of 4 – presumably guessing). In the summative equivalent, 77.7% of responses were correct whilst 3.3% gave the number given in the question.

Posted in formative assessment, mathematical misunderstandings, summative assessment | Tagged , , , | Leave a comment

Making Assessment Count

I attended another JISC ‘webinair’ yesterday. These provide a wonderful opportunity to keep up to date with developments in assessment for minimal outlay of time (one hour) and at no cost. Today’s webinair featured Gunter Saunders from the University of Westminster and Peter Chatterton from Daedelus eWorld Limited taking about the Making Assessment Count (MAC) Project.

The aims of the project include helping students to engage with the assessment process and using technology to help connect student reflections on assessment with their tutors. The technology in this case is E-reflect, which enables tutors to generate self-review questionnaires. Students complete these and get automated feedback. It’s all very simple multiple-choice stuff, but after receiving the feedback students write an entry in their learning journal and at this stage they enter into more meaningful dialogue with their tutor.

It took me a little while to work out what E-reflect is (and I’m still not sure I’ve got it right) but I’ve found a helpful website meant for University of Westmister staff and a Youtube channel (we were meant to watch one of these videos during the webinair, but I misunderstood the instruction and looked at the wrong one!).

I wonder if we could get something like this working at the OU? Just maybe. S154 Science starts here (the wonderful 10-credit module that I chair) is about to start its final presentation and the material will be re-used in the 30-credit S140 Introducing science. S154 uses a learning journal and I am determined to keep this aspect in S140 – but some students don’t like it. Definitely worth thinking about a slightly different approach.

Interestingly, given my previous post in which I explained one of the difficulties with teaching at a distance, duing the webinair I found myself pointing out that personalised support doesn’t have to be face-to-face. Dialogue can take place electronically! I’m sure we could run a system like E-reflect (probably using Moodle Quiz) in an entirely online environment to encourage reflection and as a starting point for dialogue.  I just need to think about whether this would be a good way forward for S140.

Posted in learning journals, reflection | Tagged , , , , , , | Leave a comment

Confusing letters – a distance-learning problem

I spend a lot of time arguing, especially in the context of students’ mathematical misunderstandings, that the Open University’s mostly mature distance-learning students are not that different from students at conventional universities. However we have a few difficulties that others don’t. One is that students who don’t attend face-to-face or Elluminate tutorials don’t hear others pronouncing words and symbols, or get the opportunity to do so for themselves. We try all sorts of strategies to ameliorate for this e.g. giving pronunciations within written text, using ‘talking glossaries’.

But the problem remains – I am very used to students thinking that ρ (rho) is a p, and have heard λ described as ‘that step-ladder thing’. I’ve posted previously about problems caused when students think that Gln is GIn. This post is about a confusion with the abbreviation for natural logarithms (i.e. logarithms to base e). The first problem is that some people use the abbreviation ‘log’ when describing natural logarithms whilst others use ‘log’ to mean logarithm to base 10. For safety I use ‘ln’ for natural logarithm. Well I thought that was a safe solution until my colleague Christine Leach (who has written iCMA questions for third level chemistry students) told me that she had complaints from students who had typed ‘In’ or ‘1n’ instead of ‘ln’ into their answers – and we marked them wrong! Especially in a sans serif font it really is very difficult to distinguish between the number 1, the letter I (upper case i) and the letter l (lower case L).

Posted in font | Leave a comment

Converting radians to degrees

The question shown on the left is actually quite well answered (81% of students got it right at the first attempt) especially since it is in the formative-only practice assessment rather than the summative end-of-module assessment.

However it is interesting that the most common error to all variants of this question (in 4.4% of responses) is to calculate the value of the fraction (3π/4 in the variant shown) rather than converting to degrees.

I find this particularly interesting because this error feels conceptually similar to  that made when students give the values at which a line has zero value rather than where its gradient is zero (see ‘Function or derivative?‘) or when they give twice a number rather than squaring it. I’m not sure whether there really is some conceptually similar misunderstanding or whether the students who make these mistakes  just haven’t a clue!

Posted in mathematical misunderstandings | Tagged | Leave a comment