Positive and negative feedback

Time to take a seasonal break from my rather tedious recent posts and to return to a reflection on feedback.

The column ‘Feedback’ (what else!) on the penultimate page of the Christmas and New Year New Scientist special (24/31 December 2011), includes the following:

‘John …recalls a senior manager urging staff to provide feedback for his latest project, ‘but it must be positive’…

John could not help but explain that negative feedback produced growth and stability and positive feedback produced burnout’

John is clearly a man after my own heart! Not only do scientists and senior managers use the adjectives ‘positive’ and ‘negative’ in this context to mean rather different things; those working in assessment are completely confused. The problem is that the impact of assessment feedback may be positive or negative (in both senses of both definitions) and the outcome is difficult to predict.

Happy Christmas and very best wishes for 2012.

Posted in feedback | Tagged , | Leave a comment

Is the difficulty calculus or negative numbers?

As you’ll have realised, I’m currently analysing responses to questions on calculus. Maths for Science teaches just very basic differentiation and the position of the chapter on differentiation will be different in the revised edition of the course. In the new edition, the chapter will be where it belongs in a logical story line, immediately after the chapter on graphs and gradient. In the old edition, we thought differentiation might be a step too far for some students so offered Differentiation (Chapter 10) as a choice with Statistical Hypothesis Testing (Chapter 9). Add in the fact that we still allowed students to answer questions on both chapters if they wanted to, and the fact that many of the Chapter 10 questions are multiple choice, and it is really quite difficult to tell whether students are guessing answers to the Chapter 10 questions – even the summative ones. This is characteristically different from behaviour on earlier questions where it is very obvious that most students are not guessing.

An extension of this is that it is quite difficult to assess what the real difficulties are that students experience with the chapter on differentiation. We have some familiar stories e.g. answers where a function has been differentiated correctly and values subsituted correctly, but then answers have been given to an inappropriate number of significant figures. Hey ho – I think we get too hung up about this.

Now compare the two questions given below. Continue reading

Posted in calculus, derivative, mathematical misunderstandings, negative numbers, question analysis | Tagged , , , | Leave a comment

Estimating the gradient of curves – variants behaving differently

The screenshots below show two variants of the same question. I should start by emphasising that the question is only used formatively – it’s on the Maths for Science Practice Assignment.

The two variants behave very differently and the reason for this has very little to do with student’s understanding. Continue reading

Posted in gradient, mathematical misunderstandings, question analysis | Tagged , , | Leave a comment

Function or derivative?

The common student error in the question below is somewhat predictable – but I’m not sure why the students make the error that they do.

Whilst 60.4% of responses are entirely correct, 21.7% select the three options that are actually the places where the function rather than its derivative is zero. In the existing version of the Maths for Science End of Module Assignment, this question is assessing the contents of a chapter that some students choose not to study – so it is more than usually  likely that a certain percentage of students are guessing, further encouraged by the fact that this is a multiple choice question.

So I’m not sure we can draw many conclusions from the errors that students make on this occasion. But if students really think that dy/dx is zero at the points where the graph crosses the horizontal axis, then they are mistaking the function and its gradient. This is a pretty basic mistake – perhaps a bit like mistaking x to the power of 3 with x times 3. That might be rather interesting.

Posted in derivative, gradient, mathematical misunderstandings | Tagged , , | 1 Comment

Unit conversions

As previously discussed, students aren’t great at giving correct units with their answers. However they have real probems with unit conversions!

The unit conversion shown on the left  is exceptionally badly answered, and a lot of students give up without even trying. The difficulties appear to stem from different aspects of the problem, as exemplified by the examples below.

First of all, let’s look at the problem students have in converting from one square (or cubic) unit to another, as on the right-hand side. Errors in this question fall into three basic categories:

(1) students who convert from km to m rather than from km^2 to m^2.

(2) students who do the converstion the wrong way round (i.e. convert from m^2 to km^2).

(3) students who do both of the above (i.e convert from m to km) – actually more common that just doing the conversion the wrong way round.

I’ve known for ages about the difficultly some people have in converting squared or cubic units. It fascinates me. Continue reading

Posted in mathematical misunderstandings, question analysis, unit conversions | Tagged , , , | Leave a comment

Use of capital letters and full stops

For the paper described in the previous post, I ended up deleting a section which described an investigation into whether student use of capital letters and full stops could be used as a proxy  for writing in sentences and paragraphs. We looked at this because it is a time-consuming and labour-intensive task to classify student responses as being ‘a phrase’, ‘a sentence’, ‘a paragraph’ etc. – but spotting capital letters and full stops is easier (and can be automated!).

I removed this section from the paper because the findings were somewhat inconclusive, but I was nevertheless surprised how many responses finished with a full stop and especially by the large number that started with a capital letter. See the table below, for a range of questions in a range of different uses (sometimes summative and sometimes not).

Question Number of responses (and percentage of total) that started with a capital letter Number of responses (and percentage of total) that finished with a full stop
A-good-ideaAYRF

S154 10J

 1678 (60.9%)

622 (60.0%)

 1118 (40.6%)

433 (41.8%)

Oil-on-waterS154 10J  500 (53.9%)  294 (31.7%)
MetamorphicSXR103 10E  297 (41.6%)  166 (23.2%)
SedimentarySXR103 10E  317 (39.9%)   178 (22.4%)
SandstoneS104 10B  954 (58.2%)  684 (41.7%)
Electric-forceS104 10B  673 (56.7%)  445 (37.5%)

Answers that were paragraphs were found to be very likely to start with a capital letter and end with a full stop; answers that were written in note form or as phrases were less likely to start with a capital letter and end with a full stop. Answers in the form of sentences were somewhere in between.

The other very interesting thing was that capital letters and full stops were both [sometimes significantly] associated with correct rather than incorrect responses.

Posted in short-answer free text questions, student engagement, Uncategorized | Tagged , | Leave a comment

Student engagement with assessment and feedback: some lessons from short-answer free-text e-assessment questions

Sorry for my long absence from this blog. Those of you who work in or are close to UK Higher Education will probably realise why – changes in the funding of higher education in England mean that the Open University (and no-doubt others) are having to do a lot of work to revise our curriculum to fit. I’m sure that it will be great in the end, but the process we are going through at present is not much fun. I always work quite hard, but this is a whole new level – and I’m afraid my blog and my work in e-assessment is likely to suffer for the next few months.

It’s not all doom and gloom. I’ve had a paper published in Computers & Education (reference below), pulling together findings from our observation of students attempting short answer free-text questions in a usability lab, and our detailed analysis of student responses to free-text questions – some aspects of of which I have reported here. It’s a long paper, reflecting a substantial piece of work, so I am very pleased to have it published.

The reference is:

Jordan, S. (2012) Student engagement with assessment and feedback: some lessons from short-answer free-text e-assessment questions. Computers & Education, 58(2),  818-834.

The abstract is: 

Students were observed directly, in a usability laboratory, and indirectly, by means of an extensive evaluation of responses, as they attempted interactive computer-marked assessment questions that required free-text responses of up to 20 words and as they amended their responses after receiving feedback. This provided more general insight into the way in which students actually engage with assessment and feedback, which is not necessarily the same as their self-reported behaviour. Response length and type varied with whether the question was in summative, purely formative, or diagnostic use, with the question itself, and most significantly with students’ interpretation of what the question author was looking for. Feedback was most effective when it was understood by the student, tailored to the mistakes that they had made and when it prompted students rather than giving the answer. On some occasions, students appeared to respond to the computer as if it was a human marker, supporting the ‘computers as social actors’ hypothesis, whilst on other occasions students seemed very aware that they were being marked by a machine. Do take a look if you’re interested.

Posted in short-answer free text questions, student engagement | Tagged , | 1 Comment

More errors in finding the gradient of a graph

Working yesterday on the chapter on Graphs and Gradient for the new edition of Maths for Science, I remembered the other student error that I have seen in iCMA questions. When asked to find the gradient of a graph like the one shown below, some students essentially ‘count squares’.

In this case, they would give the rise (actually (170-10) km = 160 km) as 16, since that is how many squares there are vertically between the red lines. By counting squares horizontally, they would give the corresponding run as 28 (which is correct apart from the fact that the units – seconds in this case – have been omitted).  So the gradient would be given as 0.57 instead of 5.7 km/s.

A related problem is students measuring the rise and run using a ruler, again rather than reading values from the axes. Perhaps we encourage both these behaviours by making an analogy between the gradient of a line and the gradient of a road. When finding the gradient of a road, we are concerned with actual lengths,  not the representations of  other things on a graph.

Posted in gradient, graphs, mathematical misunderstandings, question analysis | Tagged , , , | Leave a comment

Student misunderstandings

The second external meeting I attended last week, this time at the University of Warwick, was a meeting of the Institute of Physics Higher Education Group entitled ‘Conceptual understanding : beyond diagnostic testing’. The messages that I’ve come home with are that student misunderstandings may not be what we think they are – and that we need to find out more. Derek Raine’s talk ‘Metaphors and misunderstandings: addressing student misconceptions in physics’ started off with a (presumably apocryphol – and I’m sure I won’t do the story justice) tale of a famous actress being shown to a dressing room in a provincial theatre. Her hosts were embarassed about the poor standard of their facilities and apologised that the dressing room had no door. But, she said ‘if there’s no door, how do I get in?’ Yes, we really are sometimes that much at cross-purposes with our students.

In physics education research, much attention has been given over the years to the ‘Force Concept Inventory’ (FCI), where a series of questions is used to assess student understanding of the Newtonian concepts of force. At the meeting, Marion Birch described common trends in FCI results at the universities of Manchester, Hull and Edinburgh – two questions seem to cause particular problems wherever they are asked. More startling are the gender differences – women do less well than men and two questions (different from those that are poorly answered by all) have particularly large differences. What Marion was describing was inarguable (though some of the women at the meeting wanted to argue…) but I want to know what is causing the results! Is the difference at the level of conceptual (mis)understanding or is it something about these particular questions that is causing women more difficulty than men? This is just far to interesting to let it go – we must find out what is going on.

The final presentation of the day was from Paula Heron (by video link from the University of Washington) on ‘Using students’ spontaneous reasoning to guide the design of effective instructional strategies’. I think we do need to start observing our students carefully, and asking about their reasoning, rather than just assuming that they answer mutliple-choice questions in the way that they do because of a particular misconception.

Posted in Uncategorized | Tagged , , , , , , | 2 Comments

Better by design

Although I’m still trying to find time to rewrite S151 Maths for Science (hence the previous posts about students’ mathematical misconceptions) I am also incredibly busy with other things. Last week I attended two external meetings, both excellent, so I’ll take a bit of a break from the maths to talk about these. 

On Wednesday I was at a meeting of the JISC Learning and Teaching Practice Experts Group. I have only recently been invited to join this group, so I headed to Birmingham with some trepidation, but it was a superb meeting and I met some inspirational people (very few of whom I’d met previously) – and everyone was friendly. It’s difficult to pick the best from the excellent presentations and workshops I attended, but one that has already proved useful was a session led by Alan Masson of the University of Ulster, about the Viewpoints Project. From their own blurb…’the Viewpoints Project provides practitioners with a series of simple user-friendly tools that promote a creative and effective approach to the curriculum design process.’ Simple indeed – we spent some time discussing sets of cards on ‘Good assesment and feedback practice’ (based on the REAP principles) – discussing how we would use these in curriculum planning. But it was so effective (and has already led me to use cards in planning a curriculum change of our own). And so nice to see the encouragement to plan a curriculum holistically, based on sound principles (of assessment and feedback in this case, but there were other sets of cards too, which we’d have integrated if we’d had more time).

Posted in curriculum design | Tagged , , , , , | Leave a comment