Making the most of monitoring

There exists in the Open University (OU) a large set of largely unheralded workers who put themselves forward for one of the most important roles – the role of monitoring, which involves looking at a sample of another colleague’s marking and assessing it for standardisation to the marking criteria and quality of feedback. This group of workers monitored 36,000 assignments across the OU in 2019, 6 percent of the total received, including 3,735 from the School of Psychology & Counselling and nearly 1000 from one module alone (DE300 ‘Investigating Psychology’). Monitoring of assessment, for standardisation of marking and quality of feedback, is an essential feature of maintaining academic standards in an era of scrutiny of higher education provision (Bloxham and Boyd, 2012). In a distance learning institution, some students’ experience of communication with their tutor may only exist through feedback provided on assignments despite tutors’ best efforts to engage (Tsagari, 2019), so the importance of this feedback could be seen as greater than at campus-based institutions. The subjective judgements made by markers (Brooks, 2012), despite module team provision of detailed marking criteria, can also lead to standardisation issues, with assignments marked leniently or harshly. The monitoring process within the OU is vital to ensure discrepancies in marking and variabilities in feedback are picked up and addressed. The role is fundamental to the university’s principles and purposes of assessment – they ensure quality of teaching, to contribute to the professional development of tutors, and to validate assessment scores.

Monitoring as a dialogue

Monitoring in the past, though, has sometimes been associated with questions about the difficulties of monitors providing what is essentially a peer review and judgements on their colleagues. Concerns about training for monitoring led to a previous scholarship project in Psychology and Counselling at the OU in 2016 on ‘Monitoring: Good Practice’, which resulted in improvements to training for monitors and a dedicated website. An important feature of monitoring for the university is that it should be a dialogic relationship with the monitor and the tutor further discussing the process, perhaps working together to think more about successful application of marking criteria and discussing effective feedback. However, further evidence was needed to critically evaluate the prevalence and effectiveness of the ‘dialogic process’ aspired to by The Open University. To address this gap, a small-scale scholarship project was carried out, supported by FASSTEST, between February and July 2020 to explore monitoring from the experience of both monitors (n=24) and tutors (n=27) in the School of Psychology & Counselling. Content analysis of monitoring reports (n=46) across a range of undergraduate and postgraduate modules also looked for examples of good and poor practice.

Encouraging outcomes

Outcomes of the project were encouraging, particularly around the impact of monitoring on professional development for both tutors and monitors – the majority of both reported that their own practice had improved from receiving monitoring reports (21/27) and being a monitor (24/24). Monitors had gained from seeing how other colleagues assessed, and often as a result adopted (or as some suggested ‘plagiarised’) examples of good practice. Tutors, particularly those who were new to the university or to the module being monitored, reported benefiting from the feedback particularly around knowing they were applying the marking criteria appropriately.

The dialogic process between monitors and tutors, however, was less evident, with only half of tutors having further contact with monitors, usually because they did not feel it beneficial to follow up. As one tutor put it, ‘if you receive a great report, there is little more to be said’. Sometimes the dialogue was more about challenging the report and explaining and justifying the assessment by referring back to knowledge a monitor may not have – the tutor’s relationship with the student, for example, which could mean less feedback because of the student feeling overwhelmed when receiving too much.  It is recognised that discussions between markers and assessors can be beneficial in improving standardisation (Grainger et al, 2008) so the OU expectation and encouragement for the dialogue between monitors and tutors could result in improved assessment practice. Monitors worked hard to encourage further dialogue with tutors; reports typically were friendly and encouraging, and most monitors included email contact information with a ‘please contact me to talk further’ message within the report. The decision then is left to the tutor whether further contact takes place, and that will be something that workload and time management may well also impact.

As with any peer review process, there were a small minority of experienced tutors who did not feel monitoring was particularly beneficial after teaching a module for several years and sometimes questioned the judgement of monitors who were less experienced than themselves. Receiving a good rather than an excellent from a monitor was occasionally met with resistance. However, within any peer review or assessment system, there will always be emotive responses alongside cognitive benefits particularly when perceptions of experience are involved (Cartney, 2010).

A shot of a table from above with a laptop, cup of coffee, pens, a mouse and magnifying glass displayed

Photo by Ian Dooley on Unsplash

Content of reports

The content analysis of reports showed that a majority demonstrated very good practice – supportive, friendly, accurate and full of appropriate advice. Sometimes there was evidence of cut and paste errors, though, where monitors had perhaps taken information from a previous report or another similar report and inadvertently not changed the name. Whilst these were very few overall, there were some instances were tutors felt these errors undermined the value of the reports. Tutors also felt that sometimes monitors were not able to acknowledge the personal relationships tutors had with their students, which could explain a lower level of feedback, for example, for students who a tutor might judge to manage a limited amount of information in each set of feedback better (Forsythe and Johnson, 2017). There was also doubt that monitors looked back to previous reports to acknowledge what had been said before.

Being honest in peer review

It can also be difficult for a monitor to be completely honest in a report. Nearly all monitors acknowledged they had monitored a friend or close colleague, although the majority did not consider this to be an issue. The ability, though, to provide an unbiased critical judgement of a friend’s or colleague’s assessment practice could well be affected (Abidin et al, 2018). On small modules avoiding monitoring close colleagues can be problematic – with only two monitors on a module, inevitably they will end up monitoring each other. However, on larger modules this can be avoided. Notably, in other faculties on larger modules monitors are asked in advance to check their list and remove any colleagues they are familiar with to avoid this.

Professionalising the monitor role

Perhaps one of the most important findings of this research, though, relates to the status of the role of monitor. Nearly all monitors considered their role as important and valued by the university and colleagues, but very few felt they were an important part of a team, and most reported they had very little communication about their work. Half of monitors reported feeling isolated in their role. The monitor role also lacks the professional status that other roles (for example, associated with academic conduct) command. A more recognised professional status of the monitor, with better integration into Module Teams, could go some way to making monitors and tutors see the role as more valued.

Finally, changes in the monitoring system were implemented during the period this research was being carried out. These changes affected monitoring levels, and the way in which the report form is completed, are significant improvements and early reports from monitors and tutors suggest they find the forms easier to use. It was encouraging that many of the findings of the research were being addressed by monitoring process changes that were taking place at the same time. This validates both the research findings and the decisions of the Open University’s Monitoring Implementation Group, which have designed a new website and handbook.

References

Abidin, A.N.Z., Masek, A, Mohd Faiz, N.S. & Sahdan, S. (2018), Exploring the elements of integrity in peer assessment, MATC Web of Conferences, 150, 05002. https://doi.org/10.1051/matecconf/201815005002

Bloxham, S. & Boyd, P. (2012) Accountability in grading student work: securing academic standards in a twenty-first century quality assurance context, British Educational Research Journal, 38, (4), pp. 615 – 634. https://doi.org/10.1080/01411926.2011.569007

Brooks, V. (2012), Marking as judgement, Research Papers in Education, 27, (1), pp. 63 – 80. https://doi.org/10.1080/02671520903331008

Cartney, P. (2010), Exploring the use of peer assessment as a vehicle for closing the gap between feedback given and feedback used, Assessment & Evaluation in Higher Education, 35, (5), pp. 551 – 564. https://doi.org/10.1080/02602931003632381

Forsythe, A. & Johnson, S. (2016), Thanks, but no thanks for the feedback, Assessment & Evaluation in Higher Education, 42, (6), pp. 850 – 859. https://doi.org/10.1080/0260938.2016.1202190

Grainger, P., Purnell, K. & Zipf, R. (2008), Judging quality through substantive conversations between markers, Assessment & Evaluation in Higher Education, 33, (2), pp. 133 – 142. https://doi.org/10.1080/02602930601125681

Tsagari, D. (2019), Interface between feedback, assessment and distance learning written assignments, Research Papers in Language Teaching and Learning, 10, (1), pp. 72 – 99. Available online at http://rplel.eap.gr

For further details please contact

Sue Nieland – Staff Tutor in Psychology and Counselling

Leave a Reply

Your email address will not be published. Required fields are marked *