‘Unlearning’ how to do evaluation (and scholarship)?

In ALSPD (Associate Lecturer Support and Professional Development), whenever we deliver a learning and development activity with tutors, we seek to evaluate the impact it has had on tuition practice (and ultimately on the student learning and experience). Evaluation and engaging with the Scholarship of Teaching and Learning (SoTL) are both critical to the evidence-led way we identify, prioritise and deliver learning and development opportunities for tutors. For some of the team, it is also a critical part of our professional identities in the Third Space. In this blog post, Jenny Hillman reflects on the research methods that often underpin evaluation and the value of ‘unlearning’ to develop critical reflexivity as an evaluator/researcher.

A culture of evaluation

Like many Educational Development /Learning and Development teams, the approach that we take to evaluating staff development will differ depending on the size and scale of the activity (workshop, lightning talk, digital resource). The ‘hub’ team at the heart of ALSPD have iteratively developed the ‘art’ of evaluation – that is, clear, time-saving processes for efficiently carrying out surveys that are proportionate to the activity and can establish the value and potential impact of them. Our team managers collate attendance and engagement across the academic year, and these are shared with key stakeholders via an infographic. Other colleagues have also developed methods for evaluating impact on teaching practice in longitudinal studies.

However, thinking critically about how we ‘do’ evaluation (and who with) is something which has long been (and continues to be) part of our team culture. Former team members have – for example – contributed expertise to wider discussions in the sector about approaches to evaluation in Educational Development. More recently, in the evaluation of development events for Practice Tutors, we have adopted an approach from Bamber and Stefani (2016) to explore the role we play as evaluators. This has helped us to explore our subjectivity and contextualise what we found, using a structured framework.

 Critical reflections on how we do evaluation (and what to ‘unlearn’)

One of the most important challenges to what we think we know about how to evaluate comes from some of the wider literature on decolonisation. My colleague, Clemmie Quinn, recently carried out a short piece of desktop research which collated resources on this. We considered learning from the decolonisation of global development project evaluation, as well as wider research on decolonisation of research methods (we’ve provided selected examples in the reference list below). This week, I also attended a workshop for staff and postgraduate researchers focused on democratic and participatory approaches to research. ‘Moving Beyond Traditional Research Methods’ was led by Dr Kris Stutchbury and Claire Hedges and offered case studies from research projects in the Centre for the Study of Global Development. We talked about evidence cafes, for example, as a tool for participatory research (thank you to a fellow participant for the link to this Open Learn course!) The session closed with a set of principles which invited us to think about the disruptive power of non-traditional methodologies, but also of the need for trialling and retesting their implementation to ensure quality.

This has probably raised more questions than answers for me as an evaluator (and indeed in my wider scholarship and research work):

  • How and when should we look beyond some of the canonical models (Kirkpatrick , Guskey, etc) in the evaluation of staff development? In Educational Development, we might have moved beyond ‘happy sheets’ but have we really challenged the deeper, fundamental epistemological assumptions about how we generate findings?
  • When we are designing and evaluating activities for a diverse staff population, how relevant are our evaluation approaches? Or, in other words, how are colonial structures and ways of thinking influencing what we measure?
  • In what ways do methods such as focus groups and surveys privilege certain forms of knowledge over others?
  • How will we balance the time, effort (and possible failures) when implementing radically new evaluation methods with the ongoing need to evidence value and impact?
  • What do we need to unlearn about researcher/evaluator objectivity? How can we better work with our positionality?
  • What do we understand by ‘traditional’ research methods and why might we want to move beyond them? (a brilliant question posed by Kris Stutchbury and Claire Hedges at the above workshop!)

I feel I should conclude with a disclaimer that I am at an early stage in the journey to challenge my assumptions and test what I know about inclusive scholarship and evaluation. So far, however, what I have discovered seems less about what I can learn, and more about what I must ‘unlearn.’

We invite readers to continue the discussion by sharing any resources, reading, or reflections by adding leaving a reply below.

References

Cheryl Abram, ‘We need to decolonize organisational learning.’ Available at: We Need To Decolonize Organizational Learning & Evaluation | by Cheryl Abram | Medium (Accessed: 14 February 2025).

Emergence Collective (2021) ‘Decolonising Evaluation.’ Available at: Decolonizing evaluation (Accessed: 14 February 2025).

Hur Hassnain (2023) ‘Decolonizing Evaluation: Truth, Power, and the Global Evaluation Knowledge Base,’ Journal of Multidisciplinary Evaluation, 19 (44), https://doi.org/10.56645/jmde.v19i44.803

Bamber, V., & Stefani, L. (2015). Taking up the challenge of evidencing value in educational development: from theory to practice. International Journal for Academic Development21(3), 242–254. https://doi-org.libezproxy.open.ac.uk/10.1080/1360144X.2015.1100112

Charlotte Stevens (2022) ‘Get talking about evaluation: Coordinating an approach to evaluating staff development events,’ Educational Developments. Available here: Ed-Devs-23.2.pdf

 

Leave a Reply

Your email address will not be published. Required fields are marked *