In the build up to the final date of submissions for the Widening Participation Conference 2014 (30 April – 1 May 2014, Milton Keynes, UK), theme leaders from the conference’s steering group have been fleshing out their theme in more detail, outlining potential areas of interest and raising questions.

In this post, John Rose-Adams, Research, Evaluation and Information Manager at the OU’s Centre for Inclusion and Collaborative Partnerships who is leading the theme on ‘Measuring and demonstrating impact’, outlines some of the key issues.

Measuring and demonstrating impact

‘Impact’ is a term which, perhaps more than most, leads academics to contort their faces in disgust and decry the hateful impressions of the neoliberal legacy.

However strongly one feels about the emergence of Impact as an overriding indicator linking research and teaching funding to institutions, we increasingly on a daily basis have to respond to the question of impact in bids we write, in reports we file, in projects we evaluate, the departments we restructure.

As Molas-Gallart and Tang note ‘It is no longer assumed, as it may have been in the past, that research expenditure will eventually and on its own dynamics lead to social and economic benefits’ (2007, p. ii). We must now carefully spell out the ways in which our research and other activity relates to social and economic benefits, both nationally and globally.

Large swathes of widening participation activity in HEIs across the UK are to some extent measured, evaluated, and reported on. But there is an emerging consensus from sources of funding critical to further work to widen participation – in particular the UK nations’ higher education funding councils – that not enough is being done to establish the impact of what we do.

This conference theme provides a space both for critical engagement with the concept of ‘impact’ and reporting on research and evaluation which demonstrates impact and improvement in higher education, successfully widening participation. Paper authors in this theme are asked to offer their own definitions of impact, and argue strongly for how and why specific measures of success are used.

Reference

Molas-Gallart, J and Tang, P (2007) Report of the ESRC Impact Evaluation Methods Workshop 20th March 2007. Online at: http://www.esrc.ac.uk/_images/ESRC_Impact_Evaluation_Methods_Workshop_tcm8-3814.pdf [accessed 11 September 2013].

Submissions

Poster and paper submissions are being welcomed which speak one or more of the conference themes:

  • Innovation in Design and Pedagogy
  • Impact of Curriculum Reform
  • Curriculum Openness
  • Flexibility and modes of Delivery
  • Measuring and Demonstrating Impact
  • Revisiting Theory

Full details about making your submission can be found on the the conference website. The final date for submissions is 1 November 2014.

Share →

Leave a Reply

Your email address will not be published. Required fields are marked *