Skip to primary content

Engaging Research

An Open Research University: Embedding public engagement within the research culture of the OU

Engaging Research

Main menu

  • Home
  • About
  • Engaging opportunities
    • Giving public lectures about research
      • School-University Lectures 2016
      • School-University Lectures 2015
      • School-university lectures 2014
      • School-University Lectures 2013
    • How to organise a research café
      • Possible topics for research cafés
    • Communicating partnerships creatively
    • Resources
    • People
  • An open research university
    • 2015 Engaging Research Award Scheme
      • 2014 Engaging Research Award Scheme
    • Participation Now
    • Evidencing Engaged Research
    • People
  • Resources
    • Digital practices of engaged researchers
    • Snakes and Ladders of Social Media
    • Designing public-centric forms of public engagement with research
    • Evidencing impacts from engaged research
      • Collecting evidence from research-informed practices
      • Juxtalearn: capturing engagement processes
      • Special Educational Needs: The impact of key working
    • School-University Engagement
      • Planning for school-university engagement
      • Open Lectures
      • Open Dialogues
      • Open Inquiry
      • Open Creativity
    • Reports, publications, posts
    • Researching and Practicising Science Communication
    • Postgraduate Science Resources: Open Learn
  • NERC Training
  • Contact

Post navigation

← Previous Next →

A strategic approach for evaluating public engagement with research

Posted on Thursday, 24 July 2014 by Gareth Davies
Photograph of Gareth Davies

Gareth Davies, The Open University

Before I joined the OU, my background was in risk-based decision-making. I looked forward to finding innovative ways of gathering evidence of the impact of public engagement with research (PER). However, it seemed like whenever PER was mentioned evaluation would either become the pink elephant in the room or be quickly forgotten, and the conversation would focus on public engagement as opposed to public engagement with research.

In my experience, this doesn’t arise from ill intent but rather from a lack of understanding about the affordances of different PER activities and the methods and techniques used to evaluate the impact of PER.

This seminar was an opportunity to test a theoretical framework that I believe has the capacity to address this issue.

As the evaluation researcher on Engaging Opportunities and the Catalyst for Public Engagement with Research, my focus is on developing appropriate plans for evaluating the impact of PER. This is made easier by tools such as the 6 Ps, which Richard Holliman described in this post.

The dimensions of engaged research.

The dimensions of engaged research.

The six dimensions of engaged research help researchers to consider:

  • Public:  who they intend to engage with
  • Purpose: why they want to engage (purpose)
  • Processes: how they intend to engage
  • Participation: how the public(s) will interact with their research
  • Performance: how performance will be measured
  • Politics:  what wider and local political issues need to be considered

However, planning and carrying out evaluation can still be daunting for people who aren’t familiar with qualitative and semi-quantitative methods of data collection and analysis.  To address this, the theoretical framework offers a strategic approach for evaluating PER.

Affordances revealed

To introduce the group to the logic behind the framework, I asked them to discuss the values of the four kinds of activity we are using in Engaging Opportunities (open lectures, open dialogues, open enquiry, open creativity). I then asked them to position the activity on a schematic representing the levels of uncertainty that characterise the different stages of the research cycle (x axis) and the scale of impact (primary, secondary, tertiary) (y axis). This revealed the affordances that the group felt each type of activity had to offer. Luckily the group did see a distinction between the affordances offered by the different activities, which made for an easy transition into the theoretical framework (which is based on the concepts described by Funtowicz and Ravetz (1990)).

Evidence of impact!

Assuming the group had articulated a set of simple PER objectives, in theory they could use this framework to determine:

  • the type of knowledge characterising each of their objectives
  • the types of activity they should consider using
  • the types of methods and techniques that would be suitable for evaluating the impact of their PER

To demonstrate, I gave an overview of the evaluation I had carried for our 2013 (open creativity) media training programme, finishing with two questions

  • Do you see value in this approach?
  • What’s the most pressing issue with the PER agenda?

Please use Comments to send me your thoughts.

This entry was posted in Evaluation, Presentations and tagged Catalyst, Engaging opportunities, engaging with school students, evaluation, PER Catalyst, Public engagement with research, SUPI by Gareth Davies. Bookmark the permalink.
Proudly powered by WordPress