Feedback loops: reflecting on five years of feedback from the curriculum design student panel 

Plane looping over a cloud

If you’ve ever felt frustrated by a product that doesn’t seem to work for you, you’ll understand the importance of building opportunities for feedback into a design process. It’s certainly an essential part of our learning design process: alongside various organisation-wide evaluation initiatives whose insights we access as part of our work, the learning design team runs the curriculum design student panel, which provides opportunities for students to comment on a range of aspects of learning design. These comments feed directly back to our module teams. Panel members have provided invaluable insights into their study preferences, motivations, environments and habits since the panel was set up in 2016.

As the panel approaches its fifth birthday in July, now’s a good time to reflect on what the panel has achieved and how it’s evolved. It’s also a timely moment to highlight the importance of evaluation to the panel itself – in particular, how we’ve been able to use feedback from students to develop the panel and students’ experience of it.

How it started

The panel started out as a pilot project with the aim of helping module teams access students’ views easily. As with most large organisations, at the OU there are (rightly) several levels of sign-off usually needed before anyone can contact students. The panel has these built in – members give consent to be contacted and to share their feedback upfront – so that our module design teams can quickly gather student feedback and act on it.

Five years on, the panel is now part of our business-as-usual learning design process. It’s also grown: in 2016 we recruited around 500 student panel members; there are now more than 3,000.

How it’s going

The large size of the panel means that we now have students representing every level (Access to taught postgraduate), and every faculty/school (including Open Programme students). As a result, we can gather feedback from sample groups who represent the target audience of a particular learning activity.

In a typical student panel activity, members of a module team will identify an aspect of their learning materials that would benefit from student feedback. This could range from comments on how easy it is to interact with a new tool to thoughts on new source materials. Students from a particular subgroup of the panel membership (for example, students who have studied a level 2 science module) will be invited to take part. Behind the scenes, learning designers will be working with the module team, editors and designers to find an appropriate feedback methodology (from a simple survey to a fully interactive sample learning activity) and create the required materials. These are all tested before students are given access.

Once students have provided their feedback, the module team can access it in an anonymised format and build it into their ongoing learning design.

We aim to replicate the student experience as closely as possible. For example, all our sample activities are hosted on a site that’s very similar to the student VLE as we know they’ll be familiar with its layout and functionality. We also design any sample learning activities needed so that they’re as close to the live experience as possible so that feedback can easily be applied and acted on.

An evolving approach

It’s essential that the panel continues to evolve so that it can serve both students and staff well, so every year we seek feedback on the panel itself from its members as part of our evaluation approach. The most recent evaluation led to some changes that we’ll monitor and seek feedback on in due course.

For example, students told us that they’d like to receive feedback about how their comments have contributed to module development. In response to this, we’ll be including this information in our regular newsletters and our student-facing site. Similarly, students told us that time was the biggest barrier to taking part in panel activities, so we’ll be monitoring this as we design activities to make sure we’re making the most of students’ time. Panel members also mentioned that they’d like more opportunities to comment in their own words, so we’ll build these into future activities.

Benefits for everyone

As part of our evaluation, we asked panel members about the skills they felt they’d gained from their involvement. Students told us they’d built educational knowledge, critical thinking and confidence – all attributes that will serve them well in their studies and beyond.

It’s not just panel members who benefit from the panel – of course, our module teams benefit too. Students’ comments provide them with focused feedback on new activities, tools, resources and formats. They also offer insights into students’ study habits and preferences, which enables module teams to tailor materials to meet students’ needs so that ultimately, future students also benefit.

Looking ahead

We’ll be marking the panel’s fifth birthday in July with a range of online activities, including quizzes (with prizes!).

Panel stats*

• Numbers of students on the panel have risen from 461 in 2016 to 3,138 in 2021.
• Since then, panel members have contributed feedback via surveys, interviews, user and developmental testing, workshops and critical reading.
• Curriculum design student panel members have provided feedback and insights via 83 different activities.
• They’ve also responded to 22 quick questions of the month, with some of these responses feeding into wider research and recommendations.

*All statistics from Curriculum Design Student Panel 2019-20 Evaluation.

Related articles

Practical tools for developing student-centred learning

Student journeys: embedding skills into the curriculum