Designing a bootcamp – what did we get from the experience?

We’ve spent quite a bit of time in 2021 working on a Learning Design Bootcamp, alongside colleagues from several other HE institutions. As hosts in a typical sense we would normally have been physically hosting sessions for the bootcamp as well as contributing ideas toward the programme of activities and providing an OU LD flavour to the proceedings. Running it during a pandemic of course has been a slightly different experience both for us as designers and for the participants. We’ve already shared the participant story – now we want to share some thoughts on the experience for us as designers and importantly, what we’ve taken away from the process. 

Continue reading “Designing a bootcamp – what did we get from the experience?”

Hands-on support for module teams

Whilst we carry out fixed and clearly defined Learning Design activity as part of our core offer, we can also provide more hands-on input aimed at supporting module teams with specific challenges they are facing with their design. Below are three such examples where we’ve provided support to help module teams to progress either with a specific tool, or with broader aspects of Learning Design.

Guiding Authors

Chris Cox

For one FBL module, several challenges had been identified in the Learning Design Workshop in the early stages of development. I’d focused my attention on a few key areas to take forward into the LD Plan. However, it became clear that more guidance on writing and structuring the learning was needed. Plenty of guidance and resources were available on their workspace, but the team needed something more immediate.

In consultation with my colleagues, I decided to take TEL101 – a module designed to introduce staff to Learning Design and what we do – and adjust it for the team. The module was quite short, so matched the shorter study weeks one of the modules needed – and was well structured to deliver learning while engaging students with different activity styles at a deep level. I took a copy of TEL 101, and produced a ‘behind the scenes’ commentary version – each important LD point was highlighted and design principles explained, to help the team think about how to write, balance activities, tie activities to learning outcomes, and show good practical examples of LD.

I also produced a visualisation to provide them with a starting point and approach to writing that they could come back to, and a chart coding activity types – a pedagogic map of the invisible, underlying structure of the module to help them think better what each activity a section was achieving to help students stay on course and succeed.

I presented these at a Module Team meeting, which the authors found helpful – and the visualisation helped things click. And now the LD team have another tool – ‘LD101 Behind the Scenes’ – to help guide module teams in a similar position.

OneNote for PDP

Dot Coley

Soon after joining the team, while researching how OneNote could be used as an ePortfolio tool for one of my FASS modules, I joined forces with Sue Lowe to share knowledge and to support the WELS PDP pilot. The benefit was two-fold; drawing on existing experience to feed into my FASS work and using my technical background to explore the impact of potential compatibility issues for the existing pilot.  Part of this included joining our academic partners in delivering PDP support tutorials, not only assisting with immediate issues, but also helping to keep focus on who our students are and what challenges others may face in the future.

During this piece of work, another FASS academic colleague requested advice on using OneNote, and I took the lead on advising how it may work in practice. I used my previous research to present key information alongside a practical demonstration of the WELS pilot templates to show how flexible OneNote is, and how notebooks can be structured to meet individual needs.  I also introduced the Curriculum Design Student Panel and explained how they can help with testing early concepts.

Explaining how the approach has been used elsewhere provided an opportunity to use evidence of what has worked so far and what challenges we have faced. I was also able to talk about how the WELS pilot ran alongside live modules – with a smaller group of students – ahead of implementing it more widely, but I used ICEBERG to emphasise the importance of embedding reflective activities into the learning journey and allowing for them within workload planning.

Linking the concept to use on a whole qualification, I expressed the importance of starting gradually in Level 1, scaffolding students in their reflection and reducing support throughout the duration of the qualification so that by the end of Level 3 they can reflect independently—which is especially important for those moving on to study at Postgraduate level.

Further work is continuing in WELS, and a new phase is now underway. I am supporting the academic team on a Teaching Excellence funded project aiming to train previous pilot students to become PDP coaches, offering peer-to-peer mentoring.

Helping a module team to get ‘unstuck’

Katharine Reedy and Mark Childs

A degree apprenticeship team was struggling to make progress with designing the ‘practice’ module at the start of the qualification. This was due to the sheer volume of content that students need to learn. By mapping skills content across both level 1 modules, and visualising the high-level student journey in terms of how they would engage with the material, an approach was found that enabled the team to come up with a workable structure for module ‘blocks’. This was achieved in a workshop facilitated by two Learning Design team members, which enabled the module team to think through the whole student learning journey and come to a consensus about what skills they need to develop and use at each stage.

Gathering student perspectives: a case study of capturing student feedback

One of the key areas of growth activity over the past year or so has been to gather more student input into the design process. To this end we have a number of core approaches to gathering student perspectives during design phases:

  • we administer the curriculum design student panel, which provides us with access to over 1500 OU students for rapid feedback
  • we support and advise on developmental testing, frequently working in partnership with colleagues in LTI academic, editors and the module team
  • we advise module teams on use of Real-Time Student Feedback for gathering input during presentation from students

The examples below demonstrate how we go about deploying some of these approaches in our design work with module teams.

Student perspectives on aspects of module design at postgraduate level

Gill MacMillan (Senior Learning Designer)

In this example, we were able to use the Curriculum Design Student Panel to gather student feedback – during the design phase – on specific aspects of the structure of a postgraduate module:

  • Effectiveness of a weekly introductory slidecast/video, in which an academic author sets the scene and outlines the key focus and discussions to be covered in that week
  • Usefulness of a ‘Planning your week’ overview table – outlining the activities in the week ahead (timings, type of activity etc.) in order to help students plan their study time
  • Student preference between two alternative layouts for the study planner

A relevant cohort of students was identified, and while the number who then actually responded was relatively small, the feedback was very consistent and gave a clear steer from students on all 3 aspects:

  • The majority of students found the introductory slidecast/video easy to use, informative and more effective than reading online text, and said they would start the week by looking at it in order to understand the aspects of the topic to be covered. (In addition, an issue was raised about the visibility of the transcript link when viewing the video on an iPad. This was fed back to the Learning Systems team who were already aware of the issue and have made improvements to the functionality).
  • All the students found the ‘Planning your week’ overview useful
  • The majority expressed a clear preference for the second alternative layout

In terms of the overall findings and impact, the Module Chair was happy with the results and the process, recognised the usefulness of getting this direct student feedback, and went on to implement the findings. So there was a clear impact from the panel’s input, and the Module Chair summarised this impact in a follow-up message to those students who had participated. Now that the module is live, Real-time Student Feedback is being used to get feedback from students as they study, enabling us to get further feedback on these, and other, aspects of the module design.

Testing innovative assessment

Yvonne Murphy (Learning Designer)

This testing was carried out as part of the Developmental testing that the LTD team in LTI Academic lead on. We work in partnership with the LTD team initially, and then work to co-ordinate the input of D&P colleagues in the testing. LTI Academic liaise with the module team to establish how the testing will run, recruit the students and set up the evaluation. Learning Designers then support by making sure the content is setup in an authentic manner on the VLE, and work with D&P to get the materials ready and presented to an appropriate standard.

For this particular testing relating to activities for a new WELS module (E309), a key part of our role was the qualitative analysis of student feedback.

The development testing itself consisted of two student activities

  • Creating a ‘digital badge’ using a PowerPoint template with embedded audio.
  • Creating an Infographic comparing two datasets using a PowerPoint template with embedded audio.

Both activities used Office365 and required students to work collaboratively in pairs. There were 11 student participants who completed the E309 activities and associated questionnaire, some students gave additional feedback by phone.

As a result of this, we had a lot of qualitative data from the student questionnaire which was analysed and captured in a table summarising issues per participant per activity. Following the initial analysis we contacted some of the students and arranged a phone call with them to engage and explore their feedback more thoroughly.

Some of the identified risks and opportunities were as follows:

Risk & Issues Opportunity & Mitigation
If using a personal Office365 account cannot ‘share’ with partner (using Office365 share function) if using an OU student account and vice versa (cannot share outside of organisation) Cannot use the ‘share’ function of Office 365 to collaborate. Advise to share via email and on Module e-groups
PowerPoint had to be installed for audio and the desktop version used (audio functionality not available online) Provide instructions to save PowerPoint to desktop rather than using the online version
Unsure how to record audio Provide additional resource:

  • One page visual document on recording audio
  • Have a session in tutorial one around Office365 and PowerPoint
  • Produce a Camtasia screencast on basic navigation


Students lack confidence in IT and PowerPoint skills
  • Use digital badge activity as a ‘practise’ ‘dummy’ TMA to prepare students for creating a poster in TMA01
  • Provide additional support and resources (as above)
Unable to collaborate with partner due to clashes in time commitment – one partner did most/all of the work Students to produce individual infographics for assessment but to collaborate with the preparation and research.

The module team implemented all of the recommendations coming from the opportunities documented above (except the one page visual document as it was felt enough resource was provided with the screencast, tutorial, template and instructions).

Flavours of design – what’s in a name?

Our team at The Open University has gone through a rebrand. Where before we had Learning Design (LD) and TEL Design components to our team, we are now all Learning Designers. A small change in terminology, but hopefully a big change in mindset. So why have we done this?

First, it better encapsulates what we’re actually doing and seeking to do as a team. Many of the activities carried out as part of TEL Design were actually LD activities happening at a more detailed level. For instance, visualising the student learning journey, planning for a block of study, designing a collaborative online activity or helping the module team to review student workload expectations. You could see these activities as implementation of the Learning Design.

Secondly, it helps to reinforce that LD is not a one-off activity, you need to continue making design decisions and reviewing how you are progressing toward the planned design throughout the module development and production, and even into presentation. The online LD tools we have (at support this approach. Encouraging teams to engage with these and continue to review their design is something we’ll continue to push on.

And lastly, having two design teams gave the sense that these were two very different activities whereas they were always really a continuation of the Learning Design process, with TEL Design going into more detail and identifying appropriate uses of technology as we did so.

Whilst we won’t be changing our approach to design overnight we will be looking at how we can get even more value out of LD workshops and at how we can enable the subsequent design activities to tie in further with the outputs from those workshops.

Through this new blog we’ll share some of our experiences as we adjust to the new team name and on any enhancements we make to our processes, it promises to be a busy and exciting time for us!