Feedback loops: reflecting on five years of feedback from the curriculum design student panel 

If you’ve ever felt frustrated by a product that doesn’t seem to work for you, you’ll understand the importance of building opportunities for feedback into a design process. It’s certainly an essential part of our learning design process: alongside various organisation-wide evaluation initiatives whose insights we access as part of our work, the learning design team runs the curriculum design student panel, which provides opportunities for students to comment on a range of aspects of learning design. These comments feed directly back to our module teams. Panel members have provided invaluable insights into their study preferences, motivations, environments and habits since the panel was set up in 2016.

Continue reading “Feedback loops: reflecting on five years of feedback from the curriculum design student panel “

Active learning: making learning engaging

We’ve probably all sat through enough ‘death by PowerPoint’ slide decks to know what happens when we’re presented with information but don’t have the chance to engage with it. In the best-case scenario, we simply don’t learn anything. But often we leave the meeting or class worse off – with unanswered questions, frustration and reduced confidence in the tutor or meeting organiser. Continue reading “Active learning: making learning engaging”

Workload mapping part 3: concurrency and activity makeup

In this series of posts, we’ve been looking at student workload mapping. This final post looks at the other neat things we can do once we’ve mapped out a module. 

Our example student, Alex, has had their workload smoothed out in the previous posts. Now that we’re sure the volume of learning and teaching for this module is manageable we can start checking that it fits in with the wider context of their studies, and that the studies themselves are suitably varied and engaging. We’re able to do this with our existing mapping data through Concurrency and Activity mapping. 

In this series of posts, we’ve been looking at student workload mapping. This final post looks at the other neat things we can do once we’ve mapped out a module. 

Our example student, Alex, has had their workload smoothed out in the previous posts. Now that we’re sure the volume of learning and teaching for this module is manageable we can start checking that it fits in with the wider context of their studies, and that the studies themselves are suitably varied and engaging. We’re able to do this with our existing mapping data through Concurrency and Activity mapping. 

Concurrency mapping 

The Open University has an increasing number of students studying FTE (Full Time Equivalent – 120 credits a year). As the majority of modules run throughout the course of the academic year, this results in modules overlapping one another. While proactive workload mapping has smoothed both over in our examples, assessments, and small dips and spikes can be magnified to the same damaging proportions as we discussed in our first post. 

By taking the mapped workload from both modules and laying the week-by-week workloads over one another, we can see the concurrent workload for students studying both modules. In this case, a small overrun in both modules in week 8 has generated an unwanted spike, that could lead to the same negative outcomes demonstrated with Alex, our example student, in part 1 <link>. 

We might also see this with assessments, where the likelihood of a higher student-directed workload (from a student revisiting material, researching and drafting an assignment) impacts the overall study time available for a week. This is a particular concern at level 1, where students are still building their time management skills – and may struggle to prioritise conflicting assessments across multiple modules.  

Our example student Alex may opt to prioritise the assessment on the core module of the qualification, and devote less time to an optional one – or feel overwhelmed by the sudden influx of self-directed workload and perform worse on both. While part of the solution to this is scaffolding and studentship activities, which build study skills throughout the module, maintaining an awareness of potential concurrency during design allows both calendars to be nudged towards a more harmonious alignment. This is most useful when a new module is being designed, and a relationship between that and an existing module can be predicted. 

Activity makeup 

Back in part 2, we mentioned mapping directed activities. While doing this, we divide those activities in to: 

  • Assimilative – read/watch/listen (this category includes most non-directed teaching material) 
  • Interactive/Adaptive – explore/experiment/simulate 
  • Experiential – practice/apply/experience 
  • Communicative – debate/discuss/share
  • Finding and handling information – analyse, collate, discover 
  • Assessment – write/present/report 
  • Productive – create/build/produce 

An aspirational makeup of these activity types is decided right at the beginning of the module design process – taking into account the subject, student demographics and more. Module mapping allows us to revisit that aspiration during design and see if it’s on track:

In this case, we can see the module has ended up with more productive and assimilative activities than initially planned. If we wanted to, we could filter this down to a unit or week level to see if particular sections are skewing the results and suggest structural tweaks. Alternatively, this may just be the natural evolution of the module’s teaching direction as it develops – and might not be a cause for concern. 

Sense checking against student profiles is a quick way of pulse checking activity makeup. In this case (mapped from our example level 1 module) we’re happy to see that Alex would enjoy a broad spread of activities while studying this module – but we would suggest boosting the finding and handling information activity time slightly, in order to better build towards expectations at level 2. We would also like to see more communicative activities at level 1, in order to help Alex better integrate in to the student community.  

While Alex is a figment of our imaginations, much of the data in this series of posts has come from modules at various stages of development. Quantifiable factors in learning and teaching will never tell the whole story – but hopefully we’ve demonstrated the differences that can be made through proactive evaluation, and student focused thinking. 

If you enjoy a good graph as much as us, or would like to know more about module mapping and our other evaluation work then let us know via twitter @OU_LD_Team. 

 

Workload mapping part 2: mapping in learning design

In this series of posts, we’re looking at student workload mapping. This second post explains how we monitor workload during module design, and where we might make recommendations to authors.  

Overall workload for a module is agreed right at the beginning of learning design, with set times to aim for based on the level of study, credits and duration of a module. In our first post in this series we looked at the case of Alex, and the 3 week workload lump. With that level 1, 60 credit 30 week module, the weekly workload should have looked something like this: 

Module directed workload: 13 hours 

Student directed workload: 7 hours 

Total workload: 20 hours 

This balance is taken in to consideration when planning the overall structure of a module, with authors dividing topics, units and blocks across the 30 weeks in as even a distribution as possible. 

Slight variations in the workload are to be expected, and can creep in unknowingly during drafting, where authors start blocking out the details of teaching activities. Detailed workload mapping starts here, with an aim to informing tweaks for the next sets of drafts. If we take another look at Alex’s module, the results might look something like this: 

So, how do we go about it? 

The process changes slightly as a module fills out and nears presentation, but the core elements of mapping are: 

  • Word counts 
  • Reading speeds 
  • Multimedia assets 
  • Directed activities 

Word counts are a good early measure for study time, with text often representing a sizable chunk of teaching in distance education. This is based on estimated Reading speeds for different types of content. Introductory material might be read at ‘normal’ speed, while high cognitive load sections (dense definitions or models that might need slower or multiple readings) will need more time allocated per-word. 

Multimedia assets including AV material are mapped based on their duration, multiplied by two, allowing for pausing and taking notes. Images receive a flat figure depending on their nature, with less time allocated for decorative or illustrative images than detailed infographics and diagrams. 

Directed activities are written with an estimated time as part of best practice (E.g. Activity 1.1 – 40 minutes), which we then sense-check and use for mapping. While doing so, we also look at the different types of activities being used, categorising them as we go. We’ll look at this more as part of the final post in the series. 

At the end, we are able to see not just the workload of a module, but also the top-level composition: 

As you can see, text/reading content accounts for a large proportion of the workload in Alex’s module. We can now see though that the crunch points in weeks 5 and 6 are largely due to higher proportions of directed activities and AV content. Our feedback for the next phase of drafting would be to: 

  • Week 1 may benefit from slightly more study time to help quickly acclimatise level 1 students to the expected 7 hour mark (perhaps 1 more hour). Check that induction and studentship activities have been accounted for. 
  • Look to reduce overall workload for weeks 5 and 6. Planned activities and AV currently account for a disproportionate amount. 
  • Week 10 may be a little text heavy, which could affect engagement. 

How much we map is often down to the specific needs of the module. In some cases, the first 7 weeks will give enough of an idea to even out the workload as the rest of it develops.  In other cases, the full module is mapped out to look for peaks and troughs. 

Fortunately, in the case of Alex’s module, the crunch in week 5 is identified early, and is smoothed out in subsequent drafts before first presentation. Alex has a consistent experience, a better work/life/study balance, and the quality of learning and teaching itself once again becomes the primary determinant of success. 

In the final post in this series, we’ll look at how we use mapping data to measure concurrency (multiple module study) and activities – and the opportunities that opens up. 

This blog is protected by dr Dave\'s Spam Karma 2: 171 Spams eaten and counting...