Listening to students in real time: how feedback leads to success

Like all robust development processes, our learning design approach includes a number of opportunities to learn from feedback. One is our curriculum design student panel (aimed at capturing students’ views on learning design ideas before they’re live). Another is real-time student feedback (RTSF), which gathers feedback from students as they’re studying with the aim of ensuring they get the support they need. Short questionnaires focusing on recently studied topics are embedded into students’ online study planners so they can reflect and comment on their experiences and receive extra guidance based on their comments.

Our module teams can learn from RTSF too. In this post, we take a look at how students and module teams can benefit from RTSF and some of the impacts it’s making.

Continue reading “Listening to students in real time: how feedback leads to success”

Engagement with others online: students’ views of course design

As learning designers, it’s essential that we explore students’ needs and goals. That way, we can make sure that learning activities address these needs and support students to reach their goals. For example, each time we design a new module, we take time to explore student data and course teams’ experience to build up student profiles or personas that can be referred to throughout the module design process.  Continue reading “Engagement with others online: students’ views of course design”

Learning from one another: the value of students’ insights

Students are at the heart of our approach to learning design. We focus on helping our module authoring teams make evidence-based decisions, and insights from students form an important part of that evidence. This is the reasoning behind our curriculum design student panel (CDSP), which was highly commended in the ALT Learning Technologist of the Year (team) awards in 2019, described by the judging panel as ‘a model for others to follow’. Continue reading “Learning from one another: the value of students’ insights”

Learning analytics: the patchwork quilt visualisation

In the Open University, we have developed a suite of LA (Learning Analytics) visualisations called ‘Action for Analytics’ (A4A: slides from a presentation giving more detail) designed to help those responsible for producing modules to see the effects of their designs.  For example, it’s possible to track just how much use videos we produce for the module get watched and therefore see whether doing more would be a good investment.

sample learning analytics graph of students resource use with time from OU module
Tracking how many students have accessed a particular section of the module in any week. Weeks along the bottom and blue bars are weeks with Assignments in them.

This has been very successful with our colleagues outside the Learning Design team (mostly academics) being able to track what is going on with their modules real time and also see the effects of changes as they are bought in.

However, the tool is limited to a set of ‘baked in’ dashboards so its not possible to split the above data into students who ended up failing the module from those who passed and compare the two graphs.  This could give useful insight into the value of individual parts of a module and also if students are accessing it or not.

Drilling down into the data:  A4A isn’t the only route to exploring statistics about students on modules.  There are a number of databases underlying the visualisations and these can be accessed directly by specialist staff.  Using our access rights, we have been experimenting with producing bespoke visualisations not in the current suite that we think could help those writing and designing modules.  These are currently prototypes but show some promise:

Screenshot showing patchwork quilt visualisation
Patchwork quilt visualisation. Sections of the module are arranged in columns, rows at the top represent individual students showing sections they have visited at least once. At the bottom, these individual visits are collated to show percentage access to each element for various groups: Withdrawers at the top, still registered below this and Low economic status (SES) below this.

In this visualisation, individual students are shown one per row at the top.  If they have accessed any element of the course (one section per column) the corresponding cell is blue.  If they have never accessed it, it’s shown white.  At the bottom, students are grouped (e.g. ‘withdrawers’ and ‘registered’ – not withdrawn) and cells are now coloured with hot colours showing low usage and cool colours showing high usage.

Example Interpretation:  As an example of its use, the last column is the block assignment.  It can clearly be seen that section 18 (column 2nd from right, expanded up left) is attracting a high percentage of students visiting it at least once.  Section 17 (3rd from right) is attracting considerably lower numbers of students, especially amongst withdrawers.  This is a factor of inclusion of section 18 in the assignment, whereas 17 is not and, as a result, students are choosing to skip it.  From a design point of view, should it be included at all?

More granularity:  In our work investigating this graphic, we think it will become even more useful when there are improvements in the granularity, at present we can only see that students have accessed a whole section.  For example, it will be much more useful to see how far they got within a section itself – did they give up half way through?  Improvements in the learning analytics the VLE records should help with this.

Next Steps:  This is a work in progress, already we are making the patchwork quilt visualisation more sophisticated and have plans for other experiments.

Richard Treves, Senior Learning Designer.

Carl Small, Analyst.