Engagement with others online: students’ views of course design

As learning designers, it’s essential that we explore students’ needs and goals. That way, we can make sure that learning activities address these needs and support students to reach their goals. For example, each time we design a new module, we take time to explore student data and course teams’ experience to build up student profiles or personas that can be referred to throughout the module design process.  Continue reading “Engagement with others online: students’ views of course design”

Learning from one another: the value of students’ insights

Students are at the heart of our approach to learning design. We focus on helping our module authoring teams make evidence-based decisions, and insights from students form an important part of that evidence. This is the reasoning behind our curriculum design student panel (CDSP), which was highly commended in the ALT Learning Technologist of the Year (team) awards in 2019, described by the judging panel as ‘a model for others to follow’. Continue reading “Learning from one another: the value of students’ insights”

Learning analytics: the patchwork quilt visualisation

In the Open University, we have developed a suite of LA (Learning Analytics) visualisations called ‘Action for Analytics’ (A4A: slides from a presentation giving more detail) designed to help those responsible for producing modules to see the effects of their designs.  For example, it’s possible to track just how much use videos we produce for the module get watched and therefore see whether doing more would be a good investment.

sample learning analytics graph of students resource use with time from OU module
Tracking how many students have accessed a particular section of the module in any week. Weeks along the bottom and blue bars are weeks with Assignments in them.

This has been very successful with our colleagues outside the Learning Design team (mostly academics) being able to track what is going on with their modules real time and also see the effects of changes as they are bought in.

However, the tool is limited to a set of ‘baked in’ dashboards so its not possible to split the above data into students who ended up failing the module from those who passed and compare the two graphs.  This could give useful insight into the value of individual parts of a module and also if students are accessing it or not.

Drilling down into the data:  A4A isn’t the only route to exploring statistics about students on modules.  There are a number of databases underlying the visualisations and these can be accessed directly by specialist staff.  Using our access rights, we have been experimenting with producing bespoke visualisations not in the current suite that we think could help those writing and designing modules.  These are currently prototypes but show some promise:

Screenshot showing patchwork quilt visualisation
Patchwork quilt visualisation. Sections of the module are arranged in columns, rows at the top represent individual students showing sections they have visited at least once. At the bottom, these individual visits are collated to show percentage access to each element for various groups: Withdrawers at the top, still registered below this and Low economic status (SES) below this.

In this visualisation, individual students are shown one per row at the top.  If they have accessed any element of the course (one section per column) the corresponding cell is blue.  If they have never accessed it, it’s shown white.  At the bottom, students are grouped (e.g. ‘withdrawers’ and ‘registered’ – not withdrawn) and cells are now coloured with hot colours showing low usage and cool colours showing high usage.

Example Interpretation:  As an example of its use, the last column is the block assignment.  It can clearly be seen that section 18 (column 2nd from right, expanded up left) is attracting a high percentage of students visiting it at least once.  Section 17 (3rd from right) is attracting considerably lower numbers of students, especially amongst withdrawers.  This is a factor of inclusion of section 18 in the assignment, whereas 17 is not and, as a result, students are choosing to skip it.  From a design point of view, should it be included at all?

More granularity:  In our work investigating this graphic, we think it will become even more useful when there are improvements in the granularity, at present we can only see that students have accessed a whole section.  For example, it will be much more useful to see how far they got within a section itself – did they give up half way through?  Improvements in the learning analytics the VLE records should help with this.

Next Steps:  This is a work in progress, already we are making the patchwork quilt visualisation more sophisticated and have plans for other experiments.

Richard Treves, Senior Learning Designer.

Carl Small, Analyst.

Analytics4Action: emphasis on the ACTION

It is no secret that we live in a world driven by data. The insight gained from analytics underpins success in almost every industry, from strategic consumer research in commercial sales to the in-play behavioural patterns of leading men and women in sports; a great deal of importance is afforded to the reporting of “The Stats” and the stories they tell.

Effective data reporting connects an audience with a subject, providing a level of understanding that would usually only be experiential. Having worked in analysis roles across multiple fields this concept was not new to me, however prior to joining The Open University I had never considered the sheer necessity for data within education.

Analytics – The Gift and The Curse

At The Open University we have a variety of tools that offer insight into different aspects of the institution. At any moment in time, Open University staff can use data to map out the journey of prospective students, or to see the number of students attaining their sought-after qualifications. It is extremely easy to spend hours fascinated, absorbing the wealth of numbers and eye-catching dashboards on offer, using them to comprehend exactly what is happening at The Open University. We readily accept the gift of this intricate and insightful data, but so what?

Data for data’s sake

Imagine running a small business, you have an efficient analyst in your team that provides you a monthly report of all the data you need. You know the numbers inside out and could reproduce the graphs from memory with ease, there is no doubt that you are in tune with the business at an operational level, this however is only flint for the fire. A data table on its own will not drive change, nor will an aesthetically pleasing graph bring you more customers. The actions made from the insight the data provides is the spark that needed to create the fire of success.


This thought process is core to the Learning Design Team at the Open University. The Analytics4Action programme ensures Open University staff have an understanding of students on live courses, during presentation. At a distance learning institution (in which the face to face interaction between those learning and those teaching is severely reduced) this understanding is empowering. It allows for decisions to be made that can positively influence the development of a course which will in turn contribute to the development and success of our students; which naturally is the common goal for everyone here at the Open University.

In data we trust

In summary, we should always recognise the importance of accurate and well presented data, however what we should really focus on is the potential for change  granted by accurate and well presented data.