Supporting success through assessment

Research in assessment practice points to an emphasis on summative assessment to test learner knowledge. However, this assessment strategy does not necessarily develop knowledge and skills. In fact, a strategic approach by learners to only engage with knowledge that will specifically apply to summative tasks creates a shallow approach to learning. It is a strategy that can disable a learner’s ability to apply knowledge to a range of contexts or to engage with deeper level tasks. What is important is feedback that allows learners to prepare for assessment by reflecting on it to create personalised strategies for learning.

Formative assessment

Formative assessment enables this process and creates dialogues that are exploratory, informative and encourages learners to practice new methods of engaging with subject tasks (with guidance). It is assessment that creates such territory that is most effective because it involves the learner in the process of learning and to consider how they can develop responses accordingly. Engaging learners in the process of learning and facilitating the development of skills and approaches before summative assessment occurs is more than valuable. This is because it creates learners who are best placed to respond to summative tasks to reach higher attainment levels. Therefore, it is no wonder that academic research points to a shift, or re-balancing, of formative assessment in relation to summative with wider use of formative assessment advocated (Gibbs, Hakim and Jessop, TESTA in 2014: A way of thinking about assessment and feedback, P.22).

Gibbs is clear that formative feedback should be ‘useful and meaningful’ (Gibbs, 2010). As well as receiving plenty of opportunities to obtain this, he is clear to point out that it should be ‘forward thinking’ (Gibbs, 2010), in that it directs learners to a similar task so that they can apply what has been learned. This ability for assessment to enable learners to look forward whilst developing awareness of skills through dialogue is an essential component of learning. If this doesn’t occur at a formative stage, then the meaning of developmental learning is lost somewhat. And without tasks to support this, the educational context can seem empty and uninterested in the development of learning potential.


Transparency is also important when preparing learners for success. This is particularly the case with learning outcomes that build a framework for students to experience subject knowledge and an educational space. It is important that learners are aware of how these operate and how they can use such a framework to test out their knowledge, discuss it and become more familiar with it to adapt their learning and sense of self within that framework. Higher Education Academy research has shown that a greater emphasis on written instruction does not necessarily facilitate this, but it is through an exchange of dialogue that understanding occurs (as an exchange of communication from tutor to student and from peer to peer). This has bearings on the ways in which learners come to understand how they are to be judged and graded. If the space that exists between the production of tasks and the comprehension of grades and feedback is too unclear, then a connection to learning is more unstable. It would seem, therefore, that emphasising subject knowledge only in the production of learning misses the process of learning.

Designing assessment for learning

So, what does this mean to the design of learning? Tasks and activities that allow learners to reflect on learning outcomes, the framework in which they are to be judged and the ways in which tutors’ construct expectations of learners are fundamental. This needs to be understood by students and appropriate strategies of learning should encompass this. As well as discursive arena’s, learner understanding of criteria by inclusion in peer assessment and self-assessment can be helpful. It is important that learners also have the chance to reflect on feedback and use it appropriately in a safe space before summative assessment occurs. Language used for these purposes needs to be transparent, understandable by learners and informative. Learning tasks should also give learners the opportunity to gain appropriate knowledge in an appropriate context. For example, to discuss a concept can be a worthwhile independent activity but the facilitation of group discussion can enhance understanding and the concept of discussing: important for employability and transferrable skills. Incorporating activities that promote reflection on thinking, awareness as a learner, tutor feedback, subject and experiential perspective, helps to create a student-centred learning journey which can be used in both formative and summative assessment tasks.


By understanding the range of opportunities open to learners through the process of assessment, there are considerable advantages to the learner journey that will best prepare learners for success. In the process, the enhancement this generates is fundamental to student experience. Not only does this promote wider engagement in learning and subject knowledge, but also enables learners to prepare for employment by applying knowledge through a range of contexts and developing strategies to problem solve effectively. As the Higher Education Academy report ‘A Marked Improvement’ makes clear:

Assessment design is influential in determining the quality and amount of learning achieved by students, and if we wish to improve student learning, improving assessment should be our starting point.

(A Marked Improvement, 2012. P10)



Where is podcasting in higher education?

Podcast: An audio file made available for automatic download via web syndication. Typically available as a series, new installments of which can be received by subscribers automatically.

During the years following the coining of the term in 2004, the world started exploring the potential of podcasts. YouTube was an independent curiosity, MySpace was the social media site of choice, and domestic broadband was allowing increasing numbers of people to download and engage with media in new ways. As with most manifestations of the technological zeitgeist, universities hopped on board, and started eagerly exploring the educational potential of the new box of toys.

The result was that academic interest peaked in around 2008, and then settled down as it became clear that listener numbers weren’t increasing at the same explosive rate as the rest of the internet. YouTube, FaceBook and Twitter crashed in to the space, and have justifiably taken up head-scratching time with regards to their application and impact on learning and teaching.

In the background though, podcasting has quietly but steadily grown, and is once again tickling against the edges of University awareness. Several factors have conspired to contribute to this. Reports published by RAJAR (Radio Joint Audience Research) show that more than 11% of the UK population listen to a podcast each week. In the US, the numbers are even higher, with Edison Research’s Podcast Consumer report giving a staggering 26% of the population as monthly listeners.


The steady rise can be at least partly attributed to the following trends:

Technology – Smartphones have passed from luxury to ubiquity, and have transformed the concept of on-the-move media consumption. More storage, cheap mobile data, and built-in podcast support on newer devices allow consumers to fill car journeys, gym sessions or the washing up with their on-demand entertainment of choice.

Directories – Apple Podcasts, accessed mostly through iTunes, has been a dominant force in podcast visibility over the last few years, and although the market share is shrinking (estimated to currently sit at around 50% according to BluBrry’s blog) it is still the most comprehensive aggregate of podcast content. So much so in fact, that nearly all of the other podcast aggregate services supplement their own directories with Apple’s. This counteracts the challenge of the decentralized and scattered infrastructure of podcast hosting services by pulling the disparate elements together in to one, accessible place.

User generated content – Many of us will have had our first encounter with user generated content in the form of the shaky VHS  footage in ITV’s ‘You’ve Been Framed‘. Since then, YouTube, Facebook and Tumblr (to name but a few) have given everyone a platform to broadcast home made content to the world – with cheap, easy to use cameras and tools letting them capture it. Broadcasting has moved from the domain of the few, to something done unthinkingly by the many, each time we post a picture or video to Facebook or Twitter.

So with the environment having moved on, where does that leave education? Universities have hopped on the bandwagon in some regard, with Oxford University in particular publishing encouraging statistics on engagement with their own podcast content. This however, and the majority of other Universities output consists mostly of recorded lectures, seminars, interviews and discussions. Indisputably valuable, but mostly exist as an alternative method of presenting ‘traditional’ content. The golden bullet of bespoke HE teaching through podcasts remains elusive.

The Open University and others have used the medium of online learning to create new ways of educating. and others leverage videos and screencasts to great effect for training and teaching. Codecademy offer guided simulations for programming and web development. Each of these emerging mediums has presented new opportunities and directions for teaching. What can podcasts offer?

Here in the Learning Design team, we’d like to give podcasts another look, to see what the unique advantages of straight-to-student syndication and casual consumption enable in terms of new ways of teaching. Its early days at the moment, but we’re looking forward to sharing what we find with you.

Perfect partners: how we work with others to support learning design

Over the last year I’ve been reflecting on how much the success of our work depends on effective partnerships with a range of people – within and external to the Open University. As Learning Designers, it’s inherent in our role that we work collaboratively with curriculum teams to elicit and capture ideas for module creation. There are also many others – staff and students – with whom we form productive and fruitful relationships. All this is with a view to achieving the best possible outcomes for students.

In this blog post I have set out some of the ways that Learning Designers work in partnership with others, and what the impact is…

Defining partnership

According to Healey et al. (2014), “Partnership is a relationship where everyone involved is actively engaged in – and stands to benefit from – the process of learning and working together.” The key here is parity between partners, with all concerned having a stake in what happens. They go on to say that “Working and learning in partnership with students is a specific form of student engagement and is a way of doing things rather than an outcome in itself”.

Students as partners in curriculum design

I will start with students, since they are the whole reason why we do what we do. However, it’s only in the last couple of years that we have been able to engage students in the design process at an early stage, in the way that we would like to.

The Curriculum Design Student Panel was set up in 2016 to provide a means for staff involved in curriculum design and innovation to gain student input more easily. Over the two years of its existence, the panel has more than trebled in size, to around 1900 students. Recruited twice-yearly, panel members get the opportunity to take part in surveys, workshops and various kinds of usability and experience testing. The interaction is intended to mimic face-to-face student-teacher informal discussions. Activities are run via the panel’s We Learn workspace, through OU Live, and on occasion face-to-face. A Community Forum provides a space for panel members to get to know each other and share experiences.

Over the last year, students have been involved in around a dozen projects. For example, they have:

  • contributed their own ‘learner profiles’, which have fed into development of the OU’s new Online Student Experience student personas as well as being used in Learning Design workshops and academic staff development sessions
  • provided input on the terminology and vocabulary used in learning and teaching materials, which has fed into a glossary for OU staff to use when developing learning materials and module websites
  • shared their experience of informal learning through platforms such as OpenLearn and FutureLearn. This data is being used by colleagues in the Open Media and Informal Learning team to inform development
  • expressed preferences for the type of additional formats of online module material they prefer, which will shape future decisions about what we make available, and ongoing development
  • taken part in the Jisc national research project exploring how students use digital technology to support their learning, the results of which – when available – will feed into decisions the OU makes about its digital environment.

The findings from all these pieces of research help us to understand the study habits and preferences of OU students better.

OU staff partnerships

I have already referred to our partnerships with OU faculties. Over the last year, we have:

  • run 15 Learning Design workshops
  • worked with around 100 other modules (at various stages of production) on the more detailed design that follows a workshop
  • supported over 40 module teams with use of data and analytics for their modules, as well as training around 130 people.

As part of this two-way relationship we have been able to get feedback on our workshop format and bring in some improvements that should help academics with the design process. Our analytics work forms a really key part of module design evaluation and is much-valued by academic colleagues.

In addition, we work proactively with the Academic Professional Development team to develop and deliver an expanding portfolio of training, including sessions on:

  • Introduction to Learning Design
  • Selecting online tools
  • Learning analytics.

Within Learning & Teaching Innovation (LTI) we have many partners in module production – editorial staff, graphics media developers, interactive media developers, video and audio, online services and Library colleagues. When it comes to development, we work with the other TEL teams who are engaged in Learning Innovation, Online Student Experience and Learning Systems.

Our team supports the Enhanced Employability and Career Progression (EECP) project and has made input into the Employability Framework. Over the last year we have developed closer links with the Careers & Employability team – again, all with the student in mind, so that employability skills and attributes can be well integrated into the curriculum and the role of different experts clearly understood by those involved in module creation.

One group with which we are planning to develop closer working links during the coming year is the Associate Lecturer (AL) community. We will be taking opportunities to participate in AL Staff Development events, with a view to exploring AL perceptions of curriculum design as well as sharing the learning design approaches we use.

Partnerships in research

A rich body of research on learning design and learning analytics underpins our practice, and we maintain close connections with the IET Learning Analytics and Learning Design team, as shown by some of the joint scholarship and research which has been published over the last year or so, listed on the Learning Design publications page.

Learning Design is a key strand of the IDEAS project, an international collaboration between the University of South Africa and the Open University. As well as piloting OU Learning Design approaches with two UNISA module teams, three workshops have been delivered to staff at UNISA and at the University of Pretoria. There is potential to extend our work with African partners in future, as well as in other parts of the world.

External and commercial partnerships

The Learning Design team engages regularly with commercial opportunities. This can include running workshops and training for visiting groups (for example, from China). It also involves visiting other institutions and over the last year we shared Learning Design approaches with universities and education providers in the UK and overseas (for example, Denmark, Turkey, Spain).

Professional partnerships

We have active partnerships with colleagues at other institutions across the sector, including UCL, Northampton, Greenwich, Guildford, and the University College of Estate Management, as well as with Charles Sturt University (CSU) in Australia. OU Learning Designers are involved in organising the regular meetings of the Learning Design Cross-Institutional Network – a national and international special interest group that provides a stimulating forum to share Learning Design research and practice. We are also connected with the Association for Learning Technology (ALT), with good opportunities for networking and sharing practice.


It takes time and effort to nurture and maintain the connections that will lead to truly ground-breaking work. In my experience though, cross-boundary partnership working is where the magic really happens. Learning Designers are ideally placed to bring together different stakeholders in the design process, facilitate mutual learning, and generate creative and innovative solutions which benefit student learning. In fact, we are the perfect partners.


Healey, M., Flint, A. and Harrington, K. (2014) Engagement through partnership: students as partners in learning and teaching in higher education. Higher Education Academy, UK. Available at: (accessed 29 August 2018).


Learning analytics: the patchwork quilt visualisation

In the Open University, we have developed a suite of LA (Learning Analytics) visualisations called ‘Action for Analytics’ (A4A: slides from a presentation giving more detail) designed to help those responsible for producing modules to see the effects of their designs.  For example, it’s possible to track just how much use videos we produce for the module get watched and therefore see whether doing more would be a good investment.

sample learning analytics graph of students resource use with time from OU module
Tracking how many students have accessed a particular section of the module in any week. Weeks along the bottom and blue bars are weeks with Assignments in them.

This has been very successful with our colleagues outside the Learning Design team (mostly academics) being able to track what is going on with their modules real time and also see the effects of changes as they are bought in.

However, the tool is limited to a set of ‘baked in’ dashboards so its not possible to split the above data into students who ended up failing the module from those who passed and compare the two graphs.  This could give useful insight into the value of individual parts of a module and also if students are accessing it or not.

Drilling down into the data:  A4A isn’t the only route to exploring statistics about students on modules.  There are a number of databases underlying the visualisations and these can be accessed directly by specialist staff.  Using our access rights, we have been experimenting with producing bespoke visualisations not in the current suite that we think could help those writing and designing modules.  These are currently prototypes but show some promise:

Screenshot showing patchwork quilt visualisation
Patchwork quilt visualisation. Sections of the module are arranged in columns, rows at the top represent individual students showing sections they have visited at least once. At the bottom, these individual visits are collated to show percentage access to each element for various groups: Withdrawers at the top, still registered below this and Low economic status (SES) below this.

In this visualisation, individual students are shown one per row at the top.  If they have accessed any element of the course (one section per column) the corresponding cell is blue.  If they have never accessed it, it’s shown white.  At the bottom, students are grouped (e.g. ‘withdrawers’ and ‘registered’ – not withdrawn) and cells are now coloured with hot colours showing low usage and cool colours showing high usage.

Example Interpretation:  As an example of its use, the last column is the block assignment.  It can clearly be seen that section 18 (column 2nd from right, expanded up left) is attracting a high percentage of students visiting it at least once.  Section 17 (3rd from right) is attracting considerably lower numbers of students, especially amongst withdrawers.  This is a factor of inclusion of section 18 in the assignment, whereas 17 is not and, as a result, students are choosing to skip it.  From a design point of view, should it be included at all?

More granularity:  In our work investigating this graphic, we think it will become even more useful when there are improvements in the granularity, at present we can only see that students have accessed a whole section.  For example, it will be much more useful to see how far they got within a section itself – did they give up half way through?  Improvements in the learning analytics the VLE records should help with this.

Next Steps:  This is a work in progress, already we are making the patchwork quilt visualisation more sophisticated and have plans for other experiments.

Richard Treves, Senior Learning Designer.

Carl Small, Analyst.