Hands-on support for module teams

Whilst we carry out fixed and clearly defined Learning Design activity as part of our core offer, we can also provide more hands-on input aimed at supporting module teams with specific challenges they are facing with their design. Below are three such examples where we’ve provided support to help module teams to progress either with a specific tool, or with broader aspects of Learning Design.

Guiding Authors

Chris Cox

For one FBL module, several challenges had been identified in the Learning Design Workshop in the early stages of development. I’d focused my attention on a few key areas to take forward into the LD Plan. However, it became clear that more guidance on writing and structuring the learning was needed. Plenty of guidance and resources were available on their workspace, but the team needed something more immediate.

In consultation with my colleagues, I decided to take TEL101 – a module designed to introduce staff to Learning Design and what we do – and adjust it for the team. The module was quite short, so matched the shorter study weeks one of the modules needed – and was well structured to deliver learning while engaging students with different activity styles at a deep level. I took a copy of TEL 101, and produced a ‘behind the scenes’ commentary version – each important LD point was highlighted and design principles explained, to help the team think about how to write, balance activities, tie activities to learning outcomes, and show good practical examples of LD.

I also produced a visualisation to provide them with a starting point and approach to writing that they could come back to, and a chart coding activity types – a pedagogic map of the invisible, underlying structure of the module to help them think better what each activity a section was achieving to help students stay on course and succeed.

I presented these at a Module Team meeting, which the authors found helpful – and the visualisation helped things click. And now the LD team have another tool – ‘LD101 Behind the Scenes’ – to help guide module teams in a similar position.

OneNote for PDP

Dot Coley

Soon after joining the team, while researching how OneNote could be used as an ePortfolio tool for one of my FASS modules, I joined forces with Sue Lowe to share knowledge and to support the WELS PDP pilot. The benefit was two-fold; drawing on existing experience to feed into my FASS work and using my technical background to explore the impact of potential compatibility issues for the existing pilot.  Part of this included joining our academic partners in delivering PDP support tutorials, not only assisting with immediate issues, but also helping to keep focus on who our students are and what challenges others may face in the future.

During this piece of work, another FASS academic colleague requested advice on using OneNote, and I took the lead on advising how it may work in practice. I used my previous research to present key information alongside a practical demonstration of the WELS pilot templates to show how flexible OneNote is, and how notebooks can be structured to meet individual needs.  I also introduced the Curriculum Design Student Panel and explained how they can help with testing early concepts.

Explaining how the approach has been used elsewhere provided an opportunity to use evidence of what has worked so far and what challenges we have faced. I was also able to talk about how the WELS pilot ran alongside live modules – with a smaller group of students – ahead of implementing it more widely, but I used ICEBERG to emphasise the importance of embedding reflective activities into the learning journey and allowing for them within workload planning.

Linking the concept to use on a whole qualification, I expressed the importance of starting gradually in Level 1, scaffolding students in their reflection and reducing support throughout the duration of the qualification so that by the end of Level 3 they can reflect independently—which is especially important for those moving on to study at Postgraduate level.

Further work is continuing in WELS, and a new phase is now underway. I am supporting the academic team on a Teaching Excellence funded project aiming to train previous pilot students to become PDP coaches, offering peer-to-peer mentoring.

Helping a module team to get ‘unstuck’

Katharine Reedy and Mark Childs

A degree apprenticeship team was struggling to make progress with designing the ‘practice’ module at the start of the qualification. This was due to the sheer volume of content that students need to learn. By mapping skills content across both level 1 modules, and visualising the high-level student journey in terms of how they would engage with the material, an approach was found that enabled the team to come up with a workable structure for module ‘blocks’. This was achieved in a workshop facilitated by two Learning Design team members, which enabled the module team to think through the whole student learning journey and come to a consensus about what skills they need to develop and use at each stage.

Supporting success through assessment

Research in assessment practice points to an emphasis on summative assessment to test learner knowledge. However, this assessment strategy does not necessarily develop knowledge and skills. In fact, a strategic approach by learners to only engage with knowledge that will specifically apply to summative tasks creates a shallow approach to learning. It is a strategy that can disable a learner’s ability to apply knowledge to a range of contexts or to engage with deeper level tasks. What is important is feedback that allows learners to prepare for assessment by reflecting on it to create personalised strategies for learning.

Formative assessment

Formative assessment enables this process and creates dialogues that are exploratory, informative and encourages learners to practice new methods of engaging with subject tasks (with guidance). It is assessment that creates such territory that is most effective because it involves the learner in the process of learning and to consider how they can develop responses accordingly. Engaging learners in the process of learning and facilitating the development of skills and approaches before summative assessment occurs is more than valuable. This is because it creates learners who are best placed to respond to summative tasks to reach higher attainment levels. Therefore, it is no wonder that academic research points to a shift, or re-balancing, of formative assessment in relation to summative with wider use of formative assessment advocated (Gibbs, Hakim and Jessop, TESTA in 2014: A way of thinking about assessment and feedback, P.22).

Gibbs is clear that formative feedback should be ‘useful and meaningful’ (Gibbs, 2010). As well as receiving plenty of opportunities to obtain this, he is clear to point out that it should be ‘forward thinking’ (Gibbs, 2010), in that it directs learners to a similar task so that they can apply what has been learned. This ability for assessment to enable learners to look forward whilst developing awareness of skills through dialogue is an essential component of learning. If this doesn’t occur at a formative stage, then the meaning of developmental learning is lost somewhat. And without tasks to support this, the educational context can seem empty and uninterested in the development of learning potential.

Transparency

Transparency is also important when preparing learners for success. This is particularly the case with learning outcomes that build a framework for students to experience subject knowledge and an educational space. It is important that learners are aware of how these operate and how they can use such a framework to test out their knowledge, discuss it and become more familiar with it to adapt their learning and sense of self within that framework. Higher Education Academy research has shown that a greater emphasis on written instruction does not necessarily facilitate this, but it is through an exchange of dialogue that understanding occurs (as an exchange of communication from tutor to student and from peer to peer). This has bearings on the ways in which learners come to understand how they are to be judged and graded. If the space that exists between the production of tasks and the comprehension of grades and feedback is too unclear, then a connection to learning is more unstable. It would seem, therefore, that emphasising subject knowledge only in the production of learning misses the process of learning.

Designing assessment for learning

So, what does this mean to the design of learning? Tasks and activities that allow learners to reflect on learning outcomes, the framework in which they are to be judged and the ways in which tutors’ construct expectations of learners are fundamental. This needs to be understood by students and appropriate strategies of learning should encompass this. As well as discursive arena’s, learner understanding of criteria by inclusion in peer assessment and self-assessment can be helpful. It is important that learners also have the chance to reflect on feedback and use it appropriately in a safe space before summative assessment occurs. Language used for these purposes needs to be transparent, understandable by learners and informative. Learning tasks should also give learners the opportunity to gain appropriate knowledge in an appropriate context. For example, to discuss a concept can be a worthwhile independent activity but the facilitation of group discussion can enhance understanding and the concept of discussing: important for employability and transferrable skills. Incorporating activities that promote reflection on thinking, awareness as a learner, tutor feedback, subject and experiential perspective, helps to create a student-centred learning journey which can be used in both formative and summative assessment tasks.

Conclusion

By understanding the range of opportunities open to learners through the process of assessment, there are considerable advantages to the learner journey that will best prepare learners for success. In the process, the enhancement this generates is fundamental to student experience. Not only does this promote wider engagement in learning and subject knowledge, but also enables learners to prepare for employment by applying knowledge through a range of contexts and developing strategies to problem solve effectively. As the Higher Education Academy report ‘A Marked Improvement’ makes clear:

Assessment design is influential in determining the quality and amount of learning achieved by students, and if we wish to improve student learning, improving assessment should be our starting point.

(A Marked Improvement, 2012. P10)

 

 

Gathering student perspectives: a case study of capturing student feedback

One of the key areas of growth activity over the past year or so has been to gather more student input into the design process. To this end we have a number of core approaches to gathering student perspectives during design phases:

  • we administer the curriculum design student panel, which provides us with access to over 1500 OU students for rapid feedback
  • we support and advise on developmental testing, frequently working in partnership with colleagues in LTI academic, editors and the module team
  • we advise module teams on use of Real-Time Student Feedback for gathering input during presentation from students

The examples below demonstrate how we go about deploying some of these approaches in our design work with module teams.

Student perspectives on aspects of module design at postgraduate level

Gill MacMillan (Senior Learning Designer)

In this example, we were able to use the Curriculum Design Student Panel to gather student feedback – during the design phase – on specific aspects of the structure of a postgraduate module:

  • Effectiveness of a weekly introductory slidecast/video, in which an academic author sets the scene and outlines the key focus and discussions to be covered in that week
  • Usefulness of a ‘Planning your week’ overview table – outlining the activities in the week ahead (timings, type of activity etc.) in order to help students plan their study time
  • Student preference between two alternative layouts for the study planner

A relevant cohort of students was identified, and while the number who then actually responded was relatively small, the feedback was very consistent and gave a clear steer from students on all 3 aspects:

  • The majority of students found the introductory slidecast/video easy to use, informative and more effective than reading online text, and said they would start the week by looking at it in order to understand the aspects of the topic to be covered. (In addition, an issue was raised about the visibility of the transcript link when viewing the video on an iPad. This was fed back to the Learning Systems team who were already aware of the issue and have made improvements to the functionality).
  • All the students found the ‘Planning your week’ overview useful
  • The majority expressed a clear preference for the second alternative layout

In terms of the overall findings and impact, the Module Chair was happy with the results and the process, recognised the usefulness of getting this direct student feedback, and went on to implement the findings. So there was a clear impact from the panel’s input, and the Module Chair summarised this impact in a follow-up message to those students who had participated. Now that the module is live, Real-time Student Feedback is being used to get feedback from students as they study, enabling us to get further feedback on these, and other, aspects of the module design.

Testing innovative assessment

Yvonne Murphy (Learning Designer)

This testing was carried out as part of the Developmental testing that the LTD team in LTI Academic lead on. We work in partnership with the LTD team initially, and then work to co-ordinate the input of D&P colleagues in the testing. LTI Academic liaise with the module team to establish how the testing will run, recruit the students and set up the evaluation. Learning Designers then support by making sure the content is setup in an authentic manner on the VLE, and work with D&P to get the materials ready and presented to an appropriate standard.

For this particular testing relating to activities for a new WELS module (E309), a key part of our role was the qualitative analysis of student feedback.

The development testing itself consisted of two student activities

  • Creating a ‘digital badge’ using a PowerPoint template with embedded audio.
  • Creating an Infographic comparing two datasets using a PowerPoint template with embedded audio.

Both activities used Office365 and required students to work collaboratively in pairs. There were 11 student participants who completed the E309 activities and associated questionnaire, some students gave additional feedback by phone.

As a result of this, we had a lot of qualitative data from the student questionnaire which was analysed and captured in a table summarising issues per participant per activity. Following the initial analysis we contacted some of the students and arranged a phone call with them to engage and explore their feedback more thoroughly.

Some of the identified risks and opportunities were as follows:

Risk & Issues Opportunity & Mitigation
If using a personal Office365 account cannot ‘share’ with partner (using Office365 share function) if using an OU student account and vice versa (cannot share outside of organisation) Cannot use the ‘share’ function of Office 365 to collaborate. Advise to share via email and on Module e-groups
PowerPoint had to be installed for audio and the desktop version used (audio functionality not available online) Provide instructions to save PowerPoint to desktop rather than using the online version
Unsure how to record audio Provide additional resource:

  • One page visual document on recording audio
  • Have a session in tutorial one around Office365 and PowerPoint
  • Produce a Camtasia screencast on basic navigation

 

Students lack confidence in IT and PowerPoint skills
  • Use digital badge activity as a ‘practise’ ‘dummy’ TMA to prepare students for creating a poster in TMA01
  • Provide additional support and resources (as above)
Unable to collaborate with partner due to clashes in time commitment – one partner did most/all of the work Students to produce individual infographics for assessment but to collaborate with the preparation and research.

The module team implemented all of the recommendations coming from the opportunities documented above (except the one page visual document as it was felt enough resource was provided with the screencast, tutorial, template and instructions).

Where is podcasting in higher education?

Podcast: An audio file made available for automatic download via web syndication. Typically available as a series, new installments of which can be received by subscribers automatically.

During the years following the coining of the term in 2004, the world started exploring the potential of podcasts. YouTube was an independent curiosity, MySpace was the social media site of choice, and domestic broadband was allowing increasing numbers of people to download and engage with media in new ways. As with most manifestations of the technological zeitgeist, universities hopped on board, and started eagerly exploring the educational potential of the new box of toys.

The result was that academic interest peaked in around 2008, and then settled down as it became clear that listener numbers weren’t increasing at the same explosive rate as the rest of the internet. YouTube, FaceBook and Twitter crashed in to the space, and have justifiably taken up head-scratching time with regards to their application and impact on learning and teaching.

In the background though, podcasting has quietly but steadily grown, and is once again tickling against the edges of University awareness. Several factors have conspired to contribute to this. Reports published by RAJAR (Radio Joint Audience Research) show that more than 11% of the UK population listen to a podcast each week. In the US, the numbers are even higher, with Edison Research’s Podcast Consumer report giving a staggering 26% of the population as monthly listeners.

source: https://www.edisonresearch.com/podcast-consumer-2018/

The steady rise can be at least partly attributed to the following trends:

Technology – Smartphones have passed from luxury to ubiquity, and have transformed the concept of on-the-move media consumption. More storage, cheap mobile data, and built-in podcast support on newer devices allow consumers to fill car journeys, gym sessions or the washing up with their on-demand entertainment of choice.

Directories – Apple Podcasts, accessed mostly through iTunes, has been a dominant force in podcast visibility over the last few years, and although the market share is shrinking (estimated to currently sit at around 50% according to BluBrry’s blog) it is still the most comprehensive aggregate of podcast content. So much so in fact, that nearly all of the other podcast aggregate services supplement their own directories with Apple’s. This counteracts the challenge of the decentralized and scattered infrastructure of podcast hosting services by pulling the disparate elements together in to one, accessible place.

User generated content – Many of us will have had our first encounter with user generated content in the form of the shaky VHS  footage in ITV’s ‘You’ve Been Framed‘. Since then, YouTube, Facebook and Tumblr (to name but a few) have given everyone a platform to broadcast home made content to the world – with cheap, easy to use cameras and tools letting them capture it. Broadcasting has moved from the domain of the few, to something done unthinkingly by the many, each time we post a picture or video to Facebook or Twitter.

So with the environment having moved on, where does that leave education? Universities have hopped on the bandwagon in some regard, with Oxford University in particular publishing encouraging statistics on engagement with their own podcast content. This however, and the majority of other Universities output consists mostly of recorded lectures, seminars, interviews and discussions. Indisputably valuable, but mostly exist as an alternative method of presenting ‘traditional’ content. The golden bullet of bespoke HE teaching through podcasts remains elusive.

The Open University and others have used the medium of online learning to create new ways of educating. Lynda.com and others leverage videos and screencasts to great effect for training and teaching. Codecademy offer guided simulations for programming and web development. Each of these emerging mediums has presented new opportunities and directions for teaching. What can podcasts offer?

Here in the Learning Design team, we’d like to give podcasts another look, to see what the unique advantages of straight-to-student syndication and casual consumption enable in terms of new ways of teaching. Its early days at the moment, but we’re looking forward to sharing what we find with you.