The art of conversation: why collaboration matters in online learning 

If you’ve ever witnessed an awkward role play exercise in a training session, you may dread the idea of collaborative learning activities. The good news is that when it comes to online learning, you can plan and manage collaborative activities to ensure that nobody’s embarrassed and everyone benefits – possibly in ways they hadn’t anticipated.  

You may have already designed some online collaborative activities. If you found it tricky, you’re not alone. They can feel contrived or lacking in value, and you may have worried that they’ll distract students or take up too much of their time. Students can be wary of them too, especially if the activities seem bolted on rather than built in, or if they look intimidating. So, in this blog post, we look at why collaboration matters. We also provide some examples you can use as a basis for building collaborative activities into your online learning, and you can download our collaborative activities guide for more ideas.  Continue reading “The art of conversation: why collaboration matters in online learning “

What students say about … click-and-reveal discussion activities

You may be familiar with click-and-reveal discussion activities. These are simple learning activities, which ask students to consider a question and then click to reveal an answer, some further discussion or feedback. This type of question—feedback device can be seen in face-to-face classroom environments when a lecturer poses a question to the class and then provides verbal feedback on students’ responses. In distance learning delivered in print, questions can be posed within text and answers provided at the end of the book or feedback provided directly after the question. In an
A click-and-reveal activity in the style of the Open University's VLEonline virtual learning environment, students read a question onscreen, answer it (sometimes using a fill-in text box) and then click a button to reveal a discussion or feedback.

Some students enjoy the fast reward provided by click-and-reveal discussion activities, whilst others find them dissatisfying and low value. Could this divided opinion be due to the type of activity or relate more to the how the activities are designed? To understand more about how Open University (OU) students engage with click-and-reveal discussion activities and how well they aid their learning, we asked our Curriculum Design Student Panel about their experiences with this type of activity via a short survey. The panel is comprised of OU students and whose experiences and opinions inform design decisions on OU modules.

What did students tell us?

The responses to this survey numbered 362 and of those only 6% (22) reported never having encountered a click-and-reveal discussion activity in the OU modules they had studied. When asked how often they made an attempt to answer the question posed, 58% students (210) responded ‘always’ or ‘often’.

Results of a survey about students' use of click-and-reveal activities

A detailed analysis of the data revealed that the most experienced students (those studying at Level 3 students or post-graduate level) were less likely to engage with click-and-reveal discussion activities than Level 1 and 2 students. Also, Faculty of Business and Law students interacted with these activities much less than average, and Faculty of Wellbeing, Education and Language Studies students engaged with them much more than average.

The survey revealed that students’ attitudes to click-and-reveal discussion activities fall into four roughly equal categories: highly positive experiences; engaged to a limited extent; those who felt the activities would benefit from modification; and negative experiences.

Students who had a highly positive experience found click-and-reveal discussion activities a useful means to actively engage with and reflect on the module content. They found the activities to be a low risk mechanism to test themselves on their knowledge and found them useful in that the ‘reveal discussion’ part presents the material in an alternative way that aids understanding.

Where engagement with the activities was to a limited extent, students decided to constrain their engagement by only:

  • writing brief answers for example as a bullet point list
  • thinking about the answer but not actually writing anything down
  • reading the revealed answer without answering it at all
  • attempting the activity if it related to an assessment.

Those students who felt the activities would be more useful if modified, said that the activities would work better if they could be assessed, either by making them multiple choice, or by using AI to analyse their free text responses, or through peer evaluation, either through tutor group forums or by a ranking system. It was also mentioned that using the questions as a prompt to generate notes that could be used for revision was also a positive aspect of the activities.

The activities prompted a negative experience for some students because they interrupted their flow and focus on the reading. Issues were also raised with the content of some activities in that they sometimes provided answers that students did not feel matched what they had learned to that point, were superficial in design or in the types of answers given and so appeared to lack value, and could be demotivating when the discussion highlighted a lack of understanding.


Top tips for designing effective click-and-reveal discussion activities

  • Create authentic learning experiences

Avoid the tendency to simply use the click-and-reveal discussion as a means to present more reading under the disguise of an activity, as this feels inauthentic to students.  Instead use them where they are an appropriate method to allow students to test and expand their understanding.

  • Use a variety of activity types

Don’t rely on click-and-reveal discussion activities as a predominant activity type. Use a variety of activities to keep students engaged and aid retention (Van Ameijde et al., 2015).

  • Use feedback to deepen learning and incentivise engagement

Feedback must:

  • Relate to what students have been asked to do
  • be at an appropriate level
  • enable students to learn, even if they have not provided the expected answer.

For example, use epistemic ‘triple loop’ feedback there is a need to develop independent learning (Kirschner and Neelen, 2018), but also use activities which provide less intensive forms of feedback such as pattern matching activities, multiple choice questions or peer evaluation.

  • Challenge students but match activities to what they’ve learned

It is important to ensure that activities represent actual knowledge or skills students could have acquired at that point and are useful for the students’ learning.


Thanks

Thanks to Learning Design team friend Mark Childs for his contributions to this research and its outputs.

References

Van Ameijde, J., Cross, S. and Weller, M. (2015) Designing for student retention. The ICEBERG Model and Key Design Tips [Online]. Available at https://blog.edtechie.net/learning-design/designing-for-retention-the-iceberg-model/ (Accessed 10 September 2020)

Kirschner, P. A. and Neelen, M. (2018) ‘No feedback, no learning’ 3-star learning experiences [Online]. Available at https://3starlearningexperiences.wordpress.com/2018/06/05/no-feedback-no-learning/ (Accessed 10 September 2020)

Practical tools for developing student-centred learning

Over the last decade, The Open University has developed its approach to designing and evaluating student-centred learning through a wealth of research and scholarship. The basis for much of what we do derives from the tools developed through the OU Learning Design Initiative.

Since this project ended we have not stood still, but have continued to iterate on and adapt the set of tools and resources for facilitating and evaluating design of courses (see Rienties et al., 2017, for a summary of learning design and learning analytics 2007-2017). Our work includes feedback from academic and professional colleagues as well as our own experience of what works in practice. However, it has students at its heart. In this post, we’ll outline the tools we’ve developed to help us focus on students’ needs and provide links and guidance on how you can use them. All of the resources are available to download.

Starting with student needs

In a recent blog post, we shared some of our resources for developing online learning. These are licensed under Creative Commons to enable anyone to freely use or adapt them. But what makes a good tool? The authors of the OULDI-Jisc Project Evaluation report commented that “… to change practice, a tool must challenge rather than help replicate or consolidate existing practice” (Cross et al, 2012, p.80). So, part of the reason for using tools in workshops and meetings is to facilitate new insights.

For us, the learning design process starts with creating learner profiles or personas using this or a similar student profile template. Profiles are based on data and evidence, and ensure the material being designed caters for the needs of those who will be studying it. This may throw up some design challenges in terms of accessibility or skills needed for learners to succeed. It’s important to shine a light on these from the start so you can address them during the design process.

The activity types framework is a tried and tested way of prompting teams to think about designing active learning experiences for students. Our activity types cards take different learning activity types and suggest what practical activities students might do and what tools or technology may best enable the learning. We use them in module design workshops to stimulate ideas on how to engage learners actively with subject content.

Planning a piece of learning

There are two other resources you can use to help you plan courses:

  • The activity planner template is designed to prompt you to think about constructive alignment and active learning. We use it to map out a piece of learning topic by topic or week by week. It can be printed out or updated online using the editable PDF.
  • The OU general course planner spreadsheet was designed by Lawrence Kizilkaya, Product Development Manager in the Open University’s Learning Innovation team, and is particularly suitable for planning MOOC content. When planning each week of the course, we can see how activity types are distributed, with an automatic calculation of how much time is planned for each. Best of all, you don’t need to be an Excel expert to use it! There’s also an optional ‘student journey planner’ to consider questions such as how content and experience will be presented, how learners will communicate and collaborate, how they will be guided and supported through the course, and how they will reflect on and monitor their progress.
Using our resources

Here’s a suggested way of using these resources to plan a piece of learning – either in a workshop or individually.

STEP 1

  • Download and (if desired) print out the activity types cards.
  • Download a copy of the activity planner template or the OU general course planner.
  • Consider the overall aim and learning outcomes for the course or module. What are its key features? What are the needs of learners? (Refer to any student profiles you’ve created).
  • Reflect on the course. How long is it, what needs to be covered, and what proportion of time would you like students to spend on different learning activities?

STEP 2

  • Break your module into meaningful chunks, such as key concepts that you want students to learn, units of time, or thematic blocks.
  • Write headings for these in the left-hand column of the planner template you’re using.
  • Make a note of what learning outcomes and skills will be taught or developed through the activities.
  • Consider what students will do to demonstrate their learning. Note your ideas in the assessment column. Check that these ideas will help students meet the learning outcomes.
  • Use the activity types cards to think about what you want students to do, how much time they will need to do it, and what tools and resources will be needed. For example, how will collaborative activities work, how much time will students need to carry out their own research, what opportunities will you build in for students to apply their learning to real-life situations?
  • Begin to sketch out the activities you might use to teach each of these concepts (if using a hard copy of the activity planner template, turn it over if you need more space). Describe what you’ll ask students to do and make a quick note of any tools or resources you might use. Make a note of approximate timings too.

This structured planning process focuses on the student journey and keeps the needs of learners at the centre of your learning design. It also means you’ll end up with a visual resource that can be shared with others to get a common understanding of the design plan, and to act as a reference point.

We recommend involving students and staff in the process of reviewing and testing designs. Data such as learning analytics can also provide you with an insight into students’ behaviour and help you evaluate your design at a later stage.

Thanks

We would like to acknowledge the contribution of Ryan Green (Art Worker) in producing the activity planner template and activity types cards.

References

Cross, S., Galley, R., Brasher, A. and Weller, M. (2012). OULDI-JISC Project Evaluation Report: the impact of new curriculum design tools and approaches on institutional process and design cultures. OULDI Project [Online]. Available at http://oro.open.ac.uk/34140/1/ (accessed 10 September 2020)

Rienties, B., Nguyen, Q., Holmes, W and Reedy, K. (2017). A review of ten years of implementation and research in aligning learning design with learning analytics at the Open University UK. Interaction Design and Architecture(s), no. 33 pp. 134–154. Available at http://oro.open.ac.uk/51188/ (accessed 10 September 2020)

Engagement with others online: students’ views of course design

As learning designers, it’s essential that we explore students’ needs and goals. That way, we can make sure that learning activities address these needs and support students to reach their goals. For example, each time we design a new module, we take time to explore student data and course teams’ experience to build up student profiles or personas that can be referred to throughout the module design process. 

It’s also important that we understand students’ experiences of their learning, such as the times and days they study, and how and if they work with other students. Without this, we risk making assumptions that could cause students to struggle or even drop out. I2018, our Learning Experience & Technology team conducted a series of focus groups with students at a range of distance learning institutions, including the OU.  This post takes a look at what they learnt.

How do students view their courses? 

The focus groups were a small sample size in total (only 22 students: eight from the OU and 14 from a range of other distance learning providers) but they did provide an opportunity to find out how some students see their courses when given an opportunity to talk about their experiences 

The six hours of students talking provided a lot more data that didn’t make it into the videos. Mark Childs, a lecturer in microcredentials and technology enhanced learning at the OU’s Institute of Educational Technology, analysed the data to identify trends and other information that could inform our work. 

What does face-to-face teaching mean? 

Mark analysed the students separately depending on whether they were under 30 or over 30 (30 was the median age), to look for differences in their experiences. Two stood out: 

  1. The younger students tended to have jobs that have less flexibility, so the synchronous elements tend to be less accessible for them. 
  2. The younger students used face-to-face to describe videoconferencing  the logic being you can see the other person’s face. 

I’ve noticed this last one before, in a project I did at Loughborough University,’ Mark says. ‘I now try and drop using the phrase “face-to-face” as it’s obviously ambiguous, and use “co-located” – meaning physically present – instead. However, it’s difficult to stop the habit. 

To provide some focus to the analysis, Mark looked at how the students engaged with other people. He used qualitative analysis to review the transcripts, then identified themes and grouped similar ones together. What emerged was that the students saw their engagement with others on the course as being in three main scenarios: 

  • Synchronous one-to-ones with tutors 
  • Synchronous sessions with tutorial groups led by a tutor 
  • Asynchronous discussions with other students 

The students also talked about a fourth strand that they created for themselves. Only two of the 22 students said they had an opportunity to meet others on their course in the same space: these were one from the University of Sheffield and one from the OU. The others felt they missed out on this experience, so many of them talked about how they had created their own opportunities for doing this.  

It’s important to note that students may not have been aware of the course design element. They could also have forgotten it, or just not mentioned it when asked. And of course, there’s no way the views of 22 students can represent all the students at those universities. However, it provides an interesting start for a more representative study. 

Course design is just the beginning 

Mark’s research also suggested that course design doesn’t end with what we provide to the student. Students take what we provide and re-create it to suit themselves. 

Examples of how other students had created their own co-located learning experiences were: 

  • Physical social learning spaces. Two of the younger non-OU students had visited Google Campus, which is a place in London where technology start-ups have an office space, but anyone can use the downstairs café. This meant that there were people around who were working and studying and could offer advice. It was also an atmosphere that they felt supported their learning, because it offered both social and quiet areas. 
  • Creating their own blend of spaces. The older non-OU students didn’t discuss Google Campus but did talk about how they mixed libraries, cafes and their offices to create a blend of different types of spaces to work in to meet their needs for quiet but shared places and more social shared spaces.  
  • Creating their own co-located peer groups. One of the younger OU students filled in for her lack of co-located contact with others by identifying others through the module’s Facebook group and arranging to meet up with them in a monthly study group. 

However, the key element of all of these co-located learning spaces was that they were a) convenient and b) low costThis was consistent across both age groups, and for both OU and non-OU students. In other words, meeting other learners in a physical space is something learners like to do, but its not considered worth it if they have to travel a long way or pay for it. 

Creating connections 

One element that particularly needs further investigation is that some of these students who created their own co-located groups were also those who weren’t taking advantage of the online discussions with other students. Many of them stated that they would value co-located social learning because it enabled them to touch base with the course providers and to meet fellow students. This raises the question as to how these opportunities are presented in their interactions with teachers and students during their module study. 

So far, the study has raised some interesting areas for further investigation, and a potential framework from which to conduct this. It also reminds us of the value of understanding students’ perceptionsthese students saw the structure and support for their courses in a very different way from that intended.  

Useful resources 

Hidden learning spaces: What learning analytics cannot tell us [external link] 

Learning from one another: the value of students’ insights