What students say about … click-and-reveal discussion activities

You may be familiar with click-and-reveal discussion activities. These are simple learning activities, which ask students to consider a question and then click to reveal an answer, some further discussion or feedback. This type of question—feedback device can be seen in face-to-face classroom environments when a lecturer poses a question to the class and then provides verbal feedback on students’ responses. In distance learning delivered in print, questions can be posed within text and answers provided at the end of the book or feedback provided directly after the question. In an
A click-and-reveal activity in the style of the Open University's VLEonline virtual learning environment, students read a question onscreen, answer it (sometimes using a fill-in text box) and then click a button to reveal a discussion or feedback.

Some students enjoy the fast reward provided by click-and-reveal discussion activities, whilst others find them dissatisfying and low value. Could this divided opinion be due to the type of activity or relate more to the how the activities are designed? To understand more about how Open University (OU) students engage with click-and-reveal discussion activities and how well they aid their learning, we asked our Curriculum Design Student Panel about their experiences with this type of activity via a short survey. The panel is comprised of OU students and whose experiences and opinions inform design decisions on OU modules.

What did students tell us?

The responses to this survey numbered 362 and of those only 6% (22) reported never having encountered a click-and-reveal discussion activity in the OU modules they had studied. When asked how often they made an attempt to answer the question posed, 58% students (210) responded ‘always’ or ‘often’.

Results of a survey about students' use of click-and-reveal activities

A detailed analysis of the data revealed that the most experienced students (those studying at Level 3 students or post-graduate level) were less likely to engage with click-and-reveal discussion activities than Level 1 and 2 students. Also, Faculty of Business and Law students interacted with these activities much less than average, and Faculty of Wellbeing, Education and Language Studies students engaged with them much more than average.

The survey revealed that students’ attitudes to click-and-reveal discussion activities fall into four roughly equal categories: highly positive experiences; engaged to a limited extent; those who felt the activities would benefit from modification; and negative experiences.

Students who had a highly positive experience found click-and-reveal discussion activities a useful means to actively engage with and reflect on the module content. They found the activities to be a low risk mechanism to test themselves on their knowledge and found them useful in that the ‘reveal discussion’ part presents the material in an alternative way that aids understanding.

Where engagement with the activities was to a limited extent, students decided to constrain their engagement by only:

  • writing brief answers for example as a bullet point list
  • thinking about the answer but not actually writing anything down
  • reading the revealed answer without answering it at all
  • attempting the activity if it related to an assessment.

Those students who felt the activities would be more useful if modified, said that the activities would work better if they could be assessed, either by making them multiple choice, or by using AI to analyse their free text responses, or through peer evaluation, either through tutor group forums or by a ranking system. It was also mentioned that using the questions as a prompt to generate notes that could be used for revision was also a positive aspect of the activities.

The activities prompted a negative experience for some students because they interrupted their flow and focus on the reading. Issues were also raised with the content of some activities in that they sometimes provided answers that students did not feel matched what they had learned to that point, were superficial in design or in the types of answers given and so appeared to lack value, and could be demotivating when the discussion highlighted a lack of understanding.


Top tips for designing effective click-and-reveal discussion activities

  • Create authentic learning experiences

Avoid the tendency to simply use the click-and-reveal discussion as a means to present more reading under the disguise of an activity, as this feels inauthentic to students.  Instead use them where they are an appropriate method to allow students to test and expand their understanding.

  • Use a variety of activity types

Don’t rely on click-and-reveal discussion activities as a predominant activity type. Use a variety of activities to keep students engaged and aid retention (Van Ameijde et al., 2015).

  • Use feedback to deepen learning and incentivise engagement

Feedback must:

  • Relate to what students have been asked to do
  • be at an appropriate level
  • enable students to learn, even if they have not provided the expected answer.

For example, use epistemic ‘triple loop’ feedback there is a need to develop independent learning (Kirschner and Neelen, 2018), but also use activities which provide less intensive forms of feedback such as pattern matching activities, multiple choice questions or peer evaluation.

  • Challenge students but match activities to what they’ve learned

It is important to ensure that activities represent actual knowledge or skills students could have acquired at that point and are useful for the students’ learning.


Thanks

Thanks to Learning Design team friend Mark Childs for his contributions to this research and its outputs.

References

Van Ameijde, J., Cross, S. and Weller, M. (2015) Designing for student retention. The ICEBERG Model and Key Design Tips [Online]. Available at https://blog.edtechie.net/learning-design/designing-for-retention-the-iceberg-model/ (Accessed 10 September 2020)

Kirschner, P. A. and Neelen, M. (2018) ‘No feedback, no learning’ 3-star learning experiences [Online]. Available at https://3starlearningexperiences.wordpress.com/2018/06/05/no-feedback-no-learning/ (Accessed 10 September 2020)

Engagement with others online: students’ views of course design

As learning designers, it’s essential that we explore students’ needs and goals. That way, we can make sure that learning activities address these needs and support students to reach their goals. For example, each time we design a new module, we take time to explore student data and course teams’ experience to build up student profiles or personas that can be referred to throughout the module design process. 

It’s also important that we understand students’ experiences of their learning, such as the times and days they study, and how and if they work with other students. Without this, we risk making assumptions that could cause students to struggle or even drop out. I2018, our Learning Experience & Technology team conducted a series of focus groups with students at a range of distance learning institutions, including the OU.  This post takes a look at what they learnt.

How do students view their courses? 

The focus groups were a small sample size in total (only 22 students: eight from the OU and 14 from a range of other distance learning providers) but they did provide an opportunity to find out how some students see their courses when given an opportunity to talk about their experiences 

The six hours of students talking provided a lot more data that didn’t make it into the videos. Mark Childs, a lecturer in microcredentials and technology enhanced learning at the OU’s Institute of Educational Technology, analysed the data to identify trends and other information that could inform our work. 

What does face-to-face teaching mean? 

Mark analysed the students separately depending on whether they were under 30 or over 30 (30 was the median age), to look for differences in their experiences. Two stood out: 

  1. The younger students tended to have jobs that have less flexibility, so the synchronous elements tend to be less accessible for them. 
  2. The younger students used face-to-face to describe videoconferencing  the logic being you can see the other person’s face. 

I’ve noticed this last one before, in a project I did at Loughborough University,’ Mark says. ‘I now try and drop using the phrase “face-to-face” as it’s obviously ambiguous, and use “co-located” – meaning physically present – instead. However, it’s difficult to stop the habit. 

To provide some focus to the analysis, Mark looked at how the students engaged with other people. He used qualitative analysis to review the transcripts, then identified themes and grouped similar ones together. What emerged was that the students saw their engagement with others on the course as being in three main scenarios: 

  • Synchronous one-to-ones with tutors 
  • Synchronous sessions with tutorial groups led by a tutor 
  • Asynchronous discussions with other students 

The students also talked about a fourth strand that they created for themselves. Only two of the 22 students said they had an opportunity to meet others on their course in the same space: these were one from the University of Sheffield and one from the OU. The others felt they missed out on this experience, so many of them talked about how they had created their own opportunities for doing this.  

It’s important to note that students may not have been aware of the course design element. They could also have forgotten it, or just not mentioned it when asked. And of course, there’s no way the views of 22 students can represent all the students at those universities. However, it provides an interesting start for a more representative study. 

Course design is just the beginning 

Mark’s research also suggested that course design doesn’t end with what we provide to the student. Students take what we provide and re-create it to suit themselves. 

Examples of how other students had created their own co-located learning experiences were: 

  • Physical social learning spaces. Two of the younger non-OU students had visited Google Campus, which is a place in London where technology start-ups have an office space, but anyone can use the downstairs café. This meant that there were people around who were working and studying and could offer advice. It was also an atmosphere that they felt supported their learning, because it offered both social and quiet areas. 
  • Creating their own blend of spaces. The older non-OU students didn’t discuss Google Campus but did talk about how they mixed libraries, cafes and their offices to create a blend of different types of spaces to work in to meet their needs for quiet but shared places and more social shared spaces.  
  • Creating their own co-located peer groups. One of the younger OU students filled in for her lack of co-located contact with others by identifying others through the module’s Facebook group and arranging to meet up with them in a monthly study group. 

However, the key element of all of these co-located learning spaces was that they were a) convenient and b) low costThis was consistent across both age groups, and for both OU and non-OU students. In other words, meeting other learners in a physical space is something learners like to do, but its not considered worth it if they have to travel a long way or pay for it. 

Creating connections 

One element that particularly needs further investigation is that some of these students who created their own co-located groups were also those who weren’t taking advantage of the online discussions with other students. Many of them stated that they would value co-located social learning because it enabled them to touch base with the course providers and to meet fellow students. This raises the question as to how these opportunities are presented in their interactions with teachers and students during their module study. 

So far, the study has raised some interesting areas for further investigation, and a potential framework from which to conduct this. It also reminds us of the value of understanding students’ perceptionsthese students saw the structure and support for their courses in a very different way from that intended.  

Useful resources 

Hidden learning spaces: What learning analytics cannot tell us [external link] 

Learning from one another: the value of students’ insights