Learning Analytics – Recent work at the OU

Garron Hillaire Title: Prototyping Visual Learning Analytics Guided by an Educational Theory Informed Goal Abstract:Prototype work can support the creation of data visualizations throughout the research and development process through paper prototypes with sketching, designed prototypes with graphic design tools, and functional prototypes to explore how the implementation will work. One challenging aspect of data visualization work is coordinating the expertise of people from a variety of roles to produce data visualizations guided by an educational theory informed goal (ETIG) in order to better support research. When collaborating, concessions must be made: typically, everyone seeks to follow the best practices established within their own disciplines. This paper attempts to illustrate how to rethink this interdisciplinary approach to adhere more strictly to educational research goals and consider how we may need to, at times, break away from best practices with the intent to evaluate the novel decisions resulting from this approach. A case study of the creation of a self-reported emotional measure is used to illustrate this type of collaboration. By taking this approach, a clear departure from best practices occurs in the scale selection for the visualization in order to support the ETIG.


Thomas Ullman Title: Reflective Writing Analytics – Empirically Determined Keywords of Written Reflection Abstract: Despite their importance for educational practice, reflective writings are still manually analysed and assessed posing a constraint on the use of this educational technique. Recently, research started to investigate automated approaches for analysing reflective writing. Foundational to many automated approaches is the knowledge of words that are important for the particular genre. This research presents keywords that are specific to several categories of a reflective writing model. These keywords have been derived from eight datasets, which contain several thousand instances using the log-likelihood method. Both performance measures, the accuracy and the Cohen’s κ, for these keywords were estimated with ten-fold cross validation. The results reached an accuracy of 0.78 on average for all eight categories and a fair to good inter-rater reliability for most categories even though it did not make use of any sophisticated rule-based mechanisms or machine learning approaches. This research contributes to the development of automated reflective writing analytics that are based on data-driven empirical foundations.


Quan Nguyen Title :Unravelling the dynamics of instructional practice: A longitudinal study on learning design and VLE activities Abstract:Learning analytics has the power to provide just-in-time support, especially when predictive analytics is married with the way teachers have designed their course, or so-called learning design. Although recently substantial progress has been made in aligning learning analytics with learning design, there is still a shortage of empirical evidence of how teachers actually design their courses, and how this influences how students learn. This study investigates how learning design is configured over time and its impact on student activities by analyzing longitudinal data of 38 modules with a total of 43,099 registered students over 30 weeks at the Open University UK, using social network analysis and panel data analysis. Our analysis unpacked dynamic configurations of learning design between modules over time, which allows teachers to reflect on their practice in order to anticipate problems and make informed interventions. Furthermore, by controlling for the heterogeneity between modules, our results indicated that learning designs are able to explain up to 60% of the variability in student online activities, which reinforced the importance of pedagogical context in learning analytics.


Thea Herodotou Title:Implementing Predictive Learning Analytics on a Large Scale: The Teacher’s Perspective Abstract: We describe a large-scale study about the use of predictive learning analytics data with 240 teachers in 10 modules at a distance learning higher education institution. The aim of the study was to illuminate teachers’ uses and practices of predictive data, in particular identify how predictive data was used to support students at risk of not completing or failing a module. Data were collected from statistical analysis of 17,033 students’ performance by the end of the intervention, teacher usage statistics, and five individual semi-structured interviews with teachers. Findings revealed that teachers endorse the use of predictive data to support their practice yet in diverse ways and raised the need for devising appropriate intervention strategies to support students at risk.

Leave a Reply

Your email address will not be published. Required fields are marked *