Skip to content The Open University
  1. Institute of Educational Technology
  2. Feed aggregator
  3. Categories
  4. IET researcher and project officer blogs
Subscribe to Institute of Educational Technology aggregator - IET researcher and project officer blogs

IET researcher and project officer blogs

Everything In Balance

Will Woods's blog - Mon, 17/04/2017 - 15:55

Martin Weller has written some excellent posts on disruption and disruptive innovation. In his most recent blog post on disruption and the unenlightenment he argues that “knowledge of any area itself is viewed as a reason not to trust someone.” I’ve come across this myself, or more critically I’ve seen others placing a higher value upon knowledge which is unencumbered by context, so for example in our own environment having business acumen is treated with higher value than having knowledge of the higher education sector. This has been reflected over the past decade in Job Descriptions and recruitment processes in HE and also applies to politics where Farage and Trump are seen to have more value through coming from outside the political system. Within higher education this has resulted in a rash of appointments of people from outside the sector to senior positions.

This is not necessarily a bad thing. I see the higher education sector like an ecosystem and too much inbreeding within too small a gene pool will lead to stagnation and mutation  – in HE this can be seen as people adopting confirmation bias since meetings with the same cohort provide no novel insight or new interpretations on the original plan. On the other hand too much migration and churn will lead to a different but equally serious problem where specialist knowledge is lost to the organisation and sector and therefore decisions are not based on a full evaluation of evidence. The past influences the future so there is a balance to be struck. When you get new people and talent into an organisation you provide opportunities for cultural advancement and change. Ideas can move across domains in a way that allows things to happen. People ask questions like “why can’t you do it like that?” and you realise that because you had issues previously you have mentally blocked off an opportunity.

As an example I have had some of my richest conversations recently with Rosie Jones the new Director of Library Services. In her induction we discussed using gaming approaches in the workplace to stimulate new thinking as we both have backgrounds in serious gaming.

Animal Crossing 

I have now begun applying some of these approaches in events that I am facilitating for Leadership in Digital Innovation. I wouldn’t have been able to make the mental leap without her fresh perspective on some of the organisational issues, adopting what Dave Coplin might describe as non-linear thinking.

 

My point is that stimulation is a good thing as it can build the conditions for the new system to emerge – but disruption by it’s nature means that, as Martin describes it, “there is no collaboration, working alongside, improving here”. It’s what Bernard Stiegler describes in his interview How to Survive “Disruption” as “a form of widespread dispossession of knowledge, of life skills and indeed of livelihood across Europe through the rapid political, social and technological changes to work and everyday life.”

Crucially for both education and politics we must seek to understand, value, and then challenge the current system in order to create the system we need.

 


LAK17: it’s a wrap

Scattered between my research presentations at LAK17 was my work as a member of the executive for the Society for Learning Analytics Research (SoLAR). The executive met daily during the conference – it is the only chance we have each year for face-to-face meetings. The LAK conferences also provide a venue for the AGM of the society and, despite the size of the room, where the AGM was held, it was standing room only for most of the meeting.

The executive also have a role to play in decisions about the conference itself, as well as acting as reviewers on the programme committee and chairs for the different sessions. Next year, at LAK18 in Vancouver, I shall be taking on a bigger role, as one of the programme chairs for the conference.

The picture shows me with half the SoLAR Executive at the post-LAK17 review meeting.


Learning analytics community Europe

The European FP7-funded learning analytics community exchange (LACE) project came to an end last June. Since then, we have become a special interest group (SIG) of the Society for Learning Analytics Research (SoLAR) and we are now the learning analytics community Europe (LACE).

Although the loss of large-scale funding has meant scaling down our activities, we have still been active and our Twitter account reflects some of that work – including presentations on European learning analytics work in China, Japan and South Korea.

The LAK17 conference provided a chance for eight of the international team to get together and plan our next event, a workshop in our ethics and privacy in learning analytics series (EP4LA) that we are submitting to this year’s ECTEL conference.


LAK Failathon poster

Our LAK Failathon workshop at the start of LAK 17 generated the basic ideas for a poster on how the field of learning analytics can increase its evidence base and avoid failure.

We took the poster to the LAK17 Firehose session, where Doug Clow provided a lightning description of it, and we then used the poster to engage people in discussion about the future of the field.

Despite the low production quality of the poster (two sheets of flip chart paper, some post-it notes and a series of stickers to mark agreement) its interactive quality obviously appealed to participants and we won best poster award. :-)

Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond Failure: The 2nd LAK Failathon Poster. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 540–541.

 


Learning analytics: where is the evidence?

Our main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.

The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.

Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.

Abstract

Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.


LAK17: Failathon

Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.

We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.

Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.

If you can’t access the workshop outline behind the paywall, contact me for a copy.

Abstract

The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.


LAK17: doctoral consortium

A very busy week in Vancouver at the LAK17 (learning analytics and knowledge) conference kicked off with the all-day doctoral consortium on 14 March (funded by SoLAR and the NSF). I joined Bodong Chen and Ani Aghababyan as an organiser this year and we enjoyed working with the ten talented doctoral students from across the world who gained a place in the consortium.

  1. Alexander Whitelock-Wainwright: Students’ intentions to use technology in their learning: The effects of internal and external conditions
  2. Alisa Acosta: The design of learning analytics to support a knowledge community and inquiry approach to secondary science
  3. Daniele Di Mitri: Digital learning shadow: digital projection, state estimation and cognitive inference for the learning self
  4. Danielle Hagood: Learning analytics in non-cognitive domains
  5. Justian Knobbout: Designing a learning analytics capabilities model
  6. Leif Nelson: The purpose of higher education in the discourse of learning analytics
  7. Quan Nguyen: Unravelling the dynamics of learning design within and between disciplines in higher education using learning analytics
  8. Stijn Van Laer: Design guidelines for blended learning environments to support self-regulation: event sequence analysis for investigating learners’ self-regulatory behavior
  9. Tracie Farrell Frey: Seeking relevance: affordances of learning analytics for self-regulated learning
  10. Ye Xiong: Write-and-learn: promoting meaningful learning through concept map-based formative feedback on writing assignments

The intention of the doctoral consortium was to support and inspire doctoral students in their ongoing research efforts. The objectives were to:

  • Provide a setting for mutual feedback on participants’ current research and guidance on future research directions from a mentor panel
  • Create a forum for engaging in dialogue aimed at building capacity in the field with respect to current issues in learning analytics ranging from methods of gathering analytics, interpreting analytics with respect to learning issues, considering ethical issues, relaying the meaning of analytics to impact teaching and learning, etc.
  • Develop a supportive, multidisciplinary community of learning analytics scholars
  • Foster a spirit of collaborative research across countries, institutions and disciplinary background
  • Enhance participating students’ conference experience by connecting participants to other LAK attendees

CBA: impact on student engagement

Our new paper is now – lead author Quan Nguyen – is available online in Computers in Human Behavior. It examines the designs of computer-based assessment and its impact on student engagement, student satisfaction and pass rates.

Computers in Behavior is locked behind a paywall, so contact me for a copy if you can’t get access to the paper.

Abstract

Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on students’ engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in students’ time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online.

Nguyen, Quan; Rienties, Bart; Toetenel, Lisette; Ferguson, Rebecca and Whitelock, Denise (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior (Early access).


Lea’s Box and Eco

Just back from a couple of trips to Luxembourg, where I was one of the team carrying out final reviews for the Lea’s Box and Eco projects. This was my third year reviewing Lea’s Box, but I only joined the Eco team for their final review.

Lea’s Box was ‘a 3-year research and development project (running from March 2014 to [January] 2016) funded by the European Commission. The project focussed on (a) making educational assessment and appraisal more goal-oriented, proactive, and beneficial for students, and (b) on enabling formative support of teachers and other educational stakeholders on a solid basis of a wide range of information about learners.’

Eco was ‘a European project based on Open Educational Resources (OER) that gives free access to a list of MOOC (Massive Open Online Courses) in 6 languages […] The main goal of this project is to broaden access to education and to improve the quality and cost-effectiveness of teaching and learning in Europe.’


Introducing learning analytics

After talking about learning analytics at the BETT show, I was invited to write about them for the Public Service Executive magazine. The hard copy of PSE goes out to 9,000 subscribers, while the online version goes out to a database of 50,000.

This article provides a short introduction to learning analytics for people considering introducing analytics at their institution. It introduces six areas for action, and briefly outlines what needs to be done in each of these:

Areas for action

  • Leadership and governance
  • Collaboration and networking
  • Teaching and learning
  • Quality assurance
  • Capacity building
  • Infrastructure

 


Dimensions of personalisation in TEL

New paper out in the British Journal of Educational Technology, co-authored with a host of people. Lead author Liz FitzGerald plus Natalia Kucirkova, Ann Jones, Simon Cross, Thea Herodotou, Garron Hillaire and Eileen Scanlon.

The framework proposed in the paper has six dimensions:

  1. what is being personalised
  2. type of learning
  3. personal characteristics of the learner
  4. who/what is doing the personalisation
  5. how personalisation is carried out
  6. impact / beneficiaries
Abstract

Personalisation of learning is a recurring trend in our society, referred to in government speeches, popular media, conference and research papers and technological innovations. This latter aspect – of using personalisation in technology-enhanced learning (TEL) – has promised much but has not always lived up to the claims made. Personalisation is often perceived to be a positive phenomenon, but it is often difficult to know how to implement it effectively within educational technology.

In order to address this problem, we propose a framework for the analysis and creation of personalised TEL. This article outlines and explains this framework with examples from a series of case studies. The framework serves as a valuable resource in order to change or consolidate existing practice and suggests design guidelines for effective implementations of future personalised TEL.

FitzGerald, Elizabeth; Kucirkova, Natalia; Jones, Ann; Cross, Simon; Ferguson, Rebecca; Herodotou, Christothea; Hillaire, Garron and Scanlon, Eileen (2017). Dimensions of personalisation in technology-enhanced learning: a framework and implications for design. British Journal of Educational Technology (early view).


Barcelona: FutureLearn Academic Network

On 27 January, I travelled to Pompeu Fabra university in Barcelona for a meeting of the FutureLearn Academic Network (FLAN) on The Educator Experience. This was the first FLAN meeting to take place outside the UK and it was held at UPF’s Poblenou Campus. The event was organised by CLIK (Center for Learning, Innovation and Knowledge) and the members of the Educational Technologies section within the Interactive Technologies Research Group of UPF.

During the meeting, FutureLearn partners reflected on the impact and research possibilities of MOOC in the field of education. Sir Timothy O’Shea, Principal and Vice-Chancellor of the University of Edinburgh, gave the keynote speech, describing Edinburgh’s developing MOOC strategy, including the production of 64 online master’s courses.

I talked about our recent report MOOCs; What the Research of FutureLearn’s UK Partners Tells Us

 


BETT 2017: learning analytics

On 25 January, I presented at the BETT trade show on An action plan for learning analytics. If you would like to introduce learning analytics at your institution, where should you start? Drawing on recent studies that consulted experts worldwide, I outlined an action plan for analytics and identified the key points to keep in mind.

My talk formed part of the HE Leaders Summit, a section of the event that was designed to address some of the most significant challenges currently facing senior leaders across Higher Education.


MOOCs: What the UK research tells us

Our latest quality enhancement report, MOOCs; What the Research of FutureLearn’s UK Partners Tells Us came out in late January 2017. The rport was co-authored with Tim Coughlan, Christothea Herodotou and Eileen Scanlon. It follows an earlier report on what MOOC research from The Open University tells us.

The report provides brief summaries of, and links to, all accessible publications stored in the repositories of  FutureLearn’s UK academic partners that deal with research on MOOCs. Where these publications made recommendations that could be taken up, these recommendations are highlighted within the report. Full references for all studies are provided in the bibliography.

Studies are divided thematically, and the report contains sections on MOOCs as a field, pedagogy and teaching, accessibility, retention, motivation and engagement, assessment and accreditation, study skills, MOOCs around the world, and sustainability.

The report contains 59 recommendations that have emerged from the publications and each of these is linked to the research study that generated it.

MOOC priority areas

1. Develop a strategic approach to learning at scale.

2. Develop appropriate pedagogy for learning at scale.

3. Identify and share effective learning designs.

4. Support discussion more effectively.

5. Clarify learner expectations.

6. Develop educator teams.

7. Widen access.

8. Develop new approaches to assessment and accreditation.


Symposium with Gothenburg

On 23 January I presented at a joint symposium involving The Open University and the University of Gothenburg. Eleven participants from Gothenburg met with ten Open University researchers. Eight presentations, four from Gothenburg and four from The Open University allowed discussion of areas of mutual interest.

My presentation focused on what the UK research carried out by UK partners in the FutureLearn platform tells us. I presented a longer version of the talk to the FutureLearn Academic Network (FLAN) later in the week, so it is embedded in a later blog post.


Ringing the changes in HE

Will Woods's blog - Tue, 14/02/2017 - 19:31

I’ve been working with a group of colleagues across the Open University in a very collegiate spirit to develop a coherent Vision and Plan for Learning and Teaching. We are also developing a vision for our leadership in digital innovation which is complimentary. We are doing this at a time of unprecedented change for UK Higher Education, not simply because of the HE Bill and TEF and the changes those bring with them (N.B. despite the OU not entering TEF this year we still have a lot of work to do lobbying for changes, supporting the four nations agenda and national policies and preparing for the time when we will enter TEF which involves collecting and interpreting data to better differentiate part-time learners, their prior experience/level of knowledge and their learning gain) but also the wider changes resulting from the UK’s exit from the EU and implications from changes in U.S. policy. This makes it challenging to construct a vision that is both grounded but is also fixed on the far horizon and so can guide actions for transformation.

As far as Innovation is concerned we’ve been looking to the Educause “Building a Culture of Innovation in HE: Design and Practice for Leaders” as a tool to help us identify areas to prioritize. There are a series of near horizon and far horizon goals that we wish to achieve through this process. Near horizon goals aim to improve the current system of learning and teaching at the OU, while far horizon goals simultaneously build the conditions from which a new system can emerge (figure 1).

Figure 1 – Shifting from Improvement to Innovation (extracted from Educause “Building a Culture of Innovation in Higher Education: Design and Practice for Leaders”)

 

 

 

 

 

 

 

 

There is element of crystal ball gazing to all of this endeavour (although some market research and academic research is also involved). I was taken with this recent post by Joshua Kim for Inside Higher Ed which resonates with some of my feelings around HE. It’s called Why Our Higher Ed Transformation Crowd Should Read ‘The Upstarts’ and emphasizes that the antecedents for transformative change are rarely understood in advance. We can create the conditions but we cannot imagine the impact (or not).

All this work has come to the attention of others in high places and so I am having my own personal transformative change. I’m leaving my role as Head of Incubation at the end of this month to take up a new role as Head of Strategy and Policy (including a continued responsibility for co-ordination of incubation/innovation). I’m going to miss the Learning and Teaching Development team which includes the Learning Design team that I’ve been managing for the past few months, they are great people doing fantastic but hugely undervalued work.

This change consequently means an alignment and co-ordination of the Learning Design and TEL-Design (Technology Enhanced Learning Design) teams to have a coherent organisational approach and vision for Learning Design and clear ownership and responsibility for aspects of LD under Rebecca Galley (Head of TEL). We are also defining the homes for enabling elements for LD including data which is becoming increasingly valuable for decision making.

From next month I’ll be managing the Strategic Planning and Policy team. I will also be moving away from the academic side of business and from the Institute of Educational Technology to focus on this new role within the Learning and Teaching Innovation Portfolio. I’m also in my second week of the Masters course in Online and Distance Education to better understand the theory around what I’m doing. It’s a seriously well constructed course and I’m really enjoying my tutor group chats. I think I’m becoming slightly addicted to this online learning thing but I’ll see if I remain enthusiastic after my first exam!

Crucially though despite all the changes I’m  keeping a hot desk in the Jennie Lee building so that I can continue to network with academic colleagues (..and steal their coffee and biscuits)!

 

 


Research Evidence on the Use of Learning Analytics: Implications for Education Policy

The final report on our study of learning analytics for European educational policy (LAEP) is now out.

Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.

The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.

Learning Analytics: Action List

Policy leadership and governance practices

  • Develop common visions of learning analytics that address strategic objectives and priorities
  • Develop a roadmap for learning analytics within Europe
  • Align learning analytics work with different sectors of education
  • Develop frameworks that enable the development of analytics
  • Assign responsibility for the development of learning analytics within Europe
  • Continuously work on reaching common understanding and developing new priorities

Institutional leadership and governance practices

  • Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
  • Develop practices that are appropriate to different contexts
  • Develop and employ ethical standards, including data protection

Collaboration and networking

  • Identify and build on work in related areas and other countries
  • Engage stakeholders throughout the process to create learning analytics that have useful features
  • Support collaboration with commercial organisations

Teaching and learning practices

  • Develop learning analytics that makes good use of pedagogy
  • Align analytics with assessment practices

Quality assessment and assurance practices

  • Develop a robust quality assurance process to ensure the validity and reliability of tools
  • Develop evaluation checklists for learning analytics tools

Capacity building

  • Identify the skills required in different areas
  • Train and support researchers and developers to work in this field
  • Train and support educators to use analytics to support achievement

Infrastructure

  • Develop technologies that enable development of analytics
  • Adapt and employ interoperability standards

Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.


Tweeting in 2016

Twitter identifies my top tweet, my top mention and my top media tweet. My followers appear to be most interested in globalised online learning.


Developing a strategic approach to MOOCs

Our introductory article for the JIME special issue on MOOCs focused on the research work carried out in the area by UK universities who are FutureLearn partners.

‘Developing a strategic approach to MOOCs’ uses the work carried out at these universities to identify nine priority areas for MOOC research and how these can be developed in the future:

  1. Develop a strategic approach to MOOCs.
  2. Expand the benefits of teaching and learning in MOOCs.
  3. Offer well-designed assessment and accreditation.
  4. Widen participation and extend access.
  5. Develop and make effective use of appropriate pedagogies.
  6. Support the development of educators.
  7. Make effective use of learning design.
  8. Develop methods of quality assurance.
  9. Address issues related to privacy and ethics.

Ferguson, Rebecca; Scanlon, Eileen and Harris, Lisa (2016). Developing a strategic approach to MOOCs. Journal of Interactive Media in Education, 2016(1), article no. 21.

Abstract

During the last eight years, interest in massive open online courses (MOOCs) has grown fast and continuously worldwide. Universities that had never engaged with open or online learning have begun to run courses in these new environments. Millions of learners have joined these courses, many of them new to learning at this level. Amid all this learning and teaching activity, researchers have been busy investigating different aspects of this new phenomenon. In this contribution we look at one substantial body of work, publications on MOOCs that were produced at the 29 UK universities connected to the FutureLearn MOOC platform. Bringing these papers together, and considering them as a body of related work, reveals a set of nine priority areas for MOOC research and development. We suggest that these priority areas could be used to develop a strategic approach to learning at scale. We also show how the papers in this special issue align with these priority areas, forming a basis for future work.

1 of 2