Skip to content The Open University
  1. Institute of Educational Technology
  2. Feed aggregator
  3. Categories
  4. IET researcher and project officer blogs
Subscribe to Institute of Educational Technology aggregator - IET researcher and project officer blogs

IET researcher and project officer blogs

Ringing the changes in HE

Will Woods's blog - Tue, 14/02/2017 - 19:31

I’ve been working with a group of colleagues across the Open University in a very collegiate spirit to develop a coherent Vision and Plan for Learning and Teaching. We are also developing a vision for our leadership in digital innovation which is complimentary. We are doing this at a time of unprecedented change for UK Higher Education, not simply because of the HE Bill and TEF and the changes those bring with them (N.B. despite the OU not entering TEF this year we still have a lot of work to do lobbying for changes, supporting the four nations agenda and national policies and preparing for the time when we will enter TEF which involves collecting and interpreting data to better differentiate part-time learners, their prior experience/level of knowledge and their learning gain) but also the wider changes resulting from the UK’s exit from the EU and implications from changes in U.S. policy. This makes it challenging to construct a vision that is both grounded but is also fixed on the far horizon and so can guide actions for transformation.

As far as Innovation is concerned we’ve been looking to the Educause “Building a Culture of Innovation in HE: Design and Practice for Leaders” as a tool to help us identify areas to prioritize. There are a series of near horizon and far horizon goals that we wish to achieve through this process. Near horizon goals aim to improve the current system of learning and teaching at the OU, while far horizon goals simultaneously build the conditions from which a new system can emerge (figure 1).

Figure 1 – Shifting from Improvement to Innovation (extracted from Educause “Building a Culture of Innovation in Higher Education: Design and Practice for Leaders”)

 

 

 

 

 

 

 

 

There is element of crystal ball gazing to all of this endeavour (although some market research and academic research is also involved). I was taken with this recent post by Joshua Kim for Inside Higher Ed which resonates with some of my feelings around HE. It’s called Why Our Higher Ed Transformation Crowd Should Read ‘The Upstarts’ and emphasizes that the antecedents for transformative change are rarely understood in advance. We can create the conditions but we cannot imagine the impact (or not).

All this work has come to the attention of others in high places and so I am having my own personal transformative change. I’m leaving my role as Head of Incubation at the end of this month to take up a new role as Head of Strategy and Policy (including a continued responsibility for co-ordination of incubation/innovation). I’m going to miss the Learning and Teaching Development team which includes the Learning Design team that I’ve been managing for the past few months, they are great people doing fantastic but hugely undervalued work.

This change consequently means an alignment and co-ordination of the Learning Design and TEL-Design (Technology Enhanced Learning Design) teams to have a coherent organisational approach and vision for Learning Design and clear ownership and responsibility for aspects of LD under Rebecca Galley (Head of TEL). We are also defining the homes for enabling elements for LD including data which is becoming increasingly valuable for decision making.

From next month I’ll be managing the Strategic Planning and Policy team. I will also be moving away from the academic side of business and from the Institute of Educational Technology to focus on this new role within the Learning and Teaching Innovation Portfolio. I’m also in my second week of the Masters course in Online and Distance Education to better understand the theory around what I’m doing. It’s a seriously well constructed course and I’m really enjoying my tutor group chats. I think I’m becoming slightly addicted to this online learning thing but I’ll see if I remain enthusiastic after my first exam!

Crucially though despite all the changes I’m  keeping a hot desk in the Jennie Lee building so that I can continue to network with academic colleagues (..and steal their coffee and biscuits)!

 

 


Research Evidence on the Use of Learning Analytics: Implications for Education Policy

The final report on our study of learning analytics for European educational policy (LAEP) is now out.

Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.

The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.

Learning Analytics: Action List

Policy leadership and governance practices

  • Develop common visions of learning analytics that address strategic objectives and priorities
  • Develop a roadmap for learning analytics within Europe
  • Align learning analytics work with different sectors of education
  • Develop frameworks that enable the development of analytics
  • Assign responsibility for the development of learning analytics within Europe
  • Continuously work on reaching common understanding and developing new priorities

Institutional leadership and governance practices

  • Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
  • Develop practices that are appropriate to different contexts
  • Develop and employ ethical standards, including data protection

Collaboration and networking

  • Identify and build on work in related areas and other countries
  • Engage stakeholders throughout the process to create learning analytics that have useful features
  • Support collaboration with commercial organisations

Teaching and learning practices

  • Develop learning analytics that makes good use of pedagogy
  • Align analytics with assessment practices

Quality assessment and assurance practices

  • Develop a robust quality assurance process to ensure the validity and reliability of tools
  • Develop evaluation checklists for learning analytics tools

Capacity building

  • Identify the skills required in different areas
  • Train and support researchers and developers to work in this field
  • Train and support educators to use analytics to support achievement

Infrastructure

  • Develop technologies that enable development of analytics
  • Adapt and employ interoperability standards

Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.


Tweeting in 2016

Twitter identifies my top tweet, my top mention and my top media tweet. My followers appear to be most interested in globalised online learning.


Developing a strategic approach to MOOCs

Our introductory article for the JIME special issue on MOOCs focused on the research work carried out in the area by UK universities who are FutureLearn partners.

‘Developing a strategic approach to MOOCs’ uses the work carried out at these universities to identify nine priority areas for MOOC research and how these can be developed in the future:

  1. Develop a strategic approach to MOOCs.
  2. Expand the benefits of teaching and learning in MOOCs.
  3. Offer well-designed assessment and accreditation.
  4. Widen participation and extend access.
  5. Develop and make effective use of appropriate pedagogies.
  6. Support the development of educators.
  7. Make effective use of learning design.
  8. Develop methods of quality assurance.
  9. Address issues related to privacy and ethics.

Ferguson, Rebecca; Scanlon, Eileen and Harris, Lisa (2016). Developing a strategic approach to MOOCs. Journal of Interactive Media in Education, 2016(1), article no. 21.

Abstract

During the last eight years, interest in massive open online courses (MOOCs) has grown fast and continuously worldwide. Universities that had never engaged with open or online learning have begun to run courses in these new environments. Millions of learners have joined these courses, many of them new to learning at this level. Amid all this learning and teaching activity, researchers have been busy investigating different aspects of this new phenomenon. In this contribution we look at one substantial body of work, publications on MOOCs that were produced at the 29 UK universities connected to the FutureLearn MOOC platform. Bringing these papers together, and considering them as a body of related work, reveals a set of nine priority areas for MOOC research and development. We suggest that these priority areas could be used to develop a strategic approach to learning at scale. We also show how the papers in this special issue align with these priority areas, forming a basis for future work.

Researching MOOCs: JIME special issue

I was one of the editors of a special issue of the Journal of Interactive Media in Education (JIME) on Researching MOOCs. The special issue draws on the work of the FutureLearn Academic Network (FLAN), which is made up of academics st universities that are FutureLearn partners.

Other editors were Eileen Scanlon (The Open University) and Lisa Harris (University of Southampton).

The special issue contains five papers.

 


Dr Bektik: Duygu’s viva

On 14th December, Duygu Bektik defended her thesis successfully, and now only minor corrections stand between her and her doctorate.

Learning Analytics for Academic Writing through Automatic Identification of Meta-Discourse

When assessing student writing, tutors look for ability to present well-reasoned arguments, signalled by elements of meta-discourse. Some natural language processing systems can detect rhetorical moves in scholarly texts, but no previous work has investigated whether these tools can analyse student writing reliably. Duygu’s thesis evaluates the Xerox Incremental Parser (XIP), sets out ways in which it could be changed to support the analysis of student writing and proposes how its output could be delivered to tutors. It also investigates how tutors define the quality of undergraduate writing and identifies key elements that can be used to identify good student writing in the social sciences.

Duygu was supervised by me and by Denise Whitelock, Anna De Liddo and Simon Buckingham Shum (now at .

Her viva examiners were Paul Mulholland and Gary Wills (University of Southampton). The viva was chaired by John Richardson.


European research and innovation priorities

On 13 December, I joined a Foresight Workshop on Learning Technologies in Luxembourg. The workshop was designed to help the European Commission to set and define future European strategic research and innovation priorities.

The workshop began with a series of ‘Moonshots’. Individual experts presented ambitious, yet realistic, targets for EU-funded learning technology research and innovation up to 2025. For each of these, we considered: What is the problem? How is it dealt with now? What difference would it make if this problem were addressed successfully?

We went on to merge our individual Moonshots into Constellations and then into Galaxies. We made links between the different ideas, linking them with other international activities and trends, as well as to previous EU-funded work. I was interested to see that many of the experts from across Europe presented ideas associated with blockchain for learning, a pedagogy that was picked up in our recent Innovating Pedagogy report.

My moonshot focused on a series of problems: access to tertiary education is unequal, most people in Europe do not complete tertiary education and many people in Europe need to develop new skills. Massive open online courses (MOOCs) offer a potential solution, but these new approaches to learning require new approaches to teaching. Teachers need training and support to work effectively in these new environments. They also need proven models of good practice. Improving educator effectiveness on these courses has the potential to increase Europe’s capacity to respond to its priority areas. It also has the potential to open up education for millions by developing and sharing knowledge of how to teach at scale.

 


Innovating Pedagogy 2016

Great to see this year’s Innovating Pedagogy 2016 report out. This report, which I co-author with others at The Open University, highlights ten trends that will impact education over the next decade. These include Design Thinking, Productive Failure, Formative Analytics and Translanguaging. The report also presents evidence to inform decisions about which pedagogies to adopt. The pedagogies range from ones already being tested in classrooms, such as learning through video games, to ideas for the future, like adapting blockchain technology for trading educational reputation.

This year, the report has been written in collaboration with the Learning Sciences Lab, National Institute of Education, Singapore.

The ten trends covered this year are:

  1. Learning through social media: Using social media to offer long-term learning opportunities
  2. Productive failure: Drawing on experience to gain deeper understanding
  3. Teachback: Learning by explaining what we have been taught
  4. Design thinking: Applying design methods in order to solve problems
  5. Learning from the crowd: Using the public as a source of knowledge and opinion
  6. Learning through video games: Making learning fun, interactive and stimulating
  7. Formative analytics: Developing analytics that help learners to reflect and improve
  8. Learning for the future: Preparing students for work and life in an unpredictable future
  9. Translanguaging: Enriching learning through the use of multiple languages
  10. Blockchain for learning: Storing, validating and trading educational reputation

Policies for using Big Data

The PELARS project (Practice-based Experiential Learning Analytics Research And Support) invited me to Brussels for their Policies for using Big Data event on 9 November. The aim of the  workshop was to raise awareness about the potential of analysis of data produced by learning technologies to catalyze the effective design of adaptive teaching, learning and assessment at scale. The aim was to bring together people interested in exploring the state-of-the-art of learning analytics, as well as to be informed about opportunities and barriers for adoption.

I chaired the panel discussion at the event, and was also able to talk to participants about the LACE project, following a presentation on LACE by Hendrik Drachsler.


JISC effective learning analytics

Following my visit to Korea, it was great to see Il-Hyun Jo at the 8th UK Learning Analytics event, which was organised at The Open University by JISC.

Il-Hyun talked about the problems associated with learning analytics in a country where grades are allocated in relation to a normal distribution curve – so if one student’s grades go up, another student’s grades will go do – and where competition to enter universities is so intense that retention is not viewed as a problem.

 


MOOCs and Open Education around the World

The book MOOCS and Open Education Around the World, to which I contributed a chapter, has been very successful. Most recently, it won a DDL Distance Education Book Award. This award is presented in recognition of a print or digital book published within the last three years that describes important theoretical or practical aspects of distance education that can help others involved in distance education or those researching an important aspect of distance education. The primary focus of the book must be directly related to distance education.

AECT Division of Distance Learning (DDL) Distance Education Book Award. 2016 – First Place. MOOCs and Open Education around the World, Editors: Curtis J. Bonk, Mimi M. Lee, Thomas C. Reeves and Thomas H. Reynolds. NY: Routledge. Presented at the 2016 Conference of the Association for Educational Technology and Communications, Las Vegas.


LASI Asia

While I was in Seoul in September, I took part in the Asian Learning Analytics Summer Institute (LASI Asia). I was joined there by members of the LACE team, who included the event as part of the LACE tour of Asia, which also took in Japan and Korea.

During LASI Asia, I gave a talk about what is on the horizon for learning analytics. This went into more detail, and was aimed at a more specialist audience, than my talk at e-Learning Korea. I also took part in a couple of panel discussions. The first was on how to build an international community on learning analytics research, and the second was on the achievements of learning analytics research and next steps.

Abstract

There is general agreement that the importance of learning analytics is likely to increase in the coming decade. However, little guidance for policy makers has been forthcoming from the technologists, educationalists and teachers who are driving the development of learning analytics. The Visions of the Future study was carried out by the LACE project in to order to provide some perspectives that could feed into the policy process.
The study took the form of a ‘policy Delphi’, which is to say that it was not concerned with certainty about the future, but rather focused on understanding the trends issues which will be driving the field forward in the coming years. The project partners developed eight visions of the future of learning analytics in 2025. These visions were shared with invited experts and LACE contacts through an online questionnaire, and consultation with stakeholders was carried out at events. Respondents were asked to rate the visions in terms of their feasibility and desirability, and the actions which should be taken in the light of their judgements. 487 responses to visions were received from 133 people. The views of the respondents on how the future may evolve are both interesting and entertaining. More significantly, analysis of the ratings and free text responses showed that for the experts and practitioners who engaged in the study, there was a consensus around a number of points which are shaping the future of learning analytics.

1. There is a lot of enthusiasm for Learning Analytics, but concern that its potential will not be fulfilled. It is therefore appropriate for policy makers to take a role.
2. Policies and infrastructure are necessary to strengthen the rights of the data subject.
3. Interoperability specifications and open infrastructures are an essential enabling technology. These can support the rights of the data subject, and ensure control of analytics processes at the appropriate level.
4. Learning analytics should not imply automation of teaching and learning.

The full results of the study are published in a report at http://www.laceproject.eu/deliverables/d3-2-visions-of-the-future-2/.

In this session the visions explored by the LACE study will be presented, the conclusions discussed, and the audience will take part in an impromtu mapping of the most desirable and feasible vision of the future for learning analytics in Asia.


Learning analytics in Korea

I was invited to speak at e-Learning Korea 2016 in Seoul on 21-22 September. My presentation focused on the visions of the future work that I had carried out as part of the LACE project.

Abstract

Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and early adopters around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in April 2006, we had begun developing learning analytics for 2016, we might not have planned specifically for learning with and through social networks (Twitter was launched in July 2006), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. In this talk, Rebecca will introduce a range of different scenarios that explore different ways in which learning analytics could develop in the future. She will share the results of an international Policy Delphi study, which was designed for the systematic solicitation and collation of informed judgments on visions of learning analytics in 2025. The study explored underlying assumptions and information leading to differing judgments on learning analytics, and brought together informed judgments about the field. The findings of the Policy Delphi, together with other studies, are now being used to develop action plans that will help us to develop analytics to support learners and educators in the future.


MOOCs: what the research tells us

MOOCs: What the Open University research tells us recommends priority areas for activity in relation to massive open online courses (MOOCs). It does this by bringing together all The Open University’s published research work in this area from the launch of the first MOOC in 2008 until February 2016.

The report provides brief summaries of, and links to, all publications stored in the university’s Open Research Online (ORO) repository that use the word ‘MOOC’ in their title or abstract. Full references for all studies are provided in the bibliography.

Studies are divided thematically, and the report contains sections on the pedagogy of MOOCs, MOOCs and open education, MOOC retention and motivation, working together in MOOCs, MOOC assessment, accessibility, privacy and ethics, quality and other areas of MOOC research.

The report identifies ten priority areas for future work:

  1. Influence the direction of open education globally 
  2. Develop and accredit learning journeys 
  3. Extend the relationship between learners and the university
  4. Make effective use of learning design
  5. Make use of effective distance learning pedagogies
  6. Widen participation
  7. Offer well-designed assessment 
  8. Pay attention to quality assurance 
  9. Pay attention to privacy and ethics
  10. Expand the benefits of learning from MOOCs

Innovating Pedagogy

OU Innovation Report series - Wed, 30/11/2016 - 15:00

The series of reports explores new forms of teaching, learning and assessment for an interactive world, to guide teachers and policy makers in productive innovation.

View the 2016 Innovating Pedagogy report

This fifth report, produced in collaboration with the Learning Sciences Lab at the National Institute of Education, Singapore, proposes ten innovations that are already in currency but have not yet had a profound influence on education.

You can see a summary of each innovation using the menu on the right.  Please add your comments on the report and the innovations.

See themes from previous years