Skip to content The Open University
  1. Institute of Educational Technology
  2. Feed aggregator
  3. Categories
  4. IET researcher and project officer blogs
Subscribe to Institute of Educational Technology aggregator - IET researcher and project officer blogs

IET researcher and project officer blogs

Caring Counts in the Workplace launch, Glasgow: 8 June 2015

Dr Beck Pitt's blog - Wed, 24/06/2015 - 17:10

Originally posted on OEPScotland:

by Beck Pitt and Caroline Anderson (OEPS project)

Beck, Bea and Caroline were at Citizen M, Glasgow on Monday for the The Open University (OU) in Scotland’s symposium to launch Caring Counts in the Workplace, an exciting new open educational resource (OER) to enable managers to support carers in balancing their caring and work roles. Nicely timed at the beginning of Carers Week and also in the year that Carers Scotland celebrates its 50th anniversary, the day brought together carers, support workers and employers to learn about the new course, find out more about how it was developed, its benefit to both employer and employee, and its potential for transforming lives. This post aims to act as a snapshot overview of some of the rich and interesting discussion and events from the day.

Caring Counts in the Workplace builds on the success of, and accompanies Caring Counts: a reflection and planning…

View original 711 more words


1000 citations and counting

1002 citations according to Google Scholar, and  20,548 downloads from the university’s Open Research Online repository (ORO).


LACE Spring Briefing

On 15 April, the LACE project held a one-day briefing and workshop in Brussels on Policies for Educational Data Mining and Learning Analytics. Originally planned to take place in the European Parliament, a security alert required a move to the nearby Thon Hotel.

The day began with a welcome from Julie Ward, MEP for the North West of England and member of the Culture and Education Committee. She was followed by  Robert Madelin (Director-General of DG Connect) and Dragan Gašević (president-elect of SoLAR). Their talks were followed by overviews of the current European-funded learning analytics projects: LACE, Lea’s Box, PELARS and WatchMe.

During the afternoon discussion and review session, participants from across Europe worked together in three separate discussion groups  to review specific issues related to the use of learning analytics in schools, universities and workplace training.

I worked as rapporteur in the universities workshop (pictured), which had 186 participants, including people from England, Estonia, Germany, the Netherlands, Norway, Scotland and Sweden. Our policy recommendations included:

  • Privacy and ethical issues are important. Encourage institutions to develop policies covering privacy, ethics and data protection. However, this is a broader issue than educational policy making and legislation. We should aim to influence the wider debate.
  • Guard against data degradation – develop and make available methods of retaining data over time
  • Develop data standards and encourage their use so that we have standardisation of data
  • Address the problem of over-claiming and mis-selling by vendors – institutions do not necessarily have access to the expertise that allow them to interpret and assess these claims
  • Need to identify procedure for due diligence around intervention strategies, the competencies do staff need, and certification opportunities relating to these
  • Identify requirements for data collection, and structures for doing this on a sector or national basis
  • Support the development of standard datasets at national or international level, against which other data can be compared to see if performance is above or below the norm
  • Identify behaviours in the field of education that regional or national governments should support and encourage
  • Identify ways of preventing the providers of educational tools selling our own data back to us.
  • Take into account that it is not just the data we are concerned about, because once it is removed from its context it does not necessarily make sense. Data needs to be associated with metadata that is produced using standardised conventions

 


Teacher-led inquiry and learning design: BJET special issue

In mid March, the British Journal of Educational Technology (BJET) published our special issue on learning design, learning analytics and teacher inquiry.

This special issue, edited by Yishay Mor, Barbara Wasson and myself, developed from an Alpine Rendezvous workshop we ran in 2013 that dealt with the connections between learning design, learning analytics and teacher inquiry.

This special issue deals with three areas. Learning design is the practice of devising effective learning experiences aimed at achieving defined educational objectives in a given context. Teacher inquiry is an approach to professional development and capacity building in education in which teachers study their own and their peers’ practice. Learning analytics use data about learners and their contexts to understand and optimise learning and the environments in which it takes place. Typically, these three—design, inquiry and analytics—are seen as separate areas of practice and research. In this issue, we show that the three can work together to form a virtuous circle. Within this circle, learning analytics offers a powerful set of tools for teacher inquiry, feeding back into improved learning design. Learning design provides a semantic structure for analytics, whereas teacher inquiry defines meaningful questions to analyse.

Contents

 

  • Learning design, teacher inquiry into student learning and learning analytics: a call for action (Yishay Mor, Rebecca Ferguson and Barbara Wasson)
  • Informing learning design with learning analytics to improve teacher inquiry (Donatella Persico and Francesca Pozzi)
  • A method for teacher inquiry in cross-curricular projects: lessons from a case study (Katerina Avramides, Jade Hunter, Martin Oliver and Rosemary Luckin)
  • Supporting teachers in data-informed educational design (Susan McKenney and Yishay Mor)
  • Forward-oriented designing for learning as a means to achieve educational quality (Patrizia M.M. Ghislandi and Juliana E. Raffaghelli)
  • Analysing content and patterns of interaction for improving the learning design of networked learning environments (Pablo A. Haya, Oliver Daems, Nils Malzahn, Jorge Castellanos and Heinz Ulrich Hoppe)
  • How was the activity? A visualization support for a case of location-based learning design (Javier Melero, Davinia Hernández-Leo, Jing Sun, Patricia Santos and Josep Blat)
  • Scripting and monitoring meet each other: aligning learning analytics and learning design to support teachers in orchestrating CSCL situations (María Jesús Rodríguez-Triana, Alejandra Martínez-Monés, Juan I. Asensio-Pérez and Yannis Dimitriadis)

Mor, Yishay, Ferguson, Rebecca, & Wasson, Barbara. (2015). Editorial: learning design, teacher inquiry into student learning and learning analytics: a call for action. British Journal of Educational Technology, 46(2), 221-229.


Rhetorical analysis and tutors’ grades

One of my doctoral students, Duygu Simsek (now Duygu Bektik), presented on her work at LAK15.

Simsek, Duygu; Sandor, Ágnes; Buckingham Shum, Simon; Ferguson, Rebecca; De Liddo, Anna and Whitelock, Denise (2015). Correlations between automated rhetorical analysis and tutors’ grades on student essays. In: 5th International Learning Analytics & Knowledge Conference (LAK15), 16-20 March 2015, Poughkeepsie, NY, USA, ACM.

When assessing student essays, educators look for the students’ ability to present and pursue well-reasoned and strong arguments. Such scholarly argumentation is often articulated by rhetorical metadiscourse. Educators will be necessarily examining metadiscourse in students’ writing as signals of the intellectual moves that make their reasoning visible. Therefore students and educators could benefit from available powerful automated textual analysis that is able to detect rhetorical metadiscourse. However, there is a need to validate such technologies in higher education contexts, since they were originally developed in non-educational applications. This paper describes an evaluation study of a particular language analysis tool, the Xerox Incremental Parser (XIP), on undergraduate social science student essays, using the mark awarded as a measure of the quality of the writing. As part of this exploration, the study presented in this paper seeks to assess the quality of the XIP through correlational studies and multiple regression analysis.

Duygu’s slides


Twitter stream

Always good to have a presentation tweeted by your pro vice chancellor :-)


European perspectives on learning analytics

As part of the Learning Analytics Community Exchange (LACE) project’s engagement with LAK15, we brought participants from across Europe together to talk about European perspectives on learning analytics.

Alejandra Martínez Monés from Spain talked about past work carried out as part of the European Kaleidoscope Network of Excellence that has implications for the development of learning analytics internationally. Alan Berg from The Netherlands provided links to a series of initiatives designed to bring researchers and practitioners together across national boundaries. Kairit Tammets introduced learning analytics work in Estonia, and Anne Boyer offered a French perspective. Members of the LACE project talked about their work to pull together research, practice and evidence across Europe.

Ferguson, Rebecca; Cooper, Adam; Drachsler, Hendrik; Kismihók, Gábor; Boyer, Anne; Tammets, Kairit, & Martínez Monés, Alejandra. (2015). Learning Analytics: European Perspectives. Paper presented at LAK16, Poughkeepsie, NY, USA.

Since the emergence of learning analytics in North America, researchers and practitioners have worked to develop an international community. The organization of events such as SoLAR Flares and LASI Locals, as well as the move of LAK in 2013 from North America to Europe, has supported this aim. There are now thriving learning analytics groups in North American, Europe and Australia, with smaller pockets of activity emerging on other continents. Nevertheless, much of the work carried out outside these forums, or published in languages other than English, is still inaccessible to most people in the community. This panel, organized by Europe’s Learning Analytics Community Exchange (LACE) project, brings together researchers from five European countries to examine the field from European perspectives. In doing so, it will identify the benefits and challenges associated with sharing and developing practice across national boundaries.


Mapping mobile learning

Andrew Brasher's Learning Design blog - Thu, 14/05/2015 - 11:46

During January and early February I helped run a field trial of an android app for  the MASELTOV project.  The app is know as the MApp (short for MASELTOV app), and the aim of the field trial was to answer some research questions about participants’ use of the app to support their cultural and language learning (there’s a a quick overview of the trial and a report that include some initial data analysis).

We have collected a huge amount of  data from participants’ phones during the 3 weeks of the trial, and having done some analysis of participants’ self reports of what they did I nnow startting to look at quantitative data collected from their phones. This includes identification of particular MApp services being used, and where and when they where being used.

So I’m straing to think about how best to preent this data, using maps, and sequences of maps over time, and my intial ideas are…….

  1. A map showing each participant’s use of various tools over a day.
    So will have 17 (participants) x 21 (days) maps = 357 maps.
    Each map would show use of a variety of services over time and space.
    * Aim: to understand each individual’s overall usage of the MApp
  2. A map showing use of a particular tool by all users over space and time
    E.g. for the forum tool, show where and when all participants are using it over a day (or longer?)
    * Aim: to inform devlopment of particular services through knowledge of their usage patterns
  3. A map to show locations that participants frequently access (particular) services
    * Aim: to compare with interview data, and to test our implicit assumptions
  4. A map to show services which are used when participants are on the move
    i.e. which service is being used, the mode of transport, and the journey undertaken.
    * Aim: to compare with interview data, to test our implicit assumptions, to see if there’s anything surprising occurring.

Heriot Watt Workshop: Thinking about Open

Dr Beck Pitt's blog - Tue, 05/05/2015 - 08:56

Originally posted on OEPScotland:

Yesterday Bea, Martin, Pete and I visited Heriot Watt University for the first of our Thinking About Open workshops. We had a great day with the Heriot Watt team exploring different facets of openness and sharing examples and experiences. A slide deck of activities is available on the OEPS project Slideshare account.

In the morning participants explored the concept of openness and examined different examples of open practice. We also utilised an adapted version of an activity Catherine Cronin had developed, which encourages people to reflect on their own practices and which generated some interesting discussion (thank you, Catherine!)  You can see some of the morning’s activity below. During the afternoon we looked at examples of where openness is making a real difference (including OpenStax College textbooks and the DigiLit project in Leicester) before finishing up with a look at Creative Commons licensing. There was a strong interest in…

View original 111 more words


Examining engagement in MOOCs

My main paper at LAK15 analysed engagement patterns in FutureLearn MOOCs. In it, Doug Clow and I began by carrying out a replication study, building on an earlier study of Coursera MOOCs by Kizilcec and his colleagues. Although our cluster analysis found two clusters that were very similar to those found in the earlier study, our other clusters did not match theirs. The different clusters of learners on the two platforms appeared to relate to the pedagogy (approach to learning and teaching) underlying the courses.

Ferguson, Rebecca, & Clow, Doug. (2015). Examining engagement: analysing learner subpopulations in massive open online courses (MOOCs). Paper presented at LAK 15 (March 16-20), Poughkeepsie, USA.

Abstract

Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.

 


Ethics and privacy in learning

Learning together: sculpture on the Marist campus

The 5th international Learning Analytics and Knowledge conference (LAK15) at Marist College in Poughkeepsie NY opened with two days of workshops.

Among these, on 16 March, was one on Ethics and Privacy in Learning, run by the EU project I am working on at the moment, the Learning Analytics Community Exchange (LACE).

Organisers: Authors: Hendrik Drachsler, Adam Cooper, Tore Hoel, Rebecca Ferguson, Alan Berg, Maren Scheffel, Gábor Kismihók, Christien Bok and Weiqin Chen.

Drachsler, Hendrik; Cooper, Adam; Hoel, Tore; Ferguson, Rebecca; Berg, Alan; Scheffel, Maren; Kismihók, Gábor; Manderveld, Jocelyn and Chen, Weiqin (2015). Ethical and privacy issues in the application of learning analytics. In: 5th International Learning Analytics & Knowledge Conference (LAK15): Scaling Up: Big Data to Big Impact, 16-20 March 2015, Poughkeepsie, NY, USA.

Workshop outline

We aim to understand ethics and privacy issues in learning analytics with greater clarity, to find ways of overcoming these issues and to research challenges related to ethical and privacy aspects of learning analytics practice. This interactive workshop aims to raise awareness of major ethics and privacy issues. It will also be used to develop practical solutions for learning analytics researchers and practitioners that will enable them to advance the application of learning analytics technologies.


e-Access'15 conference part 2

Dr Nick Freear's blog - Wed, 22/04/2015 - 15:53

Continued from the previous post…

In the second part of the plenary Kathleen Egan, Programmes Manager at Age UK London, presented the valuable work done by the charity on digital inclusion. She discussed a project to use teenagers as volunteer mentors, and referred to the Digital inclusion evidence review, 2013 (PDF).

The final session of the morning was by Paul Smyth, Head of IT Accessibility at Barclays Bank. Paul Smyth and his colleagues have acheived a high-level of buy-in and ownership of Web accessibility from the bank's senior management.

Afternoon

I was happy to have lunch with Roger Wilson-Hinds and Natasha Beauharnais. They are both involved with Georgie Phone, a suite of low-cost Android mobile phone apps for the vision-impaired. These look very promising – I must take a look. And, I think I've met Roger at previous events...

The afternoon started for me with setting up for the round-table discussion that I was chairing, title "OU Media Player: Mainstreaming video accessibility". My colleague, Peter Devine, put a lot of work into an A0 poster, that I hope the participants found useful.

I'm fairly used to giving presentations. However, this was my first time chairing a discussion, and I was feeling nervous. We had a good number of attendees for the discussion – between 9 and 11. People attended from a wide variety of organizations, including the RNIB, the University of Southampton, and the Worshipful Co of Information Technologists.

Here are some of the questions that were asked and discussed:

  • Did I have documentation on how to make a media player accessible? (Answer: not yet)
  • How had we made the player accessible? (Answer: heavily customized MediaElement.js + testing + iterate...)
  • What formats/ file types did the player support? (Answer: generally those formats supported natively by browsers - via HTML 5; and those supported by Flash. So: mp3 audio, mp4, m4v and FLV video)
  • Did it support, eg. YouTube/Vimeo? (Answer: the underlying MediaElement.js framework does support YouTube; OU Media Player doesn't yet – watch this space)
  • Was it going to be open sourced? (Answer: yes)
  • When? (Answer: at the time of the conference I couldn't say. Now I can say, within the next 6 months)
  • What about DRM? (Answer: not considered yet. )

What I learnt about chairing a panel:

  • Have ideas jotted down for things to discuss - if there is a lull in conversation (I found there was a lull, and I wasn't quite prepared for it)
  • Be prepared to drive or guide things;
  • Try to include everyone, not just the most vocal (not sure I managed that);
  • Find a way to take notes;
  • Do a "register" at the start, so that you have everyone's contact details (should be obvious);
  • Go round at the start asking people to introduce themselves, explain what they want from the discussion (I think I did this – not sure how well though);
  • Hand out "feedback questionnaires" (meant to print some, ran out of time);

Possibly useful links:

Thank you to all who attended the discussion. And, thank you to the event organisers, Dan Jellinek and his team.

Go West: #OER15 Calling!

Dr Beck Pitt's blog - Tue, 21/04/2015 - 18:47

Originally posted on :

Next week team OER Hub will be Westward bound as we head to Cardiff, Wales for OER15!

This year’s conference is focused on “Mainstreaming Open Education” and will be held at the Royal Welsh College of Music and Drama on Tuesday 14 and Wednesday 15 April. Keynotes include Cable Green, Sheila MacNeill, Josie Fraser and our very own Martin Weller.

With six conference tracks (including Impact Research, OEP and Policy and Open Education in Schools and Colleges) and no less than seven parallel sessions including workshops and panel presentations happening simultaneously there’s plenty to check out. Head on over to the conference schedule for an overview and the abstracts for talks.

We’re also excited to be participating in several sessions across the conference:

Tuesday 14 April 

  • Get a double dose of Hub goodness with Bea debuting the latest findings from our impact research PLUS Beck talking about whether openness…

View original 139 more words


Design for Life

Will Woods's blog - Wed, 08/04/2015 - 09:39

I’ve finally worked some Manics lyrics into my blog post..

Why the title Design for Life? – Well there is a very good reason why I’ve not been blogging these past three months. I’m currently on secondment to the big production engine of the Open University (Learning and Teaching Solutions) and heading up the development of a new area there called TEL (Technology-Enhanced Learning). This new sub-unit is responsible for the aspects of Learning Design as we apply it within the Open University context, so we’re referring to this as “TEL Design”.


Learning Design in it’s purest form is technology and pedagogy neutral but within TEL design we are seeking to to use evidence based approaches to the production and presentation of modules so that they are designed appropriately considering learning outcomes (LOs), tuition/support approach, assessment and the overarching student experience. The OU has already being doing this for some time through the Learning Design team in IET who have been working with module teams on activity planning and module mapping processes to ensure that a sound pedagogic approach is being considered which is appropriate for the level of study and disciplinary context. This work however needs to be scaled up as we have perhaps in the order of 100-150 modules being refreshed every year from a provision of around 600 modules which form the OU curriculum. I’m therefore simplifying what is a very complex activity, working with module teams to turn these sound pedagogic approaches into practical module/course designs suitable for each disciplinary context which form part of a coherent student journey and consider appropriate use of technologies.

So tackling each area of my role:

Evidence-based Practice

Within TEL we have a group of around 20 very seasoned practitioners in module production or aspects of teaching, either within or outside the OU. This group have excellent experience in what works, is appropriate for design of online learning activity and which enhances the student experience. i.e. lots of empirical knowledge.

(a) We are building a library of examples of best practice

(b) We are establishing, through a survey, the evidence bases that are currently being used within the OU to establish what is “best practice”. (we have huge amounts of of evidence to draw upon – see our Learning Analytic colleagues such as Doug Clow for details of that work!).

(c) We are considering where evidence is needed and of what type. For example in some cases a “deep dive” approach may be more suitable. We are working with colleagues in IET on projects exploring analytics and evidence to support decision making for both improving design during production and also for in presentation adaption and improvement.

(d) We are also considering how much to rely on phronesis or discretionary practitioner judgement. There is a lot of interesting literature in this area, I particularly enjoyed the McNamara Fallacy and the problem with Numbers in Education article by Carl Hendrick on the dangers of using data for decision making on very complex models.

Scholarship, Development and Training

I’ve been working on a development plan for “TEL Design” practitioners. I’ve been co-ordinating work on this, looking at job descriptions both internally and externally and mapping the skills and competencies into a framework which also matches to the UK Professional Standards Framework for teaching and supporting learning (UKPSF) from the Higher Education Academy. During March 2015 I released a questionnaire to TEL staff to ask then to rate themselves against areas of this framework. We are currently creating plans to meet these needs which will be through:

(a) Identifying the most urgently needed skills and competencies required by the majority of people and consequently running workshops and training sessions to up-skill our staff, for example Grainne Conole ran a workshop in March for us on “Strategies for designing and evaluating effective learning activities”.

(b) Exploring what specific skills and competencies are required by individuals and creating personal development plans (PDPs) to meet those needs

(c) Using practice-based approaches to improve competency, for example mentoring and encouraging staff to engage in HEA fellowship programme through the OU OpenPAD scheme as a method to encourage reflection and improve practice.

(d) Developing a scholarly culture within the unit, this includes encouraging TEL staff to be involved in publishing and attending events relevant to their practice and to recognize and reward achievement in areas of specialism and knowledge within the TEL group.

Strategy and Culture

This is by far the biggest challenge as we are having to carve out a shape for the design process within the OU’s current production methodology and management processes.

The good news is that we are doing this at a time when the curriculum systems are being refreshed, when the OU curriculum itself is being refreshed through a curriculum: fit for the future programme and the Learning and Teaching Vision and Plan 2025 provides us with imperative to establish the evidence based design approach within production and presentation. I’m also located within a unit which is currently going through a re-focus process so the design processes can be considered within an overhaul of the whole production life-cycle processes to make them more efficient and effective. In order to make this stick we need a cultural change and that’s perhaps my biggest challenge. OU module production has become very risk averse and procedural and the people are necessarily used to that safety blanket of knowing what’s coming up six months before they need to start work, things need to change here and the ability to be agile and adaptive is increasingly important.

We are doing this successfully in MOOC design where the timescales are shorter and the methods used are bespoke and usually outside of regular process, the challenge now is to make that the norm rather than the exception.

I don’t have all the answers here but I have a number of ideas which I’m currently exploring.  I’m also looking, with my colleagues in TEL and LD, at the activity structure for the TEL Design workshops and I’m considering a model which I want to share for discussion. More on that in my next blog post.


ResearchGate

Download figures from Research Gate

I don’t engage very heavily with either Research Gate or with academia.edu for several reasons

(1) Time is limited, and there are only so many networks I can engage with

(2) All my work is available via my institutional repository (ORO) or via this blog

(3) Neither Research Gare nor academia.edu seems to be particularly open about its business model. How are they making money out of my time and my resources?

I thought for a while that ResearchGate might be making money via targeted job ads, but they’re currently suggesting I might be interested in the post of associate dean for veterinary research at Ross University, Saint Kitts and Neots. As my only qualifications for that job are that I once had a pet cat and I like visiting tropical beaches, I don’t think their targeting algorithms are very sophisticated.

Despite my overall lack of engagement, both sites now know a fair amount about me and my work, and my co-authors often upload papers. This means I sometimes get email updates on my downloads. This week, apparently, my work was downloaded 101 times, with 72 people downloading a technical report on social learning analytics and 16 downloading an editorial that came out this week. I even get a national breakdown of downloads (see pic). In addition to those shown, my work was accessed from Taiwan (3), Italy (3), Canada (2), Finland (2), South Korea (2), New Zealand (2), Indonesia (1), Romania (1) and Ecuador (1). That’s 20 countries this week.

Meanwhile, back at the institutional repository, my work has been downloaded over 1000 times this month. I’m not sure what to make of this. If these figures are typical (I’ve no idea if they’re high or low), then there is an enormous amount of scholars out there who are doing an enormous amount of reading. And it also looks as if the digital divide is growing – I see no African countries at all on that download list, and this reflects my experience at conferences.