Skip to content The Open University
Subscribe to Institute of Educational Technology aggregator - Recent blogs from IET staff

Recent blogs from IET staff

Nothing is deserved, everything is accepted

Professor Martin Weller's Blog - Tue, 03/05/2016 - 17:35
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

In a recent post I mentioned how I’d been at two conferences and academics had bemoaned the state of the relationship with IT services. At the risk of making academics seem like a bunch of whingers, a second theme occurred (perhaps people just like moaning to me) which was the precariousness of the academic researcher. I write this as a tenured Prof (whatever tenure means now), so it is not a self pity or self serving motivation that drives this but concern at the direction universities are hiccuping their way to.

I’ve become increasingly disturbed by the way universities (in the UK, but I suspect it’s commonplace) treat researchers. For nearly all forms of employment there is the 4 year rule which states “Any employee on fixed-term contracts for 4 or more years will automatically become a permanent employee, unless the employer can show there is a good business reason not to do so”. Lucky, lucky researchers are exempt from this however. In 2008 people were saying the fixed term contract was a thing of the past, but with austerity, the introduction of fees and general uncertainty in the higher education sector, its use seems to have increased. This is particularly true for researchers who are employed on external funding. Researchers are employed to a specific project, and when that project ends, unless there is another project, their employment is terminated. This may make sense for a big 3 year project, where you don’t want to employ a large team after the funding ends. But many researchers exist on a diet of short and medium term projects, hopefully with no gaps in between. My understanding, but I’m no expert in employment law, is that the project manager would have a good case for being made permanent at the end of a 4-year project, whereas the researcher would not. I appreciate project managers and researchers equally, but it seems non-sensical to have a surfeit of permanent project managers and a deficit of full time researchers.

The Research Concordat proposes that: “Research posts should only be advertised as a fixed-term post where there is a recorded and justifiable reason.” However, making that ‘justifiable reason’ is not difficult for universities, and the Concordat is not the same as employment law. In 2014 67% of researchers were on fixed term contracts and 39% have been at their institution for more than four years, which indicates that since the Concordat introduction in 2012 we haven’t really seen a significant reduction in the use of fixed term contracts.

Effectively universities are deploying a legal loophole in employment law to keep researchers on a series of short, fixed term contracts. I want to argue that this is bad at an individual, institutional and universal level.

For the individual, it is no way to live, being continually only 6 months or so away from being unemployed. Getting a mortgage, deciding to put down roots, and just feeling secure is very difficult in this context. It also means focus and loyalty to any one project or institution is difficult – if you’re sensible you are always looking for the next job.

At an institutional level the short-term approach can be costly. A project ends, you lose the staff, the three months later you get a new project. You then have to recruit new staff, which with advertising, and interviewing timing often takes 3-6 months. That’s 3-6 months of your new project that is lost. It is estimated that it costs £30K to recruit a new member of staff. That’s pretty much the salary of a researcher for a year, when they could be doing other things for you anyway. It also makes the establishment of a research culture much more difficult, community is a very nebulous thing, and can be easily undermined with the loss of two or three key individuals (and the full time researchers are often the ones who give most to the local community because they are unencumbered with many of the other duties and roles of senior staff).

At the more universal level it is detrimental for research at universities as a whole. This lack of a readily available research staff makes universities less agile and flexible, since everyone is either fully employed on an existing project or they need to employ new staff, with the difficulties described above. If you have a one year project, you don’t want to lose 3 months of it recruiting staff. Increasingly we are seeing independent researchers or small research companies offering services. As more research involves using IT rather than expensive equipment then it can be done by a few people working at home. Without the need for the large overheads of universities, they can be cheaper, and offer researchers better contracts and pay. Apart from the heavy duty STEM projects, research then becomes outsourced from the university, or the university is simply bypassed. This would be a shame, research is an integral part of the university identity, and is often allied with teaching. You want your best teachers and researchers in the same space. But the short-term gains universities are opting for with fixed term contracts undermines their longer term viability.

My feeling is that this has become habit and confused with employment law and best practice. It is possible to make the situation better for individuals, institutions and the overall research environment, but it requires some effort to address it. Now is the time, before it becomes too embedded and the damage at all levels too substantial.

The title comes from Martin Amis’s essay on Kafka. As internet kids like to say, I’ll just leave this here: “He deals in savage inequities that are never resented, pitiful recompenses that are tearfully cherished.”

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

10 years of Edtechie – the imposter gang

Professor Martin Weller's Blog - Mon, 02/05/2016 - 11:11
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

Today marks ten years of blogging here at edtechie. I had started a few blogs before, but this was the time I stayed with it. That ten years later I’m still doing an activity which is not part of my formal job description, is not recognised and is usually undertaken in my own time is a testament to the power of blogging in itself. But I’m not going to make this a ’10 reasons why you should blog’ post. I was struck by a comment Sava Singh made in her presentation at OER16 when she said that even complaining about how Twitter used to be better in the old days is a sign of privilege. She’s right, we old timers have a temporal privilege – anyone coming in to blogging now is starting out in a very different context. I recall Pink Floyd saying that they were lucky that when they started there weren’t many bands around, so they were given time and people who might not listen to them came to them for want of anything else. This is a very different scenario for an artist now, who must compete within the deluge of daily releases. And so with blogging, I had the good fortune to be able to build up a reputation when there wasn’t much around, it would be a very different story now. So, I’m aware that my story is not necessarily applicable now, but it’s the only story I have. So apologies in advance, this is a self-indulgent post.

As I’ve considered writing this post over the past week or so, I’ve reflected on why I personally like blogging. I don’t mean all the reasons we often give people, such as establishing an identity, increasing dissemination, keeping a record of your process – all those are valid extrinsic motivators, but what is it about blogging that appeals to me. I came to the conclusion that blogging was where I felt I really belonged. I had found my academic tribe.

People talk a lot about imposter syndrome now. Again, I appreciate I have a set of privileges which mean it is only a fraction of what others may feel (white, european, male), so please interpret this in light of how it shaped my blogging reaction only: I was comprehensive educated, working class, first generation at university. I was educated at a range of polytechnics, which post 1992 became new universities: Hatfield, Kingston, Teesside. All good places, but not the key to a network of influence. I didn’t feel any sense of being an outsider whilst studying because fellow students at Polys tended to be similar to me in upbringing. I definitely did feel it when I started working at the OU. I remember my first coffee break after joining in 1995 – everyone was Oxbridge educated, older than me and generally middle class. One colleague recounted a story of how Edith Wharton had bought him a train set when he was young. Yeah, we’ve all got stories about our family’s friendship with a famous author haven’t we? I felt like (and probably was) a yob. For the first year at least I expected someone to tap me on the shoulder and say, “sorry, we made a mistake”.

But everyone was friendly and supportive, and such feelings subsided. But it was with the advent of the web, and encouraged by my OU colleagues John Naughton and Tony Hirst (probably both outsiders also) that I took up blogging. At the time blogging amongst academics was still relatively rare. I used to tell people excitedly “I have a blog”. Now that would be akin to saying “I have a microwave” – not guaranteed but not worthy of comment. Blogs were like little beacons shining across the globe that would splutter in to life and look for fellow signals to respond to. I fell in with the North American and UK ed tech blogging crowd. And this is why I think blogging resonates with me – I generally like bloggers. I don’t like all bloggers and I don’t dislike non-bloggers, but there is something about the approach to blogging – the informal use of language, the sense of fun, the support, willingness to try new things and the personal, social element that appeals to me. I think nearly all of the bloggers who influenced me (George Siemens, Stephen Downes, Audrey Watters, Jim Groom, Bon Stewart, Alan Levine, Scott Leslie, Josie Fraser to name just a very few) were outsiders to formal academia to an extent. Indeed I think you had to be an outsider in those early days to get blogging. That’s probably why there was an inverse relationship between online and academic reputation. Blogging was the refuge of the outsider. This is less true now when it is an accepted part of a communications strategy and you can take courses on being an effective blogger. It is now more professionalised, but I still think it represents a more democratised, open space than formal academia and I still make new connections with people here. As a tenured Prof at a big university I can’t really claim outsider status any more, I’m one of ‘them’ now. But blogging was where I found an authentic voice and I still cherish that. Bloggers are still my kind of people.

I don’t know what its role is really in relation to my ‘proper’ work, but I’m okay with that now. When I retire I expect that the three people who turn up to my retirement party (under duress) may point to formal publications as an indication of my work, but I can think of no higher honour than if they declared “he was an allright blogger”.

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

IT services – we need to talk

Professor Martin Weller's Blog - Mon, 25/04/2016 - 17:33
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

I was at two conferences recently (OEGlobal and OER16). At both of them I ended up in a (different) group bemoaning the IT services in their university. I didn’t initiate either of these conversations I should add. Also, please do not interpret this post as having a pop at people in IT services, I know lots of good people there. Rather it is about how universities have created the environment where academics and IT are now in a rather dysfunctional relationship. Across many universities the complaints seemed to be rather similar:

  • Security is used rather the same way Governments use terrorism – as a means of controlling things and removing freedoms
  • Increasingly academics have no control over their machines, and cannot install or trial new software
  • Even basic tasks are often highly frustrating and time consuming
  • Support has been centralised so there is no local advice or help
  • Senior IT managers have been brought in form other sectors with little understanding of the university culture
  • Increasingly academics are circumventing official systems to buy their own machines, or host their own services, often in their own time and at their own expense
  • There is little room for experimenting with tools beyond the VLE

Listening to these complaints (and occasional horror stories) made me rather wistful. As IT has become increasingly part of the central operation of every university’s teaching and research environment, it seems that it has moved further away from the people who actually need it for those functions. It has become a thing in itself, and the academics (and students), merely an inconvenience in its smooth operation. This is not to blame those in IT services, they are operating in the context that universities have established for them. If there is a security breach, it will be the IT manager who is in trouble, not the academic who wanted to play around with a cool new tool. It must be frustrating for lots of people in IT also, I’m sure they’d like to be experimenting with tools also.

We have to get back to having dialogue, and having IT people who understand the needs of universities (and equally academics who understand the demands of IT systems). The need for innovation in universities is often trumpeted, but it doesn’t arise from stony soil, but rather from the stinky, messy fertiliser of failed attempts with less than perfect ideas and tools. Innovation is not necessarily synonymous with digital technology, but often it is deeply associated with it. If you don’t have freedom to explore this stuff then increasingly universities will struggle to compete with ed tech companies who have more flexibility and freedom.

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

Innovating Pedagogy: 弗格森

Innovating Pedagogy – now available in Chinese.

I now know that my surname translates as 弗格森

Plan Ceibal

The main reason for my visit to Uruguay was to attend the First International Workshop on New Metrics for Evaluation: Towards Innovation in Learning. This event was organised by the Centre for Research at the Ceibal Foundation in collaboration with INEEd, the ICT4V centre and the education division of the Inter-American Development Bank.

The workshop had four objectives, which the organisers framed as:

1. Using data for research and evaluation: towards an open and collaborative process for analysis, research and improving education.
2. Presenting experiences in the use of information systems for improving learning outcomes.
3. Presenting innovative approaches for evaluation and assessment of learning outcomes.
4. Policies, projects and programs for technology integration and data use in education.

It was a fascinating event, with representatives from countries across South and Central America, including speakers from Brazil, Chile, Colombia, Ecuador, Mexico, Nicaragua and Uruguay. Other speakers from outside the continent were Dragan Gasevic from Edinburgh, Neil Selwyn from Monash in Australia and Gilles Dowek from France.

I was particularly interested to find that Uruguay runs a ‘One Laptop per Child’ programme based on premises of equality and justice. Uruguay sees access to computers and the Internet as a right. You should have them in your classroom, just as you should have electricity in your classroom. Plan Ceibal has supplied 600,000 people (a fifth of the population) with laptops or tablets. Every child gets one when they start school, and they get a replacement every three years, with secondary school children now receiving Chromebooks. Internet is available nationwide – no one should be more than 400 metres from the Internet. There is a maintenance programme and a disposal programme, a teacher training programme, a learning management system, a suite of software, and a programme of video-conferenced English lessons, arranged in conjunction with the British Council.

I was also interested in Neil Selwyn’s talk, focusing on analytics and Big Data from a sociological perspective. He posed six questions:

  1. What are the potential gains, and what are the potential losses?
  2. What are the unintended consequences or second-order effects?
  3. What underlying values and agendas are implicit?
  4. In whose interests is this working? Who benefits, and in what ways?
  5. What are the social problems that data is being presented as a solution to?
  6. How responsive to a ‘data fix’ are these problems likely to be?

These wider questions of politics and power have not yet been taken up to any extent by the learning analytics community, but they look set to be bigger issues as the field matures.

My talk was on learning analytics, the state of the art and what the future might look like.

I also took part in a round-table discussion with Neil, Gilles and Dragan on issues related to learning analytics.

The back channel – mostly in Spanish – used the hashtag


Universidad ORT

During a visit to Uruguay, I was lucky enough to be invited to visit the Institute of Education at the ORT University in Montevideo. There, I gave a presentation to faculty members and postgraduate students on Innovating Pedagogy.

Innovating pedagogy within the OU

For the past four years, The Open University has produced an Innovating Pedagogy report annually. This series explores new forms of teaching, learning and assessment for an interactive world, to guide educators in productive innovation. As one of the report authors, I presented a quality enhancement lunchtime seminar on 23 March 2016 (part of the QELS series). In the seminar, I introduced the themes that have emerged from this series of reports – scale, connectivity, reflection, extension, embodiment and personalisation – and how these connect with modules (courses) run by the OU. The seminar included examples of innovative pedagogies in use at the OU, and identified others that could be used in future.

Learning analytics expert workshop: Amsterdam

March 15-16 2016, I co-ordinated a Learning Analytics Expert Workshop that was jointly run in Amsterdam in March 2016 by the LAEP project and the Learning Analytics Community Exchange (LACE).

Fifty people attended the workshop, including invited experts (expert presentations), representatives of current European-funded projects in the field of learning analytics (project presentations), and representatives of the European Commission.

The workshop dealt with the current state of the art in learning analytics, the prospects for the implementation of learning analytics in the next decade, and the potential for European policy to guide and support the take-up and adaptation of learning analytics to enhance education.

The workshop began with a review of current learning analytics work by participants and went on to consider how learning analytics work can be taken forward in Europe (presentation on the LAEP project).

Participants at the workshop identified immediate issues for learning analytics in Europe. They set out considerations to be taken into account when developing learning analytics, made recommendations for learning analytics work in Europe and then identified both short- and long-term policy priorities in the area.

Immediate issues for LA in Europe

Framework for development: A European roadmap for learning analytics development would help us to build and develop a set of interoperable learning analytics tools that are tailored for the needs of Europe and that have been shown to work in practice.

Stakeholder involvement: There is a need to bring different people and stakeholders on board by reaching out to groups including teachers, students, staff, employers and parents. Our current engagement with stakeholders is too limited.

Data protection and surveillance: As legislation changes and individuals become more aware of data use, institutions need to understand their responsibilities and obligations with regard to data privacy and data protection

Empirical evidence and quality assurance: More empirical evidence is needed about the effects of learning analytics, in order to support a process of quality assurance.

Considerations for the development of LA
  1. Learning analytics can change or reinforce the status quo
  2. Learning analytics should enhance teaching, not replace it
  3. It is our duty to act upon the data we possess
  4. Desirable learning outcomes must be identified
  5. Be clear why we are collecting and analysing data
  6. Bring the data back to the learner
  7. Intelligent systems need human and cultural awareness
  8. Impressive data are not enough
Recommendations for LA work in Europe
  1. Undertake qualitative studies to understand how learning analytics can be aligned with the perceived purpose of education in different contexts, and which aspects of different educational contexts will support or constrain the use of learning analytics.
  2. Publicise existing evaluation frameworks for learning analytics and develop case studies that can be used to enrich and refine these frameworks
  3. Develop forms of quality assurance for learning analytics tools and for the evidence that is shared about these tools.
  4. Identify the limitations of different datasets and analytics and share this information clearly with end users.
  5. Explore ways of combining different datasets to increase the value of learning analytics for learners and teachers.
  6. Extend to different sectors of education the work currently being carried out in the higher education sector to identify the different elements that need to be taken into account when deploying learning analytics.
  7. Develop analytics, and uses for analytics, that delight and empower users.
Short-term policy priorities

Workshop discussion

Innovative pedagogy: Top priority is the need for novel, innovative pedagogy that drives innovation and the use of data to solve practical problems.

Evidence hub: Second priority is to secure continuing funding for a site that brings together evidence of what works and what does not in the field of learning analytics.

Data privacy: Participants considered that a clear statement is needed from privacy commissioners about controls to protect learners, teachers and society.

Orchestration of grants: The European grants system could better support the development of learning analytics if grants were orchestrated around an agreed reference model.

Crowd-sourced funding support: Set up a system for crowd-sourcing funding of tools teachers need, with EU top-up funding available for successful candidates.

21st-century skills: Focus on developing learning analytics for important skills and competencies that are difficult to measure, particularly 21st-century skills.

Open access standards: Standards need to be put into practice for analytics across Europe, with an open access forum that will enable the creation of standards from practice.

Ambassadors: We need more outreach, with ministries and politicians spreading the word and encouraging local communities and schools to engage.

Long-term policy priorities

Teacher education: Top priority in the longer term was for media competencies and learning analytics knowledge to be built into training for both new and existing teachers.

Decide which problems we want to solve: In order to develop the field of learning analytics we need to have collective discussions on the directions in which we want to go.

Facilitate data amalgamation: More consideration is needed of how to combine data sources to provide multi-faceted insights into the problems we seek to solve.

Identify success cases and methodologies that give us a solid foundation: We need a coordinated approach to quality assurance and to the identification of successful work.

Several accounts of the workshop are available online, dealing with the morning of day one, the afternoon of day one, day one as a whole, the morning of day two, the afternoon of day two and day two as a whole.

Artificial Perception

Will Woods's blog - Fri, 15/04/2016 - 11:21

I’ve been listening to educational technology hype recently with an eyebrow raised particularly in respect to the ideas being expressed around artificial intelligence and the role of intelligent agents to replace humans. One of the most recent examples of this is Mark Zuckerberg at F8 conference saying ““Our goal with AI is to build systems that are better than people at perception.” The Telegraph provides a summary of his keynote and the F8 conference.

Sit back and reflect on his statement for a moment.

perception pəˈsɛpʃ(ə)n/ noun
  1.   the ability to see, hear, or become aware of something through the senses. “the normal limits to human perception”
  2.   the way in which something is regarded, understood, or interpreted. “Hollywood’s perception of the tastes of the American public”

What is perception? – a personal view of the world? – shaped by our emotional state and environment? – An entirely subjective reality. What do we mean by better perception? is this seeing the world logically without the trappings of emotion? – is it about the ‘wisdom of crowds? – If it’s the latter then we know that this is being gradually debunked because we are seeing greater confirmation bias within social media circles, I referred to this in a previous post as ripples in the pond, and there is evidence of the undermining effect of social influence. However there is no doubt that artificial intelligence will have access to a greater dataset and will have the ability to interpret data in ways that would be impossible to humans. My question though might be is that going to translate into better outcomes?

Invention comes from creative friction, discourse, questioning. In a world where we are all synthesized down within a crucible above the flame of artificial intelligence what happens to inspiration. interpretation. challenge? – this is of course a dyspotian future that people in the AI world are keen to promote because it creates a big dream of the future and a strong emotional connection.

But we do need to be concerned because at a minimum a possible future predicted by Gartner may see smart machines replacing millions of humans but at the same time we should be rational because we must recognize the Myths around AI’s and their usefulness is in support human endeavours, especially around tackling big data challenges.

…so what of humanity?



Effective social teaching and learning

Will Woods's blog - Fri, 15/04/2016 - 08:24

Eric and I introduce the group to our social media session (That’s me on the right) – Image courtesy of Ian Roddis .

Several months of planning, and a few nights waking up in a sweat, have led to a successful one day social media event which I co-chaired with colleagues from Learning and Teaching Solutions (LTS) on “Embedding Social Media to effectively support OU learners”.

There were two reasons that it’s taken so long to arrange:

  1. I wanted to introduce external perspectives to the topic to refresh our thinking. To this end my fantastic co-chair Beccy Dresden got in touch with Eric Stoller and we brought him to work with us. You’ll get a sense of his work from his blog. The thing I most like about Eric is his passion and enthusiasm for effective knowledge of, and use of, social media (more on that later).
  2. I wanted to tackle this problem at three levels in order to get actionable outputs and from both a top-down and bottom-up perspective, by that I mean (i) the Vice Chancellor, (ii) the people at Director/AD level responsible for learning & teaching, communications and marketing and (iii) the people who work directly in support of academic practice around module production and presentation.

I structured the day to begin with a conversation with the Vice Chancellor about the Open University and use of social media for a variety of strategic purposes, then we held a wider conversation which I chaired with a group of senior OU staff, from both academic and non-academic areas on “Embedding social media to effectively support OU learners facilitated by Eric Stoller”, then in the afternoon Beccy chaired sessions with academic support staff which began with a Keynote by Eric followed by parallel sessions around Social media for professional development with Eric and Lawrie Phipps (JISC) and Exploring the possibilities for social media within distance learning material hosted by Beccy Dresden and Steve Parkinson from Learning and Teaching Solutions (OU) and concluded with a plenary/roundup.

I began the morning session by introducing four provocations:

Provocation 1 – “Do we need a social media strategy for learning?”
Provocation 2 -“How and when do we embed social media practice within our modules and across the curriculum?”
Provocation 3 – “What can we learn from others?”
Provocation 4 – “Can we use social media to bridge the informal/formal divide?”

Embedding social media to effectively support OU learning with Eric Stoller from Open University

We then has an introductory chat about our different perspectives with social media and Eric followed this up by giving a talk which went into more detail starting with why does social media matter?


We kept the presentations short to allow plenty of time for discussion and the session has a lot of stimulating and interesting perspectives thanks in large part to Eric’s facilitation. Eric asked me before the session what type of conversation should we expect “..sometimes it’s a conversation about org culture and daring to dream/experiment that is needed…sometimes it’s more about choosing which tools are relevant right now and how to apply them in strategic / worthwhile ways.” I said that it was a bit of both and that turned out to be the case. Eric was also interested in the variety of perspectives and knowledge, for example some people in the room, such as Ian Fribbance, have used social media effectively in their practice for some time. The OU has some examples of great use of social media within pockets of the curriculum but there are also pockets of skepticism around social media and particularly about its relevance within formal learning and teaching. In fact one person at the meeting had never used social media and didn’t want to try it, to which Eric exclaimed “This is 2016! – I’ll not force you to use social media but we will talk later!”. The OU is also a place where practice is diverse and where OU academics don’t necessarily engage directly with students but that aspect is managed through tutors (or ALs) so there can be a disconnect.

Here are my key takeaways from the session:

  1. We aren’t using social media consistently and effectively to support and facilitate our discourse within the Open University and that has  consequences for our engagement with our learners and more widely within our teaching communities.
  2. Things are improving. Examples of use of social media which have in the past been treated as ‘renegade’ are now being seen as exemplars of good practice, which is encouraging. e.g. the use of FaceBook within Social Science to support 26,000 learners
  3. It sounds like assessment may be the key to unlocking a bit of a cultural shift towards using social media more effectively…that and the push by certain individuals at the senior level is crucial. (this was echoed by Eric)
  4. We don’t need a formal strategy (considered to be constricting) and LTS have created a “manifesto” already as a grass roots approach so what the group thought would be most valuable was an enabling framework within which people could experiment with optionally using social media within their contexts.
  5. We need to ensure that academic staff are developed and supported to be digital scholars, which includes using social media effectively, so we see a need to build this into the “academic excellence” objective that is currently being formulated.
  6. We need to ensure that we consider appropriate platforms and risks when using social media so we see a need to build these elements into the “leadership in digital innovation” objective that is currently being formulated.
  7. We need to provide greater support for ‘grass roots’ initiatives and to remove barriers to adoption, this includes advocacy at senior level but also enabling through joined up thinking and grass roots initiatives such as the special interest group for social media.
  8. We need to continue to engage with external perspectives to help us to see how we compare, and to ensure that we are leading the way around social learning.

Eric is reporting back his thoughts to the Vice Chancellor, and we are now exploring how we can work with the Pro-Vice Chancellor (Learning and Teaching Innovation), the Pro-Vice Chancellor (Research and Academic Strategy) and the Head of Digital Engagement in particular to form an action plan to take this work forward – with thanks to Simon Horrocks, Beccy Dresden and The LTS team in particular who are supporting this work and considering the next steps.

Watch this space.

Should bid proposals be open access?

Professor Martin Weller's Blog - Fri, 08/04/2016 - 08:57
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

I was at a UNESCO OER meeting in Paris last week (impersonating an important person) and a topic that came up a couple of times was the waste of resource that we just accept. Someone highlighted all the EU funded projects which are difficult to search, or find outputs for. They were from an AI, machine learning background so they wanted access to this to discern patterns and create links between projects.

In the Battle for Open I talk about how much effort is wasted in the current bid writing proposal:

Some of the inherent waste in current practice often goes unnoticed, because it is accepted practice that academics have been enculturated into. For example, some researchers can spend considerable time, months even, developing research bids to submit to funders. Stevenson (2013) calculated 3 months for a proposal, but the Research Councils UK found that 12 days for a conventional proposal was the average (RCUK 2006). The success rates of bids are decreasing as it becomes more competitive; for instance, the ESRC state that only 17% of bids were successful in 2009–10 (ESRC 2010). If a bid is unsuccessful then sometimes it will be modified and submitted elsewhere, but often it is simply abandoned and the researcher moves on to the next one. That equates to a lot of lost time and knowledge. The RCUK report in 2006 estimated that £196 million was spent on applications to the eight UK research councils, most of which was staff time. The number of applications increases every ­year – ­there were 2,800 bids submitted to ESRC in 2009–10, an increase in 33% from 2005–6, so this figure is likely to have increased significantly. Some of these 2,800 proposals were studentships, which have a higher success rate, but even taking an optimistic figure of 800 bids accepted to account for studentships, this still leaves 2,000 failed bids. If we take RCUK’s figure of 12 days as an average per bid, then this equates to 65 years of effort, and this is just one of several major research councils in the UK and Europe to whom researchers will be bidding. Obviously this is just an indicative figure, and there are many assumptions in its calculation that one could challenge, but nevertheless, the nature of research as it is currently conceived has a lot of waste assumed within it. This is not to suggest that the ­peer-­review process is not valid, but that the failure to capitalise on rejected bids represents a substantial waste of resources. As with open source software and OER approaches to teaching, open approaches to research may provide a more efficient method.

That was 65 years of wasted academic effort for just one research council in one country. And many of these are never revisited. That is a very inefficient way to operate. While research bodies have tackled some aspects of openness, for example mandating publications are open access, and have searchable databases for funded projects (eg the ESRC one), they don’t tackle this waste problem. The simple solution is to make all bids openly available also (I’m not aware of a funder who does this, but please let me know if there is one). Maybe not all aspects, individuals and institutions may want to keep salary costs, or overheads private, but the main idea and methodology could be made available. Others could then build on these, as well as allowing the type of meta interpretation my friend at UNESCO was interested in.

But this probably wouldn’t be easy to realise, and it really gets at the difference between an open culture and a more circumspect one. The research system overall may benefit, but there would be risks to individuals. For example, research teams in more expensive countries may never get funded because the funders know that if it’s a good proposal, someone else will take it and adapt it for half the price. Would people be cautious about what they shared in research bids? People do alter and resubmit so would this undermine that?

There would be some adjustment required, but if we’re using CC-BY (maybe even CC-NC) then the original party would be credited. The point of research is often not just that you have the idea, but that you have the ability and expertise to conduct it also, so it wouldn’t simply be a case of lowest bidder. This would be a more radical step to an open research culture. Part of me is just sad at all those very good research proposals that never see the light of day.

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

Types of OER user

Professor Martin Weller's Blog - Thu, 07/04/2016 - 08:45
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

For the GO-GN we are relaunching our webinar series. These will be the first Wednesday of every month, 4pm UK time. They are aimed at anyone with an interest in OER research, and will feature external guests, GO-GN students talking about their work and also research advice sessions. So, put a reminder in your calendar, details will appear on the GO-GN website.

I did the first of the new series, using it as an excuse to trial my talk for OEGlobal and OER16. It was looking at types of OER user, based on the findings of the OER research hub. What with OER movement being 15years old now (depending on when you date its inception), I’m interested in the strategies for engagement with OER. In the talk I propose three types of users:

  • OER Active – these generally know what you mean if you use the term ‘OER’. They are engaged, have knowledge of licences and act as advocates. An example might eb a community college teacher who adopts an open textbook and becomes an OER champion.
  • OER as facilitator – these are people who want to achieve a particular goal, and are only interested in OER in as much as it allows them to realise that goal. This might be flipping a classroom, saving students money or increasing retention.
  • OER consumer – this group just want high quality resources and will use OER amongst a mix of other media. They don’t really care about licences, but they d care about good, easy to use material. An example might be a learner considering entering formal education and seeing if the subject is for them.

If these groups have any validity, then they have implications for OER strategy. I would suggest that thus far most of the attention has been focused on the OER active group. This has been a successful strategy, but there may be limits. You can’t make everyone an OER convert. To reach the other groups different (but complementary) approaches are required.

For instance, the OER as facilitator group want packaged solutions. It may be that we can identify five or so key aims here, eg teachers who want to flip their classrooms, those who want to create distance education type all inclusive courses, particular subject areas, etc. For these a packaged OER based solution can be created so they can more readily achieve their goal. This is the type of activity that commercial providers offer. They know that teachers are busy people, and offering convenience is a key benefit. For the OER consumer there is a need to improve the overall OER brand. Usually OER project funds are spent on producing good quality material. But we don’t have a very good cross OER brand, so maybe there is a need to bring in marketing, SEO and promotion expertise, so OER can compete with publishers who have whole departments dedicated to this.

The replay of the presentation is here.

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

The blog has now moved to

Dr Gill Kirkup's blog - Thu, 17/03/2016 - 12:42

Gynoid Times has now moved to:


What are the research questions for OER?

Professor Martin Weller's Blog - Thu, 17/03/2016 - 09:16
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

When we developed the OER Research Hub project with Hewlett, we came up with 11 hypotheses that they and we felt represented questions that it would be useful to find answers to. Some worked better than others to be honest, but it was a good way to shape the research of that project. We got the questions largely right I think, and this led to more people wanting to collaborate with us.

But it was still very much our interpretation as to what was significant, and this was back in 2011. A lot has changed in the OER world since then – we’ve had MOOCs, open textbook projects are getting solid results, we’ve seen the demise of JORUM in the UK, lots of new players have entered the arena, etc. So it would be a good time to revisit the key research questions for the OER community. This isn’t for any project we are running, so it’s not “what should the OER Hub research” but more widely, what does the community as a whole feel are the research questions that should be addressed? For the OER Research Hub there was a focus on trying to establish evidence for what were perceived as long held beliefs about OER. It may be more targeted now, for example, if it could be shown that OERs have an impact on this very specific aspect of education (for example retention), that would be a key piece for influencing decisions.

To this end, we’re running a couple of workshops at OEGlobal and OER16 to explore the research questions for the OER community. I’m sure you have an opinion regarding key research questions, so please complete this mega-short form to let us know. And if you’re at either of those conferences please come along, but if not, we’ll run some online discussion also.

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

LAEP2016: Weds pm

Dr Doug Clow's blog - Wed, 16/03/2016 - 15:36

Liveblog notes from Wednesday afternoon, 16 March 2016, at the a two-day expert workshop on “The implications and opportunities of learning analytics for European educational policy”, at the Allard Pierson Museum, Amsterdam, organised by the LAEP project and the LACE project.

Foresight exercise

Small group: Learner profile 4 – Marie Martin; Educator profile 4 – Remy Depardieu.

Concern for Marie about coping with the course. If LA could provide reassurance, based on previous students with her background and how they coped, that would be useful. Maybe recommend crash course, update on technology. Reputation and learning about interaction, but managing those courses requires digital courses in the first place. Also social media skills to manage the reputation. Maybe invest in software that would help her do the course but would hide her identity. Like Tor for biometric tracking. Black market eyeballs! A LA blocker. How can she get good analytics? Blocks her identity, but not her activity. Having other people doing it for you?! We get quite dark thinking about selling her eyeballs to other people. She wants to be able to take the course but not be identified. Identity assurance, to avoid anonymous bullying. You need policy measures, to prevent sale of valuable identity information. She should be able to pay extra to not have her data sold on. Policymakers perspective, we should have a strong framework. Verifying your identity makes sense, to prevent anonymous bullying and cheating, but it doesn’t have to be public. Maybe they can’t make it public unless it’s authorised. What are the main barriers? This biometric registration, for a leisure course – what if she wanted to take a course about her sexuality. It’s different if you are going to be certified or increase your pay grade.

It’s about privacy, and consent. This is about data. The other parts are easy – she can build up skills through a series of Khan Academy courses, or something. But attitudes towards sharing course participation? If you’re going to take a course like this, would I want a group course, or a personal? You want it just in time, just enough, personal. With a group, you have to take them in to account. She wants a one-to-one with Remy. There seems to be a mismatch between what she wants as a learner and what’s available from the school/commercial system. They’re not being customer-friendly. Maybe they have a business model to sell the information on to others, e.g. advertisers. Examples of very intimate tracking. Posture prompts, all sorts of crazy smart gadgets.

Policy changes? Register privately. Consent to it. As a consumer, if she doesn’t like the implications, she should just not register. There’s bound to be a company that will let you register privately.

Plenary feedback

Learner profile 3: Thomas Müller

It’s not good for Thomas.

Disruption for the household is coming through gaming. Stealing resources from the boy next door.

The scenario of unemployment for drivers is not far-fetched, with self-driving vehicles.

Here we don’t see data sources for LA. His career, market-wise, interesting to know whether to go in to this new market, but is it too late? An average person has 7.3 jobs over their lives, likely more in future. Draw on the idea of giving the data back to the learner, career advice. There’s a trajectory. Big data could match and gather data around similar people with this profile, and the decisions those people had made, how that impacts on their choices and outcomes. Give them wise advice. Maybe getting in to an industry when it’s already peaking may not be a smart move. In Maastricht, the mines closed, the Government made investment for a university.

I love this example best! He wants to become a beer brewer, look at LEA’s Box Facebook page, I wrote some articles about LA for guitar players, chefs, painters. What can tech do to support him. Statistical data are so far away from things this guy wants to learn. It’s about taste, fine skills to make a good beer. Also about C21st skills like making good PR to sell your product. The microbrewery scene is all about your commercial appearance. We have a mixture of skills, competences not necessarily related to your subject matter, and extremely difficult to be treated by statistical information and data, like taste. This is a great example of how to support people who want to become beer brewers, wine makers. In Austria, we have schools that teach people how to make good wine. We can learn a lot from that approach. Tricky thing for next couple of years is bringing big data together with creativity, C21st skills, overarching need to be successful in this world. Empowering learners to be successful in our society.

Job market intelligence, disconnection between education and the job market. Redesigning the curriculum is the area to explore here. Hinting at some automated system to suggest career to follow. How do we avoid the SatNav problem – road is blocked, everyone go that way, but it’s blocked. Add some jitter to the advice. Also have to understand what the job is. Universities, there is no limit on university staff. Beer brewers there’s a finite limit. Link to existing quality frameworks – vocational training quality frameworks, a reference we can use and this should be linked to, instead of reinventing it. Who is not on a standard career track? [a handful – suspect most people didn’t hear] Outcome analysis linked to the job market, you’ll always suggest the paths lots of people have done. But most interesting are people who’ve followed a different path.

Teacher profile 3: Maria Koch

Working on training for midwives.

Two aspects. Using LA to look at background knowledge, aptitude. Which 10% are likely to have an aptitude for it. An approach that might be feasible, not sure it’s the right thing to do, because she doesn’t have the expert knowledge, the idea is to use monitoring and video capture of expert midwives in their practice. Matching learner behaviour to expert behaviour is a role for LA. Barriers, lack of expert knowledge is one. How to conceptualise the role of different kinds of education and training. Engaging taxi drivers, maybe if they need a job. Privacy is an aspect, medical practice in real life, how to manage that? VR may be a way, simulate. Risk of lawsuits with intensive monitoring, used as evidence against them if things go wrong. Policy changes, some consideration about policies to allow or require monitoring to go ahead and be appropriately QAed. Maybe indemnify against lawsuits from the capture process.

Similar feedback from another group. Particular aspect around quality assurance. Course development, short time, lack of domain expertise.

Learner profile 4: Marie Martin

Post-human impressionist painter. Wants to learn combination of cooking with 3D printer, but has reservations about tech and concerned about need to give biometric data which might embarrass her and damage her artist reputation.

In terms of learning the technology, LA would help build tools, figure out her skillset, bring up to speed with the technology. Buying it isn’t a problem, doesn’t have to steal it from her grandchildren. The company that she wants to take the course with, she doesn’t want to share her information, she should be able to take a course direct with Remy (the educator) so investigating idea to hire an identity to take the course for her. Like World Of Warcraft players hire people from other countries to build up their character, now. LA Blocker (TM). In a way it’s cheating? It’s like she’s wearing a mask, Remy wouldn’t see Marie, but Bob. There’s an arms race with that. She’s willing to pay extra for premium service, why is this not possible? I’ll pay extra to have full attention. Is it just a financial problem? No, she has power as the consumer, can just find another course. Maybe it’s a market solution, maybe some regulation about what has to be public, or consent.

E.g. schools with base level analytics, or higher-quality. Should there be taxation on the higher to fund the lower? A degree of elitism here, should we have redistribution. You pay for extra services, get an advantage, that propagates across generations. If it’s commercial enterprises that charge a premium, they pay taxes and VAT on their profits, not a problem. If universities, it’s budget allocation. That’s their own freedom of choice. I don’t think you should ‘punish’ universities on the choices they make freely. This is a course driven by personal interest, she doesn’t need to be certified, there’s a different element of policy here. Also policy element, does she understand what the company will do with her data. It should be possible to be anonymous.

Educator profile 4: Remy Depardieu

Well established teacher. Had problems with incident exploding on YouTube. Problem was failure to pick up the impact of institutional change, which should have been picked up. Maybe you need a computer to say he drops off when his learners are more diverse [but maybe you should have noticed that anyway]. Maybe worth keeping an eye out matching the time of organisational change in the analytics.

Our group focused on the internationalisation of education. Already happening. Coursera, MOOC platforms, brings the problem of cultural context. This is a problem that you can try to tackle. Two approaches, one is to avoid it at all, you just generalise, take off all the edges, one size fits all. Interesting for cuisine, already not one size fits all. The other is to take it in to account. I say something, I don’t know how you hear it. LA system could help the teacher with. Trying to bring over this message, take care, there are cultural dimensions in what you’re saying.

Metaphor for standards. It has to do with every learning module you are trying to make. Here everything is in the public domain, you can see a love-hate model. Is that happening now? Here it was an incident, triggered debate. Can the system inform Remy he has a student with a creative response.

We have standardised computerised tests for your driving licence. Because someone ticks the box, yes I will stick to the speed limits, is it real? I know I have to tick that there. I can drive whatever I want. There’s a legal issue there. You consent to obeying the laws where you pass your tests. In this case, creative side of things, it’s different. This student is being provocative. It’s a course on French cuisine. We’re assuming this is leisure-based course, but should be a way for a teacher to respond to this, support cross-cultural work. Company loses reputation, maybe a refund.

Can analytics be so smart? Not even that would be enough. You have to develop algorithms that smart. Behaviour and how it relates – is it like persistence? Keeping on a solution that won’t help them with their goals. That’s different from this one example. It depends on the learning outcome, is it specific, or more broad. If it’s just cooking, Sprite’s fine, if it’s specifically this sort of Coq au Vin, it isn’t.

Foresight exercise 2

Small group:  Learner profile 7 – Giovanni Zanardi; Educator profile 7 – Laura Botticelli.

We’re back to zero with pen and paper, below zero, huge pushback on use of technology in schools. We have to start with policies. My education friends say the teacher is doing analytics anyway, just not with the sophisticated data. Teachers have the rich data in their heads, they know the background of their students, their parents, their interests. There is some analytics in there. Like for K12, not later? Most of these cases, the decisions about using the tools, it’s based on individual events with high impact, and beliefs about whether they are good, not on data about the efficacy. Focus on true knowledge! How can students take PISA (?) tests? I guess they can’t. We randomise, get data out of everyone.

Looking at educator. She doesn’t want to [use LA], she likes chalk and talk. To what extent are they allowed to do what they want? There’s a national curriculum. But the ways of working may not be standardised, may just be learning goals, but the teacher can decide how those goals are achieved. I expect they would say about digital skills … but not here.

In the learner profile, he must be doing things in his free time. Even nowadays we have a backup, can restore to the previous days. This is a political decision. We need policy. Strong EC decision here, to anchor curriculums in core European competencies, including digital ones, in every subject. This could be against competition laws. From research perspective, a great natural experiment on learning tools. But you can’t collect data on the schools? Do it with pen and paper. But then it’s a different outcome. Why are we talking curriculum? The personalised learning, they’ve gone back from that. Like Poland rolling back the reforms. Reduce number of years, reduce the number of dropouts. Any evidence of improved PISA scores, because they don’t drop out? 15yos take it. Then 16yo are now out. Suggestion to impose a common European curriculum on the member states.

If we can show that students are unemployable, can use that to follow up. Wait until they graduate, see if they get jobs. Doesn’t say the workplace has no technology. Wait until the companies start complaining. That may take a while. They are developing replacements. Like Steiner, Montessori learning. Adapt to corporate surroundings with technology. Maybe they can be the innovative, out of the box thinkers. Not enough energy to power our devices. Risking the education of a generation based on a thought that there won’t be a replacement for power? Too high a risk! Do you ask for an operation without anaesthetics? No! Why leave our students without a chance.

[Diversion discussion about Brexit!]

There’s an institute campaigning against ICT. More serious is he’s at risk of exclusion because of his device. Thinking they may be sent out of school just for wearing a device, it’s kind of a discrimination. Many schools have practice fo collecting cellphones, they can’t do it if it’s implanted, you have to send them home. We already don’t let people go to school with a scarf. It’s discrimination.

What about analytics? We don’t have data! Have EU try to convince leaders of Italy that their policy is wrong. Perhaps more subtly. Track student mobility, in tertiary education. Immigration, tracking employability of Italian graduates across Europe. Maybe people will move, people with kids move away to get a better education, a brain drain. It’s already the case in southern Italy. People move north, to the cities.

What do we do about this institute? They might have a point. [laughter]

Plenary feedback

Learner profile 5: Åsa Anderson, Educator profile 5: Kristofer Palm

8yo being ranked swiftly. Many things wrong with this. Not too little use of LA, but too much. Too much measuring, leads to stress on children. Very behaviourist, quick responses, very granular. Maybe broad feedback is Ok. Comparison issue, normative measures are problematical. Notion of, what’s it measuring – what you can measure, rather than measuring the right thing. In this case, people selling a product, wizardry, needing to make sure that user is bringing a sense of pedagogy, what the child needs. She’s cheating!

Is help from a friend cheating? Getting questions about using social media in class – that’s called living now! Real challenge if you organisation relies on IP, if they are chatting on Twitter. What do you recruit employees for? Recruiting me and everyone who follows and responds to me online?

This scenario, white gleaming classroom with chrome. It’s antiseptic. Bleached white wood too. It’s cool and interactive? Question about diversity again. More efficient sausage factory, make more sausages that are the same. How do you maintain creativity if very specific goals. This is what marketing companies are selling to us, you’re skeptical. We’re not skeptical, we’re antagonistic – we could do this, we don’t want to. IBM are doing this already.

The teacher is wondering if they’re doing what they’re trained for. Most would like to have better control over classroom, especially new teachers. Some type of support. Not sure they would like to have the motion-tracking support of knowing this child is bored or not, on or off tasks. Sometimes good to let them off task a bit, important part of being a teacher. That freedom is a bit lacking. As a teacher, I can see you talking to me. This is an erosion of trust, your role is to trust your colleague. The student is gaming the system, but maybe that’s Ok. Inside education you can do that more than outside. Maybe she’s not really cheating.

Learner profile 6 – Natalia Eglitis, Employer profile 6 – Lina Xin

Natalia wants to go abroad, does she have the chance. Is this LA, or just personal choice? Do we have to have policy about this, or is it just what you do? If you share it with FB, that’s your choice. If you share it, employees know it, it’s also a disadvantage.

You have this on LinkedIn with voting. But maybe in the future you will want to see evidence. Here she might be good theoretically in building bridges, but maybe we should see the data. More and more our students will have to prove they are capable citizens. Stanford doing online student transcripts with linked evidence, that’s quite fun.

It’s a matter of personal choice to apply, doubt we have a role as teachers. Why did she not just go to Portugal?

Why would you want them to speak Chinese if they have an implant/device that can do it for you? It’s not just about speaking. They don’t have to learn the language, it’s not so important, there’s many things you can do beside talk.

Like influx in Sweden of refugees, big market for language skills for them. If not sure how many are asylum seekers, how many will have to be moved back.

Learner profile 7 – Giovanni Zanardi, Educator profile 7 – Laura Botticelli

In this world, there’s retraction from use of technology in schools. Two incidents. Legislation response.

Our discussion said, Ok, can’t really understand how they retract so fast, so believe this is a political movement to something else. The reports from the institute is leading us to believe there is a political thing behind this with subsidiary agents working against use of technology in school system. Reflected in frustration of the learner, he would like to use tech but not allowed. Device implanted in his hand, he may be expelled. Raises interesting questions on discrimination, exclusion of students. One thing is to collect cellphones from students. Another thing is to send them home based on what they are wearing. Link to wearing hijabs and the debate there. It is definitely controversial. For the educator, brilliant profile, lot of sentiments teachers have today. She is relieved not to use tech in her instruction any more. Some problems, Brexit victims, trying to reach them, other places there are programs that are better to catch up with problems like this. She’s retracting to comfortable position of chalk and talk curriculum.

How to crack this one? All educators are analysing their pupils, very richly, dense data from human interaction. We’re lacking overview. THinking supranational, recommendations from European level, tracking how Italian education system is performing now. Are the outcomes from the Italian system, students graduated, are they still attracted on the European employment level? Of course, if one goes that far, you could impose central European curriculum, common curriculum, including technology use.

It’s a provocative approach. Say if in 2027 PISA say no difference? There’s won’t be PISA, they can’t do that. Say they make exception?

The teacher and parent perspective is great. We can’t believe that technology is extracted at all, only the education system. This could be reality in some places in 2025. In eastern Hungary, don’t even have windows, that would be an improvement. The context will be different. I’d guess 80% of teachers would love it. If they were used to it, they would not love it.

Informal use of it? Could Brexit parents let them use tech at home? Is all of Italy offline? No, only schools. The teacher could go to them and say, maybe you should try to learn Italian at home.

[tea break]

Learner profile 8: Elena Fechter, Educator profile 8: Ernst Bild

Conversation about trust and trusted third party, role for Commission to look at those. Philosophical conversation: many questions.

Issue, we’re too much talking about elites, an elite picture of our learners, parents want to access data of my 25yo just passing final exams (!). I think, most people are so smart, already elite. I’m one of the average guy. I do not know any 15yo boy or girl interested if they’re in the top 3% of their peers. That is not a reality for me in my environment. The arguments I see appear to be too much driven from elite, societal perspective. Conversation, you must be above the average – that’s the way of thinking – half of them must be below the average. The median. (!) How to translate our love of big data, we have them, partially – what can we learn from them? I’m pessimistic that only this approach, focusing on elite, come from number 3 to number 2 or 1, on the one hand, and from this charming big data approach, to a very realistic learner-centred approach, in real-life settings. All of those scenarios, they scare me a bit. It’s from a bad science fiction movie.

Why’s it bad? The more data we have, the more precious people feel about their data. Elena wants to own it?

It’s a dystopia – it’s not meeting either of their needs. People obscuring their own presence, that goes on today. People are not always going to be forthcoming with information if they don’t trust the environment. That’s not just a future problem. Not having the information might be realistic.

We’re talking about learning analytics, analysing data. That is only one part. We must be very cautious about what we personally think we can do with these analysis.

Should EU prevent students providing data?

There should be a minimal subset of data, to bring up the quality of education.

Learner profile 9 – Juan Hernandez-Santiago; Manager profile 9 – Marianne Salome-Hernandez

Spain established an awesome system. Open access database, it’s visible for everyone. The manager of the university has a great one. Silicon Valley model of outsourcing, online coaches from abroad, feedback.

The issue was, even though they have a great system, you can’t add something from outside the system. Even though an open system, it’s closed to the realities of Juan’s opportunities. CVs no longer exist. We don’t know how to solve his big issue.

For Marianne, she should look at more quality. Question is how trustworthy are her measures, approval scores. Bit of lack of trust in this data. Maybe needs to see how to get more quality and trust in to the system. Make sure that her selling point is to make the brand of the university, not just cheap.

LA has a big flaw here, hasn’t predicted there won’t be Alonso in 10 y.

The right hand side, Oviedo is not so good a university right now. If the data said so? [unclear] Satisfaction is high, so the throughput is good, high quality. If you’re selecting a certain group of students, perhaps they’re paying a lot, of course, they are very happy. If this system works, the data is reliable. If that many are satisfied, you’re doing something wrong. You want people to make mistakes to learn. Some people will not be satisfied, that’s a byproduct of learning. It’s not how many mistakes you made, it’s you have a degree, lets you be well paid. But this isn’t saying that.

Maybe thanks to LA and going global, many people in Chile and Peru don’t know the reputation for Oviedo. It’ll never happen. Why not? JISC national infrastructure, self-declared stuff from students to bring their own data. Bring diversity, do analytics over that diversity, will solve those problems.

Validation of foresight exercise

Each participant has three stars: if you like a scenario, add a star at the top; if you dislike it, add star at the bottom.

[Lots of star sticking.]

Plenary: Potential for European policy related to learning analytics

Imagine you meet the minister for education at the airport, have one minute to convince them what action should happen now, and wish list.

The minister will say, don’t ask me for money. You can ask anything, but no money.

Teacher union perspective about the dynamic in the community. You need a reference model.

In this example. Ministry of education, I met our minister in the summer last year. At a conference of teachers. E-education summer summit. I put on the wish list, we should shift teacher education, still driven by people of my age or older, to a level where media competence, the ideas of learning analytics, and perils, dangers, data mining – all this is treated in a way next generation teachers can use the solutions we have, and will have in a couple of years from now. So they can use them effectively for the benefit of their students. More comprehensive, complete, realistic teacher training. There are already efforts to do that. This doesn’t cost money, there’s teacher education anyway. It must be modernised.

On the now list – clear statement from privacy commissioners about the benefits and controls to protect learners, teachers, society.

Two things. First, orchestration of grants around a reference model, so we don’t have replication. Need to start being more efficient. That’s an immediate thing. [Can you say, you’re wasting money?] Sure you can. They will pay attention. Can say, by assessing project based on number of LOC, you are ensuring lots of LOC written. You have projects ending, particularly LACE, if you don’t continue the Evidence Hub immediately, you’ll be inefficient because it’s not visible. It’s important to continue the LACE project.

On evidence, the risk we have is LA takes us down an objective view of education, promotes the objectively measurable things. One is on, thinking about C21st skills. Curious, plenty of people writing the scenarios are engaged with this, but they didn’t come through. Gather evidence for C21st skills relationship to LA will be a powerful tool of realisation of LA that does not favour a particular view of education. It could be done now, it’s on the agenda…

Evaluation. Another challenge, thinking about answering the question, was the LA successful? If we focus on attainment, we miss relationship between LA and process. Framing the evidence about success around process, not just headline results that mask benefits and disbenefits. If we say do PISA different, focus on what students are doing in the classroom, without extra cost, that’s a heck of an exercise. But teaching only happens where process happens in the classroom is. The final destination doesn’t help the teacher to act. Have to think about action.

Cost benefit. If I’m investing 100k in a solution, at school level, what are the consequences of that expenditure. We could have a mandate, 2% of ICT budget should be spent on LA. That could be a recommendation – do we want to do that? I don’t know it’s mature enough to say that should be an aim.

A Kickstarter type thing for LA companies. It’s almost asking for money. European approach, different stakeholders, companies, SMEs that want to build on LA to create opportunities for innovation. Biggest issue is what we do when the budget ends. SMEs are supposed to take over, it’s not so easy because of the way the funding works. Not an MIT Media Lab model, but where private/public opportunity beyond the grants. It doesn’t have to be connected to research. It can be a detriment to connect to people who think too deeply about things. Partly innovation is about disconnection, self-belief. It’s a different type of grant, bit like Innovation in Action in H2020, but not as formal. Crowdsourced, more market need would be good. Not sure how to formulate it. So e.g. crowdfunding, EU tops it up, then gets something back if it works. There are existing loan schemes like that. It doesn’t need to be crowdfunding. Take H2020, small innovative companies, plan on 1 page. Make application approach like that. Not 100 page document, why not 1-pager, 3 milestones, 10 letters of recommendation. Would save process costs.

More comprehensive approach, more scientifically based, identify the success cases. Bring this to the reference model. Solid foundations to reference models. Need to identify which are the success cases. Taking in to account cultural differences, curriculum. Maybe not just one methodology Perhaps we should have more stronger international coupling. Easy to deploy in your institution. Could be the LACE Evidence Hub, or more general? Methodologies is a key aspect.

One problem, we discuss LA nicely here. What’s going on outside? People know very little about this area. What ministries, politicians can do, is spread the word. Ministries can try to delegate this task to local communities, in schools you should start discussing it. In 2025, these people will be living in that world. If we do not start it now, it’s already too late. An ambassador? It could be. Part of the national curriculum somehow. Include in the teacher’s curriculum. Raising awareness, organise conferences. The field is not major enough. Collecting good practices is still difficult.

Technology standards and architecture. Implicit in orchestration of grants. Several years since we lost the workshop, no standardisation track in Europe that’s open access. Problem was that it was a route to take outcomes in to pseudo-standardisation, not drawing from practice. But now are creating profiles for standards, JISC work in UK, xAPI in NL. Put them in to practice for analytics, what will work from capture, data science, to users. No mechanism to create those common profile standards so people can use them across the market. Open access forum to allow standards to be created from practice.

Technology, interoperability is important. But is that a solution? Heterogeneous teams. Practice. What happens if you put IT people in the school, they make UML diagram of how they work. Innovative pedagogy. Heterogeneous teams. Maybe we could suggest, in the NL, teacher has EUR3000 for education. Take that money, put it in to a challenge or focus, put technicians in to schools to work out, we need technology, but we need pedagogy driving that. This combined approach needs to be addressed.

We are overstepping territory. Creating solution, also creating the problem. nobody else is. I would ask the minister, what is on your agenda, your long term agenda. What do you want to change. Then we know their priorities, we can create that, use it to create solutions.

They will say – I want to be top one in PISA. Any solution for that I will take.

Gender bias, youth mobility, inclusion, early school leaving – that can all be supported. Follow-on question: how will LA help solve those problems.

Data privacy, ethical standards. Openness. Transparency. LA is part of a hierarchy of data-driven decision-making.

I don’t love data. We need to be smarter. Not tomorrow, in 2y, but in 10y. Need to find smarter solutions than having statistics, like Google Flu Trends. Tried to estimate flu traces from Google searches. That is a technical problem but little to do with human learning. It’s so very different. Assignment to classes. There’s a big big danger that if we have an imperfect set of data gathered through whichever channels, we are biasing people to make decisions based on imperfect data. A big danger I see. We need big data for that, ways to treat it, smarter than Google Flu Trends.


Top three action – innovative pedagogy, evidence hub, and ethics.

Top three wish list: Teacher education, decide which problems we are solving – third is data privacy, ethical standards, open and transparent.

Next steps

Will send out a draft report for comments.

One further thing. To see how LA can support systemic change. Education ministers, they want change, the bit trouble is inertia. Bringing change is difficult, many stakeholders. The question could be, how can LA and tech that provides evidence, how can that support systemic change. If you do research, research on that. Then we need money. But that’s how you justify asking for money!

Thanks to all, safe journeys home.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

LAEP2016: Weds am

Dr Doug Clow's blog - Wed, 16/03/2016 - 11:15

Liveblog notes from Wednesday morning, 16 March 2016, at the two-day expert workshop on “The implications and opportunities of learning analytics for European educational policy”, at the Allard Pierson Museum, Amsterdam, organised by the LAEP project and the LACE project.

Summary of yesterday

Garron and Jenna summarise. Garron – one point was the need for presence of teachers in the field, how to connect to their problems. Another thing was around data privacy and considerations to take in further efforts.

Jenna – a lot of talk about the human element in learning analytics, to serve processes, about more than just numbers. What do we mean by education? How does LA add to that? Student data goes beyond simple activity data, rich datasets needs. Debate – if all you have is activity data, can you still gain valuable insight? A question about what is it that we want to know, and what data do we need. What we did not talk about is also important. A lot of the focus on tools, practices, teachers. Big demographic we’re missing, the students, the people LA is supposed to serve. Big question for today, how is LA servicng them, how will they gain. Secondly, much focus on HE, a lot of us come from that so perhaps natural. It also focuses on [other areas], we should keep that variety in mind. Finally, activity in the afternoon brought good, broad suggestions, but work today on explicit policy suggestions. How do we get from ideas to policy?

Konstantin Scheller

From the Commission, DG Education and Culture, funding this project. Some context about why you’re here, and what will happen with the outcomes. In 2013, EC had policy communication called Opening Up Education, first to mention LA. Most member states didn’t have it on their states. Now, some countries were. It was already a field, we noticed at that time there was little understanding of how this could be in practice, no attempts to get to policy. This research task exists to provide input to policymakers. It will be a document, which will come to member states – e.g. Estonia, Denmark – what you say will reach the ministries, the Ministers, the message will be there, and go on to Commission policy. How can we bring that vision to reality, what are the policy issues we should address, what are the big questions. We should not get involved at all? Perhaps cause more harm than good if no deep understanding. You’re not just talking to researchers, you’re talking indirectly to education systems about what they should be doing. High-level messages. What should policymakers have in their minds? Usually they don’t know so much. We are all not experts, you are the experts. Important for us to understand what concerns you, what might be coming in a few years. All the issues that you see, that need to be addressed at a high level – privacy, standards. Thanks for helping us to make good policy.

Visions of the Future

Doug Clow (me) gave a presentation on the LACE project’s Visions of the Future study.

LAEP Visions of the Future of Learning Analytics from Doug Clow Planning Exercise What are your current concerns over the next 3 years in moving LA forwards in your own practice, in your own environment? [small group work] Simple solutions? What if we focused on simple things, like e.g. happiness, people are willing to share things like emoticons on social media. Walking before we can run. How do you convince people that LA is a good thing? Teachers, students, policymakers, others? Young kids and parents – they’re very involved at that stage. Anything with connotations that don’t go down well. But being on top of what the child are doing, keeping a close eye on their progress, parents are very supportive of that. Subtle differences making a big one, between being cared about, and being surveilled. Procrastination can be an important thing! Need to be allowed to play your procrastination card today. Don’t want to be reminded, just to gorge on Netflix. We could do stuff technologically, but is that going to put people off? A friend had a heavyweight spreadsheet about everything they do – right down to e.g. sex?! With LA, we can have knowledge like e.g. you’re not going to do well in this class. That’s already sending a signal not to make the effort. It should be a prompt to get more help. We have to find a way to use the analytics in a way that’s more human, more responsive to the needs of the person. Two things: not going too far, but also about the tone. User experience stuff is important. As researchers, we’re interested in the detailed nuances, but that’s only interesting to other researchers. So e.g. standing tables, people who are tall use them differently to people who are short – uninteresting to people apart from furniture designers. Must be some parents who are more controlling than others? Absolutely. End user experience, for the child, or the parent. A 3 or 4 yo, has limited ability to agree to how their data is used. Have to bring in another stakeholder. Supporting good pedagogy, formative assessment, what teachers have always done, but enhancing that, and it’s not too onerous. Near future? A: Last 10 yo, schools required to have a learning platform. Now fragmented school system, change hard in England. Leaders in the school system influence, self-improving school system. My school is a national support school, so a chance to roll out what we do to other schools. Ofsted still there. Networks of schools, no local government. Now headteachers who are system leaders, messy, slower to effect change. B: Creating systems that support small group work around design engineering tasks. To have a system that can support interest-driven work by students. So the role of teacher is more of a coach, students have more of opportunity to reflect on how the group work is going. Ask why did our project not work, create documentation on the fly to create their reports, learning portfolios. Lots of those tools missing. Create tools to support students, give them the chance to reflect. We’re known for thoughtful interaction design, to teach people be more reflective practitioners. C: From policy perspective, will be increased demand for services like learning analytics, educational systems are under pressure to cater for refugee crisis, shift of demand in skills in the labour market, shifting funding regimes. Solutions lie in scalable systems, relieve the teachers from some of their more tedious duties. Efficiency perspective on this is important from policy perspectives. Those aspects will be very much sought after in years to come.  Funding will be less explorative, less quality improvement of learning, more on efficiency and easing the burden on the systems as a whole. It’s kind of a bleak message. Looking for leap from structured subjects (STEM) to e.g. language or history teaching? There’s some. D: Assessment processes – formative assessment. MCQs is easy. But more qualitative, essay writing, we can do more to support humans. It could be more efficient if supervisors get help from text analysis. It should be reliable. E: Interaction could be a thing – face readers to train communication, it’s already used. e.g. for security guards. Also in sales. A program called Communicate, learning interaction, moving to face readers. One of the modules is on negotiation. How to be a better person! Plenary feedback (a) There are question marks. Europe has shown that privacy is important. Definition of personal data will be larger. New category of data, anonymous and also pseudonymous data. Third thing, role of the processor, may change in 3y, controller has more obligations. Helping parents, users understand who is responsible for what. A bad scenario would be organisations that are privacy-sensitive are cautious, meaning slow, organisations that are not will be first on the market, will be easy and compelling to end user, so worst scenario with companies offering software that is not really privacy-compliant. We already have this worst-case scenario. We have this in the US, adaptive smart courses, but there’s no data protection. That is why they’re successful, they don’t care about [data protection]. We must be careful not to have regulations that force us to fall too far behind. Putting ourselves at a competitive disadvantage … but maybe we should? Playing out at the moment with LMS features? (b) Three themes. Bringing people on board. Going to teachers, students and staff in institutions. Issue of partners working with teachers, employers, trade unions. Second, issues around legislation. New rules, guidelines from EU perspective. And navigating existing legislation – is what we do compliant? Third, resource creation and developing and exploiting opportunities. Building new ones, new settings, or developing what we already have, getting benefit for individual users. Quality! (c) We also had three. First, institutional support. Teachers are experimenting with small-scale projects. To scale up, need institutional support to scale up and experiment. More time for teachers to spend on these. Second point, more evidence. If we have opportunity to experiment, get good practices. We should see whether it works. Will help to convince people. Should be explicit part of grant – include evaluation phase, not just design of a tool, where you evaluate and share results with the community. Maybe should also include one or two people who spend more than 10% of time, so less scattered project team. Critique of proposals, for smaller startup experiments, you need lots of paperwork and time passes before you start. Once everything is settled, a few years to start. If you could start from day 1 would have results more quickly. Maybe something in the bureaucracy. SoLAR had Open Learning Analytics framework. Need this framework, monitor how you cover that, and that’s missing. Lots of universities are building dashboard. If we don’t have a framework … we can’t distribute the grants. In H2020, require that things will be open, no intermediate step. If you go for commercial, you have to have business plan away. Academic project often just build it. But that’s changed in H2020, looking for product, not algorithms. Maybe an EU framework? Is quality the same as empirical evidence? It’s about assessing quality in general. (d) 3y is one generation of students, and 0.1 y of teachers. We have to socialise the idea with [all stakeholders], currently building silos to protect themselves. “My data” could be a problem. What are the sentiments coming in? Scary developments. Frank Pasquale’s book on the Black Box Society. Nobody knows what these are doing. Demands for safe spaces – students must not be frightened, shocked, scared. Will LA have to deal with that? If that happens, we have to distinguish ourselves from that. We are not Fb manipulating your emotions, we’re not Google exploiting your data. We’re not unaccountable algorithm, we have humans. We need a different story from the big data. The wave of sentiment created by them will sweep up away. I disagree. We’ve been doing large-scale implementation of predictive analytics. Have interviewed 10 tutors over time, they’re saying I don’t trust the data. Later say, yes, the 10 were at risk they dropped out. Then later say, yeah, 10 new people identified. There has to be some human interpretation. We’re losing students because people are not acting on the data, even though we know it’s accurate? This is a critical point. One thing is data, stats, this student is at risk. What can I do? What is realistic? Is there any guidance about what I could do within the resources I do have? What are my chances to reach this person? These are the critical questions for the next year. Superficial information, from red flag to this is what you can do. My reading of the policy to students is that what they do with red flag is go to your tutor who knows e.g. you have just moved house. At the moment the OU is explicit that there is human mediation. The purpose of red flag is not the concern of the student, but ROI. Natural churn rate with adult education. Is this the right use of LA, to ID someone at risk, because as a teacher we’re concerned. Or is it the OU interested in maintaining higher profit. Not trying to defend the OU! We know there’s 2-week time window we flag students at risk, they eventually drop out. Thousands of students. Even if 10k at risk, don’t have enough to call on them. We are already at at stage to use LA, but how do we implement it without the structures? DeepMind is look for a job now! Institutional interest in higher retention. Also a strong issue for the student. If they’re leaving on a mature, active decision. If they fail and didn’t realise they were in trouble, that’s different. So LA could help them make adult decisions about how they are working. It’s an example of how we have the data, don’t know how to apply it. Should be, we know what to apply, then what data do we need. (e) Finding resources, bringing people on board. Many of us developing tools, trying to implement it. At some point, our project ends, what then? I’m already involved, our digital portfolio product, what do we want to do when the project is done? Did I do a lot of effort to develop tools, and then …? Another issue is related to emotional or social aspects. We have emphasis on WBL, you have to be confident in the workplace also. There’s a gap between little projects, then bigger. Role for Commission in talent-spotting after project finished? At the end, you have to do a sustainability plan, would like to write it will be used in our teaching system. But we only have it in our system because the project is there. Other side of the coin, problem of everything is either open or in a business plan. Was supposed to be a solution. It can be open-sourced, but the system still has to be adapted, fine-tuned to change our program. In European research funding, it’s balancing on an edge. Project duration 3y, 4y. First develop prototype, go in to schools. Doing some research, but also real life of the school. If you want to engage with them, to use what you have, you need a professional product, and it costs to realise that product. Have to support that each day. Make sure they don’t delete their databases. Easy to use configuration tool. It’s a difficult balance between real life. Then the project is over, and we don’t have resources to support them like a company does. Also SMEs, they want to have profit, they do not want to give a licence for free. Have a strong collaboration, but what happens next. Example in the WATCHME project, we’re working through it. We have concerns how this will end up. We have a strong sustainability plan, data market analysis, but who will take responsibility for that? Reference model idea. Do reference implementation based on the best of those, give a basis for companies to compete. (f) Motivational and policy effects at many different levels. IT department having control over the data, institutional policy, not allowing faculties to have access to complete student data. Institutional policies on continuous assessment as a motivational factor to drive other experimental analytics. Government and agency policy considerations, e.g. how student funding is affected by levels of completion, examples from Norway about effect of introduction of continuous assessment. Use of different learning models, PBL, systematic change of institutions. Commercial vendors are already there with quality assurance, it’s a real pressing issue. DataShop example, anonymised data. Best practice guidelines, evidence, should be packaged to the level where decisions are made. (g) Our own settings, but general themes: When thinking about tech, future, important to consider the danger of getting carried away and doing too much too soon. Restraint! Not doing too much. Restraint in the level of LA. There are things we can do, but are they going to encourage broad take-up. Perhaps more focus on shallower depth across more institutions, rather than some racing ahead at high levels. Restraint in tone of LA. End user experience of the learner – react to automated messages if you’ve been red-flagged. Nuance in learners – in my setting, age of pupils, the parents are also in play. Teachers as well as end users. Marking, how possible it is for technology to mark essays. If you are a student, compare feedback from a machine versus human. From teacher perspective, nice to have machine marking. Underlying message: wait for LA to mature? At a low level, that maturity is already there. Not being distracted by the possibilities – running before you walk. Counter-argument, leadership in UK has to take more LA. If we’re restrained and we’re the experts, not in line with the increase on the throttle. Different to summarise different settings. My setting, I’m a headteacher in a fragmented school system, it’s becoming more fragmented. Key part of the hierarchy is now gone (local government). Not sure the message from the Government in England is clear. The Australians, NZ, have policy downwards. We may say restraint, that’s a competitive disadvantage compared to others. I see a system, looking across institutions, shallow, but everyone’s involved, but alternative where some institutions at very high level, very sophisticated, but very patchy. Interlude  So in 2025 … that’s not so far away, but in technology terms, it could be. Case studies We have nine case studies. Provocative, really weird. We want you to think out of the box. Hope for a heated discussion. One row on scenario 1 – one focus on role of the learner, one on teacher. Another row on scenario 2, learner, learner, teacher. Until lunch. [This group scenario 2: Tayla Özdemir, Jan Zoetemelk.] This is a very weird scenario. A bit dystopian. Think it’s supposed to be. Very big brother watching you. There’s potential for LA to help Tayla develop her Dutch skills. Over the years, problem here is borders between countries, qualified in one country and moving to another. We want LA to be international, free movement of people and skills, can be used wherever they are. Movement of skills and data? Would need legislation as well. The data has gone in this scenario. But if it’d been held centrally in one country … except here they’re outside the EU. Could create a training program for her, assuming willing to pay off the chip over time, could work with a social worker. LA could evaluate if she has the skills to go forward. In Germany, they have community translators, you can work as e.g. a certified medical translator. A program like that she could apply for, become a trainee to be recertified. This happens today! Recertification is a C20th idea. This is a case of someone having the skills, not the certification. In 2025, maybe there’s the previously-wired skills tests, so we can test that to be admitted, do these courses. This is already happening, with competency framework in Dutch. Language skills, she can’t afford the translator. Self-testing, then recommender systems. Recognition of the value of diplomas. But can’t run the tests without the language. Wouldn’t it be cheaper to just give her the app? If she works as a trainee, can pay off the app, do the self-testing. Microfinance for skills assessment and translator tool. You would still want a live evaluation, field experience. Or peer evaluation. How about increased demand for social workers, have them train in Dutch but also introduce to the system, for more credits. An intensive language course, is solving the recruitment crisis. Quite targeted language courses, need special vocabulary. A lot of field work. A local community language is then a strength.
  • Analytics – used for self-testing, recommend resources, field experience with Dutch peers.
  • Main barriers – funding, but if LA can identify likely success, can be more confident of state outlay on her training. If for religious reasons she will refuse implant, will have problems getting a job. Moral or attitude training towards accepting it. [!]
  • Policy – in the training programs.

Plenary feedback [after musical chairs exercise] [the scenarios will be online later …] Scenario 1: Learner – Jack Wood We found it overwhelmingly grim. Potentially quite realistic. We went surreal in our answer. Being creative, bringing in spying agencies. It almost feels like LA is irrelevant. So many core fundamentals just wrong. It highlights the inequality, evidence to say this is unfair, student is falling behind because. But a few points – the LA could detect that this student always responds, could maybe adapt the materials. There are other things you could be doing. I thought it was quite relevant. If the secret service have the algorithms and data, that could be outsourced and useful for the whole community. If it’s done anyway, we should embrace it for wider use. NSA have not saved lives with their data. Policy, you could see people with poor devices, give them better. But should take steps back and think of overall policy. Access to devices, is that as core as access to heat, water etc. It may not be LA, but access to tech becomes almost a right. Scenario 1: Teacher – Jane Philips We got really depressed by this scenario. Someone must before class average! Teachers could be subjected to this, but we’ll act subversive, do something else that they would like to do. Their roles are reduced here. Not unrealistic – tutors paid at e.g. University of Phoenix about results. You can measure this stuff. It was too much about LA, with a teacher should not mention LA, it should support what they’re doing. We should not even bother them with that. If teacher doesn’t understand why she should be doing that, she will try to game the system. Using marketeer language, there’s a real risk. Scenario 2: Learner – Tanya Özdemir Chance for policy to be changed to let people learn more widely. Option 1, LA could be used to short-circuit recertification process putting them in realistic scenarios, comparing data with benchmarks of approved Dutch people through that, see specific areas needing training. It might take less than training from scratch. We discussed the same route. Concept of stealth assessment. As a teacher, we do assessment, but it’s not the goal, want to train them to be good, then do some assessment. Can see how she adapts to the cases. If she does have her history, she will adapt quickly. If not, it will just take more time. There are highly talented people coming from across the world to Europe, and we’re not using their skills. Language is a way of thinking, they look at concepts in a different way. Not learning Chinese any more will not help us. Like Hitchhiker’s Guide to the Galaxy Babel Fish. Scenario 2: Employer – Jan Zoetemelk The employer has no more money, great demand, rising tensions. Could watch behaviour in action. Problem is she has to take the implant [for translation]. What about policy? How much different is this from smartphones now? Some have it, some don’t. It’s just not in your head. It is fundamentally different, it’s a physical violation, you don’t need to do it anyway, there are other ways to contact your brain. More about if we want to regulate this, why don’t we regulate it today? It’s just better quality data. We do need to regulate. If we don’t, society will make its own mind up, people are already turning off their phone in shopping centres because of unregulated tracking. If you regulate by societal trust, you end up with extremes. Now you have free choice to disconnect. It’s difficult to do. But the choice is the same here. It’s not a fair choice. My niece and nephew have to have a device, it has to be an iPad, at age 12. This is already happening. What has all this to do with LA? Analogy of Facebook – some employers ask for access to employee’s Facebook. Not about prohibiting it, but what can you use that data for. That’s can the employer request that data. That’s an ethical question. For me personally, I care about ethics, I’m thinking about Tayla that she would love to work as a social worker but she fails in language. The first problem is how to teach her Dutch so she can interact, given the resource context. How can learning technologies support me in this attempt in teaching her? Ethics is important part in it. We need to see the cultural aspect. Language is also about how people behave, how they think, you approach people in a different way. You would not get that from a language translator. That’s about education, social behaviour. This is not a viable solution for the problem we have right now, with refugees. We need to approach this at a different level. At the end, we’ll discuss the scenarios, we’ll ask you to vote which ones are likely, and which ones you would hate.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

LAEP2016: Tues pm

Dr Doug Clow's blog - Tue, 15/03/2016 - 15:56

Liveblog notes from the Monday afternoon session at the a two-day expert workshop on “The implications and opportunities of learning analytics for European educational policy”, at the Allard Pierson Museum, Amsterdam, organised by the LAEP project and the LACE project.

LAEP Project

Rebecca Ferguson welcomes everyone back.

Doug Clow (me!) gave a short tribute to Erik Duval, a pioneer and leading figure in learning analytics, who sadly died recently.


Rebecca takes over to talk about the background to the LAEP project.

Three research questions: what’s the state of the art, what are the prospects for implementation, and what’s the potential for European policy?

Once we have a sense of what’s happening now, what do we think could happen – what’s feasible? And what’s desirable – what do we want to happen?

Project process: literature review, glossary, inventory of learning analytics, case studies, expert workshop (this event!), then final report.


Terms developed by Adam Cooper before he moved to Tribal. [Definitions developed by the team, including Doug Clow!] Used LAK dataset, frequently-used keywords. Available online, suggestions welcome! Help us improve definitions, and expand it.

Literature Review

Also by Adam. There are many literature reviews, but few focused on implementation. Gone through much literature about implementation, five areas: underpinning technology (interoperability, sharing expertise, avoiding lock-in); policy, codes of practice and governance; skills and literature; culture, values and professional practice; leadership and change management.

Not much work on what’s good about data warehouses; needs exploring further.

Lots of work on ethics and privacy. Lots of data has been gathered without much knowledge – e.g. LMS/VLE is gathering data you may not have thought about. How do we tell learners? And what? Data governance. Rights of ownership (Andrew Cormack tweets:

#laepanalytics looking at “ownership” of data. Is that a useful concept? Law talks instead about ownership of rights in information…

— Andrew Cormack (@Janet_LegReg) March 15, 2016


Need for open and shared analytics curriculum. Also research on use of visualisation.

Why are we educating people? To some extent we agree, but to some extent it varies by context. What are we trying to achieve, and how can we link analytics to that so it really works for people. How do we embed LA in to professional training, so they are confident and empowered, rather than bored by algorithms and statistics. Help to make informed decisions.


Andrew Brasher takes over. Desk research looking at the current state-of-the-art of the actual implementation of LA across the world.

Broad-but-shallow collection of informative examples across three categories. Policy documents (N=14); Practices (N=13); Tools (N=19). One-page descriptive information about each. This is one view. Please look online and add to our view on the Cloudworks LAEP Inventory space.

Case Studies

Jenna Mittelmeier and Garron Hillaire take over.

Garron starts. We worked on six case studies, collectively. Three cases: Blue Canary, UTS, Kennisnet. Blue Canary, startup acquired by Blackboard. UTS is committed to being a data-driven university, and a centre focused on that. Kennisnet, NL school work.

Jenna – we also looked at Apereo, Norway’s work on SLATE, their national centre on LA, and the Open University UK ethics policy specifically on LA. These are in-depth, seeking different goals. There are key themes related to policy from these.

First is a holistic view of stakeholders – e.g. at OU involving students, at UTS engaging with academics and teachers, in Norway on taking LA research in to practice and engaging industry, teachers and students. It’s not just about those who create the policies, but leading with those stakeholders.

Secondly, how national or institutional practices for evaluation of schools can stifle students and teachers from adopting LA. The evaluation practice favoured old or outdated teaching methods. Can lead to fear of adopting new and flashy tools, such as LA.

Garron – different challenges. Data privacy and collaboration across data. There’s need for definitions about how to secure data, and at the same time, collaborating across organisations. This isn’t a checklist for policies to match, but areas of challenge around LA adoption.

Rebecca takes up again. Talk to Jenna and Garron, or Andrew, about those.

European projects (Lightning Presentations)

Anouschka Van Leeuwen, Utrecht University

Teaching analytics – not many focus only on teacher perspective. For teacher to provide adaptive feedback, they need to be informed about students and their progress. My thesis is also on this topic, came out last year. Investigated in secondary education whether providing tools to teachers changed their behaviour. Found that LA made teachers in some cases intervene more, and in other cases give more specific feedback. Working on a postdoc project investigating functions of teaching analytics, the best division of work between teacher and analytics. Much interpretational freedom, or advice from the system about when to? Comparing different functions of LA and effect on teacher behaviour. Using our own environment, the learner activity leads to teacher dashboards. Also blended learning dashboard for Higher Education context – but currently done by hand, provided weekly, seeing how this can help them iteratively. Looking for software or experience that can help.

Baltasar Fernández Manjón, BEACONING

It’s a new EUR 6m H2020 project, using games and gamification in different domains and settings. Classroom and physical spaces, breaking the barriers. It started in January. Aim to improve LA, running large pilots, need to prove that those interventions are effective. Working with content providers, in a very diverse consortium, including serious games providers, researchers, and others. Enriching LA that suppose a controlled environment to an uncontrolled one. Accessibility – using LA to improve accessibility to serious games for people with cognitive disabilities. Also geolocalised information, included in LA. The idea is to close all the aspects of this very diverse environment. Aim to reuse some of the RAGE project’s infrastructure – we are only partner in both. Idea is to build something that’s easy to deploy.

Hendrich Drachsler, LACE

Who knows about the LACE project? [most people put hands up] Hendrik cheers! It’s a Community Support Action, focused on pressing issues like interoperability. An outstanding report about that on the LACE website, there’s many deliverables. Working heavily on ethics and privacy, just released a checklist – eight points in a checklist – to provide a handy tool. It’ll be presented at the LAK conference. Also working with many of you. Sustainability plan is building up a SIG LA, so LACE becomes Learning Analytics Community Europe, have a proposal to SoLAR. European projects can align to this and become board members, a melting pot to bring all the stuff together. Proposal coming out soon, will say more about this at LAK.

Michael Kickmeier-Rust, LEA’s Box

From Graz University of Technology. Our community encountered a big loss with Erik Duval. He has an inerasable character in this community. I’ve learned a lot today about very cool projects. What I have is just a cardboard box. Very simple. Lea is a little character. Our project brings LA in to real educational settings, in to schools. We find situations where we sparsely have data. In most schools, in Europe, we don’t have a lot of data. Learning and teaching is a social process, an analog process. Teachers use learning apps, LMS, perhaps Moodle. We try to provide a central hub, a platform where teachers can smash in to the cardboard box and give them the best possible analytics solutions about their students progress and learning. That’s not an easy task. Also focus on psychological theories to improve analytics. A good metaphor is SPSS. A powerful statistics tool. It can do a lot. We try – the little girl is a metaphor – is translating this powerful SPSS in to meaningful answers for real-life teachers.

Katerina Riviou, PBL3.0

Started in January, 3y, Erasmus. LA, Learning Semantics, Problem Based Learning. Also interested in MOOCs. Want to use PBL with innovative use of LA and LS, maybe in a MOOC context. Five countries, four universities and one company. Final aim is to come out with pilots and recommend best practices and policy recommendations about the context of PBL.

Baltasar Fernández Manjón (again), RAGE

H2020 about serious games. 20 partners, EUR 9m. Developing industry in Europe. Building assets, modules, services to the game industry that is small. Developing a new ecosystem, promoting things already happenings in the US. We are building a full infrastructure to streamline process of applying LA to games. Games have been using telemetry data for years, but in a proprietary way, mainly used for increasing revenue. Here, working on providing game trackers than can be used with e.g. Unity3D. Using Apereo learning records score, xAPI, promoting this in to an industry afraid of using standards because they don’t want data to become a commodity. If you have a game, it will be easy to put in a new tracker, so there’s an easy way to deploy it in to your services. We’re using docker, push-button deployment. Running this in to large experiments with police in the UK, Randstad, university of Hull in the UK. Mainly industry-oriented. Want to streamline the process, but may be useful for research.

Noelia Cantero, SHEILA

Supporting HE to incorporate learning analytics. Erasmus+ Led by Dragan Gašević in Edinburgh. So new, there’s no logo yet. Want to impact policy development. Want many case studies. Using the ROMA approach – Rapid Outcome Mapping Approach. 4 institutional strategies, one for each. OUNL, Carlos III Madrid, Estonia. Student stakeholders, and also EU Association of QA agencies (?). Started in January.

Tom Broos, STELA

STELA, Erasmus+. Successful transition from secondary to higher education. KU Leuven, working with TU Delft, TU Graz, Nottingham Trent University, SEFI. Focused on transition, supporting students with LA. First-year students, in a new type of education, evolving from learning dependency to learning autonomy. Frequent high-quality feedback may help them. So we try to support that. Multidisciplinary team, also involving educational sciences, teachers, student counsellors. Our focus is holistic, taking in to account a set of courses, all requiring different learning strategies. Have to make something that can adapt to the learning strategy that’s best for that specific course. Our plan is for research, then to have something implemented in all these partner institutions seeing real benefits.

Jeroen Donkers, WATCHME

From Maastricht University. Long acronym, workplace e-assessment competency-based higher multi-professional education. 3rd year of the project. Large project, many partners. Some technical partners, some educational partners. Our research is in the domain of workplace-based learning. Learning to become medical specialists, veterinary specialists, or teachers. Two partners on each domain. It’s important to get insight in how you are working and learning. The way is to use workplace-based assessment. Your teachers are your peers, walking around with you. We use an eportfolio system to collect this information. That’s the evidence and context of learning. We add on top of this system personalised student models. They are used to inform the student and supervisor. It’s MEBN (multi-entity belief networks) – from general knowledge, very personalised models. We use this to produce visualisations and messages that turn up in the portfolio. Will test this in practice this year.

Daniel Spikol, PELARS

Project-based learning analytics for research and support. We talked at lunch about fun. This project is making sense of how students build with physical computing projects. Try to analyse different stages of how they plan, build and reflect. last year of FP7 project. 11 partners, SMEs to universities. Also trying to generate results – first attempt at data card for students to reflect on their project. Also a visualisation tool for researchers. Multi-modal data from how students move arms, gave, how they interact with sentiment buttons. Also reflective tools to plan, document and reflect on their work, which generate a visualisation. The idea is to enable students to learn to make better decisions in small groups. Students don’t reflect on how they got started or how they ended.


Bart Rienties takes over again. We have this list – our inventory – of Policies, Practices and Tools. Sit with your group, and if you have identified one of these that isn’t on this list. Put it on a green piece of paper – if you’re sure it’s the best ever but we’ve not mentioned. Yellow – is, Ok, I heard about this policy, I’m not sure about it, it would be potentially interesting. Green=essential, Yellow=maybe.

Two observations:

Marieke Van Der Schaaf: noted that the gap in teacher practices compared to this is not there. Another gap – in data management/data security/data protection/ownership.

Jan-Paul Van Staalduinen: work in TU Delft, different group to him, on aggregated data on social media – crawling data for learners from Facebook and Twitter. Also work on a programming course on EdX, where they get permission and then crawl the learners’ github profiles, looking for evidence of their programming improving. The learners are happier to give permission to scientists than to the government. General discussion about ethical issues.

Next exercise: reorder the post-its. Then think about which learning analytics practice you as a group want to defend.

Practice pitches

Next exercise: Pitching the practice to different groups in turn.

Pitch 1: Daniel Spikol: Fun!

Fun is important in LA. Fun leads to delight. For learning and for teaching. Teachers and learners should have fun. Empowering. Relaxing. Keeps the human factor in! (AlphaGo is not delighted that it beat Lee Sedol three times, but he is pleased to have beaten it once after that.) Intrinsic motivation. Fun as guide to your skills.

Pitch 2: Mark Brown: Address the demand side

We understand there are tools for LA, and people who argue that LA is all about the learner, and they should be the focus. Our argument is the teacher matters most. It extends, at the moment there is an oversupply of tools. Too many tools! A problem of lack of demand. So to address the demand side, classic for new innovations, we’re proposing to focus on the educators, the teachers. You can define whatever level. We want to develop a toolkit to teachers to help them understand the choices available to them to improve learning through evidence. A series of accelerator events, funded by the commission to support implementation of the toolbox.

That will address the demand side, people understanding more about it, to make more sensible, wise and informed decisions about the tools. And to influence, as stakeholders, the design of those tools in the future.

Q: In England, the system is very fragmented, hard to influence teachers because we have dispersed teacher training system, hard to influence schools. Influence over teachers is hard.

We thought about that! It’s easily resolved for the UK, you won’t be entitled to funding from later in the year because you’ll be out [laughter]

More seriously, tightly bound and assessment driven, opportunities for teachers are quite limited now. A policy level has to operate to enable that. The HE environment, there’s more opportunity. We might pitch at the HE level.

Q: Also smaller number of institutions. Thousands of schools in England alone.

Ye-es. IBM have contract for state of Georgia in the US. It’s very driven by standardised testing. But parents were key stakeholders, what they get, what they want to know – without interpretation the data you provide might be meaningless, or pose challenges.

Pitch 3: Jocelyn Manderveld: SURFnet practices!

SURFnet practices! Top down, and bottom up. Happening in the Dutch landscape. Bottom up, lots of experiments, 2012-2014. Funding in place for new ones from 2015 on. At the top, at management layers, they get the notion we should do something with this, with the quality. They’re also reluctant, they’re afraid about legal issues, security, consent etc. We are trying to vision the gap between top down and bottom up. We organise workshops, to get a dialogue started between management and researchers. Data scientists, teachers, deans of faculties, etc. Is your institute ready to use LA? Discussion going. Focus on leadership, if not leadership in the institute, it’ll still be at the bottom, it doesn’t have the full potential. As well, we make sure we have a simple, secure infrastructure in place. So if at the top, if want to do something, don’t want to invest in the tooling, they can still do some experiments of their own. Then have communication between researchers, and the management layer at the top. We are building this up now.

Q: To convince the top level that they should do something, do you have evidence that there’s positive things? From the experiments?

Yes, but also at this meeting, the researchers talk about the data, what’s happening. We have the management saying this is interesting – and they can bring their own evidence. I’ve noticed in last two years, we know the list of Purdue, and things in America. Typical Dutch, or European – yeah yeah, it’s American, but does it work for me? It’s is better – three people here from University of Utrecht. Talking to dean about LA. They were surprised, so we’re bringing them together.

This is the mandate of SURFnet. Eventually we hope – SURFnet is steered by the universities – we hope they will say this is what we need.

Q: And use the infrastructure you provide?

We are developing that now. It’s really there for experiments, not big use. Our mission is also to deliver education services to institutes that the market doesn’t. Security and privacy.

Pitch 4: Dirk Tempelaar: Only start LA if you have rich data!

Our main idea was taken away for the interview! This is our second one. The idea, if you want to start with LA, only start if you have rich data. If you don’t , forget LA. If you base yourself on activity data, what Blackboard or LMS brings you, click data, time on task – there’s no chance to make any interested model from that. What you need is more iterative. We have some pilots in NL coordinated by SURF combining formative assessment with LA, assessment for learning, brings very rich data. In my context, we have 1/3 surveys and dispositions, dispositional LA, [1/4] formative assessment, [1/6] clicks etc. Dispositions most powerful at the start [for predicting learner success], with a bit of others [such as demographics]. But then formative assessment is the dominating data for predictive models. The message from these studies, if you only have LMS data, don’t do LA.

Q: Do you have evidence, or is that anecdotal evidence from your approach?

Maastricht published this in Comp Human Behaviour, looking at how rich data is and looking at learning analytics.

Final exercise: One-minute pitch for ideas

All in one large circle, with a person in the middle pitching their idea.

Pitch A: Dirk Templaar.

Learning analytics needs rich data. What is it? More than LMS data which has clicks and time on task data. If you only have that, forget about doing any LA. What you need is rich data sources that are complementary to this data. Two examples from my experience. Data from formative assessment, quite rich data. Disadvantage is it comes late in the course. Second source is dispositions data. Then you can predict which will drop out.

Alan Berg: I agree. Have to break population in to subsets based on self-regulated.

I don’t think so. Our context is small group. Bring data to tutors, discuss it with their students. Don’t need to group students, depend on individual action.

Hendrik Drachsler: Challenge! I would say, who is against rich datasets? Sure, everyone wants that. Research shows activity in MOOCs leads to success. There are some effects that are there. You can say a lot already from activity data.

Activity data can be dangerous! In one, very non-linear effects of activity data. Not active are those who are high end, don’t need to participate, and low end. Only interested in drop out – but activity data groups those. Effects are non-linear, but if you combine them you can sort that out.

HD: Ok that’s good. We combine MOOC data with a LRS, only activity data w/o assessment and grades, we can do very little indeed. I second it.

Ed Foster: I would back that up. Rich data makes sense. But staff making use of things quickly. Just that activity data is beneficial, that sense of it happening. We have seen correlation between activity and progression and attainment. Good to make things easy.

I will use activity data, but combine with disposition data. That even predicts the activity data. But when I have first formative tests, I forget the dispositions and activity data.

Pitch B

We’ve had a lot of discussions about rich data, good analysis. One question, how do we get teachers to use data? To use LA? That’s a critical issue for this whole group. In Denmark at least, some teachers are quite conservative. How do we get the teachers to work with the data? Critical issue for policymakers. We need good data, convince them, but how do we trigger the movement? That’s my issue?

Riina: Easy. You evaluate teachers on LA. That’s the only way they do things. If it’s measured it gets done.

[some disagreement]

Someone1: It’s about starting the dialogue about improving their work from a perspective they think is relevant.

S2: Find the champions. Bill Gates, teaching awards now, world’s best teachers.

Alan Berg: U Mich, have fellowship program for LA. Sponsor specific evangelists.

We have a trigger in Denmark, demanding all municipalities buy an LMS within a year and a half. Bringing data in to the classroom. But still we need training, data, support.

Riina: My comment was funny, but has truth too.

Some municipalities setting up activity goals for these LMSs.

S3: Do teachers have time to do LA?

S1: If it’s different thing, it’s their regular job. If it is the same.

S3: Do we need every teacher to use LA? Is it relevant for every purpose?

Maybe one should see it as an investment in education. If you invest, you might save time later on, you know more about students, what works, what instruction methods work. You might save time in the long term.

S4: It gives you new ways to make decisions, changing your practice.

Riina: It’s a good question, does every teacher need to use LA. Maybe have different roles within a school. Like with ICT, we have some leading teachers, some have support, you have different levels of users. Maybe the same for analytics.

Pitch C

It’s teachers again. We’ve seen long lists of tools. The question is, if you give this list to teachers, what will they do with it? They look at it, ask what they should do. Our idea is to give an evaluation framework to the teacher to help them decide what tool to take for their specific context.

Bart: Can you buy it today?

Yes. Give us 3 weeks.

S5: What is the requirement for teachers?

It’s not necessarily a computer system. It’s helping us to ask the right question. So say, I’m a teacher in university, want to do something in assessment, I have a budget. But also ask what functionalities you need. Helping teachers to set requirements towards LA.

S7: Does the teacher have that decision? Not in Spain.

It depends on the teaching context. Sometimes, my budget is zero. Use open source software, or zero cost – that is one requirement.

S8: Isn’t it embedded in learning tools, and they already have trouble using those?

Then the decision is made in the wrong place. The teacher is responsible for teaching.

Ed Foster: Our institution, someone has to make decisions across the institution. There needs to be a basket of stakeholders involved, drawing in the computing angle too.

S9: The problem is an uninspiring list. The danger is to build a technical solution that makes it even more uninspiring. Challenge to find testimonials, user stories – this teacher had this problem, solved it with that tool, had this outcome. Bring it to life.

To list success of recommendations.

S10: If it’s recommended, did it solve your problem – so users can see whether it helped.

Use this data for future recommendations.

Pitch D

The teacher has already mentioned. For us, it was a big thing. Reflecting on what we heard today. It was about data, models, projects, but not about the connection with the education. It was there sometimes but fragile. When starting from the supply side, the IT side about LA, you always have to solve something. How can you get a team of teachers to make the change? It’s also possible to start at the other side, the demand side. Our suggestion is to put in the foreground the teacher and the things he thinks are important. If you have a professional dialogue about that, maybe in the background a LA question crops up.

S10: Sometimes people don’t know what’s good for them, can’t see in the future. We didn’t know we needed Powerpoint, now we know better [laughter] Have to educate them about what they really need.

S11: The data is in the background. The teacher’s needs are in the front, analytics in the back. There’s no technology there, the teacher is in front.

S12: Counter-argument, it’s a very economic point of view, to sell something. I support this idea to say we need to say what are the teacher’s needs. Glad you have a woman teacher, we have 90% of women in the profession.

The association of board members of primary education have on their website a day of men.

Alan Berg: I agree. But the conversation is that the tiger is coming in, how do you control the flow of water so it goes where you want? I’m not sure this is the tactic to control the market trend?

You have a choice, to anticipate market trends, or you can choose not to move immediately but to take space to have those professional dialogues. Something will come up, you will have the tool to facilitate those questions.

AB: Generational difference. Popper’s idea, old scientists die, new generation bring in new ideas. Probably going to see that with LA. Same as students with their telephones brining in apps.

Pitch E


Our proposition is simple: buy this policy maker HAL 3000. No! It relates to fun!

We think we have to have a human side to LA. Lots of discussion more focused on performance matrices, how this affects teachers, learners, policymakers. We need to remember some type of delight, motivation about education and LA systems. To empower people. It should work on empowerment. The human side – rich data is important. It’s not just the activity data.

S12: Is it feasible to make it fun?

Delight is feasible. Fun is a harder question. If you want teachers to learn to use LA, has to be delightful.

Marieke: Positive experiences, also for the teacher before they move to use more LA.

Riina: What would be the incentive?

Time management.

S13: Can we pitch LA as doing the boring administration stuff that you knew you were going to have to do when you signed up as a teacher, thought the rest was exciting enough to sign up. It’s back to the 80s vision that the computer does the boring work freeing us to do the exciting stuff. I think it’s still convincing.

S14: We go to schools, one of the biggest incentives we could offer is providing the coolest, newest most sophisticated rocket-science LA, the biggest incentive was make this system to allow me to print all the record cards for all my students with one click of the button, and I’ll use your system. That was the incentive! LA per se might have big big positive aspects for education. There are simple and trivial side aspects. We must be creative enough to find that and sell that to the teachers. The simple things are the better selling points.

Bart: What about policy? How do we go back to Brussels, how can we make LA fun, and embedding it in policy?

Doug: We must mandate fun!

Empower teachers to do something simple, give them a view to do more complex things.

Bart: Does it need policy then?

If everyone in Denmark is put in to SAP … it’s beyond policy for people to want to use it. Policy is one thing, everyday use is another.

Mark: Reconceptualising policy – it’s what you create.

Riina: What is a tool that policymaker love is PISA study. Can LA beat that PISA study?

Alan Berg: You can add extra information to the PISA study, LA can do a lot of it.

Hendrik: PISA is a bad example. It’s a policy that came over us, push for kids to get good results for honour of the country. LA would make that really bad. We should get away from summative assessment, to formative assessment. Get away from ranking the kids, ranking the teachers. [laughter]

Peeple’s just been re-launched hasn’t it?

Pitch F: Mark Brown

The uncomfortable truth: there is myth, propaganda about LA. We want to present the person who matters most. It’s the teacher. The engine of innovation in education. They subvert the system to ensure it works! The glue that holds it together. We’re swamped with tools, with the supply side. Look at the demand side. We need to be respectful of teachers. What teacher hasn’t had data to inform their decision-making? Our pitch is a toolbox for teachers and accelerators throughout Europe for only ten million.

Alan: We can do it for nine. I come in for building communities. I did this for Apereo. Instead of thinking of teachers alone, think of all communities together. Stop thinking about subpopulations, but whole population.

Once we’re successful with toolbox for teachers and accelerators, then do one for educational leaders. It’s an enabling discourse, LA is not done to you, but you can shape and inform.

Alan: It’s a multidisciplinary team. Let’s build those. Teaching community, developers, under a wider umbrella. At the atomic level, don’t get the advantage of the sum of the parts.

Yep. You don’t always know what you don’t know. Accept the need to take an ecological view. Our toolbox, one would be identifying the stakeholders.

S15: Every group is talking about the teacher. We don’t want to put this on top of the teacher. We’re leaving out the student. The teacher puts this on top of the student! I have to start from teacher focus, to flipped classroom. In universities, teacher has higher responsibility for younger students, but for university, student takes responsibility for it. I cannot choose for myself, I want the system to buddy me.

S16: Can we talk about not a toolbox, but a sandbox. Teachers, students, whoever come together as part of the process, try to play with LA. A new way of education.

S17: When we were developing the JISC Code of Practice, the National Union of Students were an active stakeholder. They produced briefing paper for student unions, saying when your university introduces LA, here’s what’s good, here’s what’s not.

Toolbox is different from a play box. Maybe opportunities to play. How you engage in K12 is harder and more complex than in HE.

Pitch G

We do this every morning in the NL, all the children in a circle. How do we bridge the gap to institutional use? Some bottom up needed as well as top down. We need top leadership, management saying this is important, we want to use data to help students. At the bottom, you have enthusiastic people wanting to do experiments. A lot of work is happening at the bottom, but at the management level they don’t know it’s happening. Let’s start working together from bottom up to top down.

Daniel: This would be a question for policy. To make top and bottom work together. Initiatives from governments, or EU, from other things.

Listening today to all stories, there’s a lot happening in the LA field in Europe, in different institutes. There’s no kind of institutional approach. I haven’t heard one example of institution saying data are important.

Bart: Open University UK does this!

Yes, but I haven’t heard this today. Legal issues, ethical issues. But need policy as an institute to stimulate people to work with this. The experiments are really good, but we have to get this at the top level. Otherwise, we’ll still be debating the same topics, and in 5y we’ll still be there.

Riina: This is the strategy you use at SURF. There is a policy in place in the NL that has given you a mandate.

Yes, true, but still looking for universities to put this in their vision for education. We are working on this, it’s interesting for them, [tentative], there are issues to address. We want to become, like assessment is an important issue, but data can also be an important issue.

S18: We had this big project, in the beginning, with Masters students, did in-house stakeholder analysis. How many depts have stakes? 22 of them. But, 14 or 15 of them could kill the project at any moment. Without knowing! We have to operate in this environment. If I would address something on organisational level, this would be it.

Pitch H

LA is a trend whether we do something or not. As we use digital tools, the analytics parts are there. All of these are built in, as we get new versions, new products, they are there. They give nice pictures and graphs. But teachers and managers need tools and information to separate the ones that are evidence-based to support their learning, from those that just have pretty pictures. We think providing information about practices that work, tools that help to improve teaching would be helpful.

Riina: How do customers know yours is not a pretty picture?

My solution? I’m doing consultancy to implement digital solutions. I’m afraid of how educated the customers are. They don’t have the tools or knowledge to separate one system from another. That’s frightening.

Marieke: That’s close to the other discussions.

?Barbara: Who’s going to quality assure?

Mark: I had a vision of the London Underground map, trains on different paths. With that map, I have a problem, I’m colourblind. For me it’s not a single pathway, it’s more complex than that. Ecological metaphor, the layers we have to think about.

Adam: I talk about evidence. This is about providing evidence to help. Who’s going to gather evidence? How are we going to get the contextual information? You can sort of move towards this with the case studies of this project, but the nuance needs to be there. So communities of practice [are important]. Allow self-organisation, the emergence of new practice through those communities, giving them space to change in ways not orchestrated, or instrumental, but emerge through that community.

Bart: Final exercise: things you liked on one side, things you didn’t on the other.

[As ever, apologies to people whose name I haven’t managed to capture, or whom I have misquoted. These are very rough notes to capture some of the discussion, and may not be as accurate a transcript as they might appear.]

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

LAEP2016: Tues am

Dr Doug Clow's blog - Tue, 15/03/2016 - 11:17

The LAEP project and the LACE project with JRC-IPTS are jointly running a two-day expert workshop on “The implications and opportunities of learning analytics for European educational policy”, at the Allard Pierson Museum, Amsterdam.

These are liveblog notes from the opening session on Monday morning.

Implications and Opportunities of Learning Analytics for European Educational Policy (LAEP)

Riina Vuorikari, JRC-IPTS

Riina gives some introduction and background.

Why is JRC-IPTS organising this? JRCs are the Joint Research Centre of the European Commission, the in-house research centres. There are 7. The Institute for Prospective Technological Studies (IPTS) is one of those, based in Sevilla.

Looking at ICT in education, through various studies and projects, like LAEP. It’s to find evidence to help policymakers make better policy at the European level. Have invited ministries of education, school networks, etc, to understand how to support it. Policies already include 2013 COM on Opening up Education, E&T 2020, Digital Agenda, New skills and Jobs, EU Recommendation on Key Competences for Lifelong Learning. And Digital Single Market agenda.

Mission is to provide EU policies with independent, evidence-based scientific and technical support throughout the whole policy cycle. This is your chance to tell the funders what you want! Also work with DG Employment and Inclusion.

Working on digital transformation of education and training. Working plan with many acronyms, frameworks, looking at national settings and at the European level. They work with teachers, schools, HE institutions, and the skills digital citizens need. This is a supporting study, a short one, tendered and awarded to OU UK.

Riina introduces Jonathan, who works with her.

Expert Introductions (Lightning Presentations)

Rebecca Ferguson (OU UK) introduces the day. Fast moving, four-minute presentations.

Adam Cooper (Tribal Group, UK)

Used to work on LACE and on LAEP. Now works for Tribal, commercial venture in UK, student records systems and data management. Working on pilots organised by JISC. Trying to get universities started with LA in practice. The pilots are useful for us as a supplier, to experiment with process and software. Data mining standard process, to systematise how we work with clients. For adoption at scale, balancing repeatability is going to be tricky. Business understanding is very important, including the objectives, critical success factors, and who’ll be involved. We translate those, go through an iterative cycle. Different data available and different levels of interest in using them. We have an ensemble of data streams, can build predictive models around those sources, linked to the business objectives, and work towards a single indicator. Exploring how to build this, how transparent to make this.

We have software, including a dashboard. That’s not very flexible alone. We have to accommodate understanding. Our software application does a lot of the heavy lifting.

Alan Berg (University of Amsterdam, NL)

Program manager for LA. Amsterdam has had a 3y LA project, with 8 pilots. One pilot was to collect student activity streams from learning record store. Four were dashboards, two were predictive models. Now looking at scaling up. Looking at data warehousing, so each faculty can decide what they do with the results. It’s important to have an educational API. We can put a specialised team in.

Doing a hackathon with JISC at LAK16 on their architecture, building a generic architecture, how to cluster students, how to cluster student activity in to standardised views.

Alan Berg again (Apereo)

I also work for the Apereo Foundation, on board of directors. A consortium of 90-100 universities in open source software. Have our own suite of LA tools. It’s growing and maturing. The foundation is a do-acracy – if you see a problem, you volunteer to solve that problem. Marist College donated some software in this framework. It’s closely related to the SoLAR initiative around the Open Learning Analytics. Gaining traction around the world. Some components are involved in the JISC architecture.

Alex Rayón Jerez (Deusto University, Spain)

Research group in three areas – LA, game-based learning, remote experimentation. One of the most important areas are which questions to make of the educational data. Our activity is data science. We start with proper questions to data that we have gathered in the same data hub, and build domain knowledge. We work first with STEM learning, and with competency-based learning. We have some EU projects – SCALA, KODETU, Make World. Making predictive models for their students. Also SNOLA – Spanish Network of Learning Analytics.

Andrew Cormack (JISC)

I was a law student when I came across LA; the question “do you consent to LA” worried me. As a student, it didn’t make sense. As a service provider, doesn’t help either. Tempts me towards what I can get away with doing. Had a dig around in European Law. Two phases – pattern-finding stage, what factors help students do well, what factors can we include in learning. Then a pattern-using phase. In the pattern-finding phase, that’s legally a legitimate purpose, has to be necessary, minimise individual impact. Then once you have the pattern – e.g. teaching maths at 9am on Saturday doesn’t work. That could lead to improvement in course design. Some patterns you can use without dealing with individual students – eg. move it later on Saturday, or more wifi, or provide more support to non-traditional students. That doesn’t need personal data, doesn’t need consent. If you want to intervene with individual students, to provide support, more challenge – then you can offer meaningful consent choice: do you want personalised choice, or vanilla service. In conversation with a law professor, the big change, the point is changing form ambition to minimise the impact on the individual, to maximise it. That’s really interesting point to go to the individual: do we agree together? There’ll be a paper in the JLA next month.

Anne Boyer (University of Lorraine, France)

Computer science prof. Big university in East of France. Lead KIWI research group, working on automatic analysis of digital traces. AI, modeling prediction. User behaviour modeling and prediction to get better understanding of learner, teacher and their interaction. Design recommender systems, what action would improve this. Project Pericles to recommend OERs. Work on memory too. Delivery of pertinent indicators to learners and teachers. Automatic detection of learning communities. Adaptive learning to improve competencies and skills in foreign language.

Barbara Wasson (University of Bergen / SLATE)

Representing the ministry. We are the new centre in Norway, won the national competition for the centre for learning analytics and learning sciences. Norway was first country to put digital literacy beside 3Rs, in the 2000s. Now digital competence. All our schools connected, first to do national tests digitally. National programme on assessment for learning. Lots of situations for analytics!

Centre came from government report on MOOCs in 2014. One recommendation was a research group looking at that. We started in January. Our mandate is wide – research, competence centre, input to ministry, and to educational sector.

Projects starting: MAP LA – mapping LA in Norway. Big Data in HE, starting with University of Bergen, first to go to digital exams entirely, by end of next year. Push came from students, had uproar about exam formats. Now assessments underway, not just exams. iComPAss – taking Eu project in to firefighter education, open learner models and analytics.

Charlotta Grönqvist (Sanoma, Finland)

Educational publisher. We make books and digital material for the K12 market, in 5 markets, Belgium, Sweden, Finland and Poland. (?and NL) Market leader in many countries, 1m teachers, 10m pupils. We make content. More than 50% is either hybrid or digital. We use LA to improve the content, to make it better, to make life easier for the teacher, to make life more fun and efficient for the pupils, to improve the learning impact. Process: create content, publish in digital platform (bingel is their most famous one), pupils exercise, they analyse the content and answers – see which questions work and which don’t, too hard, how long do they take, why do they answer it wrong. Aim is to improve the content. Other benefits – fun for students if they get it right, very frustrating if they answer correct but don’t get it. Makes teachers more effective. Good for company to make our products better, and faster.

Challenges around changing market, and projects with universities.

Dirk Tempelaar (Maastricht University, NL)

We are one of the teachers applying tools. I’m teaching mathematics & statistics to 1st year students, collaborating with Amsterdam and Eindhoven in SURF projects. Digital testing, formative assessment and LA. My class is 1,100 first year students. We use blended learning approach, using e-tutorials, with assessment, to personalise the learning. Since the groups are so large, they are also diverse, students with different backgrounds. In Anglo-Saxon system you would stream, but law doesn’t allow us to do that, but we need to personalise. Do it using e-tutorials, assessment tools, give feedback. At Maastricht, we use dispositions analytics (Buckingham Shum & Deakin Crick, LAK2012). Aim of research is to find what data is most predictive at what stages. Especially in first stages, learning dispositions in combination with background data are very predictive of learning outcomes and can help interventions. Later on, formative assessments provide useful information.

Ed Foster (Nottingham Trent Unviersity, UK)

[Coming later]

Gábor Kismihók (University of Amsterdam, NL)

In business school, small RG, multidisciplinary. Eduworks network. Many disciplines from sociology to data mining. Project on goal setting: from area of human resource management, evidence this is important for success. So apply this in educational context. Set goals, measure performance, and behaviour, how they meet those goals. Looks very promising. Did pilot study here, scaling up at University of NSW in Australia. Another project in Eduworks, around Self-Regulated Learning in a MOOC context. In organisational psychology, existing measures for SRL. Want to look for evidence of these measures in large datasets from MOOCs. Ongoing project, looks promising so far. Also working on adaptive assessment, how to analyse data from these systems. Want to stress importance of privacy and ethics, a fundamental issue. This is a serious barrier.

Ian Dewes (Dunchurch Infant School, UK)

My children in my school, from the age of 3 to 7. Their education is different to university, don’t have large datasets. LA is still something we’ve found useful. There are parallels to be drawn between very young children, and those going to university. Student retention at university, analysing early signs of dropout. Very young children can’t drop out, but because of Early Years Education (3-5 yo), lot of play and choice in their learning. My children couldn’t sit down quietly like you lot are doing, well done. [laughter] LA help us monitor this. Children doing their own activity. Trend of increasing accountability in learning in England. We assess their learning through observations. Were struggling with this (photos, writing) until we used LA, pick up activity on iPAds. “Can’t see the wood from the trees” – some children observed doing same thing multiple times, some were avoiding being observed. LA helped us – can analyse the observations, use the digital trace well, has had a big impact on outcomes. Next step, not just existing digital traces, but converting the datasets we have in to digital datasets. Children read at home with their parents, want to digitise that so we can analyse it better. LA is something schools are doing without realising it. We are now much more aware, data privacy, data visualisation. Lots of scope to improve what we do and the outcomes for our children.

Jocelyn Manderveld (SURFnet, NL)

Dutch national research network. Mission to provide educational innovation about the use of ICT. Learning LA programme. Innovation Programme 2015-2018, working with Dutch HEIs to get them working with LA. Several topics. Learning analytics readiness – developing instruments, workshops to debate with IT and education depts. We see in the NL lots of researchers, but not deployed at large scale for whole institution. We are trying to scale this up. Most important thing is what is the question, what you want to know from your data. Easy to collect a big warehouse full of data. Algorithms will always give you results. We will publish a report soon looking at pedagogical models and what are the right questions to ask. Then we have privacy. We have a big task addressing these issues. Important to not look at the hindrance, but the possibilities. What’s possible according to EU law, Dutch law. Another report. We have a large part on architecture and tools, developing infrastructure, making tools available so institutes can make these available. Also looking at data loggers, so students can own their own data. We are also looking at research – what data really say something about student success. At a national level, bring educationalists, data scientists, all together to solve problems.

Jonatan Castaño Muñoz (EC JRC IPTS)

More focused on MOOCs. Quantitative methods in educational research. Example of MOOCKnowledge. Interested in complementarity with learning analytics.

Kristel Rillo (Ministry of Education and Research, Estonia)

We have a lot of data. If you torture data enough, it will confess. We can do anything, but we want to do it wisely. As ministry, we have a very good database about teachers and students. We have a lifecycle. We haven’t used it as wisely – e.g. for career planning. Want to decrease dropout, have better career planning. Vocational education is always a challenge. We have data in our system, LMS – only two big ones in Estonia, so we can do it in Excel. We are following Polish experiment or experience, to see how it can be useful for us. In general education, moving towards formative assessment. Good to analyse numbers, but when you have [more detailed feedback, it’s better]. Admission Information System, can see who studies e.g. ICT. E-tax database, a need to follow the labour market. Need to follow privacy, take the data that’s available for us. We took graduates and data from e-tax system, employment taxes, measured how successful they were after university. We have a lot of data! We need to do something useful with that.

Kristian Ørnsholt (Danish Ministry for Children, Education and Gender Equality, Denmark)

We’re working with central-level data, and local data – so on two levels. We have central data. One big initiative is a data warehouse, 3-4y work, available publicly on the web. Aimed at school leaders, data about the goals we’re working with in the educational system. Latest initiative is some new dashboards, targeted towards parents and choice of school. We bring relevant data to choosing schools, and when you discuss the class with your teacher. E.g. on pupil’s wellbeing, attendance rates, and so on. We have national tests, including adaptive tests. We have a centralised IT infrastructure, with ID number for every pupil in the Danish schools. This ID number we try to distribute to exchange local data. We want to help the market and local schools and municipalities to exchange data more easily. We don’t want to be a Big Brother ministry gathering data, but help locally, to bring those data we do have centrally more in to the game locally. Working a lot with formulating standards for data exchange, and also trying to make some platforms where we can share the central data with local data. We are starting a project on how to measure progression, student progression within the frames of our national curriculum. It is complex, we experience a lot of demand to make standards for how to measure pupil’s progress on it. It’s huge work. Interested in hearing from other countries about how to measure that.

María Jésus García (Ministerio de Educatión, Culture y Deporte, Spain)

Head of online teacher training in Spain. Three different pathways. Online tutored courses, on traditional Moodle platform, takes 2 months, open twice a year, March and September. Our big thing is MOOCs, we are launching one per months, usually in Spanish, but launching first in English. educaLAB moocintef. Looking at how learners interact, from Facebook groups to our ministry’s professional networking system. Whatever we do, it has to be social. We are interested in analysing what’s going on, what’s more motivational. Agreement with IPTS for MOOCKnowledge. Apart from collecting data, we want to analyse it. Portfolios, debates, interactions, digital artefacts the learners make. We write reports, but then what? That’s why I’m here.

Mark Brown (Dublin City University, Ireland)

Director of National Institute for Digital Learning (NIDL). Institutional hat too. Project called PredictED. We have national centre for data analytics, more general than for education. Partnership with my team and them. Simple project, sending email to 1st year students based on data from the VLE/LMS, Moodle. 6y data gathering to establish patterns. Based on that, sent emails weekly based on risk/activity. Very simple intervention. Will talk about the politics at institutional and national level.

I wasn’t centrally involved, but my staff were. My role was to take these innovations and connect them up. The President wants us to be doing LA, this delivered a project. CEOs, VCs, what they know at one level is not the same as the depth of knowledge as e.g. in this room. Some of the claims made about the project – they are weak if not somewhat risky, to say the least. A national newspaper article appeared, claiming the initiative had increased achievement by 3%. Correlation and causation issue. Not everyone does educational research. Only about 15% of students responded to the survey, so the percentages quoted are misleading – but are used around the institution without that level of critique.

Be critical of the focus on LA, turn it on to what we care about. Australian report, Shane Dawson – look at process as a way of rethinking teaching and learning.

Susan Flocken (European Trade Union Committee for Education)

Teacher union’s point of view. We’re developing a policy document on the use of ICT in the C21st. My main point is quality. That’s the main issue for use. Focusing on effectiveness of the teaching profession, what makes it attractive, what is the role of teachers, assuring academic freedom. ICT is only a tool, not the main purpose – it’s about learning. Learning analytics, it’s not about the marks at the end, it’s about the process, have they learned to learn. Heard a lot about the link to the economic link, the labour market – but it’s about preparing for a life outside school and university. It’s important to involve those working in the field. Many those working only see ICT or labour market. So ask the teachers, the professors, education staff. Working conditions, working environment – their working environment is the students’ learning environments. Not enough skilled women in digital skills, want to do something about that. Other inequalities. Access, special needs (teachers and learners). Big problem is risk of privatisation and commercialisation of public education. Education is a public good. Critical point: it’s not about making money out of it. Also looking at Internet safety, promoting open education as a means of adding value to teaching and learning. It’s not about saying teachers are not interested in ICT, need to provide the environment, training, CPD for teachers who have been teaching for many years.

Tim Vogelsang (iversity, Germany)

MOOC providers. iversity is a European providers. I lead marketing and business intelligence team, my passion is analytics – which I can do publicly. Two examples – peer grading, group creation. I’m not so interested in plotting graphs as having a concrete problem and solving it. Peer grading, a course with 1,000-2,000 students, they grade themselves, many grade each other, how do you derive a grade for each person – means or averages, or like Google PageRank – finding the best one that improves the learning is the challenge there. Second one is group creation. The question is how to create learning groups in a MOOC. Used k-means clustering, put people together who are similar. The real challenge is you want similarity on some factors, but diversity on others. Maybe same time zone, so they can collaborate, but want mixture of e.g. male and female. Except in e.g. Muslim countries that want male and female separate.

Topi Litmanen (Claned, Finland)

A startup, 20 people, in Helsinki, Dubai, Singapore, London. Learning environment, can be used for e-learning. Pilots and customers, universities providing elearning, from medical education to dance education. Try to provide tools to make the learning process visible for the student and the teachers. Our environment, you can use your elearning materials embedded, or upload videos, documents, Powerpoints, we provide automatic keywords and topics, and track everything that the learners do, how they spend their time. We do analytics on the interactions between different learners, focused on collaboration. An example screenshot: a teacher view of a course on medical education. We provide data to teachers by looking for groups of students who act similar, or have similar motivational patterns. Here it is students with similar challenge evaluations, at the topic level. We try to make the learning process visible for the teacher, information might be used for giving supporting materials, or more education for topics experienced as challenging. We give the data back to the leaner, a learning tracker tool. Next phase is to use the data to provide suggestions for individualised learning paths. If you have a learning goal, what materials would be useful, suggest materials outside the course to support your learning process. Differentiation from other providers is that we try to provide a solution that doesn’t have to be hard-coded in.

This work by Doug Clow is copyright but licenced under a Creative Commons BY Licence.
No further permission needed to reuse or remix (with attribution), but it’s nice to be notified if you do use it.

Openness as feature

Professor Martin Weller's Blog - Thu, 03/03/2016 - 10:41
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

(going with the “if in doubt, use one of Alan’s pics” approach)

Sorry, this is two ‘open’ posts in a row, I’ll blog something else soon (if you want something very different, I’ve started a film a week blog, it’s reassuringly uninformed).

There have been a few announcements recently that made me reflect on the co-option of ‘open’ in a commercial sense. The first was Amazon’s Inspire announcement where they look to be getting into the OER game. Amazon & OER, that is big time and has Battle for Open written all over it. It could be amazing, it could miss the point of OERs altogether. Audrey Watters blogged her reaction to it, but I guess we’re playing a wait and see game at the moment. I will say, as far as I know, the Amazon team haven’t spoken to people in the OER world and haven’t previously engaged with that community (not that they need to of course, they’re Amazon, but they might learn something useful).

The second was actually an old article (from 2014, practically prehistory I know), that I only recently came across. It was predicting how SOOCs (selectively open online courses) would be better than MOOCs, because SOOCs would have “an entrance requirement designed to reduce the unwanted diversity.” As the kids say: I can’t even. Unwanted diversity? Selectively open?

One more – a piece in Inside Higher Ed about Coursera beginning to charge for more of its MOOCs. The piece says that learners can explore freely but “To turn the course materials into an actual course, learners have to pay.” The Coursera blog said ““We are on a mission to change the world by providing universal access to the best learning experience, … The changes that we are making this year will move us toward sustainability and enable continued investment in our learning experience, without compromising our commitment to transforming lives for people around the world.”

What these highlight to me is that openness is a feature when you’re developing a business model or technology. Will it get you more money or users? If yes, then adopt it. If no, like any feature it can be dropped. Compare this with universities and non-profit organisations for whom openness is a principle. It is embedded in what they do, and matches their core mission (or should do, although the increasing commercialisation of universities may see more ‘feature’ based thinking). So while the announcement of any big company that they are adopting open gets headlines and is exciting, it is worth examining to what extent is it a feature versus a principle?

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

Positive openness

Professor Martin Weller's Blog - Tue, 01/03/2016 - 15:54
!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

I’ve been mulling around something on how openness ain’t what it used to be for a while. I’m not sure I’ve got it, but a few strands are converging.

Firstly, the way openness is framed now is really as free. Tressie McMillan Cottom gave a good presentation at ICDE last year, in which she highlighted that the new forms of openness do not create the equality many had assumed. For instance, it is mainly elite universities that adopt open source LMSs, whereas poorer community colleges sign up with commercial providers. And both for OERs and MOOCs, the learners who use them most tend to be well educated already and from privileged backgrounds. Simply making something ‘open’ itself does not lead to equality or democratisation, and in fact may increase inequality.

This arises because ‘open’ has become largely synonymous with ‘free’. But openness is something much richer and more complex than this. In order to make things truly open, then free may be the least interesting element in the overall equation. You need to provide support structures, to specifically meet the needs of the audiences you feel might benefit from open. And that costs money. So not only does the equating of open to mean free underplay other elements, but it also falsely gives people the impression that this is a cheap option. It is not. Support for openness usually requires people, and they are often the most expensive component in an education system. Whether that is supporting learners on MOOCs, or supporting teachers to adapt OERs, the ‘build it and they will come’ philosophy only applies to people for whom the route is already easy. It also gives a lower return. The audiences you might want to benefit from open approaches are likely to drop out more, get lower grades, earn less than the guaranteed successful people. So now openness is more expensive and gives lower success rates. It then becomes less of a Silicon Valley dream investment, but it does become more of a social good.

The conflation of open with free has nearly always been to the detriment of the people openness is intended to benefit. So we need to get away from it: Accept that openness costs money somewhere in the system if you want to do it properly, or stop calling it open education.

My colleague Rob Farrow has been coming to this from a philosophical perspective. He has been thinking about how currently open is framed as an absence – eg the removal of restrictive copyright. In the presentation below he frames it using Isaiah Berlin’s concept of positive and negative liberty. Negative liberty is the removal of constraints, whereas positive liberty carries agency. Both are required, but I think Rob’s point is that we’ve focused on the negative liberty aspect hitherto and now need to move to the positive liberty aspect:

Constellations of Open from Robert Farrow

This might be a useful way of thinking about the type of supported openness I mean here. When the Open University was founded it developed a model called “Supported Open Learning”. Note that it wasn’t just “Open Learning” – without that ‘supported’ part the rest falls away. This is my resolution for this year, to look at open education ventures and ask ‘yes, but where’s the positive openness?’

!function(d,s,id) {var js,fjs=d.getElementsByTagName(s)[0];if (!d.getElementById(id)) {js=d.createElement(s);;js.src="";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs");

1 of 2