The Open UniversitySkip to content

Learning Analytics symposium

KMi Podium, Level 4, Berrill Building, The Open University, Milton Keynes, UK [map]

1.25-4.30pm BST, Thursday 25 October 2012 [convert times]

#StormSLA twitter archive, question archive, and visualization (thanks Martin Hawksey)

This hybrid face-to-face/webinar is part of SoLAR Storm — the new virtual research lab convened by the Society for Learning Analytics Research, to build research capacity in this new field by networking PhD researchers with each other and the wider community.

1.25pm Welcome from Simon Buckingham Shum


1.30pm – Visualising Social Learning in the SocialLearn Environment

Bieke Schreurs and Maarten de Laat (Open University, The Netherlands), Chris Teplovs (Problemshift Inc. and University of Windsor), Rebecca Ferguson and Simon Buckingham Shum (Open University UK)

DeLaat-SoLARStormOct2012.pdf / Network Analysis Tool + SocialLearn demo movie

The OUNL team will talk about work in progress from a SocialLearn research internship held by Bieke Schreurs. The Network Awareness Tool (NAT) was developed initially for rendering the normally invisible non-digital networks underpinning informal learning (in particular for teacher professional development). The work reported here describes how NAT was adapted to render social networks between informal learners in the OU’s SocialLearn platform, in which different social ties can be filtered in and out of the network visualization, and moreover, enriched with topics.

Biographies: Bieke Schreurs is an intern on the SocialLearn project, working on the development of social network analytics. Since 2010, she has worked at the Ruud de Moor Centre as a researcher and project manager. She is involved in projects to support primary and secondary schools in their professionalisation process, both in the Netherlands and internationally. Her research focus is on networked learning, and on ways to support and encourage this process.

Maarten de Laat is a full professor at the Open University of The Netherlands, and is director of the social learning research programme. His research concentrates on teacher professional development, knowledge creation through (online) social networks and the impact the technology and social design has on the way these networks work and learn. He has published and presented his work extensively in research journals, books and conferences.

Chris Teplovs is Chief Research Scientist at Problemshift, Inc.  He specialises in the design and implementation of interactive analytics representations of complex data with particular emphasis on network learning, learning analytics, and formative e-assessment.  He is currently a Visiting Scholar in the Faculty of Science at the University of Windsor, Canada.

2.15pm – Data Wrangling – Bridging the Gap between Analytics and Users

Gill Kirkup, The Open University, UK


Gill will talk about the ‘data wrangling’ programme that she leads. Data wranglers provide a bridge between those who produce learning analytics and those who make use of those analytics.

Biography: Gill Kirkup is a Senior Lecturer in Educational Technology in the Institute of Educational Technology at The Open University. Her research interests are in gender and lifelong learning (e-learning and distance education), students’ use of learning technologies in their domestic and work environments, and the use by home-based staff of technologies for teaching.

3pm – What Can Learning Analytics Contribute to Disabled Students’ Learning and to Accessibility in e-Learning Systems?

Martyn Cooper, The Open University, UK


This presentation explores the potential for learning and academic analytics in improving the accessibility in e-learning systems and in enhancing support for disabled learners:

  • The accessibility focus here is on identifying problems of access to learning systems (VLEs, etc.) for disabled students.  (By extension, learning analytics could possible identify other interaction issues that widely impact on learning for any student.)
  • Enhanced support for disabled students could be achieved by both extending the general benefits of learning analytics to disabled students and in focusing dedicated support for them.

Several use cases are outlined in order to make the argument for learning and academic analytics as a set of tools in these areas. Ways in which these approaches could be technically achieved are outlined.  Then issues highlighted that will need to be addressed if these approaches were to be deployed at enterprise level.

There will be a discussion of accessibility metrics that highlights ways in which a learning analytics approach offers some advantages over traditional accessibility evaluations against standards such as W3C’s Web Content Accessibility Guidelines (WCAG 2.0).

Biography: Martyn Cooper is a Senior Research Fellow, in the Institute of Educational Technology (IET), at The Open University in the UK.  His academic background is Cybernetics and Systems Engineering.

He has a B.Sc. in Cybernetics & Control Engineering with Mathematics (from Reading University). So far he has managed to bypass all higher degrees. (It took him 12 years to get his Bachelors degree!).

His research focuses on:

  • applications of new and emerging technologies to enable and empower disabled people
  • effective access to a wide range of technologies by disabled people

3.45pm – Good Pedagogical Practice driving learning analytics: OpenMentor, Open Comment and SAFeSEA

Denise Whitelock, The Open University, UK


There is a recognition that e-assessment accompanied by appropriate feedback to the student is beneficial for learning. However, one of the problems with both tutor and electronic feedback to students is that a balanced combination of socio-emotive and cognitive support is required from the teaching staff and the feedback needs to be relevant to the assigned grade. This presentation will discuss two technology-enhanced feedback systems that were based upon models of good pedagogical practice (OpenMentor and Open Comment). It will also introduce my most recent project, Supported Automated Feedback for Short Essay Answers (SAFeSEA), which involves the natural language processing problems of how to ‘understand’ a student essay well enough to provide accurate and individually targeted feedback and how to automatically generate that feedback.

Biography: Denise Whitelock is a Senior Lecturer in Information Technology specialised in building feedback models for e-assessment systems. Her e-assessment research has been funded by a number of external bodies. The most influential in informing policy was the e-Assessment Roadmap, which stressed the need for automated text recognition systems and which assisted the JISC with the planning of their e-assessment strand. Denise directed the Computer Assisted Formative Assessment Project (CAFA) at The Open University, and has investigated the use of feedback with electronic assessment systems with a large number of course teams. She has built on this expertise to construct an electronic feedback system known as eMentor, which won an Open University Teaching Award. This system provides tutors with feedback on the comments they have made on their students’ assignments and coursework. It has now been transformed into OpenMentor and the JISC are funding the transfer of this technology to Southampton University and Kings College London. She has recently been awarded an EPSRC grant to provide an effective automated interactive feedback system that yields an acceptable level of support for university students writing essays in a distance or e-learning context

4.30pm – Close and Refreshments

For a researcher, SocialLearn is a vehicle and innovation platform for probing, and experimenting with, the future shape of the learning landscape.

Learning analytics are, bluntly, ‘merely’ technology for assessment. And one of the pivotal debates raging at present is about our assessment regimes. It boils down to What does good look like? “Attainment”, “Progress”, “High impact intervention”, “Quality Students”… all of these rest on foundational assumptions about what it is that we’re trying to accomplish. I am always struck by the slide Guy Claxton uses:

As discussed at length in many other reports and books, a false dichotomy can spring up in a lazy (or politically charged) debates, between on the one hand mastery of discipline knowledge and on the other, transferable 21st century skills — as though the two must be in conflict, or that the 21st century doesn’t also need subject matter knowledge. (See the Assessment and Teaching of 21st-Century Skills project for one perspective on this, and Whole Education for another.) What we know of course is that both are vital, and that while it’s possible to teach just enough for the learner to regurgitate knowledge to pass exams, what we need for society is pedagogy that sparks the desire to learn for life, equips learners to conduct their own inquiries, empowers them apply their knowledge in the real world, and prepares them to thrive under conditions of unprecedented uncertainty compared to those for which our current education systems were designed.

Learning analytics and automated assessment are key technology jigsaw pieces if the global thirst for learning is to be met. You simply can’t do it with paper, bricks+mortar, and human tutors marking everything. It doesn’t scale. Every VLE (or LMS for our non-English readers) now ships with a rudimentary analytics dashboard providing basic summary statistics on relatively low level user activities. The learning analytics horizon is around defining and operationalizing computationally tractable patterns of user activity that may serve as proxies for higher order cognition and interaction.

The Social Learning Analytics programme focuses on what we hypothesize to be domain-independent patterns typical of the Web 2.0 sphere of activities, including social networking analytics, social discourse analytics, and dispositional analytics. We won’t say any more about those for now [learn more].

But here at the OpenU there’s a huge amount of learning technology innovation going on [overview]. While we care deeply about learning analytics for lifelong/lifewide learning, we’re also working on the complementary analytics which will enable very large numbers of learners to receive automated feedback on their mastery levels of a given topic, but through rich user interfaces that move us far beyond the old multiple-choice quiz.

In work led by Phil Butcher on the OpenMark for instance, you’ll find the pedagogical principles underpinning automated assessment for online learners who may be anywhere:

The emphasis we place on feedback. All Open University students are distance learners and within the university we emphasise the importance of giving feedback on written assessments. The design of OpenMark assumes that feedback, perhaps at multiple levels, will be included.

Allowing multiple attempts. OpenMark is an interactive system, and consequently we can ask students to act on feedback that we give ‘there and then’, while the problem is still in their mind. If their first answer is incorrect, they can have an immediate second, or third, attempt.

The breadth of interactions supported. We aim to use the full capabilities of modern multimedia computers to create engaging assessments.

The design for anywhere, anytime use. OpenMark assessments are designed to enable students to complete them in their own time in a manner that fits with normal life. They can be interrupted at any point and resumed later from the same location or from elsewhere on the internet.

The 3 screens below illustrate how some of the interaction paradigms that OpenMark makes possible, but see their website for many more.

It’s also possible to provide formative assessment on short, free text answers to questions, and evaluation of human vs. automated assessment of learner responses is encouraging, e.g.

Philip G. Butcher & Sally E. Jordan (2010). A comparison of human and computer marking of short free-text student responses. Computers & Education, 55, pp. 489-499. DOI:

Abstract: The computer marking of short-answer free-text responses of around a sentence in length has been found to be at least as good as that of six human markers. The marking accuracy of three separate computerised systems has been compared, one system (Intelligent Assessment Technologies FreeText Author) is based on computational linguistics whilst two (Regular Expressions and OpenMark) are based on the algorithmic manipulation of keywords. In all three cases, the development of high-quality response matching has been achieved by the use of real student responses to developmental versions of the questions and FreeText Author and OpenMark have been found to produce marking of broadly similar accuracy. Reasons for lack of accuracy in human marking and in each of the computer systems are discussed.

Work by Paul Piwek provides automated assessment of critical thinking skills. Embedded in the OU’s Moodle-based VLE (using OpenMark), is an integrated drag+drop interface for students to complete simple argument map templates, in their analysis of a target text. The screen below shows how the learner drags the statements into the template in order to demonstrate their ability to parse the text into the top level claim, and supporting/challenging premises/sub-premises. Learners can then check their map, which preserves the elements in the correct place, but leaving them to try again with those they got wrong.

So, the exciting thing is when analytics for learning-to-learn come together with analytics for discipline mastery. Without one or the other, we miss the defining contours of the new learning landscape: deep grasp of material, moving from logging low-level actions to higher-order cognition, tracking social learning dynamics, and dispositions for lifelong/lifewide learning.

My current mantra…

Our learning analytics are our pedagogy.

Let’s make them fit to equip learners for complexity.

Some of the thinking behind SocialLearn is set out in a chapter in the new book edited by a team at the Open University’s Knowledge Media Institute, with contributions from some of the thought leaders on the future of education:

Collaborative Learning 2.0: Open Educational Resources

Current advances and convergence trends in Web 2.0 have changed the way we communicate and collaborate, and as a result, user-controlled communities and user-generated content through Web 2.0 are expected to play an important role for collaborative learning.

Collaborative Learning 2.0: Open Educational Resources offers a collection of the latest research, trends, future development, and case studies within the field. Without solid theoretical foundation and precise guidelines on how to use OER and Web 2.0 for collaborative learning, it would certainly be very difficult to obtain all the benefits that these “user-generated content, resources and tools” promise. The purpose of this handbook is to understand how OERs and Web 2.0 can be deployed successfully to enrich the collaborative learning experience and ensure a positive outcome in terms of user generated knowledge and development of skills.

In Chapter 17, we set out the rationale that motivates the creation of a platform like SocialLearn…

This chapter examines the meaning of “open” in terms of tools, resources, and education, and goes on to explore the association between open approaches to education and the development of online social learning. It considers why this form of learning is emerging so strongly at this point, what its underlying principles are, and how it can be defined. Openness is identified as one of the motivating rationales for a social media space tuned for learning, called SocialLearn, which is currently being trialed at The Open University in the UK. SocialLearn has been designed to support online social learning by helping users to clarify their intention, ground their learning and engage in learning conversations. The emerging design concept and implementation are described here, with a focus on what personalization means in this context, and on how learning analytics could be used to provide different types of recommendation that support learning.

Read the whole chapter as an open eprint on the OU’s Open Research Online server:

Ferguson, R. and Buckingham Shum, S. (2012). Towards a Social Learning Space for Open Educational Resources. In: Okada, A., Connolly, T. and Scott, P. (Eds.), Collaborative Learning 2.0: Open Educational Resources. Hershey, PA: IGI Global, pp. 309–327. Eprint:

Social Learning Analytics symposium
— student work-in-progress

A SoLAR Storm event hosted by the Open University SocialLearn Research Project

KMi Podium, Level 4, Berrill Building, Open University, UK [map]

12.30 Lunch
Programme: 2-5pm Tues 26 June [convert timezones]

Tweet on #StormSLA if you’re attending,
so we can see you on TwitterMap

This is the chance to catch up with the research programme on Social Learning Analytics that we are developing.

We’ll start by giving an introduction, and then two of our fabulous SocialLearn Interns will present their work in progress, which moves this programme forward.

This hybrid f-f/webinar is part of SoLAR Storm — the new virtual research lab convened by the Society for Learning Analytics Research, to build research capacity in this new field by networking PhD researchers with each other and the wider community.


12.30 – Lunch

2pm – Social Learning Analytics: Five Approaches

[Tweet comments/questions #StormSLA @r3beccaf @sbskmi]

Rebecca Ferguson and Simon Buckingham Shum, Knowledge Media Institute & Institute of Educational Technology, The Open University, UK

This paper proposes that Social Learning Analytics (SLA) can be usefully thought of as a subset of learning analytics approaches. SLA focuses on how learners build knowledge together in their cultural and social settings. In the context of online social learning, it takes into account both formal and informal educational environments, including networks and communities. The paper introduces the broad rationale for SLA by reviewing some of the key drivers that make social learning so important today. Five forms of SLA are identified, including those which are inherently social, and others which have social dimensions. The paper goes on to describe early work towards implementing these analytics on SocialLearn, an online learning space in use at the UK’s Open University, and the challenges that this is raising. This work takes an iterative approach to analytics, encouraging learners to respond to and help to shape not only the analytics but also their associated recommendations.

Key Reference:

Ferguson, R. and Buckingham Shum, S. (2012). Social Learning Analytics: Five Approaches. Proc. 2nd Int. Conf. Learning Analytics & Knowledge, (29 Apr-2 May, Vancouver, BC). ACM Press: New York. Eprint:

Bios: Simon was Programme Co-Chair for the 2012 Learning Analytics conference, serves on the new Society for Learning Analytics Research, and is a regular invited speaker on the topic including EDUCAUSE and Ascilite. His particular interests are in what learning analytics may be blind to, analytics for informal/social learning, and whether analytics can help build the learning dispositions and capacities needed to cope with complexity and uncertainty — the only things we can be sure the future holds.

Rebecca is a research fellow in the UK Open University’s Institute of Educational Technology, focused on Educational Futures. She works as research lead on the SocialLearn team, developing and researching initiatives to improve pedagogical understanding of learning in online settings, to design analytics to support the assessment of learning in these settings, and to extend the university’s ability to support learning in an open world.

2.45pm – A Self-Training Framework for Automatic Exploratory Discourse Detection

[Tweet comments/questions #StormSLA @zhongyu_wei]

Zhongyu Wei, Department of System Engineering & Engineering Management, The Chinese University of Hong Kong

With the development of online learning platforms such as SocialLearn, learning resources are uploaded to the internet at a dramatically increasing rate, which makes it difficult for individuals to identify information in need. Ferguson and Buckingham Shum (2011) aim at seeking qualitative understanding of context and meaning of the information and identify “exploratory dialogues” to facilitate users to decide if a resource is useful based on sociocultural discourse analysis (Mercer, 2004).

In this project, we extend the previously proposed self-training framework (He and Zhou, 2011) to detect exploratory dialogues from online learning materials automatically. We cast the problem of detection of exploratory dialogues as a binary classification task which classifies a given piece of text as exploratory or non-exploratory. We first train an initial maximum entropy (MaxEnt) classifier based on a small set of manually annotated dataset. The trained classifier is then applied on the large amount of unseen data. Texts classified with high confidence (refer to pseudo-labeled instances) are added into the training data pool for iteratively updating the classifier. Apart from incorporating pseudo-labeled instances directly into the MaxEnt training process, we also explore the use of pseudo-labeled features to constrain the MaxEnt training. Our extensive experiments on the transcribed text from online conferences and the learning paths data downloaded from the SocialLearn platform show that with the self-training framework, the performance of MaxEnt improves significantly. The improvement is more prominent when facing with a smaller number of annotated training instances. The proposed approach will be integrated into the SocialLearn platform for highlighting exploratory discourses in learning paths.

Preparatory material: LAK11 paper introducing Exploratory Discourse Learning Analytics: [paper] [replay]

Key References:

R. Ferguson and S.Buckingham Shum. 2011. Learning analytics to identify exploratory dialogue within synchronous text chat. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge, pp. 99–103. ACM. Eprint:

N. Mercer. 2004. Sociocultural discourse analysis. JAL, 1:137-168.

Y. He and D. Zhou. 2011. Self-training from labeled features for sentiment analysis. Information Processing & Management, 47(4):606–616.

Bio: Zhongyu WEI is a Ph.D. student in the Department of System Engineering & Engineering Management at The Chinese University of Hong Kong, under the supervision of Prof. Kam-Fai Wong. He received both M.Eng. and B.Eng. degrees in Computer Science and Engineering from Harbin Institute of Technology, China. His research focuses on social network analysis, information retrieval and machine learning. He is now visiting Knowledge Media Institute as an intern, working together with Dr. Simon Buckingham Shum, Dr. Yulan He and Dr. Rebecca Ferguson.

3.30pm – Prototyping Learning Power Modelling in SocialLearn

[Tweet comments/questions #StormSLA @shaofutw]

Shaofu Huang, Graduate School of Education, University of Bristol

Many educational experts argue that good learners in the 21st century will need a strong set of skills and attitudes to learning. To measure this, research of Effective Lifelong Learning Inventory (ELLI) has identified the “power” of an effective learner with seven dimensions [i]. The seven dimensions structure was designed to assess a learner’s learning power as well as to open up a window for coaching conversation which stimulates further development of the learner’s learning [ii].

Now being used by learners at many levels of education and in many countries, ELLI models learning power according to what learners think about themselves. EnquiryBlogger, a blogging platform for learners developing since 2010, enables users to link their online activities with learning power dimensions [iii]. What we are interested in this study, however, is whether and how we can model learning power according to what learners do in SocialLearn.

This study looked like a transplanting process to me at the beginning: from learner self-reporting to automated platform reporting. It still is, but I have now expanded my understanding of what “transplanting” means: it involves not only different languages, but also different ways of quality assurance and different interests. I will first introduce the learning power dimensions developed with ELLI as a background, then focus on the method and research design of this study and its inter-discipline nature.

Preparatory material: LAK12 paper introducing Dispositional Learning Analytics: [paper] [replay]

Key References:

[i] Ruth Deakin Crick, Patricia Broadfoot, and Guy Claxton, ‘Developing an Effective Lifelong Learning Inventory: The ELLI Project’, Assessment in Education: Principles, Policy & Practice 11 (2004): 247–272.

[ii] Ruth Deakin Crick, ‘Learning How to Learn: The Dynamic Assessment of Learning Power’, Curriculum Journal 18 (2007): 135–153.

[iii] Rebecca Ferguson, Simon Buckingham Shum, and Ruth Deakin Crick, ‘EnquiryBlogger: Using Widgets to Support Awareness and Reflection in a PLE Setting’ (presented at the PLE Conference 2011, Southampton, UK, 2011),

Bio: Shaofu Huang is a PhD student in the Graduate School of Education, University of Bristol. His broad research interests include authentic pedagogy, participatory learning and systems thinking. He has six year experience of teaching, learning design and teacher training before starting his postgraduate study here in the UK. He has been involved in the recent development of Learning Warehouse and many ELLI data analysis projects since 2010. Shaofu is now working as a SocialLearn intern with Simon Buckingham Shum and Rebecca Ferguson.

4.15 – Tea break

4.40pm – Open Discussion

5pm – Close

Learning Analytics for Learning Power

Knowledge Media Institute, The Open University, Milton Keynes, UK
3 year fully-funded PhD (Oct. 2012-Sept.2015), Stipend: £40,770 (£13,590/year)

Supervisors: Simon Buckingham Shum & Rebecca Ferguson


Intrinsic motivation to engage in learning (whether formal/informal, or academic/workplace) is known to be a function of a learner’s dispositions towards learning. When these are fragile, learners often disengage when challenged, and are passive, lacking vital personal qualities such as resilience, curiosity and creativity needed to thrive in today’s complex world. Learning Analytics seek to improve learning by making the optimal use of the growing amounts of data that are becoming available to, and about, learners [1,2]. Dispositional Learning Analytics seek specifically to build stronger learning dispositions (note that these are not ‘learning styles’, which have a dubious conceptual basis [3]). One particularly promising approach models dispositions as a 7-dimensional construct called Learning Power, measured through self-report data [4]. A web application generates real time personal and cohort analytics, which have been shown to impact learners, educators, and organizational leaders, and the underlying platform pools data from >50,000 profiles, which in combination with other datasets, enables deeper analytics. As a form of Social Learning Analytic [5], in combination with Discourse-Centric Analytics [6-7] and Social Network Analytics for learning [8], our strategic goal is to provide a suite of analytics that can help learners grow in Learning Power, and ultimately, build their capacity as life-long, life-wide learners.

PhD Challenge

A key next step in this research programme is to harvest and analyse data from the traces that learners leave as they engage in social digital spaces, and to explore its relationship to other data sets and data streams. This PhD will therefore fund a technically strong candidate to design, implement and evaluate “Learning Analytics for Learning Power”. The project will deploy iterative prototypes in authentic use contexts, considering OU platforms as a starting point (e.g. SocialLearn [9]; Cohere [10]; EnquiryBlogger [11]), but open to data streams from new kinds of digitally instrumented interactions (e.g. from Pervasive Computing; Augmented Reality; Quantified Self). A possible outcome is an analytics architecture open to diverse forms of input, grounded in a theoretically robust framework, generating visual analytics with transformative power for reflective learners.

Seize the Day!

This is a fantastic opportunity if you’re passionate about the future of learning and want to engage with next generation pedagogy. You’re already a great web and database developer, and now looking to develop a research career in Learning Analytics, one of the fastest growing topics in technology-enhanced learning. Your supervisors will be Simon Buckingham Shum and Rebecca Ferguson, who are active contributors to the field of Learning Analytics research. The PhD will be in collaboration with learning scientists and practitioners in the global research network, who are well placed to provide pedagogical input, user communities and impact evaluations. In particular, this work will be done in partnership with Ruth Deakin Crick, University of Bristol (Graduate School of Education & Centre for Systems Learning & Leadership), whose research into modelling and assessing learning power provides a foundational element for this project.

Open U. is the place to do research in learning technology, this being an institutional mission-critical challenge. The Knowledge Media Institute is an 80-strong state of the art research lab, prototyping the future for the Open University and the many other organisations with whom KMi partners. KMi is renowned for its creative, can-do culture, and its high impact on the OU’s strategic thinking and technical capacity [Locate KMi]. The Institute of Educational Technology (where Rebecca Ferguson is based) is home to world leading research on learning technology, as well as conducting institutional research to inform the university’s core business. You will be part of a comprehensive doctoral training programme in computing and educational technology, participating in a dynamic research community, with opportunities to connect with groundbreaking people and ideas — limited only by your energy and imagination.

To Apply…

  • Contact us if you have informal queries. We’re in Vancouver next week at the 2nd International Conference on Learning Analytics & Knowledge, or get in touch online for informal enquiries: s.buckingham.shum or r.m.ferguson atsign
  • Read the background papers, and research technical approaches which will enable you to locate your ideas within the literature on learning analytics. See for instance the ACM Learning Analytics Conference (2011/2012), and SoLAR resources, and other research conferences around data mining and visualization.
  • Open U. Research website gives an overview of the quality of research conducted here, and the Research Degrees Handbook which will answer many practical questions
  • Draft a proposal outlining how you might tackle this challenge, highlighting which of the above approaches you might focus on, and the skills/experience that you bring (plus any that you recognise you will need to acquire). 4 pages max. This is not a binding document, but shows us how clearly you can think and write.
  • Complete the PhD application form [Word doc]
  • Email these with your CV and a cover letter, cc’ing Simon Buckingham Shum & Rebecca Ferguson, to: Ortenz Rose <>
  • Deadline 7 June: You will be competing for one of several studentships being offered across all KMi research topics.


  1. Ferguson, R. (2012). The State Of Learning Analytics in 2012: A Review and Future Challenges. Technical Report KMI-12-01, Knowledge Media Institute, The Open University, UK.
  2. Buckingham Shum, S. (2011). Learning Analytics: Dream, Nightmare or Fairydust? Keynote Address, Ascilite 2011 followed by a Networked Learning Conference 2012 online discussion:
  3. Coffield, F., Moseley, D., Hall, E. and Ecclestone, K. (2004). Should We Be Using Learning Styles? What Research Has To Say to Practice. London: Learning and Skills Research Centre, 1540/05/04/500. Eprint:
  4. Buckingham Shum, S. and Deakin Crick, R. (2012). Learning Dispositions and Transferable Competencies: Pedagogy, Modelling and Learning Analytics. Proc. 2nd Int. Conf. Learning Analytics & Knowledge. (29 Apr-2 May, 2012, Vancouver, BC). ACM Press: New York. Eprint:
  5. Ferguson, R. and Buckingham Shum, S. (2012). Social Learning Analytics: Five ApproachesProc. 2nd Int. Conf. Learning Analytics & Knowledge, (29 Apr-2 May, Vancouver, BC). ACM Press: New York. Eprint:
  6. De Liddo, A., Buckingham Shum, S., Quinto, I., Bachler, M. and Cannavacciuolo, L. (2011). Discourse-Centric Learning Analytics. Proc. 1st Int. Conf. Learning Analytics & Knowledge. Feb. 27-Mar 1, 2011, Banff. Eprint:
  7. Ferguson, R. and Buckingham Shum, S. (2011). Learning Analytics to Identify Exploratory Dialogue within Synchronous Text Chat. Proc. 1st Int. Conf. Learning Analytics & Knowledge. Feb. 27-Mar 1, 2011, Banff. Eprint:
  8. Haythornthwaite, C. and de Laat, M. (2010). Social networks and learning networks: using social network perspectives to understand social learning. In: 7th International Conference on Networked Learning (Aalborg, Denmark, 3-4 May 2010). Eprint:
  9. Ferguson, R. and Buckingham Shum, S. (2012). Towards a Social Learning Space for Open Educational Resources. In: Okada, A.; Connolly, T. and Scott, P., Eds.: Collaborative Learning 2.0: Open Educational Resources. Hershey, PA: IGI Global, pp. 309–327. Eprint:
  10. De Liddo, Anna; Sándor, Ágnes and Buckingham Shum, Simon (2012, In Press). Contested Collective Intelligence: Rationale, Technologies, and a Human-Machine Annotation StudyComputer Supported Cooperative Work (CSCW). Eprint:
  11. Ferguson, R., Buckingham Shum, S. and Deakin Crick, R. (2011). EnquiryBlogger: using widgets to support awareness and reflection in a PLE Setting. 1st Workshop on Awareness and Reflection in Personal Learning Environments. PLE Conference 2011, 11-13 July, Southampton, UK. Eprint:

© 2008 SocialLearn Research Blog | iKon Wordpress Theme by Windows Vista Administration | Powered by Wordpress