OPLRN proposal

From Institute of Educational Technology Public Wiki
Jump to: navigation, search

Status: This is now not the latest version as we needed to reformat for submission on 22/8/2008. The Open University and Carnegie Mellon University are working on a proposal to develop infrastructure, community and activity to help share research findings on the design and use of OERs. This outline proposal describes the network element of the proposal.

PDF version of the submitted proposal is available: [1]

OLnet Aims

OLnet aims to foster a global network for sharing methodologies and evidence on the effectiveness of OERs.

OLnet Summary

The Challenge

The aims of the Hewlett Foundation's Open Educational Resource (OER) program in 2007 were stated as to: 1. Sponsor High-Quality Open Academic Content; 2. Break Down Barriers to Open Educational Content; and, 3. Encourage People Worldwide to Use Open Educational Resources.

The resulting OER movement has been successful in promoting the idea that knowledge is a public good, expanding the aspirations of organizations and individuals to publish OERs. The Open Learning Network (OLnet) proposed here aims to extend this cultural shift, to evaluate the impact of OERs on teaching and learning and facilitate more transformative educational practices. This will be achieved by linking the design, use and evaluation of OERs. OLnet will provide an infrastructure and develop a community concerned with improving OER design, methods for assessing robustness and encouraging greater adoption of evaluative techniques to provide evidence of OERs in use.

The very openness of OERs means that we cannot be sure how they are being used, in what ways are people getting the best out of them, or if they are meeting the original goals of the developers. OER authors are often not in the best position to address these questions partly because their position inhibits neutrality and partly because of the need for expertise and resources to do the research.

The approach of the OLnet is to build on the growing pool of OERs and associated services, developing a complementary infrastructure to investigate and report on the issues around OER deployment and evaluation. There is an opportunity to go beyond isolated, individual views of an OER's effectiveness, by aggregating data, sharing evaluation know-how, and mediating dialogue and debate within the community.

Two Analogies

Here are two analogies to help illustrate the need we see for OLnet:

  • International Development: helping the helpers. One way to help developing countries is to get out there in the front lines. That's great — clearly someone has to be doing that. But fragmented, individual efforts can be made more effective by agencies who connect frontline groups with each other to share know-how and build collective intelligence about a region or problem of common concern. The OER movement lacks the feedback loops and socio-technical fabric to help this happen, this will be one of the key aspects of the OLnet.
  • A research portal. In more established fields such as cancer research, there is a consensus map of the structure of the field, the major research questions, and the different sub-communities and associated methodologies. It is possible to place oneself on the map, and to coordinate effort in a well understood way. In contrast OER research is a relatively young field, which has not yet being fully articulated and defined. The OLnet will help to facilitate researchers in the area to articulate the scope of the field and typical methodological stances. At various points the OLnet will trigger a series of questions and provide mechanisms to enable those in the field to progress answers to those questions. What is the OER research map? What is the OER design process? What does it mean to validate an OER? What are the central challenges that all agree on? The OPLRN seeks to create a structured 'place' where questions such as these can be debated, and hopefully, enabling more effective coordination of action around issues and OERs of common interest.

Research Questions

The driving research question behind OLnet pinpoints what we see as the next evolutionary step in the OER movement, namely:

  • How can we build a robust evidence base to support the design, evaluation and use of OERs?

This high level question is refined into three sub-issues:

  • How to improve the process of OER reuse/design, delivery, evaluation and data analysis?
  • How to make the associated design processes and products more easily shareable and debateable?
  • How to build a socio-technical infrastructure to serve as collective, evolving intelligence?


The Open University is open to people, places, methods and ideas. It promotes educational opportunity and social justice by providing high quality education to all those who wish to realize their ambitions and fulfill their potential. Through academic research, pedagogic innovation and collaborative partnership it seeks to be a world leader in the design, content and delivery of supported open and distance learning (http://www.open.ac.uk/about/ou/p2.shtml)

The University has given rigorous attention to monitoring and evaluating its practices and sharing best practice with key networks, relevant examples are:

  1. The Institute of Educational Technology at the University plays a major role in institutional research and evaluation; and
  2. The Centre for Research in Education and Educational Technology has the experience and capacity to mount major educational research projects, often with partners throughout the world;
  3. The Knowledge Media Institute leads projects on innovations in technology and the application of tools to learning and sensemaking;
  4. Many staff contribute to the work of relevant international bodies such as the Open Courseware Consortium, the European Association of Distance Teaching Universities and the International Council for Distance Education.
  5. The OpenLearn initiative, funded by the Hewlett Foundation 2005-2008, established a collection of open educational resources, tools, and links with the OER community that form a foundation for further work.

The University is committed to developing, with others, open content solutions that are effective and sustainable for both users and providers as we stand on the threshold of a new era of global educational delivery. We believe our experience enables us to extend the research and evidence base for OERs for all alongside the continuing investment in OER use and prodcution within the University.

Workplan Overview

In the first year of OLnet we will:

  • Establish a shared model for how to research OERs;
  • Build the infrastructure to enable a network to work together and share data and methods;
  • Focus on learning design and evaluation into use as approaches to data gathering;
  • Grow capacity through a fellowship programme;
  • Work with organizations to involve the wider community;
  • Draw in expertise through OER Challenge summits and sponsored professorships to expand knowledge and understanding of the area.

In years two and three of OLnet we will:

  • Expand the infrastructure
  • Engage the wider community through a program of OER fellowships
  • Establish partner networks
  • Report and synthesize research results
  • Aggregate from experience and data

Work areas

There will be six key areas of work

1. Management

Good project direction, management and structure are vital to the success of the OLnet. We will establish a management structure that recognizes the distributed nature of the network with core services initially held at the OUUK. A steering group will be established to provide an overview of project actions and advise on options and connections. Key appointments include a Network Liaison Manager responsible for the community connections and establishing Human Infrastructure.

2. Infrastructure building.

The infrastructure will provide a dynamic and evolving mechanism for supporting and developing the OER community. Building a large scale web2.0 infrastructure that enables the sharing of evidence of effectiveness through data, designs and tools. Tracking the social interactions, agreements and disagreements through supported discourse. The result will be a generic open source infrastructure that can support any project.

3. Capacity building

The fellowship programme will range from small scale stipends, to open fellowships and open professorships. Typically recipients of open fellowships will spend part of their time in supported travel and internships followed by a period researching aspects of OERs. Open professorships will give space and support to experts in related fields to work with OERs on a flexible basis. They will apply their expertise from their own domains to address specific and challenging OER research and development issues. Fluidity between roles and levels of expertise will be supported by an architecture of participation enabling contribution at all levels.

4. Research projects

Identified actions to create data, demonstrate methods and test hypotheses. Research projects will involve project owners, invited experts and fellowship holders. Core support will come from OLnet research fellows. OLnet will support research that promises to provide feedback to OER developers in relatively short time cycles. A fundamental objective of the OLnet is to provide actionable research results to improve the quality and facilitate the 'travel' of OERs (i.e. the movement of OERs from one context to another). Three initial projects are identified in this proposal in order to enable a rapid start. An agile approach will be used to bring in other activities ensuring flexibility in response alongside identified outcomes.

5. Network building

To complement the dynamic technical infrastructure a series of face-to-face and virtual events will create presence for the OLnet. OLnet is first and foremost a community endeavour. Events, regardless of format, will facilitate, build and strengthen the growing OER community. These will be careful choreographed over the lifespan of the project to trigger activity, facilitate and sustain a range of activities and ensure synergies are developed across the different components of the networks and the associated areas of work. Events will support:

  1. The debate, development and adoption of measures of effectiveness
  2. The use of tools and services to evaluate one's own resources or materials that one would like to use
  3. Tackling hard issues at the core of the OER movement and problem solving as a community
  4. Developing and supporting best practices in development, reuse, and transfer

While the form of our events may morph and grow over time to meet the evolving needs of the community, we have listed several events here that we plan to hold over the course of this grant.

  • Annual Summit: Bringing together developers, researchers and foundations to discuss current issues, the latest OLnet findings, and other issues relevant to the larger community.
  • Developer's Workshop: Share best design practices with OER developers assisting them to create resources that will be effective.
  • Evaluator's workshop: Bring together evaluators – whether course developers, institutional decision makers, or funders to learn the latest tools, techniques and methods of evaluation.
  • Co-sponsor relevant events throughout the OER community.

We also aim to proactively link to a wide range of existing networks and communities and to opportunistically link into relevant conferences.

6. Evaluation

A central facet of OLnet is an ongoing, formative evaluation strand. This will be the vehicle for driving the overarching research questions being addressed by the initiative - from evaluation of the network itself - what works and any barriers to its development, through to more specific questions on the sub-projects and the use and effectiveness of individual OERs. An evaluation framework will be developed at the start of the project in open consultation with the OER community. It will act as a blueprint for the evaluation, but will be designed to be evolving and dynamic. Findings from the various strands of the evaluation will be iteratively feed back into the OER community as soon as they become available so that they can genuinely shape and inform practice. Similarly, findings from the research and development activities across OLnet will help to shape and adapt the evaluation framework. We see the evaluation occurring at a number of nested levels. At the highest level is evaluation of the OLnet initiative itself - what's working and what isn't. There will also be evaluation of each of the above strands of work:

  • management (how well is the project working, what communication and dissemination mechanisms are being used and how effective are they, how are project ideas being developed and taken forward, what barriers and enablers are emerging in terms of trying to realise the overall OLnet vision),
  • infrastructure building (how is the infrastructure being designed and how well is it realised in practice, what technical and organisational issues arise)
  • capacity building (the fellowship programme will drive the capacity building activities, evaluation will focus on how effective this programme is, how it works in terms of the different types of sponsorship - from individual small-scale stipends through to the open professorships)
  • research projects (each research project will have it's own evaluation component, the project-level evaluation will aggregate the findings from across the projects to produce a meta-synthesis of overarching themes, in particular looking at emergent commonalities and differences across the projects)
  • network building (this will concentrate on how effective network activities are; do some types of activities or events have more impact? what are the critical success factors? how important are individual contextual factors - timing of event, structure, who is involved, etc.)?

The evaluation framework will develop the above list of research questions and will provide a comprehensive set of methods for data collection. These will included surveys, web statistics, feedback forms, focus groups, observation, video and audio recording, interviews, etc). However the focus is on providing a facilitative evaluation framework that can be taken up and used by the OER community. So the framework provides guidance and support, the emphasis is on an action-based, community approach to undertaking the research.


The structure for the work is shown in the diagrams on the following page. We see ourselves as investigating two related cycles. The cycle of information about the effectiveness of resources: the OER effectiveness cycle; and, the cycle of information flow and community about research: the OLnet effectiveness cycle.

The relationship between these areas is overlapping and synergistic. The OU is taking the lead in the researching the OLnet effectiveness cycle and Carnegie Mellon is taking the lead in researching the OER effectiveness cycle. In research question terms Carnegie Mellon is taking the lead on the research question: "How to improve the process of OER reuse/design, delivery, evaluation and data analysis" and the OU is taking the lead on the research questions: "How to make the associated design processes and products more easily shareable and debateable?" and "How to build a socio-technical infrastructure to serve as collective, evolving intelligence?" The two institutions, jointly, will be responsible for defining and creating the OLnet infrastructure that supports both effectiveness cycles.

These figures might, therefore, be treated as a first attempt at providing a common framework for OER designers, researchers and software developers to locate themselves in relation to each other.

Key diagrams: locating yourself on the map

We sketch below a series of diagrams that might help coordinate activity by providing a common frame of reference. The idea is that they should be expressive enough for an OER practitioner/researcher to locate their contributions, whether it be OER creation, delivery, evaluation, data analysis, or methods/software support for any of the above.

Firstly, there is the basic OER lifecycle. We do not expect this to be too controversial as an idealised model: one moves from design or selection of an OER, to implementation, to deployment, and through evaluation generate data which informs another design iteration. This may happen rapidly or slowly, with anything from one to hundreds of learners, generating informal or formal data (see later discussion on recognising the diversity of forms of evidence). For each stage of the cycle there are an associated set of tools, resources, etc. available within the OLnet community. Each stage can also generate specific outputs such as a design representation or a new evaluation instrument, which can be put back into the OLnet community for others to use. So for example a user might query an existing OER repository such as OpenLearn as a means of selecting an OER for use. Someone else might develop a new survey instrument for evaluating the use of say, Science-focused OERs which they then make available to the OLnet community for others to use to evaluation their use of Science OERs.

Our central argument is that all too often, the feedback loop (from evaluation, to data collection, to cumulative design improvement) is broken — links that should be reforged. This design cycle is focused on OERs as the objects of interest, with other tools facilitating its transition at different stages, hence the label "OER Effectiveness Cycle". At each stage of the cycle


We want to emphasis that we see the cycle as reflexive: OERs are not the only objects of interest in an epistemic community of practice. Any of the design representations or other artifacts generated, or used to analyze, OER design can themselves become "social objects", that is, artifacts shared, deployed, evaluated and improved on by the community. There will be sub-communities focused on designing better OER Learning Design Patterns (see Research Project P2), better Social Learning tools, better Evaluation Tools, better Dataset Analysis Tools, and so forth. Thus, the very infrastructure that we use to to accomplish this process -- the OLnet -- becomes the object of reflection, hence the label "OLnet Effectiveness Cycle".

The second figure shows the emergence of these specialist communities, whose focus is a particular node or arc in the OER or OLnet cycle. Five illustrate OER effectiveness cycles are show. It shows how the OLnet provides a facilitative infrastructure and network to enable connections to be made across different sub-project or activities. So that outputs from one activity can be taken up and reused by another activity. So for example a design representation in one cycle can be picked up and used as a starting point for a different OER cycle, or evaluation findings on the use of one OER can be used to inform and shape the design of a different OER. Some of these different connections are shown by the arrows between OER cycles in the diagram. In addition to this transfer of knowledge and outputs between OER cycles, OLnet will also aggregate outputs in different ways - for example provide a means of groups people and sub-communities in different ways, aggregating tools and resources, collating designs, evaluations and case studies and performing a range of relevant meta-synthesis studies. This will ensure that the sum is greater than the parts and that maximum benefit is derived at the level of individual OER cycles and the overall OLnet.


Plan – year 1 (plus outline year 2)

By Area Activity
March 2009: Management Recruitment of core staff
Infrastructure Day 1 environment (based on OLI tools, OpenLearn tools + Cloudworks)
Future Technology Workshop
Requirement analysis, assessment of tools.
Consultation with OCWC and engagement.
Gather tools and approaches (workshops, materials, methods etc.)
Projects Start research programme
Project 1: Integrating pedagogies and technologies that support individual learning and group knowledge building
Project 2: Open learning designs
Project 3: The OER effectiveness cycle
Capacity 2x expert professorships
2x identified fellowships
Events Masterclass to prime “OCW” cohort (P3)
Launch Research Network at Hewlett Meeting
June 2009 Management Virtual steering group established
Infrastructure Quickstart guide
Prototype environment
Projects Workshop+ approach
Data invitation – mix your information
Capacity Call for open fellowships
Call for open professorships
Events OER Challenge Summit: use & evaluation of OERS the methodological challenge
October 2009 Infrastructure Data visualization tools
Usability study
Feature review
Projects Second cohorts
Capacity Community links through e.g. OCWC, CoL, TESSA
Events OER Challenge summit: Design and OERs
December 2009 Populate Group cognition and OERs data and methods
OER design set and use data
OCW case study use of effectiveness cycle
Report A model for research engagement
Assessment of the effectiveness cycle
Infrastructure review
Stage 2 refinements
Outreach model
Network expansion
Data aggregation
Summative Events: with OCWC
Operating at scale

Research projects:

P1: Integrating pedagogies and technologies that support individual learning and group knowledge building. – lead CMU: Champion Candace Thille

The OLI pedagogy and technology is designed to support and evaluate individual learning of content and practices in complex domains. The Knowledge Forum pedagogy and technology developed by Marlene Scardamalia and Carl Bereiter is designed to support and evaluate group knowledge building (Knowledge building is a process of sustained idea improvement fostered by communities in which participants take responsibility for the advancement of community knowledge). We will integrate the two pedagogies and technologies in an exemplar OER so that teachers and learners can participate in both environments simultaneously and so that we can collect a single data store of their use in the integrated environment. As the integrated environment is used to support teaching and learning, we will use the OLI and Institute for Knowledge Innovation and Technology (IKIT) analytic tools to evaluate the ways in which the integrated approach improves both individual learning and group knowledge building. There are many and important related research issues. For example, if learning is enhanced for a few students, do their contributions to the collaborative space enhance the work of the group; how can we assess growth and spread of ideas; can we keep ideas alive and improving in a worldwide open community?

Marlene Scardamalia will serve as an Open Professor. The teachers using the integrated OER will receive fellowships for their participation in the use and evaluation research. OLI and IKIT technical staff will integrate the learning and data collection environments.

Outcomes: on two levels:

On the specific OER level:

  1. ) An OER that exemplifies a combined pedagogical approach of fostering individual learning and group knowledge building
  2. ) Data on the effectiveness of the specific OER in improving teaching and learning.
  3. ) Data on the conditions for effective travel of the OER.
  4. ) Capacity building in the areas of new pedagogical approaches, evaluation and design for the teachers using the integrated OER.

On the OPLRN network level:

  1. ) A learning environment that facilitate the simultaneous participation of learners in individual knowledge building and group knowledge building practices.
  2. ) Contribution to the OPLRN’s collection of evaluation methods, data analysis tools and design practices.
  3. ) Contribution to the OPLRN’s evaluation data collection.
  4. ) Analytic tools for visualizing both individual knowledge gain and emergent processes of knowledge building over time.

P2: Learning design of OERs – lead OU: Champion Grainne Conole.

Brief: Learning design is an approach that considers learning materials as having both a final product, the educational resource, and a design that captures the intent of the product. This design is often implicit and has not been valued as a product in itself. OERs challenge that position as it becomes important to communicate why material has been developed so that users can make best use of the material and also see the designs as shareable in themselves. Designs matter both to educators, to understand potential reuse, and to users to help them select material relevant to their context. The design is a key part of the effectiveness cycle. In this project we will establish collections of designs linked to available OERs. Current work in the OU has established use of Compendium (a free knowledge mapping tool from OpenLearn) as a representation tool, and a workshop-based approach to involving the educator and producer community. A community site Cloudworks (http://cloudworks.open.ac.uk) is now in use within the OU to share designs at differing levels of sophistication. This platform is being opened to a wider community.

There is a wider interest in the approach and related methods, such as pedagogic patterns. We will collaborate with that community through an Open Professorship to enhance the understanding of how OERs work and connect from design to models of use.

Outcomes, on two levels:

On the specific OER level:

  1. ) Design-based description in a shareable representation.
  2. ) Data on the interpretation of design and how it can support reuse.
  3. ) Materials to capture the approach and enhance participation.
  4. ) Use data on designs in relation to OERs

On the OPLRN network level:

  1. ) A collection of social objects that enable research discussions .
  2. ) Community building through sharing of representations and interlinking with the OER community.
  3. ) Integration of tools (e.g. from Cloudworks) within the OPLRN frameowrk.

P3: The OER effectiveness cycle – lead CMU: Champion Candace Thille.

Brief: Establish a framework for evaluating and improving the effectiveness of OERs in improving teaching and learning in various contexts. Working with OER producers and users from the OCWc as participatory researchers, we will move several representative OERs through the OER effectiveness cycle as the OER travels to a new context. We will document the evaluation methods, the data analysis and design practices, and the results of the evaluation showing the mechanisms through which the OER improves teaching and learning and the conditions for effective travel. An open professor will conduct the research on the mechanisms through which the OER improves teaching and learning and on the conditions of travel. The OER producers/users involved in this project will be awarded fellowships to support their participation.

Outcomes: on two levels:

On the specific OER level:

  1. ) Data on the effectiveness of the specific OER in improving teaching and learning.
  2. ) Data on the conditions for effective travel of the OER.
  3. ) Capacity building in the areas of evaluation and design for the specific OER producers and users.

On the OPLRN network level:

  1. ) A framework of progressive refinement for evaluating and improving the effectiveness of OERs in improving teaching and learning.
  2. ) Contribution to the OPLRN’s collection of evaluation methods, data analysis tools and design practices.
  3. ) Contribution to the OPLRN’s evaluation data collection.


OLnet International Fellowships

OLnet will establish and run an International Fellowship programme that will fund a mix of stipends, scholarships and internships to spend time working with established experts, or bringing in identified expertise. The fellows will come from Institutions other than the network hubs and successful applicants will become Netwrok members. The fellowships will be hosted and/or part sponsored by either the OPLRN hubs or other appropriate locations (e.g. OCW members, centres of expertise, or placements with developing areas). A tentative model for such a fellowship based on the OU’s experience in running a funded International fellowship scheme for open and distance learning (http://www.open.ac.uk/international-fellowships/index.shtml) is:

  • 3-6 months overall duration
  • 2-4 weeks funded placement in an institution with training, mentoring and sharing of expertise.
  • 3-5 months project-based work on negotiated release to contribute to the evidence base for OPLRN

This model will imply a shared funding base between OPLRN and the participant’s institution. We will also adopt a “bench fee" model to enable the involvement of different hosts – for example, MIT, Cambridge University and Open University of the Netherlands would all be suitable placement sites and we hope to draw upon existing schemes from the OU and Cambridge University (www.welf.org.uk) to add to the number of OER focussed fellowships with the Network.

OLN Open Professorships

As a Network we should be harnessing our skills and capacity to reach large numbers of people through a programme of ‘open lectures’ using our infrastructure by offering such opportunities to distinguished thinkers and writers that parallel the Gresham Professorships offered by Gresham College in the City of London.

We therefore propose to establish and support several Open Professorships that enable the distinguished recipients to develop multimedia learning resources on a research topic that fits within the Network’s themes.

Open professorships would last between 3and 6 months depending on the individual and their project. The money associated with each visiting professorship would pay for some travel and subsistence expenses for the duration of their tenure and pay for the appropriate development and delivery of the learning resources they would author. In other words the visiting professors would bring their ideas and the Network would supply the expertise and assistance needed to convert it into the most accessible style and format.

The Open Professors or others would also be expected to lead time limited on-line discussions and debates surrounding the topic of their ‘lectures’. In time an archive of material authored by some of the most noted OER researchers and scholars of their time and incorporating significant public participation would be generated and accessible to anyone able to use the Internet.

Organisational and Technical Infrastructure

Key Requirements

To advance the OER movement in the manner proposed requires both an organisational and technical infrastructure to build an epistemic community — a reflective community of practice dedicated specifically to advancing understanding not only of its field, but of what can/should count as “knowledge” in the field. Educational technology, and OERs specifically, are young, interdisciplinary design fields lacking widely adopted design methods, patterns, or evaluation criteria. The infrastructure must therefore foster appropriate forms of discourse and memory: structures for sharing, indexing, recovering and debating the community’s collective intelligence on the relative merits of different OER design and evaluation approaches. Nor can the infrastructure fossilize as soon as the project’s startup funding ends: it must be a sustainable social and conceptual network that can evolve through the contributions of many people.

Design Process

The project will follow a participatory design, action research approach. We will engage the OER practitioner and researcher community from the start in a series of face-to-face and online consultations, this pre-submission summary being the first step. We will be rapidly prototyping and releasing early and often in response to community feedback.

What might “OER Collective Intelligence” look like?

An epistemic community is interested in claims and supporting evidence, but also in counter-claims and differing interpretations of the same evidence. While many projects are engaged in building collective intelligence, few know how to deal well with contested knowledge — other than by enabling comments, threaded fora, blogs and wikis. While the low levels of structure in such tools creates very low entry thresholds for new users who want to post a comment or make an edit to a document, they provide correspondingly poor support for an end-user who wants to know the current state of the evidence base, or of a debate. The software has no conception of the kinds of discourse that underpins the way the community works.

We therefore propose to add an additional layer of structure designed specifically to support epistemic community discourse and memory. The infrastructure built to support the OLN will enable learners, educators, researchers, analysts and other decision makers to ask questions such as, What evidence is there that this OER is effective?, but equally important, Does the community agree with the claims made? What counter-examples are there? In addition, as a socio-technical infrastructure, it will be architected to support diverse forms of evidence, with an architecture of participation that sets a low threshold for contribution, and community structures to help those who want, to move from the periphery to become more active, central players.

Technical Infrastructure

Three requirements currently shape our thinking:

1. Recognise multiple levels and types of evidence

2. Enable structured knowledge negotiation and integration

3. Facilitate large scale participatory research

Recognise multiple levels and types of evidence

Practitioners and researchers come from many intellectual traditions, each of which has shades and nuances. What “counts” as legitimate evidence in order to make claims varies accordingly. Hard pragmatics often constrain the kinds of data that can be collected, from anecdotal stories from the field that are compelling but isolated, to more structured case studies, through to narrower qualitative and quantitative laboratory evaluations that strip out the “noise” of real life in order to yield insights into specific group or cognitive processes.

A collective memory that is owned by such a diverse community will not only honour these different kinds of narrative, but highlight the weight of evidence around a given issue by providing a way to build and navigate the layers of narrative through an engaging website. Thus, we can imagine a user interface onto the evidence base that makes clear which of the following “evidence layers” underpin a particular OER:

  • Anecdote: text/images/video from the field (e.g. We’ve just completed the first trial of this OER and it’s been a disaster — but we have some clues as to why, which we’re chasing up...)
  • Case study: anecdotal with informal evidence (e.g. ...and we have the following screencasts and interview MP3s that we’re happy to share because we need help to analyse them...)
  • Case study: structured research methodology and data analysis (e.g. This article/website tracks a cohort of trainee teachers for 3 months, as they sought to apply an OER. Video analysis using Grounded Theory leads us to propose 3 key factors that influence their success...)
  • Controlled experiment: qualitative and/or quantitative data (e.g. 48 undergraduate chemistry students grouped by ability and cognitive style used the ChemTutor OER to complete Module X. Statistical analysis combined with think-aloud protocols supports the hypothesis, based on Learning Theory X, that higher ability students would benefit most...)

The mere presence of one or more evidence layers as the foundation for an OER is an approximate cue to the level of validation it has received, but is not, of course, a guarantee of its suitability for a given context (content may be culture-specific; conclusions may be controversial; methodology flawed; etc). Indeed, successful OER deployment and reuse turns on contextual factors. Our task will be to create ways for the commnuity to filter the OER evidence-base around meaningful context.

We will implement dissemination of new results via structured web feeds targetting phenomena of interest to an epistemic commnuity. While it will be trivial to subscribe to research alerts when a specific resource changes (e.g. “Monitor this OER on midwifery”), once integrated with the underlying network of claims and evidence, one might subscribe to “Prediction: OERs will be convertible to study credits in major universities by 2020”, “Challenge: Who can deliver an open midwifery course in 6 months?”, or an ongoing debate (“Algebra-1 tutor is more effective than a human”).

Enable structured knowledge negotiation and integration

As many commentators have noted, despite the digital revolution, the manner in which research is disseminated and reviewed is largely still in the era of the printing press: with minor refinements, the prose research article is in essence the same as when the first journals emerged in the late 1600s. While prose articles can now move in an instant around the planet, and some researchers now share their thoughts more regularly and at a finer granularity via blogs, our knowledge infrastructure is barely exploiting the potential of the digital network.

Search engines and digital libraries process text, hyperlinks and citations to match keywords and link bibliographies (cf. Citeseer and Google Scholar). However, these still do not permit the kinds of basic queries that any student or researcher brings to a field:

  • Which work supports or challenges this?
  • What is the intellectual lineage of this idea?
  • What data is there to support this specific claim or prediction?
  • Who else is working on this problem?
  • Has this approach been used in other fields?
  • What logical or analogical connections have been made between these ideas?

While the landscape of Web 2.0 social tools is changing weekly, developments of tools for more structured discourse is less frenetic, since they target a more specialist audience focused on long term, high quality knowledge construction.

Open U. infrastructure. A semantic annotation tool able to support such queries was first developed as a proof of concept in an OU e-science project around modelling and visualizing scholarly discourse (2001-04). The Hewlett Foundation has, through the OpenLearn project, funded the extension of the Compendium knowledge/argument mapping tool, and the creation of the Cohere annotation and argument-mapping tool which delivers these capabilities within a Web 2.0 paradigm. Cohere enables an analyst to create “ideas”, optionally backed up by URLs, and then forge meaningful connections between ideas in order to create a conceptual network. These connections can in turn be supported or challenged by others. The result is the overlaying of the emerging conceptual network — new research results are reported — with the social network — the community behind the ideas. A recent development at Open U. is the SocialLearn Web 2.0 platform, likely to be deployable during the project's first year, should it be judged suitable.

Other relevant work. Web-centric, structured deliberation tools are being explored by the Global Sensemaking community (see their tools listing). We also note the work on scaffolding students' scientific discourse and other forms of argumentation in the Computer-Supported Collaborative Learning research community. The Univ. Toronto OISE research led by Scardamalia on structured discourse with school pupils has produced the Knowledge Forum infrastructure, which is distinctive in providing analytic indicators based on measures of the community's discourse. If KF can be transferred to support OER researchers, and integrated technically, its tools may provide us with a way to measure the coherence of the OER user community.

Web 2.0 synthesis. The OPLRN's members can of course be referred to any of the above tools as standalone hosted web applications. Whether we can integrate them will depend on their APIs, and whether we can extend them will depend on licensing and other considerations. This project's approach is to build a confederation of loosely coupled, interoperable tools, in which "the data is the platform", with participants contributing/accessing from diverse tools via an API. The project will review the above tools, and depending on functionality, usability, licensing and technical issues, will integrate one or more of the above with other tools (e.g. Twitter, blogs, wikis, listserv), and the OER document archive (e.g. an Eprint archive).

Facilitating large scale participatory research

The preceding sections focused on key technical foundations for the envisaged social, knowledge-integration infrastructure. While the project’s core staff will certainly be end-users, with a high degree of expertise, the challenge is to foster an active community. How will we foster the skills and work practices to populate and manipulate these tools in a large-scale, open-ended community of OER researchers and practitioners? How can we make it easy for them to contribute meaningfully, with growing motivation as they feel community ownership, to calls for large scale participation in OER design and evaluation initiatives?

Relevant principles from human-centred knowledge structuring are:

  • Start with the tools people use. The Web 2.0 paradigm underpinning the infrastructure will enable as far as possible the use of diverse tools to submit material to the evidence base (as described above, ranging from informal videoblogs and stories, to well designed experimental studies). So, we will investigate ways to enable bloggers to submit from their blog, social bookmarkers to submit via tag combinations, and academics uploading papers to eprint archives to generate metadata as a by-product.
  • Active facilitation and mentoring. Research into technology adoption clarifies the key role of early adopters/experts actively modelling how tools can be used in simple ways, and how they relate to everyday practice. In addition, we will create webinars and screencasts that show the range of materials that we will welcome.
  • Knowledge gardening. It has been demonstrated time and again that while untrained participants may be willing to post comments, edit a page, or tag contributions to shared repositories, it is often fruitless to expect more fine-grained levels of information structuring. Compared to a leisure social network, one might expect greater motivation to structure contributions within an epistemic community of inquiry, but this cannot be depended on. It is envisaged that the creation of coherent views of the literature will require an architecture of participation like Wikipedia’s, whereby a small team of editors, ‘gardeners’ (or ‘curators’ to use a museum metaphor) works to identify, annotate and most likely additionally structure quality contributions, which will then be promoted from the ‘wilder borders’ to one or more more carefully maintained ‘gardens’, which provide a filtered view of the network, maintained by and for a particular community. While the project’s core team can play this role in the early days, we envisage that OER research groups will appoint at least one of their number to become ‘literate’ in tagging, linking and other techniques to maximise the visibility and impact of their research results (just as researchers have learnt to disseminate their work via the Web, first on personal homepages, then in institutional document archives and libraries, and now via richly tagged and indexed blogs).

Logic Model