Crowd Learning

Crowd learning describes the process of learning from the expertise and opinions of others, shared through online social spaces, websites, and activities. Such learning is often informal and spontaneous, and may not be recognised by the participants as a learning activity. In this model virtually anybody can be a teacher or source of knowledge, learning occurs flexibly and sporadically, can be driven by chance or specific goals, and always has direct contextual relevance to the learner. It places responsibility on individual learners to find a path through sources of knowledge and to manage the objectives of their learning. Crowd learning encourages people to be active in  setting personal objectives, seeking resources, and recording achievements. It can also develop the skills needed for lifelong learning, such as self-motivation and reflection on performance. The challenge is to provide learners with ways to manage their learning and offer valuable contributions to others.

5 Responses to Crowd Learning

  1. Mark Gaved says:

    Oxford Internet Institute holding a conference on crowdsourcing 25- 26 Sept 2014, abstract submission date 14 March 2014.

    From: David Sutcliffe (david.sutcliffe@oii.ox.ac.uk)

    Dear all,
    The journal Policy and Internet will be holding its third conference (co-convened by the OII, in collaboration with the ECPR) next 25-26 September in Oxford, on the subject of crowdsourcing. We are currently calling for abstracts.

    Conference: http://ipp.oii.ox.ac.uk/
    Call: http://ipp.oii.ox.ac.uk/2014/call-for-papers
    Abstract deadline: 14 March 2014.

    Location: Thursday 25 – Friday 26 September 2014, Oxford Internet Institute, University of Oxford.
    Convenors: Helen Margetts (OII), Vili Lehdonvirta (OII), David Sutcliffe (OII), Sandra Gonzalez-Bailon (Annenberg, UPenn), Andrea Calderaro (EUI / ECPR).

    Contact: policyandinternet@oii.ox.ac.uk
    #ipp2014

    ** Rationale **

    Crowdsourcing – the provision of goods by large numbers of people contributing via an online platform – is used to generate and sustain policy ideas, labour markets, business investment, charitable donations, knowledge commons (such as Wikipedia), cultural goods and artefacts, libraries, government transparency, public management reform, education, scientific development and the institutions of democracy itself. This pattern of technology-enabled institutional change, where a known few are replaced by an indefinite many, has deep and diverse implications for government, business, civil society, democratic life and public policy-making. Researchers and policy-makers have barely begun to examine the opportunities and challenges that the crowdsourcing model presents.

    The Internet, Politics, Policy 2014 conference is dedicated to facilitating discussion on crowdsourcing across disciplinary boundaries. The conference calls for papers on the observed and potential implications of crowdsourcing for politics, policy and academic practice. Perspectives are welcomed from across science, social science and the humanities as well as from academic and policy-making communities. We aim to identify both what is novel in crowdsourcing, and the ways it enables and extends existing social and political processes.

    ** Topics **

    The conference aims to attract papers from a range of disciplines analysing crowdsourcing-related phenomena. We welcome both theoretical and empirical papers reporting original research on crowdsourcing and related concepts such as microwork, peer production, human computing, co-creation, open innovation and e-government. We particularly welcome comparative approaches and papers drawing on new empirical findings and novel research methods.

    Topics of interest include (but are not limited to):

    How is crowdsourcing changing politics? Topics of interest include citizen participation in government and the political process, and online collective action.

    Uses of big data in evidence-based public policy, including probabilistic, and conditional and predictive policy-making and the use of social media data for government self-improvement.

    Online labor markets, new organizational forms, and the blurring of boundaries between work and play, as well as the economics of crowdsourcing more generally.

    Co-production and co-creation of public policy, through (for example) the use of feedback facilities, rating, ranking and reputation applications.

    Crowdsourcing for conflict management, peace building and humanitarian intervention, including crisis mapping.

    Crowdsourcing for educational, scientific and technological development, such as citizen science, crowd-funding, massive online open courses, and the methodological, epistemological and ethical issues involved.

    New methods for analyzing crowdsourcing, such as computational social science and big data analytics, including sentiment analysis, topic classification, sampling from social media platforms, and inferring from socially generated data to the wider population.

    Ethical issues arising from the use of such methods, such as de-anonymisation, privacy, and inequalities created by the use of predictive analytics in decisions concerning individuals.

    When crowds turn into mobs: online hate groups, organized cyberbullying, their dynamics and effective policy responses.

    Perspectives from any academic discipline are welcomed, including:
    political science, economics, law, sociology, medicine, information science, communications, philosophy, computer science, physics, psychology, management, organization science, geography and humanities. Papers should attempt to frame their object of study in relation to established concepts and theories. ‘Crowdsourcing’ need not be the central concept in a paper as long as it deals with the issues and topics identified in this call.

    ** Proposal submission **

    * Paper proposals

    Paper proposals should consist of a title and a 1,000-word extended abstract that specifies and motivates the research question, describes the methods and data used, and summarises the main findings. Abstracts will be peer reviewed, and the authors of accepted proposals are expected to submit full papers prior to the conference. Applicants will have the opportunity to co-submit their paper to the journal Policy and Internet, which will operate a fast-track review process for papers accepted to the conference.

    Paper submissions can also be considered for a Best Paper Award (sponsored by the journal Policy and Internet). The prize will be awarded at the closing session of the conference. As the paper is intended to be published in a future issue of the journal, authors should indicate whether they would like their paper to be considered for the prize.

    * Poster proposals

    Posters should summarise in a visually engaging manner the purpose, methods and results of an original piece of research. All accepted submissions will be considered for a Best Poster Award. The prize will be awarded at the closing session of the conference.

    ** Important dates **

    Extended abstract submission deadline: 14 March 2014 Decisions on abstracts: 14 April 2014 Full paper / poster submission deadline (for accepted abstracts): 15 August 2014 Conference dates: Thursday 25 – Friday 26 September 2014.

    IPP2014: Crowdsourcing for Politics and Policyhttp://ipp.oii.ox.ac.uk/

    ***

    David Sutcliffe

    Managing Editor
    Oxford Internet Institute
    University of Oxfordhttp://www.oii.ox.ac.uk/
    Tel: +44 (0)1865 612334

    Managing Editor
    Policy and Internet
    Journalhttp://onlinelibrary.wiley.com/journal/10.1002/(ISSN)1944-2866
    http://blogs.oii.ox.ac.uk/policy/

  2. Victor Morgan says:

    I’m in favour of the inherently democratic approach to learning implicit in this development. However, I feel that this summary is a trifle too bland. It does not engage with the problematic inherent in the developments in this area. In my observation the most salient problems are those of accuracy/evaluation of evidence and of coherence of argument. All this is evident already in popular manifestations such as the various wikis and the fatuous, ill-informed and sometimes downright aggressive comments that are evident on almost any website that invites comment (eg, comments on GooglePlay software, or book reviews on Amazon). This wider context has established an `etiquette of discourse’ that too readily carries over into what one hopes are more controlled environments. How one reshapes expectations and practices and establishes conventions for more fruitful and constructive interactions is surely the pressing issue here. This goes well beyond the technologies that make these interactions possible. It is about the shaping of social conventions in new contexts and that is and always will be an area of contention.

    • Liz FitzGerald says:

      Hi Victor,

      Thanks for your comments – and yes, you’re completely correct in identifying some of the main problems that one encounters when utilising crowd learning (another name for it reflects further issues: “the stupidity of the swarm”). One way around the problem of accuracy and coherence is through peer moderation and the use of reputation systems. For instance, theiSpot site has both a reputation system and also a published ‘etiquette’ to ensure high quality responses from the ‘crowd’ (it also helps that many people in the wider community are from a professional or semi-professional background). The h2g2 is likewise a good example of a peer-reviewed online encyclopaedia (yes, the inclusion of a link to Wikipedia about this is an intentional irony ;) ).

      I suspect that as the WWW becomes even more full of junk/pictures of cats/inane comments and unasked opinions, we will place more value on those sites that offer “quality” content, through whatever mechanism that is. I’d hope this would include items of better (factual) accuracy and a more coherent narrative/argumentation. One way of doing this *may* be personalisation (which still hasn’t gone away, although we still don’t seem to know what mechanisms are “best” although lots of people are trying various things and have been for some time) or through better managed content. Personally, I can’t wait!

      • Liz FitzGerald says:

        Incidentally, I wrote a journal paper about an authoring framework for user-generated content for location-based learning, in 2012 – you can find the details here if you’re interested. The underlying rationale was that this would help improve the quality of user contributions – although we haven’t yet tested this framework. If you’d like to discuss it in more detail, and/or would like to look at using it yourself, please get in touch.

  3. admin says:

    First trial of tool that allows students to grade their classmates’ homework and to receive credit for their work. http://bit.ly/16dY3Uf

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>