I visited the University of Deusto in Bilbao, Spain, to give a keynote at the learning analytics summer institute there (LASI Bilbao 2016) on 28 June 2016. The event brought people together from the Spanish Network of Learning Analytics (SNOLA), which was responsible for organising the event, in conjunction with the international Society for Learning Analytics Research (SoLAR).
What does the future hold for learning analytics? In terms of Europe’s priorities for learning and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as employability, active-citizenship and well-being. This is a tall order and, in order to achieve it, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can avoid potential pitfalls and develop an action plan that will drive the development of analytics that enhance both learning and teaching.
The series of LACE workshops on Ethics and Privacy in Learning Analytics (EP4LA) keeps expanding.
I worked with María Jésus Rodríguez-Triana on the programme for one of these events, which she ran with Denis Gillet at the 12th Joint European Summer School on Technology Enhanced Learning (JTEL Summer School) in Estonia, on 20 June.Workshop outline
This 90-minute workshop aims to give participants an overview of the ethical and privacy issues in Learning Analytics. Furthermore, the workshop allows the participants to increase the awareness about how to implement LA solutions either as researchers, practitioners or as developers. It will consist of three parts:
Part 1 – Introduction: presentation of LA frameworks and guidelines for Learning Analytics regarding ethics and privacy.
Part 2 – Framework analyses: participants will be grouped to work in a specific framework. The teams will categorise those ethical and privacy issues that the participants are currently addressing in their practice, those that could be covered with a low-medium effort, and those that constitute a challenge
Part 3 – Discussion: An open discussion will follow, exploring the complexity of each framework and looking for potential ways of addressing them.
We had been asked to validate the Cert ED and the Professional Graduate Certificate in Education courses run by the college, which does not currently have the authority to award qualifications at this level. Like many other colleges in England, it asked the OU to validate its courses, so that students completing those courses could receive certificates from The Open University.
Through its Royal Charter, the OU is able to validate the programmes of institutions that do not have their own degree-awarding powers or that wish to offer OU awards. Validation is an iterative process, carried out over a period of time, culminating in an event that brings together participants. The process covers ten areas:
- Rationale, aims and intended learning outcomes of the programme of study
- Curriculum and structure of the programme of study
- Teaching and learning
- Admissions and transfer
- Staffing, staff development and research
- Teaching and learning resources
- Other resources for students
- Programme management and monitoring
- Programme specification and handbook
The process requires close scrutiny of relevant documentation, discussions with staff and students involved with the programmes, and tours of the facilities. A very interesting day, and a chance to get a detailed overview of how two qualifications work in practice.
I was invited to write a paper for Distance Education in China, a journal which reaches out to Western academics and is willing to take on the task of translating papers from English. My paper was based on work published in Augmented Education, written by me, Kieron Sheehy and Gill Clough, which was published by Palgrave in 2014.Abstract
Digital technologies are becoming cheaper, more powerful and more widely used in daily life. At the same time, opportunities are increasing for making use of them to augment learning by extending learners’ interactions with and perceptions of their environment. Augmented learning can make use of augmented reality and virtual reality, as well as a range of technologies that extend human awareness. This paper introduces some of the possibilities opened up by augmented learning and examines one area in which they are currently being employed: the use of virtual realities and tools to augment formal learning. It considers the elements of social presence that are employed when augmenting learning in this way, and discusses different approaches to augmentation.
数字化技术的价格越来越便宜,功能越来越强大,在日常生活中用途越来越广泛。与此同时,利用数字化技术进一步促进学习者与他们所处环境的互动以及对环境的 感知以增强学习的机会也越来越多。增强学习可以利用增强现实和虚拟现实以及许多能提高人类意识的技术。本文介绍增强学习的一些可能性并讨论目前正在应用增 强学习的一个领域:运用虚拟现实和工具增强正式学习。文章分析了基于虚拟现实和工具的增强学习所需的社交临场成分,并讨论不同的增强方法。
Ferguson, Rebecca (2016). 增强学习的可能性与挑战 [Possibilities and challenges of augmented learning]. Distance Education in China, 6 pp. 5–13.
Next stop after the LAK conference was Kuala Lumpur in Malaysia. There I took part in an expert workshop from 2-3 May 2016, organised by the Commonwealth of Learning, developing guidelines for the quality assurance and accreditation of massive open online courses.
The purpose of this review is to identify quality measures and to highlight some of the tensions surrounding notions of quality, as well as the need for new ways of thinking about and approaching quality in MOOCs. It draws on the literature on both MOOCs and quality in education more generally in order to provide a framework for thinking about quality and the different variables and questions that must be considered when conceptualising quality in MOOCs. The review adopts a relativist approach, positioning quality as a measure for a specific purpose. The review draws upon Biggs’s (1993) 3P model to explore notions and dimensions of quality in relation to MOOCs — presage, process and product variables — which correspond to an input–environment–output model. The review brings together literature examining how quality should be interpreted and assessed in MOOCs at a more general and theoretical level, as well as empirical research studies that explore how these ideas about quality can be operationalised, including the measures and instruments that can be employed. What emerges from the literature are the complexities involved in interpreting and measuring quality in MOOCs and the importance of both context and perspective to discussions of quality.Workshop participants
Australia: Adam Brimo
Japan: Paul Kawachi
New Zealand: Nina Hood
My final presentation at the LAK16 conference was another session organised by the Learning Analytics Community Exchange (LACE) project that built on our Visions of the Future work. This panel session brought participants together to discuss the next steps for learning analytics and where we are heading as a community.Abstract
It is important that the LAK community looks to the future, in order that it can help develop the policies, infrastructure and frameworks that will shape its future direction and activity. Taking as its basis the Visions of the Future study carried out by the Learning Analytics Community Exchange (LACE) project, the panelists will present future scenarios and their implications. The session will include time for the audience to discuss both the findings of the study and actions that could be taken by the LAK community in response to these findings.
Ferguson, Rebecca; Brasher, Andrew; Clow, Doug; Griffiths, Dai and Drachsler, Hendrik (2016). Learning Analytics: Visions of the Future. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
This paper explores the potential of analytics for improving accessibility of e-learning and supporting disabled learners in their studies. A comparative analysis of completion rates of disabled and non-disabled students in a large five-year dataset is presented and a wide variation in comparative retention rates is characterized. Learning analytics enable us to identify and understand such discrepancies and, in future, could be used to focus interventions to improve retention of disabled students. An agenda for onward research, focused on Critical Learning Paths, is outlined. This paper is intended to stimulate a wider interest in the potential benefits of learning analytics for institutions as they try to assure the accessibility of their e-learning and provision of support for disabled students.
Cooper, Martyn; Ferguson, Rebecca and Wolff, Annika (2016). What Can Analytics Contribute to Accessibility in e-Learning Systems and to Disabled Students’ Learning? In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
Our second LACE workshop of LAK16 was the highly successful Failathon. The idea for this workshop emerged from an overview of learning analytics evidence provided by the LACE Evidence Hub. This suggested that the published evidence is skewed towards positive results, so we set out to find out whether this is the case.
A packed workshop discussed past failures. All accounts were governed by the Chatham House Rule – they could be reported outside the workshop as long as the source of the information was neither explicitly or implicitly identified.Abstract
As in many fields, most papers in the learning analytics literature report success or, at least, read as if they are reporting success. This is almost certainly not because learning analytics research and activity are always successful. Generally, we report our successes widely, but keep our failures to ourselves. As Bismarck is alleged to have said: it is wise to learn from the mistakes of others. This workshop offers an opportunity for researchers and practitioners to share their failures in a lower-stakes environment, to help them learn from each other’s mistakes.
Clow, Doug; Ferguson, Rebecca; Macfadyen, Leah and Prinsloo, Paul (2016). LAK Failathon. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
A busy week at the Learning Analytics and Knowledge 2016 (LAK16) conference began with a workshop on Ethics and Privacy Issues in the Design of Learning Analytics. The workshop formed part of the international EP4LA series run by the LACE project.
The workshop included a series of presentations, and I talked briefly about findings related to ethics and privacy that had emerged from the LACE Visions of the Future study.Abstract
Issues related to Ethics and Privacy have become a major stumbling block in application of Learning Analytics technologies on a large scale. Recently, the learning analytics community at large has more actively addressed the EP4LA issues, and we are now starting to see learning analytics solutions that are designed not only as an afterthought, but also with these issues in mind. The 2nd EP4LA@LAK16 workshop will bring the discussion on ethics and privacy for learning analytics to a the next level, helping to build an agenda for organizational and technical design of LA solutions, addressing the different processes of a learning analytics workflow.
Drachsler, Hendrik; Hoel, Tore; Cooper, Adam; Kismihók, Gábor; Berg, Alan; Scheffel, Maren; Chen, Weiqin and Ferguson, Rebecca (2016). Ethical and Privacy Issues in the Design of Learning Analytics Applications. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
Learning at Scale: Using an Evidence Hub To Make Sense of What We Know, Abstract
The large datasets produced by learning at scale, and the need for ways of dealing with high learner/educator ratios, mean that MOOCs and related environments are frequently used for the deployment and development of learning analytics. Despite the current proliferation of analytics, there is as yet relatively little hard evidence of their effectiveness. The Evidence Hub developed by the Learning Analytics Community Exchange (LACE) provides a way of collating and filtering the available evidence in order to support the use of analytics and to target future studies to fill the gaps in our knowledge.
Ferguson, Rebecca (2016). Learning at Scale: Using an Evidence Hub To Make Sense of What We Know. In: L@S ’16 Proceedings of the Third (2016) ACM Conference on Learning @ Scale, ACM, New York.
Together with Mike Sharkey (Blackboard) and Negin Mirriahi (University of New South Wales), I chaired the Practitioner Track of the LAK16 conference in Edinburgh and edited the Practitioner Track proceedings.
Practitioners spearhead a significant portion of learning analytics, relying on implementation and experimentation rather than on traditional academic research. The primary goal of the LAK practitioner track is to share thoughts and findings that stem from learning analytics project implementations. The proceedings of the practitioner track from LAK’16 contains 12 short papers that share reports on the piloting and deployment of new and emerging learning analytics tools and initiatives.
Papers accepted in 2016 fell into two categories.
- Practitioner Presentations Presentation sessions are designed to focus on deployment of a single learning analytics tool or initiative.
- Technology Showcase The Technology Showcase event enables practitioners to demonstrate new and emerging learning analytics technologies that they are piloting or deploying.
Both types of paper are included in the proceedings.
Along with other members of the LACE project (Tore Hoel, Maren Scheffel and Hendrik Drachsler), I co-edited a special section of Journal of Learning Analytics Vol 3, No 1, which focused on ethics and privacy in learning analytics.
The section contained eight papers:
- Developing a Code of Practice for Learning Analytics
- Learning Analytics in Small-scale Teacher-led Innovations: Ethical and Data Privacy Issues
- LEA in Private: A Privacy and Data Protection Framework for a Learning Analytics Toolbox
- A Data Protection Framework for Learning Analytics
- The Role of a Reference Synthetic Data Generator within the Field of Learning Analytics
- De-Identification in Learning Analytics
- Privacy-driven Design of Learning Analytics Applications – Exploring the Design Space of Solutions for Data Sharing and Interoperability
- Student Vulnerability, Agency and Learning Analytics: An Exploration
The volume also included our guest editorial:Abstract
The European Learning Analytics Community Exchange (LACE) project is responsible for an ongoing series of workshops on ethics and privacy in learning analytics (EP4LA), which have been responsible for driving and transforming activity in these areas. Some of this activity has been brought together with other work in the papers that make up this special issue. These papers cover the creation and development of ethical frameworks, as well as tools and approaches that can be used to address issues of ethics and privacy. This editorial suggests that it is worth taking time to consider the often intertangled issues of ethics, data protection and privacy separately. The challenges mentioned within the special issue are summarised in a table of 22 challenges that are used to identify the values that underpin work in this area. Nine ethical goals are suggested as the editors’ interpretation of the unstated values that lie behind the challenges raised in this paper.
Ferguson, Rebecca, Hoel, Tore, Scheffel, Maren, & Drachsler, Hendrik. (2016). Guest editorial: ethics and privacy in learning analytics. Journal of Learning Analytics, 3(1) 5-15.
First visit to the excellent Black Country Living Museum in Dudley today following a wonderful night at Dudley Town Hall for the C&BLE Summer 1930s Ball with the Bratislava Hot Serenaders. So very good!
It’s the museum’s annual 1940s weekend and you can check out some of my photos from today.
And if you’re curious to hear what BHS sound like…
They are playing a number of UK dates (inc. Edinburgh Festival) so if they’re heading your way go see (and dance)!
I’ve been involved in supporting several workshops recently for the Open University around Leadership in Digital Innovation. This is one of the six strands of the new “Students First” strategy. We’ve run various workshops and events around this and we already have some great ideas coming through. The most recent workshop was to a select group of OU leaders about the leadership challenges (in my opinion we are all leaders, and personal leadership is what we should be developing here!).
The event was led by Dave Coplin the Chief Envisioning Officer at Microsoft and included a video by Martha Lane Fox, Chancellor of the Open University and creator of dot.everyone, and I’ve just seen that she is now on the board of Twitter.
I was leading the online discussion which took place during the event and I thought it might be worth sharing with you some of my key takeaways (now I’m getting hungry).
Martha gave a great talk about the dangers of complacency and how organisations are either digital organisations or they are not digital (digital DNA?). The thing that most resonated with me however and was echoed by others was how we must be “..always and relentlessly focused on users”. This may seem obvious to most but in many ways it is easy for organisations to inadvertently do things that lead to greater separation with users. For me I believe that we have been neglectful when it comes to user testing for example compared to the rigourous approaches we had previously, we also don’t represent the users at senior level in the way we once did and I’ve been calling for the Open University to consider a “chief customer officer” rather than, or complementary to, a chief operating officer, so that the emphasis is advocacy of the students. Some Universities are creating a PVC (Student Experience) role for similar reasons. The introduction of TEF and quality measured against student satisfaction sharpens the focus in this area and as we look at student co-creation, co-production, student evangelists, students champions and student evaluators we also need to consider student advocacy.
Dave Coplin provided a inspiring and provocative talk on themes such as the end of the divide between work/life, with most people having access to better technology at home than at work yet we are forced to commuting in order to use lower tech in offices. He talked about us as a Victorian workforce still largely pinned to our desks to use connected technologies. He talked about email, how it relies on us as the filter to the conversation moving further int he organisation, how most emails are not confidential and how we should ditch email as not the right technology. He talked about leadership changing to become about empowerment rather than control. He talked about lack of information flows across the organisation, about the potential for connectivism in work, about AI and predicting the future and about non linear thinking. He mentioned Skype Translator and how we no longer need to learn languages (yeah we all get the babelfish idea, but here I got uncomfortable about technologies reducing our ability for human discovery and improvement, language learning changes our brains and perhaps we shouldn’t just be so quick to lose that opportunity Dave? – to be fair he did say that we still need to develop core skills) and he finished off by saying that we need to focus on outcomes not process and concluded with the elephant powder anecdote which made a very good point about people doing stuff which adds no particular value.
After Dave’s provocations I led the online discussion and we had around six or seven people engaging in a stimulating discussion where we discussed topics including:
- How we are process driven and this affects how we manage change so we tend to have process led change which means we tackle little bits rather than the bigger goals and this seems to take away the creativity.
- How technology, when supporting our organisation, should be in the background and sometimes it appears to be in the foreground.
- The perceived tension between our regulatory and quality requirements and the need to take risk and innovate. We later concluded at our table that this was largely a demon of our own making (i.e. an internal perception rather than a reality) and that many universities find ways of working with QAA and regulatory bodies to manage the balance.
- Trust being a critical factor for the empowerment of staff at all levels.
Final there was a panel discussion with the Peter Horrocks (Vice Chancellor), Hazel Rymer (Acting Pro Vice Chancellor, Learning and Teaching Innovation) and Dave Coplin. Key quotes from that were “as Facebook say done is better than perfect“, “take the users with us on the journey”, “students as digital creators”, “everyone should have the opportunity to feed back”, “we need to challenge what we provide which is paid for versus what is given for free”, “we have gold standard bureaucracy”, “we must always and relentlessly focus on the user” and finally, a little controversially for a university “we should investigate what we can burn” (what are we doing that is of little value).
I’d like to hear your thoughts on these provocations, in the meantime I’m going to work with others across the OU to continue the discussion #OUDigitalInnovation
The main reason for my visit to Uruguay was to attend the First International Workshop on New Metrics for Evaluation: Towards Innovation in Learning. This event was organised by the Centre for Research at the Ceibal Foundation in collaboration with INEEd, the ICT4V centre and the education division of the Inter-American Development Bank.
The workshop had four objectives, which the organisers framed as:
1. Using data for research and evaluation: towards an open and collaborative process for analysis, research and improving education.
2. Presenting experiences in the use of information systems for improving learning outcomes.
3. Presenting innovative approaches for evaluation and assessment of learning outcomes.
4. Policies, projects and programs for technology integration and data use in education.
It was a fascinating event, with representatives from countries across South and Central America, including speakers from Brazil, Chile, Colombia, Ecuador, Mexico, Nicaragua and Uruguay. Other speakers from outside the continent were Dragan Gasevic from Edinburgh, Neil Selwyn from Monash in Australia and Gilles Dowek from France.
I was particularly interested to find that Uruguay runs a ‘One Laptop per Child’ programme based on premises of equality and justice. Uruguay sees access to computers and the Internet as a right. You should have them in your classroom, just as you should have electricity in your classroom. Plan Ceibal has supplied 600,000 people (a fifth of the population) with laptops or tablets. Every child gets one when they start school, and they get a replacement every three years, with secondary school children now receiving Chromebooks. Internet is available nationwide – no one should be more than 400 metres from the Internet. There is a maintenance programme and a disposal programme, a teacher training programme, a learning management system, a suite of software, and a programme of video-conferenced English lessons, arranged in conjunction with the British Council.
- What are the potential gains, and what are the potential losses?
- What are the unintended consequences or second-order effects?
- What underlying values and agendas are implicit?
- In whose interests is this working? Who benefits, and in what ways?
- What are the social problems that data is being presented as a solution to?
- How responsive to a ‘data fix’ are these problems likely to be?
These wider questions of politics and power have not yet been taken up to any extent by the learning analytics community, but they look set to be bigger issues as the field matures.
My talk was on learning analytics, the state of the art and what the future might look like.
I also took part in a round-table discussion with Neil, Gilles and Dragan on issues related to learning analytics.
The back channel – mostly in Spanish – used the hashtag #edumetricas
During a visit to Uruguay, I was lucky enough to be invited to visit the Institute of Education at the ORT University in Montevideo. There, I gave a presentation to faculty members and postgraduate students on Innovating Pedagogy.
For the past four years, The Open University has produced an Innovating Pedagogy report annually. This series explores new forms of teaching, learning and assessment for an interactive world, to guide educators in productive innovation. As one of the report authors, I presented a quality enhancement lunchtime seminar on 23 March 2016 (part of the QELS series). In the seminar, I introduced the themes that have emerged from this series of reports – scale, connectivity, reflection, extension, embodiment and personalisation – and how these connect with modules (courses) run by the OU. The seminar included examples of innovative pedagogies in use at the OU, and identified others that could be used in future.
Fifty people attended the workshop, including invited experts (expert presentations), representatives of current European-funded projects in the field of learning analytics (project presentations), and representatives of the European Commission.
The workshop dealt with the current state of the art in learning analytics, the prospects for the implementation of learning analytics in the next decade, and the potential for European policy to guide and support the take-up and adaptation of learning analytics to enhance education.
The workshop began with a review of current learning analytics work by participants and went on to consider how learning analytics work can be taken forward in Europe (presentation on the LAEP project).
Participants at the workshop identified immediate issues for learning analytics in Europe. They set out considerations to be taken into account when developing learning analytics, made recommendations for learning analytics work in Europe and then identified both short- and long-term policy priorities in the area.Immediate issues for LA in Europe
Framework for development: A European roadmap for learning analytics development would help us to build and develop a set of interoperable learning analytics tools that are tailored for the needs of Europe and that have been shown to work in practice.
Stakeholder involvement: There is a need to bring different people and stakeholders on board by reaching out to groups including teachers, students, staff, employers and parents. Our current engagement with stakeholders is too limited.
Data protection and surveillance: As legislation changes and individuals become more aware of data use, institutions need to understand their responsibilities and obligations with regard to data privacy and data protection
Empirical evidence and quality assurance: More empirical evidence is needed about the effects of learning analytics, in order to support a process of quality assurance.Considerations for the development of LA
- Learning analytics can change or reinforce the status quo
- Learning analytics should enhance teaching, not replace it
- It is our duty to act upon the data we possess
- Desirable learning outcomes must be identified
- Be clear why we are collecting and analysing data
- Bring the data back to the learner
- Intelligent systems need human and cultural awareness
- Impressive data are not enough
- Undertake qualitative studies to understand how learning analytics can be aligned with the perceived purpose of education in different contexts, and which aspects of different educational contexts will support or constrain the use of learning analytics.
- Publicise existing evaluation frameworks for learning analytics and develop case studies that can be used to enrich and refine these frameworks
- Develop forms of quality assurance for learning analytics tools and for the evidence that is shared about these tools.
- Identify the limitations of different datasets and analytics and share this information clearly with end users.
- Explore ways of combining different datasets to increase the value of learning analytics for learners and teachers.
- Extend to different sectors of education the work currently being carried out in the higher education sector to identify the different elements that need to be taken into account when deploying learning analytics.
- Develop analytics, and uses for analytics, that delight and empower users.
Innovative pedagogy: Top priority is the need for novel, innovative pedagogy that drives innovation and the use of data to solve practical problems.
Evidence hub: Second priority is to secure continuing funding for a site that brings together evidence of what works and what does not in the field of learning analytics.
Data privacy: Participants considered that a clear statement is needed from privacy commissioners about controls to protect learners, teachers and society.
Orchestration of grants: The European grants system could better support the development of learning analytics if grants were orchestrated around an agreed reference model.
Crowd-sourced funding support: Set up a system for crowd-sourcing funding of tools teachers need, with EU top-up funding available for successful candidates.
21st-century skills: Focus on developing learning analytics for important skills and competencies that are difficult to measure, particularly 21st-century skills.
Open access standards: Standards need to be put into practice for analytics across Europe, with an open access forum that will enable the creation of standards from practice.
Ambassadors: We need more outreach, with ministries and politicians spreading the word and encouraging local communities and schools to engage.Long-term policy priorities
Teacher education: Top priority in the longer term was for media competencies and learning analytics knowledge to be built into training for both new and existing teachers.
Decide which problems we want to solve: In order to develop the field of learning analytics we need to have collective discussions on the directions in which we want to go.
Facilitate data amalgamation: More consideration is needed of how to combine data sources to provide multi-faceted insights into the problems we seek to solve.
Identify success cases and methodologies that give us a solid foundation: We need a coordinated approach to quality assurance and to the identification of successful work.
Several accounts of the workshop are available online, dealing with the morning of day one, the afternoon of day one, day one as a whole, the morning of day two, the afternoon of day two and day two as a whole.
I’ve been listening to educational technology hype recently with an eyebrow raised particularly in respect to the ideas being expressed around artificial intelligence and the role of intelligent agents to replace humans. One of the most recent examples of this is Mark Zuckerberg at F8 conference saying ““Our goal with AI is to build systems that are better than people at perception.” The Telegraph provides a summary of his keynote and the F8 conference.
Sit back and reflect on his statement for a moment.perception pəˈsɛpʃ(ə)n/ noun
- the ability to see, hear, or become aware of something through the senses. “the normal limits to human perception”
- the way in which something is regarded, understood, or interpreted. “Hollywood’s perception of the tastes of the American public”
What is perception? – a personal view of the world? – shaped by our emotional state and environment? – An entirely subjective reality. What do we mean by better perception? is this seeing the world logically without the trappings of emotion? – is it about the ‘wisdom of crowds? – If it’s the latter then we know that this is being gradually debunked because we are seeing greater confirmation bias within social media circles, I referred to this in a previous post as ripples in the pond, and there is evidence of the undermining effect of social influence. However there is no doubt that artificial intelligence will have access to a greater dataset and will have the ability to interpret data in ways that would be impossible to humans. My question though might be is that going to translate into better outcomes?
Invention comes from creative friction, discourse, questioning. In a world where we are all synthesized down within a crucible above the flame of artificial intelligence what happens to inspiration. interpretation. challenge? – this is of course a dyspotian future that people in the AI world are keen to promote because it creates a big dream of the future and a strong emotional connection.
But we do need to be concerned because at a minimum a possible future predicted by Gartner may see smart machines replacing millions of humans but at the same time we should be rational because we must recognize the Myths around AI’s and their usefulness is in support human endeavours, especially around tackling big data challenges.
…so what of humanity?