Skip to content

Toggle service links

Coronavirus: Please be aware it may take us slightly longer to respond than usual. Find out about our coronavirus response and current contact hours.

You are here

  1. Home
  2. OU awarded £250,000 to counter misinformation

OU awarded £250,000 to counter misinformation

Man's hands typing on a laptop

The Open University (OU) has been awarded £250,000 from the Engineering and Physical Sciences Research Council to track the manipulation of online information, particularly about COVID-19 and climate change.

The project, CIMPLE, which stands for Countering Creative Information Manipulation with Explainable AI, is part of a €1.6M collaborative project funded through CHIST-ERA; a consortium of international research funding organisations.

Replacing the “black box” approach

The project aims to research and develop innovative knowledge-graph based solutions for the detection and tracking of online information manipulation, while taking into account various AI-explainability needs and requirements. Explainable AI is artificial intelligence (AI) in which the process and results of the solution can be traced and understood by humans. It contrasts with the concept of the "black box" in machine learning where even its designers might struggle to explain why an AI arrived at a specific decision.

Tracking COVID-19 and climate change information

CIMPLE will use computational creativity techniques to generate powerful, engaging, and easily understandable explanations of rather complex AI decisions and behaviours. These explanations will be tested in the domain of detection and tracking of manipulated information, with COVID-19 and climate change being the two main use cases, both of which are common targets for much misinformation.

New approach to Explainable AI

Professor Harith Alani, the OU’s Principal Investigator on the project, said: “AI is being increasingly used to automatically detect misinformation to cope with the sheer scale of the problem. But research shows that presenting users with true/false decisions on claims circulating online is often inadequate and ineffective, particularly when a black-box algorithm is used.”

Explainability is of significant importance in the move towards trusted, responsible and ethical AI, yet the understandability of such explanations and their suitability to particular users and application domains received very little attention so far.

Professor Alani, adds: “Acceptance of AI approaches is sometimes dependent on various human factors. In some contexts, such as misinformation detection, we need to find more creative and engaging ways to explain the auto-generated credibility assessments.”

CIMPLE starts in April 2021, and will run for three years, involving five partners from five countries:

  • EURECOM, France (CIMPLE Co-ordinator)
  • The Open University, UK
  • INESC-ID, Portugal
  • University of Economics, Czech Republic
  • webLyzard technology, Austria

Read more about research in the OU's Knowledge Media Institute

Find out more aobut what the OU is doing to support the response to Coronavirus

Contact our news team

For all out of hours enquiries, please telephone +44 (0)7901 515891

Contact details

News & articles

Space ship orbiting earth

OU space researchers secure a share of £2 million Research England funding

The Open University’s award-winning space researchers are to benefit from significant new funding awarded by Research England’s Connecting Capability Fund (CCF) to the Space Research and Innovation Network for Technology (SPRINT) programme.

26th July 2021
See all