Responsible metrics

Responsible metrics refers to the sensitive use of quantitative measures of research assessment.  There are 3 central documents regarding responsible metrics

San Francisco Declaration on Research Assessment

Commonly abbreviated to DORA, The San Francisco Declaration on Research Assessment primarily focuses on the misuse of the Journal Impact factor in research assessment. Nevertheless, it has 18 recommendations it seeks funders, research organisations, publishers, metrics suppliers and researchers to uphold.

The Open University is a signatory of DORA.  Our adoption of DORA can be tracked at Open University DORA Implementation Plan

The Leiden Manifesto

The Leiden Manifesto for Research Metrics was published in a comment in Nature and proposes 10 principles for the m,easurement of research performance:

  1. Quantitative evaluation should support qualitative, expert assessment.
  2. Measure performance against the research missions of the institution, group or researcher.
  3. Protect excellence in locally relevant research.
  4. Keep data collection and analytical processes open, transparent and simple.
  5. Allow those evaluated to verify data and analysis.
  6. Account for variation by field in publication and citation practices.
  7. Base assessment of individual researchers on a qualitative judgement of their portfolio.
  8. Avoid misplaced concreteness and false precision.
  9. Recognize the systemic effects of assessment and indicators.
  10. Scrutinize indicators regularly and update them.

The Metric Tide Report

Published in 2015 The Metric Tide Report is a UK based report on the role of metrics in research assessment and management. The executive summary defines responsible metrics across 5 dimensions:

  • Robustness: basing metrics on the best possible data in terms of accuracy and scope;
  • Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment;
  • Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results;
  • Diversity: accounting for variation by field, and using a variety of indicators to support diversity across the research system;
  • Reflexivity: recognising systemic and potential effects of indicators and updating them in response.

Contact us

Library Research Support team