Category Archives: Metrics

“Measuring research: what everyone needs to know” – new ebook available

Want to know more about quantitative research indicators (a.k.a. bibliometrics) such as Journal Impact Factor, h-index or altmetrics? We have a new ebook in stock, containing thorough discussion of this field, the pros and cons of various indicators and the future of measuring research:

Sugimoto, C. R. and Larivière, Vincent (2018) Measuring research : what everyone needs to know(OU login required)

It also answers questions such as:

  • What are the data sources for measuring research?
  • How is impact defined and measured?
  • How is research funding measured?
  • What is the relationship between science indicators and peer review?

 

Investigating signing DORA in response to funder policy changes

Adopting a responsible metrics approach is seen as good practice
across the research community.

However, there is now an additional need for The Open University to sign up to an
external responsible metrics statement, such as the San Francisco Declaration on Research Assessment (DORA) or the Leiden Manifesto, or develop one of its own. Certain
major funders have changed their policies, which could impact our eligibility to receive research funding:

“We [The Wellcome Trust] are committed to making sure that when we assess research outputs during funding decisions we consider the intrinsic merit of the work, not the title of the journal or publisher.

All Wellcome-funded organisations must publicly commit to this principle. For example, they can sign the San Francisco Declaration on Research Assessment, Leiden Manifesto or equivalent.

We may ask organisations to show that they’re complying with this as part of our organisation audits.”

(The Wellcome Trust, 2019)

 

“cOAlition S* supports the principles of the San Francisco Declaration on Research Assessment (DORA) that research needs to be assessed on its own merits rather than on the basis of the venue in which the research is published. cOAlition S
members will implement such principles in their policies by January 2021.”

(cOAlition S, 2019)

* cOAlition S is a group of funders co-ordinated by Science Europe. It includes UKRI, Wellcome, the European Research Council (ERC), the European Commission and The Bill and Melinda Gates Foundation. They are responsible for Plan S, a radical proposal regarding open access to research publications from which the above quote is taken

 

The Library Research Support team recently brought a paper to Research Committee, which investigates the University’s options in terms of responding to these policy changes. We are looking into how publication metrics are used at The OU and whether any current practices are in tension with these policy changes. The aim is that, all being well, The Open University will look at signing DORA.

We will keep you updated on our progress and would welcome any feedback on this issue.

 

References

cOAlition S (2019) Plan S: Principles and Implementation. Available at: https://www.coalitions.org/principles-and-implementation/ 

The Wellcome Trust (2019) Open access policy 2021. London: The Wellcome Trust. Available at:
https://wellcome.ac.uk/sites/default/files/wellcome-open-access-policy-2021.pdf

Plan S – a primer

What is Plan S?

Plan S is a radical proposal regarding open access (OA) to research publications.

It was created by cOAlition S, a group of research funders co-ordinated by Science Europe. It includes UKRI (UK Research and Innovation), Wellcome, the European Research Council (ERC), the European Commission and The Bill and Melinda Gates Foundation.

What does Plan S propose?

The crux of Plan S is that peer-reviewed research publications resulting from grants that the coalition allocate:

“must be fully and immediately open and cannot be monetised in any way”

cOAlition S believe they have a duty of care towards research as a whole. Thus they favour OA because it helps research function more efficiently and have greater impact on society. They feel there is no justification for keeping research publications behind paywalls and that progress towards OA needs accelerating.

More specifically, Plan S requires that all peer-reviewed research publications funded via calls posted from 1st January 2021 must be:

  • Published in an OA journal where the content is OA immediately (gold OA)

OR

OR

  • Published in an OA repository where the content is OA immediately (green OA with no embargo)
      • At The OU, authors could comply by depositing their work in ORO, as long as the work meets all other Plan S requirements

Making research data and other outputs OA is encouraged and a statement clarifying policy regarding monographs and book chapters is expected by the end of 2021.

Other headlines include:

  • Publication in hybrid journals (i.e. subscription-based journals that charge a fee to make articles OA) will not be supported…
    • …unless the journal moves towards becoming fully OA within a defined timeframe under a “transformative arrangement”
  • Authors or their institutions must retain copyright
    • CC-BY is the preferred license
  • Publishers should charge reasonable fees for OA and make the structure of these fees transparent
    • Funders may even standardise and cap the fees they pay
  • A commitment to the responsible evaluation of research when allocating funds
    • The coalition states it will judge research on its own merit and not on things like the journal it was published in or metrics such as Journal Impact Factor
  • Compliance with Plan S will be monitored and non-compliance will be sanctioned

However, the devil is in the detail – there are a lot of elements to Plan S and we recommend reading it yourself to see which aspects might impact you.

What are people saying about Plan S?

There have been a LOT of reactions to Plan S and these are, predicatably, mixed. Some of the themes I have noticed are:

  • Many people support the aims of Plan S
  • There is concern it is too STEM-focused and will negatively affect AHSS researchers
  • There is concern regarding some of the implementation detail
    • e.g. the technical specifications regarding publications, OA repositories and other OA platforms
  • Some believe it will impinge academic freedom
    • i.e. to choose where and how to publish
  • There is concern about the effects it will have on smaller publishers and learned societies
  • The timescale is too ambitious
  • We have been here before
    • There have been statements, reports and policies made in the past which did not push through the radical change anticipated

 

What is next for Plan S?

There is still a lot of uncertainty regarding the detail and implementation of Plan S, so all concerned will need to keep a watching brief.

What are responsible metrics?

“Responsible metrics” refers to the ethical and appropriate use of citation-based metrics (e.g. citation counts, Journal Impact Factor, H-index), altmetrics (e.g. how many times research is mentioned, used, saved and shared on blogs, social media and social bookmarking services) and other quantitative means of evaluating research.

It applies to everyone involved in using or producing these metrics e.g.:

  • researchers
  • funders
  • institutions (i.e. universities and other bodies that employ researchers)
  • publishers
  • organisations that supply metrics

The idea is to offer guidelines for good practice that help prevent scenarios such as:

  • a journal article being judged solely on the journal it is published in rather than on its own merit
  • universities focusing on improving their place in a ranking list, when the completeness of data and appropriateness of measures the list uses are contested
  • employers using arbitrary metric thresholds to hire and/or fire staff
  • the assessment of research in general being skewed by the fact that metrics can be gamed and/or lead to unintended consequences

Adopting a responsible metrics approach is seen as good practice across the research community.

The Metric Tide is an important report published in 2015, which helped foreground and frame discussion of responsible metrics (in the UK at least). It states:

“Responsible metrics can be understood in
terms of a number of dimensions:

Robustness: basing metrics on the best possible data in terms of accuracy and scope;

Humility: recognising that quantitative evaluation should support – but not supplant– qualitative, expert assessment;

Transparency: keeping data collection and analytical processes open andtransparent, so that those being evaluated can test and verify the results;

Diversity: accounting for variation by field, and using a range of indicators to reflectand support a plurality of research and researcher career paths across the system;Reflexivity: recognising and anticipating the systemic and potential effects ofindicators, and updating them in response”

Other important milestones in responsible metrics include the San Francisco Declaration on Research Assessment (DORA), formulated in 2012, and The Leiden Manifesto for research metrics, which was published in 2015.

Expect to hear more about this issue as research funders begin to implement the principles of responsible metrics and demand that organisations receiving grants from them do likewise – see Plan S and Wellcome’s Open access policy 2021.

Learn about research metrics across disciplines with The Metrics Toolkit

The Metrics Toolkit is a new resource that allows you to learn more about a wide variety of research metrics (a.k.a quantitative research indicators). It will also help you decide which ones to use in particular cases, according to the type of impact you hope to measure, the kind of research output involved (e.g. journal articles, books, datasets) and the academic discipline in which you are working.

The Metrics toolkit was developed by two academic librarians and a director of a metrics company, in conjunction with an advisory board of four researchers and one academic librarian. The project is supported by Indiana University – Purdue University Indianapolis, Altmetric and FORCE11.

July ORO downloads – how do people get to ORO?

This is second of three posts looking at the benefits and functions of the institutional repository through the lens of the top monthly downloads.  This post looks at the different ways people get to the Open Access papers in ORO.

In June and July the top 50 downloads in ORO had another new entry:

Burel, Grégoire; Saif, Hassan; Fernandez, Miriam and Alani, Harith (2017). On Semantics and Deep Learning for Event Detection in Crisis Situations. In: Workshop on Semantic Deep Learning (SemDeep), at ESWC 2017, 29 May 2017, Portoroz, Slovenia.

In which the authors “introduce Dual-CNN, a semantically-enhanced deep learning model to target the problem of event detection in crisis situations from social media data.”

The paper was added to ORO on the 14th June and was 13th on the top downloads list in June with 211 downloads, and 24th with 146 downloads in July.

Referrals from social media seems to have had significant impact on the downloads this paper received, most notably from Twitter.  On June 25th the Accel.AI (Artificial Intelligence network) twitter account tweeted a direct link to the paper:

This was retweeted by Massimiliano Versace

and then he retweeted himself retweeting @AccelerateAI

The following day it was tweeted by Vineet Vashishta (a “Top 10 influencer on #MachineLearning & #DataScience) – this amassed the most retweets and likes.

The tweets (and their retweets) seem to have had a direct impact on the downloads of the paper, especially the latter, which appears to have resulted in over 100 downloads of the paper.

This seems to tie in with a previous analysis of ORO downloads and the beneficial impact of the patronage of a Twitter Heavyweight.  The lead author Grégoire Burel, Research Associate in KMi in STEM added:

“It seems to be a ‘completely out of the blue’ case. We have a follow up paper (‘Semantic Wide and Deep Learning for Detecting Crisis-Information Categories on Social Media’) that will be presented soon at ISWC17 (21-25 October) so it would be interesting to see if it gets picked up again after we publish it to ORO”

I’ll certainly be keeping an eye on it!

Search and Referrals

Whilst the majority of traffic coming to ORO is from a direct search in Google there is an increasing trend for referrals in ORO, both from social media and other referring websites like Google Scholar.  In 2014 15% of traffic came from referrals, this year (to date) it’s up to 25%.

This shift in traffic from direct search to referrals is interesting.  A Forbes article back in May, The Trend To Facebook Referrals Is A Risk To Google Search, called it context search:

“People often want answers to their questions within the context of their community. So “searches” are changing. People are going back to what they did before Google existed – they are asking for information from their friends. But online. And primarily using Facebook.”

I find that quite compelling and so far this year:

  • Referrals from social media have a lower bounce rate (71%) than search (78%)
  • Referrals from social media have a higher average session duration (1:45 minutes) than search (1 minute).
  • Referrals from social media have more pages per session (2.13) than search (1.61).

However, results from general referrals (e.g. from clicking a link on a website) compare as well or better than referrals from social media:

  • 66% bounce rate
  • 1:45 minutes a session
  • 2.16 pages per session

So maybe it’s not so much about someone you (kind of) know on social media giving you a tip, as actually knowing you’ve found what you were looking for.

Top downloads list for July:

Library advice regarding the responsible use of quantitative research indicators

Image of data visualisations

We have written some Library advice regarding the responsible use of quantitative research indicators (the preferred term for metrics/bibliometrics). This outlines our approach to such indicators, representing current good practice and acting as a guide for future activities.

We believe that this is important because, whilst useful in certain circumstances, quantitative research indicators need to be understood to be used fairly and effectively.

Library Services provides support and guidance to The OU research community regarding quantitative research indicators. If you would like any more information on this subject, please see our website or contact library-research-support@open.ac.uk

 

 

 

 

 

Web of Science

Web of Science is a renowned abstract and citation database. It has wide coverage and facilitates analysis of its content, which can help identify links between past and current research, between collaborators or between funding and research impact, for example.

It is well-known among researchers but, for those of you who have yet to use it in anger, here are a few reasons for taking a closer look:

It has a lot of content

Web of Science contains over 33000 journals. It also covers books, conference proceedings, data sets and patents back to 1900. Whilst its index is not as big as Scopus and some other sources, its size is still considerable  and makes it worth considering as a place to start finding literature for your research or as a dataset for analysis.

The content is tightly curated

Many people believe that one of Web of Science’s main strengths is that its content is chosen by a team of experts. The idea is that only high-quality and relevant publications are included. Furthermore, it is generally felt that the metadata and citation data is high quality.

It is a source of bibliometrics

You can easily see the citation count for an article (highlighted red here) on the search results page:

You can also click Create Citation Report from a search results page to get more insight into the citation data:

Furthermore, Web of Science provides Journal Impact Factor scores. Journal Impact Factor is a contested metric (see this Science article or this piece on Occam’s Typewriter to get a flavour of the criticisms) but, for better or worse, it is still used and it can be useful to know how to access it. Simply click on a journal’s title from a search results page to see its Impact Factor and related metrics:

You can also link through to Journal Citation Reports form Web of Science, which allows you to explore Journal Impact Factor and other journal-level metrics in more detail. Follow the “Journal Citation Reports” link (highlighted red here) at the top of the page:

Indeed, if metrics are your bag, than Web of Science is one of the main sources worth investigating. You can get more general information on our bibliometrics page.

However, as an abstract and citation database, Web of Science requires you to link out to access the full text of articles. Indeed, there is no guarantee that The OU will subscribe to the full text – use the “Checks if the OU offers full text” link for an article (highlighted red below) to find out:

You can log into Web of Science (OU student or staff credentials required) to investigate for yourself.

Scopus – a large abstract and citation database for research

Scopus is a large, multi-disciplinary abstract and citation database of peer-reviewed literature, including scientific journals, books and conference proceedings.

Here are just a few of the features that make it worth considering as a researcher:

It has a lot of content

Scopus boasts over 65 million records and claims to be the biggest database of its kind (we understand that Google Scholar and Microsoft Academic may index more records but that their content is not curated in the same way). This alone means it is worth investigating, if you want to discover literature for your research.

It doesn’t cover everything (no database does) and it’s subject coverage isn’t equal (there is more content in the sciences than in the arts for example) but it can still provide a good starting point for a lot of people . Learn more on the Scopus Content page.

It has powerful search features

As well as intuitive basic search features, Scopus allows you to search by author and affiliation (i.e. the university, company or other organisation that an author works for). It also has a potent advanced search feature, which allows for the constructions of complex searches – really useful if you are after something specific.  Learn more on the Scopus Features page.

It is a source of bibliometrics 

Scopus records the citations that publications get, as well as providing metrics on things like social media mentions, uses on Mendeley and Citeulike and mentions in the mass media.

You can easily see the metrics for an article by looking at the “Metrics” box on its “Document details” page:

Indeed, if metrics are your bag, than Scopus is one of the main sources worth investigating. You can get more general information on our bibliometrics page.

However, as an abstract and citation database, Scopus usually requires you to link out to access the full text of articles. Indeed, there is no guarantee that The OU will subscribe to the full text  – use the “Checks if the OU offers full text” link  for an article (highlighted red below) to find out:

You can log into Scopus (OU student or staff credentials required) to investigate for yourself.

 

The Secret Life of Repository Downloads

The download data of Open Access content in ORO can tell some fascinating stories, the counts from December and January are no exception… it really is amazing what you can discover with a bit of digging!

The first one that jumped out at me from the December list is a journal item published back in 2002 by Dr Sara Haslam in FASS:

Haslam, Sara (2002). Written in blood: Family, sex and violence in Wuthering Heights and Jane Eyre. The English Review, 13(1) pp. 8–11.

A “steady performer” that averages between 20 to 30 downloads a month.  But December and January saw a spike in downloads with 100 in December and 124 in January which saw it reach the top 50 list (see below).  Looking at the referrals I noticed a large amount coming from open.edu, or OpenLearn to you and me.  A quick search found this page, which had a link to the ORO page for the article.

OpenLearnSaraHaslam

Sara was the academic consultant on the OU/BBC co-production “To Walk Invisible” and this was one of the OpenLearn pages supporting the programme – which is great connecting ORO and OpenLearn – how joined up!

Looking at Google analytics to see how many hits the ORO page got from OpenLearn tells us the ORO page was visited 251 times in the week immediately following broadcast (29th December to January 4th).  The actual PDF of the article was downloaded 115 times.  So, roughly, half the visitors coming to ORO from OpenLearn, were interested enough to download the paper!

Mapping the site visits and downloads of the paper gives us this graph. WalkInvisible

 

 

 

 

 

 

 

 

 

The graph shows that the greatest spike came immediately after broadcast of the programme.  But there is a tail of site visits and downloads that coincide with the availability of the programme on iPlayer.  It’s a great example of connecting Open Learning and Open Research.

The second story comes from the January downloads and relates to a paper co-authored by Dr Mathijs Lucassen in WELS:

Fleming, Theresa M.; Bavin, Lynda; Stasiak, Karolina; Hermansson-Webb, Eve; Merry, Sally N.; Cheek, Colleen; Lucassen, Mathijs; Lau, Ho Ming; Pollmuller, Britta and Hetrick, Sarah (2017). Serious Games and Gamification for Mental Health: Current Status and Promising Directions. Frontiers in Psychiatry, 7, article no. 215.

This one went through the roof, with 604 downloads in January making it the second most downloaded item in January (full top 50 below).  It was added to ORO on the 10th January and almost immediately picked up in twitter by @andi_staub.

The download pattern show a remarkable a correlation between that tweet and the number of ORO downloads for that article.

mathijschart

Initially I was suspicious that a single tweet could have that impact, even though it did get plenty of likes and retweets.  But Andreas Staub is apparently a Top 20 influencer in the world of FinTech.  FinTech (Wikipedia told me) “is an industry composed of companies that use new technology and innovation with available resources in order to compete in the marketplace of traditional financial institutions and intermediaries in the delivery of financial services” and got $19.1 bn funding in 2015

So why might a FinTech influencer be interested in this research?  Mathijs gave me some lowdown:

People do seem very interested in serious gaming in mental health…I wonder if it is because people are aware of the addictive potential of commercial games, so they wonder how can a game be therapy?  There are some really interesting ones out there (in addition to SPARX – I was a co-developer – Professor Sally Merry has led this work), like “Journey to the Wild Divine” a ‘freeze-framer’ game based on bio-feedback in a fantasy setting. The program is a mind and body training program, and uses biofeedback hardware (e.g. a user’s heart rate) along with highly specialised gaming software to assist in mindfulness and meditation training (e.g. a user has to learn to control their body in certain ways in order to progress through the game)…Plus programs like “Virtual Iraq” (to assist service men and women with Post Traumatic Stress Disorder with their recovery).

There was one other thing about the downloads for this paper.  It was published in an Open Access journal so I’d have expected most downloads to come from the journal site. But the majority of downloads (at least in January following this tweet) were from ORO.

OROFrontiers

Which indicates to me that Institutional Repositories can be as good as any other platform, whether they are publisher platforms or commercial academic social networking sites, to disseminate your research. Full Top 5o lists for downloads are below: 2016-12-monthly_downloads 2017-01-monthly_downloads