Monthly Archives: July 2016

Revisiting The Metric Tide

As we reported last summer, The Metric Tide is an independent report commissioned by HEFCE that was published in July 2015. It looks at the current and future roles of quantitative evaluation of research (i.e. bibliometrics, altmetrics and the like) and has a particular focus on the use of these measures in relation to the next REF.

Nearly a year on, we will look at its recommendations and reception further, especially in the light of other documents regarding research assessment.

The Metric Tide recommends that metrics are used responsibly and provides a definition of responsible use:

“Robustness: basing metrics on the best possible data in terms of accuracy and scope

Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment

Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results

Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system

Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response.”

The report also recommends that metrics are used to play a supporting role alongside expert qualitative evaluation (i.e. peer review etc.) rather than replace it. As such, it can be seen to be broadly in line with the San Francisco Declaration on Research Assessment (DORA) and the Leiden Manifesto for Research Metrics.  To many, these documents represent moves towards a new and different balance to research assessment.

The recommendations in The Metric Tide could be adopted as part of the next REF but this is not guaranteed. Indeed, it has been observed that developments in the UK higher education and research environment could affect if and how the report’s recommendations are implemented.  Since those observations were made there have been many further developments and publication of Lord Stern’s review of the REF, which could have a particular impact, is still awaited.

The chair of the review wrote that the report has increased healthy debate and provides evidence to help the UK higher education community shape research assessment. However, even amongst those who agree with the report’s method and findings, it has been argued that the environment surrounding evaluation of research in UK higher education makes implementing metrics responsibly very difficult.

Get your ORCID working for you!

orcid-logoAs I write over 2,300,000 ORCIDs have been registered by researchers globally.  ORCIDs have gained traction with publishers, funders and universities as a way of persistently identifying individual researchers across different infrastructures and enabling the flow of data across them.  This will save everyone time and effort and increase the quality of datasets in scholarly communications.

However, simply registering for the 16 character alphanumeric code at ORCID.org is not enough to get the real benefit of ORCIDs.  A certain amount of configuration of your ORCID record will get it up to date and keep it up to date without additional intervention.  Here’s how…

Getting it up to date – one off additions

1. Add works by search and link.

Go to your ORCID record.  In the Works section click Add works>search & link, identify a useful source of publications data from the list:

  • CrossRef Metadata Search – all articles with a Digital Object Identifier (DOI) should be retrievable from CrossRef Metadata Search so that’s a very good place to start (you will need to authorise CrossRef to read your ORCID record and Add works to it).
  • Scopus to ORCID – import publications associated to your name/Scopus ID in Scopus. There is good coverage here, especially in STEM, not so good in AHSS but Elsevier are improving coverage in these disciplines (you will need to authorise Scopus to read your ORCID record and Add works to it).
  • Europe PubMed Central – import publications from Europe PMC database of biomedical and life sciences research literature (you will need to authorise Europe PMC to read your ORCID record and Add works to it).
  • Researcher ID -if you have a ResearcherID in Web of Science you can import all your publications recorded by linking your Researcher ID to your ORCID (you will need to authorise ResearcherID to read your ORCID record).  You can also add publications to your researcher ID from ORCID see this guide from Queensland University of Technology on Linking your publications to your ORCID ID via ResearcherID.
  • MLA International Bibliography – given the gaps in Scopus great source for researchers in literature and modern languages (you will need to authorise MLA to read your ORCID record and Add works to it).
  • Additionally research datasets with a DOI can be added to your ORCID record by going to DataCite source where you can search and locate datasets using the Search and Link – DataCite search option (you will need to authorise the linking and log into DataCite using your ORCID ID in the the top right corner!)

2. Add works by importing by BibTeX file – e.g. you can add all your records in ORO by exporting them as a BibTeX file and then importing them to ORCID.  

  • In ORO go to Browse>OU Author
  • In your list of publications “click export as BibTex”
  • You will get a webpage of all your publications in BibTeX format.
  • If you are using Google Chrome right click and “Save as”, choose an appropriate name and save as a text document somewhere handy.
  • If you are using IE, right click, then Select all>Copy.  Open NotePad and copy the data into a file. Click File>Save as, choose an appropriate name and save somewhere handy.
  • Go to your ORCID profile
  • In the Works section click Add works>Import by BibTex
  • Find your file and import the publications.
  • Click save or ignore for each publication depending on whether or not you want to retain it in your ORCID.
  • Add works
  • Further instructions can be found on the ORCID website here:http://support.orcid.org/knowledgebase/articles/390530-import-works-from-bibtex-files-website-user

(NOTE: we are working on functionality to automatically update ORO with publications in ORCID and vice versa.)

3. Add works manually don’t do this!  …unless you really have to and works can’t be added by any of the methods above

Keeping it up to date – automated additions

The real benefit of ORCID is that it facilitates the automated flow of data across systems – so set up your ORCID record to automatically update when new records are added from trusted data sources.

In the works section, go to search and link to add these feeds to your ORCID record.

  • CrossRef Auto-update – any new article with a Digital Object Identifier (DOI) that is associated with your ORCID will be automatically pulled into your ORCID record.
  • Scopus Author ID Updates – any new items in Scopus associated to your Scopus ID will be pulled into your ORCID record. (Pending clarification from Elsevier – Chris 20/02/2017.  Auto – updates are not currently configured in Scopus – this is a longer term goal for Scopus & Mendeley – Chris 26/05/2017)
  • DataCite Auto-update – any new dataset with a Digital Object Identifier (DOI) that is associated with your ORCID will be automatically pulled into your ORCID record.

Again you will need to make sure you have authorised these trusted organisations to make changes to your ORCID record.

De-duplication

Having added publications to your ORCID record you may expect to find duplication of some publications. If you import items that share an identifier (e.g. a DOI) ORCID should combine them in a single listing and allow you to select your preferred version for display.

If this de-duplication doesn’t work you can easily remove duplicates by using the bin icon.

Checking Access Permissions

At any stage in the process of authorising trusted organisations you can check who can and cannot add items to your ORCID profile by viewing Account Settings on your ORCID record which will list all the organisations you have granted permission to.  At any point you can withdraw these permissions.

You are in control… and use your ORCID

ORCID.org is very clear in its commitment to researcher control and privacy – things can only happen to your ORCID record if you expressly give permission for them to happen.

Whenever you can add an your ORCID – whether it is to bids, publications, datasets or peer review – add it!  Adding your ORCID your scholarly works of whatever type asserts your ownership of your scholarly works and allows them to be pushed around the scholarly communications infrastructure without further intervention.

The San Francisco Declaration on Research Assessment (DORA)

DORA_logo

The San Francisco Declaration on Research Assessment (DORA) is an initiative to change the way that research is evaluated.

It centres on the belief there is an over-reliance on bibliometrics, such as the Journal Impact Factor, which is seen by many as flawed. Other key themes are that research needs to be evaluated on its own merit (not on which journal it’s published in) and that we need to make the best of the flexibility afforded by online publishing.

DORA makes a general recommendation that bibliometrics should not be used as a “surrogate measure” for the quality of research and then makes a number of specific recommendations for publishers, academic institutions, research funders, organisations that supply metrics and individual researchers. All the aforementioned are invited to sign DORA.

For a university, signing DORA would mean it is obliged to:

  • Be explicit about the criteria it uses for the assessment of research and researchers
  • Reinforce that the content of research is what is important (rather than what metric scores it has or what journal it is published in)
  • Consider the value and impact of all research outputs
  • Consider using a variety of measures to for the assessment of research and researchers

However, a professor at Imperial College London recently lamented that few UK universities have signed up, calling on more to do so or to explain their reasons for refusing.

*EDIT* For another perspective on the lack of UK universities signing up to DORA, see Is signing DORA that responsible? where Elizabeth Gadd asks “whether signing the DORA principles will give us a better outcome than developing our own locally relevant, properly debated, carefully implemented and monitored principles”.

Do you think DORA will help research?

Do you think the OU should sign up?

ORO Downloads – June 2016

I’m a bit reluctant to publish lists of top downloads from ORO as they only tell the story of those items that get an exceptional number of downloads from ORO.  Sometimes these numbers are questionable and can be the result of non-human downloads that haven’t been filtered out by either the e-prints software (on which ORO runs) or the Jisc service IRUS which we also use to capture download counts.  But more importantly it doesn’t capture the more modest downloads accruing on ORO – the repository has, if you like, a long tail of downloads where the majority of downloads are actually gained by lots and lots of outputs getting smaller amounts of downloads.

So I’ve expanded our list from 15 to 50 to see what we capture.  It’s still the exceptions (the top 50 makes up about 0.6% of the total Open Access items in ORO) but there are some interesting stories to tell.

ORB-2016-06-monthly-downloadsFirst and foremost is the top of the list: Petre, Marian (2013). UML in practice. In: 35th International Conference on Software Engineering (ICSE 2013), 18-26 May 2013, San Francisco, CA, USA.  Marian’s paper has been a very popular paper in ORO for a very long time but last month was extraordinary with 1,784 downloads. Fortunately, I think we can explain this, ORO is telling me that it the item is being referred to from Wikipedia and it appears the paper was added as a reference to the Wikipedia page on UML sometime in August last year.  Additionally, in June traffic to the ORO item appears to have referred from social media sites like Twitter and Facebook, and sites like Y combinator (a start-up incubator) and feedly (an aggregator service).  I think it’s fair to say the presence of the paper on Wikipedia has led to it’s greater dissemination across various web platforms – maybe it’s time for the OU to have a Wikipedian in Residence!

Secondly we have the fifth item on the list: Sharples, M.; Adams, A.; Alozie, N.; Ferguson, R.; FitzGerald, E.; Gaved, M.; McAndrew, P.; Means, B.; Remold, J.; Rienties, B.; Roschelle, J.; Vogt, K.; Whitelock, D. and Yarnall, L. (2016). 创新教学报告2015 —探索教学、学习与评价的新形式 [Innovating Pedagogy 2015]. 开放学习研究 [Open Learning] (2016.1) pp. 1–18.  Now this is the Chinese translation of the 2015 Innovating Pedagogy Report which appears in the Chinese language journal Open Learning.  Last year we wouldn’t have accepted this item in ORO as is.  Rather than a discrete item with associated metadata in the source language we would have just added the file alongside the original English language version.  However, last year during Open Access week we were challenged to accept these items as discrete records in ORO to support discoverability and make ORO a more global resource.  We changed our policy and the benefit is evident here.  Interestingly, it seems that the majority of downloads, at least in June, are coming from the U.S.A. (187) rather than China (43).

Thirdly, there are three theses on the list.  Theses occasionally get a bit higher ranked than this and make the top 15 – but they are consistently highly downloaded.  Institutional Repositories have a major role supporting the dissemination of materials that do not get published via the standard routes of academic publishing – I’m thinking particularly about theses and reports that may not get a good platform for dissemination elsewhere.  So whilst there has been such an emphasis recently on ORO and the HEFCE Open Access Policy – we shouldn’t lose sight of the key function ORO can play in the dissemination of these other research outputs.

Finally, and perhaps most fundamentally, this Top 50 draws into stark contrast the benefits different faculties get from ORO.  There is only one item from the Science Faculty on this list.  The route to Open Access for Science is well supported by disciplinary repositories and Gold Open Access publishing (frequently, but not always, funded by the RCUK block grant).  Notwithstanding the requirement for a university to be aware of, and showcase, all its published research outputs, the value of the institutional repository can be discipline specific – and we need to pay close attention to that when advocating its usage.

 

A checklist for evaluating journals to publish in

For a recent OU training session I co-ran entitled “Finding and evaluating external scholarship literature” I was asked to speak very briefly about evaluating journals to publish in.

Attendees were introduced  to Think Check Submit, which provides “a simple checklist researchers can use to assess the credentials of a journal or publisher” then they used it  to evaluate a scholarship journal they had found and were asked to reflect on its usefulness.

I also distributed a more comprehensive checklist that I synthesized from other universities’ guidance. It covers all the major questions I found and categorises them.

Please feel free to download and use this list as an RTF file or as an ODT file (it is available under a CC0 license). RTF, for example, is compatible with most word processing software including Microsoft Word.

I would be really interested to know:

  • Whether this checklist is useful
  • How it compares to Think Check Submit
  • Whether there are any points you would add or take away

Let me know your thoughts!

Does using ORO provide a citation advantage?

There have been several studies indicating that deposit in an institutional repository (like our own ORO) does provide a citation advantage.  The one I usually refer to when I’m promoting ORO is the one conducted by Lars Kullman “The Effect of Open Access on Citation Rates of Self archived Articles at Chalmers” which finds that “self-archived articles have a 22% higher citation rate than articles that were not self-archived“.  

It’s something I’ve often wanted to do in relation to ORO, but haven’t got round to before now, not least because it’s a tricky thing to investigate.  There are a number of things to bear in mind when evaluating the impact deposit in a repository may have on citations of any particular paper, they include:

  1. The disciplinary variance of citations – some disciplines cite heavier than others and this might skew the results if a certain discipline is more heavily represented in the dataset being analysed.
  2. Citations increase over time – obviously older papers that have been in circulation longer will accrue more citations.
  3. Items deposited in the institutional repository as Open Access may also be Open Access in another repository (e.g. arXiv) or published Gold Open Access on the publisher’s website.  So it might not be their presence in ORO that is making the difference!
  4. Author’s may self select which papers are made Open Access in a repository i.e. only the better papers are made openly available.  And these (you would hope!) are the ones that will get more citations.

And at this point I normally give up and go back to the nitty gritty of running ORO – I’m not a researcher.  However, we’ve recently been using the bibliometrics tool SciVal (an Elsevier product based on Scopus data) and evaluating how that can support individuals, research groups and the University to benchmark their research.  So I decided to run some data through it and see what came out, and this is what I found.

Total Outputs Citations per Publication Field Weighted Citation Impact
ORO research outputs with DOI 10,008 20.9 1.88
OU research outputs with DOI 11,684 16.1 1.65
OU research outputs 16,627 14.9 1.48
  • Date range: 1996-2015
  • Field Weight Citation Impact (FWCI) – a relative measure of citations normalised by discipline, publication type and published date where 1 is the average e.g. a FWCI of 1.48 means it has been 48% more cited than expected
  • All tables ranked by FWCI
  • Data available here: ORB_OROSciValData

OK, let’s unpick that a bit.

  • OU research outputs is everything in Scopus/Scival that has been affiliated to an OU researcher.  SciVal/Scopus is only a subset of everything that is published and is skewed towards the sciences – coverage of the Arts and Social Sciences is not great!
  • OU research outputs will capture all outputs from authors with an OU affiliation – these may not be research contracted staff (e.g. Academic-related staff or Associate Lecturers) e.g. some of these people may not be able to deposit in ORO.
  • ORO currently captures around two thirds of everything that gets published by OU researchers.
  • ORO captures research outputs of current OU researchers published before they joined the OU – so some ORO items will not have an OU affiliation and therefore not be in the SciVal/Scopus dataset.
  • The ORO research outputs with DOI are only those items in ORO with a DOI – there are a lot of items in ORO that don’t have DOIs – but that’s the only way I could import them into SciVal to do the analysis!  I don’t know but maybe there would be a correlation between possession of a DOI and citability of an output.

So that’s some context, but the data there does seem to indicate that items in ORO get a citation advantage compared to all OU research outputs indexed in SciVal/Scopus.  So I dug a bit deeper and tried to get some data on those publications that were only on ORO or only in the SciVal/Scopus OU dataset.

Total Outputs Citations per Publication Field Weighted Citation Impact
ORO outputs only (with DOI) 3,003 27.0 2.03
OU research outputs not in ORO (with DOI) 4,679 13.0 1.41

That appears to be even more emphatic, but it may well be self selecting – people may only be putting on ORO the stuff they want people to see!

My appetite well and truly wetted I tried to see whether being Open Access in ORO differed from just being in ORO (i.e. metadata only).

Total Outputs Citations per Publication Field Weighted Citation Impact
ORO Open Access with DOI 2,926 16.1 1.98
ORO research outputs with DOI 10,008 20.9 1.88
ORO Metadata only with DOI 7,116 22.8 1.84

Note: you’ll see the numbers don’t add up – that’s because there were a number of duplicates that appeared in both sets (i.e. they had been deposited in ORO as both open access and metadata only). 

Not much at first glance, it might appear that just getting the metadata visible to Google is enough to get the research output noticed and citable.  However,  the FWCI value for Open Access is a bit higher than metadata only, especially when considered in relation with Citations per Publication.  I’ve written here before how Arts and Social Sciences full text gets downloaded more than STEM and, given differences in citation practices, I wonder if that is happening here too?

Finally, I tried to tie this up by seeing whether appearance in high impact factor journals had a determining influence in the varying citation measures across these datasets.  So I used one of the metrics in SciVal (Publications in Top 10 Journal Percentiles% – SNIP) to see whether this might be the case.

Total Outputs Citations per Publication Field Weighted Citation Impact Publications in Top 10 Journal Percentiles % – SNIP
ORO outputs only (with DOI) 3,003 27.0 2.03 25.4
ORO Open Access with DOI 2,926 16.1 1.98 22.1
ORO research outputs with DOI 10,008 20.9 1.88 24.3
ORO Metadata only with DOI 7,116 22.8 1.84 25.2
ORO and OU with DOI 2,926 18.2 1.81 23.9
OU research outputs with DOI 11,684 16.1 1.65 22.0
OU research outputs 16,627 14.9 1.48 20.1
OU research outputs not in ORO (with DOI) 4,679 13.0 1.41 18.6

And generally yes, there is a trend that those outputs published in journals in the top percentiles in their subject area do get a higher FWCI value.

However, there is one notable exception: those items that are Open Access items in in ORO have a higher FWCI value despite having a lower Journal percentile ranking than metadata only deposits . They also have very similar Journal percentile ranking but significantly higher FWCI value than all OU research outputs with DOI.

So I think there are a couple of things to take away:

  • All items deposited in ORO do appear to get higher citations than those that aren’t, however this may be due to the fact that these items are more likely to be published in more prestigious journals in the field, rather than the fact they have been deposited in ORO.  So it’s likely that there is a self selecting element here – researchers may be more inclined to deposit items in ORO if they appear in more prestigious journals.
  • Open Access items in ORO appear to contradict this general trend. Open access deposit in ORO may lead to more citations irrespective of the apparent quality of the journal they appear in.