Research data sharing: ensuring greater research integrity?

You may have read in the news recently about a scandal concerning the doctoring of research data within a lab run by a top UK academic. Earlier this month UCL released details of the inquiries into misconduct, which were undertaken in 2014 and 2015. Of the 60 papers reviewed, the panels found evidence of misconduct in 15 of them. This included “cloning” whereby features were copy and pasted throughout an image, and some of the data fabrications were reportedly fundamental to the conclusions reached by the authors.

This news story struck me as a prime example of why data sharing is so important to improve research integrity. If the data underpinning the papers in question had been made publicly available in a trusted research data repository, it seems unlikely that misconduct of this level would have happened. Data sharing should encourage greater transparency of results – ensuring that researchers are less likely to falsify research findings or fabricate data, and if they do then this sort of misconduct could be spotted much more quickly. Would a culture of data sharing also have instilled a sense of responsibility on researchers to “do the right thing” rather than cutting corners?

Sharing research data can seem like an onerous task, however if a possible outcome of data sharing is greater research integrity, then it needs to be recognised as an important part of all researchers’ work.

Posted in Research Data Management | Leave a comment

What is a systematic review and how does it differ from a ‘regular’ literature review?

There are a lot of different types of literature review and there is a lot of different terminology surrounding literature reviews.

This creates confusion and there is a particularly large amount of confusion regarding systematic reviews. The term, strictly speaking, refers to a specific and particularly rigourous method that has its origins in biomedicine and healthcare (although it is adapted and used in other disciplines). However, many people use the term to refer to a ‘regular’ literature review that is methodical and comprehensive.

In short, if somebody asks you to carry out a systematic review, it is worth clarifying exactly what they have in mind.

Here, we will spell out the differences between ‘regular’ literature reviews and systematic reviews as we see them:

‘Regular’ literature reviews

A regular literature review involves finding, analysing and synthesizing relevant literature, then presenting it in an organised way to the reader.

Regular literature reviews can be methodical and comprehensive. They can involve attempting to find all the literature there is on a topic, recording results and reflecting on strategies. We could even describe them as being “systematic” in an informal way but they do not employ the full formal methods of a systematic review, as outlined below.

Systematic reviews

In biomedicine and healthcare a systematic review aims to be exhaustive, objective, transparent and replicable, employing specific methods to reach these goals. It typically involves stages such as:

  • Creation of a structured research question to guide the process
  • Writing a protocol or following a previously established protocol, which sets out the methods the systematic review will use
    • A protocol covers things like which databases will be used, why they will be used, what keywords will be used, what other search techniques will be used. The protocol is usually developed through testing and is often peer-reviewed
  • A methods sections, including:
    • A list of all databases and/or journals that were searched
    • The exact keywords, limiters etc. that were used
    • When each search was undertaken
    • How many results each seach found
  • The titles and abstracts of articles found are compared against inclusion criteria
  • Meta-analysis may be undertaken
    • In this context, meta-analysis refers to the statistical analysis of data from comparable studies
  • Reporting on the results of all included studies, highlighting any similarities and differences between them

A systematic review is often preceded by a scoping review, a relatively brief search of relevant databases, which aims to tell you whether your research question, in its current form, is worth pursuing or whether it needs changing. This a process tells researchers whether a recent or ongoing review of the topic already exists – if it does then a new systematic review may not be necessary.

The description above is necessarily brief and partial. We recommend that you consult guidance such as that produced by the Centre for Reviews and Dissemination (CRD) for a fuller explanation of how systematic reviews work in biomedicine and healthcare.

As mentioned, the systematic reviews method has been adapted by other disciplines. For example, the Campbell Collaboration have adopted the method, defining systematic reviews and producing guidance with a focus more on the social sciences. There are also books (e.g. this book we have in print at the Library) and articles (e.g. this article which is open access) on systematic reviews in the social sciences.

If you want to know more about systematic reviews, you can also watch the recording of the online training session by Library Services (OU login required).

Posted in Library research support | Tagged , , , | Leave a comment

What was the first academic journal?

Well, like most things in academia, the question of which academic journal came first is contested.

Usually, it’s seen as being between Journal des sçavans, which was based in Parisand Philosophical Transactions, produced by the Royal Society of London.

It is not contested that Journal des sçavans was the first of these to be published. The first issue was published on January 5th 1665 whereas the first issue of Philosophical Transactions was published 60 day later.

However, the issue seems to be whether historians consider Journal des sçavans to be a “true” academic journal, with some believing that it didn’t contain enough original science to count.

Either way, the history of scholarly communications is fascinating and I look forward to digging into it some more to see what it can tell us about the future of academic publishing.

References

History of Philosophical Transactions. (n.d.). Retrieved June 27, 2019, from The Secret History of the Scientific Journal website: https://arts.st-andrews.ac.uk/philosophicaltransactions/brief-history-of-phil-trans/

McClellan, J. E. (2005). Scientific Journals. In A. C. Kors (Ed.), Encyclopedia of the Enlightenment (online). Retrieved from https://www.oxfordreference.com/view/10.1093/acref/9780195104301.001.0001/acref-9780195104301-e-652 (subscription-based resource)

The Royal Society. (2019). History of Philosophical Transactions. Retrieved June 27, 2019, from The Royal Society website: https://royalsociety.org/journals/publishing-activities/publishing350/history-philosophical-transactions/

Spinak, E., & Packer, A. L. (2015). 350 years of scientific publication: from the “Journal des Sçavans” and Philosophical Transactions to SciELO. Retrieved June 27, 2019, from SciELO in Perspective website: https://blog.scielo.org/en/2015/03/05/350-years-of-scientific-publication-from-the-journal-des-scavans-and-philosophical-transactions-to-scielo/#.XRSE8uhKiUm

Swoger, B. (2012). The (mostly true) origins of the scientific journal. Retrieved June 27, 2019, from Scientific American Blog Network website: https://blogs.scientificamerican.com/information-culture/the-mostly-true-origins-of-the-scientific-journal/

Posted in Academic journals | Tagged , , , , | Leave a comment

ORDO best practice #2 Archiving a website

Continuing my series on best practice in ORDO, this time I’m going to trumpet The Robert Minter Collection: https://doi.org/10.21954/ou.rd.7258499.v1 which was deposited by Trevor Herbert in December 2018. According to the ORDO record:

This is a copy of the data underlying the website ‘The Robert Minter Collection: A Handlist of Seventeenth- and Eighteenth-Century Trumpet Repertory’ which contained a database of music collected by Robert L. Minter (1949-81).

Minter’s interest was in the collection of sources that contribute to our understanding of the trumpet at various points in its history before the twentieth century.

This is regarded as one of the world’s largest fully catalogued datasets about early trumpet repertoire.

The website in question was created in 2008 and is no longer active, however it had been archived by the Internet Archive, most recently in May 2017. In 2018, Trevor approached the Library for help archiving the data contained on the website because he was aware that although the Internet Archive had maintained much of the information, not all functionality and content had been preserved; most crucially the database itself is no longer searchable.               

ORDO was deemed a good fit for creating an archive of the content of the website. It allows the deposit of any file type and enables in-browser visualisation of many of these so it is not always necessary to download documents in order to view them. By depositing the material in ORDO, Trevor also obtained a DOI (Digital Object Identifier) – a persistent, reliable link to the record which will be maintained even if the materials are no longer available for any reason. Any materials added to ORDO are guaranteed to be maintained for a minimum of ten years.

Within the record there are four files – an access database, a csv copy of the data, a zip file containing information about the collection, database and website and a list of files in the zip file. The description in the record makes it clear to any potential users what they are accessing and how they can be used. Since it was deposited in December, the collection has been viewed 139 times and downloaded 18 times. Now that deserves a fanfare!

Posted in Best practice in ORDO, ORDO, Research Data Management | Leave a comment

Plan S – a primer

What is Plan S?

Plan S is a radical proposal regarding open access (OA) to research publications.

It was created by cOAlition S, a group of research funders co-ordinated by Science Europe. It includes UKRI (UK Research and Innovation), Wellcome, the European Research Council (ERC), the European Commission and The Bill and Melinda Gates Foundation.

What does Plan S propose?

The crux of Plan S is that peer-reviewed research publications resulting from grants that the coalition allocate:

“must be fully and immediately open and cannot be monetised in any way”

cOAlition S believe they have a duty of care towards research as a whole. Thus they favour OA because it helps research function more efficiently and have greater impact on society. They feel there is no justification for keeping research publications behind paywalls and that progress towards OA needs accelerating.

More specifically, Plan S requires that all peer-reviewed research publications funded via calls posted from 1st January 2021 must be:

  • Published in an OA journal where the content is OA immediately (gold OA)

OR

OR

  • Published in an OA repository where the content is OA immediately (green OA with no embargo)
      • At The OU, authors could comply by depositing their work in ORO, as long as the work meets all other Plan S requirements

Making research data and other outputs OA is encouraged and a statement clarifying policy regarding monographs and book chapters is expected by the end of 2021.

Other headlines include:

  • Publication in hybrid journals (i.e. subscription-based journals that charge a fee to make articles OA) will not be supported…
    • …unless the journal moves towards becoming fully OA within a defined timeframe under a “transformative arrangement”
  • Authors or their institutions must retain copyright
    • CC-BY is the preferred license
  • Publishers should charge reasonable fees for OA and make the structure of these fees transparent
    • Funders may even standardise and cap the fees they pay
  • A commitment to the responsible evaluation of research when allocating funds
    • The coalition states it will judge research on its own merit and not on things like the journal it was published in or metrics such as Journal Impact Factor
  • Compliance with Plan S will be monitored and non-compliance will be sanctioned

However, the devil is in the detail – there are a lot of elements to Plan S and we recommend reading it yourself to see which aspects might impact you.

What are people saying about Plan S?

There have been a LOT of reactions to Plan S and these are, predicatably, mixed. Some of the themes I have noticed are:

  • Many people support the aims of Plan S
  • There is concern it is too STEM-focused and will negatively affect AHSS researchers
  • There is concern regarding some of the implementation detail
    • e.g. the technical specifications regarding publications, OA repositories and other OA platforms
  • Some believe it will impinge academic freedom
    • i.e. to choose where and how to publish
  • There is concern about the effects it will have on smaller publishers and learned societies
  • The timescale is too ambitious
  • We have been here before
    • There have been statements, reports and policies made in the past which did not push through the radical change anticipated

 

What is next for Plan S?

There is still a lot of uncertainty regarding the detail and implementation of Plan S, so all concerned will need to keep a watching brief.

Posted in Funding bids, Library research support, Metrics, Open Access | Tagged , , , , , | Leave a comment

Call for Data Champions!

The Library is launching a new Data Champions programme, and we are looking for PGR students and staff who are interested in taking part.

What would this involve?

Data Champions are expected to:

  • Lead by example – make data open (via ORDO or other data repositories); share best practice through case studies and blog posts, and share Data Management Plans on the Library Research Support website 
  • Promote OU Research Data Management (RDM) services and tools within your unit
  • Provide discipline specific data management advice and support to colleagues
  • Attend and contribute to Library-run events 
  • Contribute to The Orb, Open Research Blog 
  • Offer feedback to Library Services to support RDM service development

What’s in it for me?

Data Champions will benefit from the following: 

  • Boost CV – increase funding opportunities by having RDM “expert” status  
  • Increase visibility – dedicated profile on the Data Champions webpage, opportunity to contribute to the successful Open Research Blog 
  • Opportunity to network with colleagues from across the OU 
  • Be instrumental in developing the OU Research Data Management Service and improving the culture of data sharing at the OU 
  • Receive 100 GB of data storage on ORDO as default 
  • Attendance for one Data Champion per year to the annual Figshare Fest conference in London 

Do I need to be a data expert?

No  – we’re looking for a range of people from different disciplines who work in different ways with different types of data. You could be a research student, early career researcher, professor, member of research support staff or an IT specialist. You might have experience compiling surveys, collecting lab-based data, harvesting big data or creating video data. Whoever you are and whatever your area of interest, we’d love to hear from you.

Don’t worry if you don’t consider yourself a data expert, your knowledge in your specfic area is invaluable and training and support will be given.

What’s the time commitment?

We expect the Data Champion role to require a commitment of 1-3 hours a month, but this can be flexible according to the amount of time you are able to give.

How do I apply?

Send an email to library-research-support@open,ac,uk  by 31st July with the subject “Data Champions” stating what type of research you are involved with and whether there’s any particular contribution you’d like to make.

When do I start?

We are going to launch the programme with a Data Champions Forum in September. This will be an opportunity to meet the other Data Champions, find out more and help shape the Data Champions programme.

 

Posted in Data Champions, Research Data Management | Leave a comment

What are responsible metrics?

“Responsible metrics” refers to the ethical and appropriate use of citation-based metrics (e.g. citation counts, Journal Impact Factor, H-index), altmetrics (e.g. how many times research is mentioned, used, saved and shared on blogs, social media and social bookmarking services) and other quantitative means of evaluating research.

It applies to everyone involved in using or producing these metrics e.g.:

  • researchers
  • funders
  • institutions (i.e. universities and other bodies that employ researchers)
  • publishers
  • organisations that supply metrics

The idea is to offer guidelines for good practice that help prevent scenarios such as:

  • a journal article being judged solely on the journal it is published in rather than on its own merit
  • universities focusing on improving their place in a ranking list, when the completeness of data and appropriateness of measures the list uses are contested
  • employers using arbitrary metric thresholds to hire and/or fire staff
  • the assessment of research in general being skewed by the fact that metrics can be gamed and/or lead to unintended consequences

Adopting a responsible metrics approach is seen as good practice across the research community.

The Metric Tide is an important report published in 2015, which helped foreground and frame discussion of responsible metrics (in the UK at least). It states:

“Responsible metrics can be understood in
terms of a number of dimensions:

Robustness: basing metrics on the best possible data in terms of accuracy and scope;

Humility: recognising that quantitative evaluation should support – but not supplant– qualitative, expert assessment;

Transparency: keeping data collection and analytical processes open andtransparent, so that those being evaluated can test and verify the results;

Diversity: accounting for variation by field, and using a range of indicators to reflectand support a plurality of research and researcher career paths across the system;Reflexivity: recognising and anticipating the systemic and potential effects ofindicators, and updating them in response”

Other important milestones in responsible metrics include the San Francisco Declaration on Research Assessment (DORA), formulated in 2012, and The Leiden Manifesto for research metrics, which was published in 2015.

Expect to hear more about this issue as research funders begin to implement the principles of responsible metrics and demand that organisations receiving grants from them do likewise – see Plan S and Wellcome’s Open access policy 2021.

Posted in Metrics | Tagged , , , , , , , , , , , | Leave a comment

ORCID Training

An ORCID is a 16 digit persistent identifier for researchers and contributors.  It’s purpose is to:

(1) disambiguate researchers with like names in any system (e.g. Web of Science, ORO or ORDO)

(2) aid data transfer across systems to stop you re-keying information (e.g. if your ORCID is related to a bunch of publication information in one system simply by adding your ORCID to another system all that information can be automatically pulled across without the need for re-keying).  That’s the idea, anyway! 

Thanks for reading!

 

 

…if you would like to know more – then come along to our re-scheduled training session on 3rd July 10.30-11.30; face to face at Library Seminar Room 1 4th July 10.30-11.30 Library Seminar Room 1, or online via Adobe Connect.

My Learning Centre Registration: Claiming your research publications: ORCIDs at the OU.

Posted in Library research support, ORCID | Leave a comment

ORDO best practice #1 Documenting data

Over the coming months I’m going to focus on some examples of best practice on ORDO. The creators of all the items in this series will receive a reusable Figshare coffee cup as way of thanks and congratulations.

The first series of items I’m going to focus on are the OpenMARS Database datasets (https://doi.org/10.21954/ou.rd.c.4278950.v1) , deposited by James Holmes (STEM) earlier this year. From the data record:

“The Open access to Mars Assimilated Remote Soundings (OpenMARS) database is a reanalysis product combining past spacecraft observations with a state-of-the-art Mars Global Circulation Model (GCM). The OpenMARS product is a global surface/atmosphere reference database of key variables for multiple Mars years.”

Since their deposit in February, these datasets have been downloaded a total of 291 times, making them some of the most popular items on ORDO. This is a fine reward for all the hard work that went into preparing them for sharing.

What’s so good about them?

There are four datasets which are published individually and also grouped together as a collection. The most impressive thing about these is the documentation accompanying these datasets, which is excellent:

  • On the landing page for each dataset is a description, which clearly details the provenance of the dataset and information about the OpenMARS project
  • Each dataset has a PDF reference manual. This can be read in the browser, and as the datasets are large (~25GB each) and use a file format that requires specialist software and does not display in the browser (.nc) this means that users can decide if the data is useful before download
  • The documentation within the reference manual is very detailed and includes information on access (using a sample Python script included in the dataset), structure of the dataset, provenance and quality assurance
  • The datasets clearly reference the funding body – the European Union’s Horizon 2020 Research and Innovation programme

Is it FAIR?

The gold standard for research data is that it should be FAIR – Findable, Accessible, Interoperable and Re-usable. These datasets fulfill all but one of the criteria detailed in Sarah Jones and Maarjan Grootfeld’s FAIR data checklist (original version at https://doi.org/10.5281/zenodo.1065991).  It only falls down on the fact that the data are not in a widely available format, but considering the nature of the data this would be very difficult to achieve, and since the reference manuals are very accessible, this issue is dealt with. See the completed checklist.

And finally, a word from James…

‘Adding datasets produced by our team at the Open University that will be of interest to multiple different users was really simple to do using the ORDO system, and the team that manage it were very helpful if I had any questions during the process. Thanks!’

 

Posted in Best practice in ORDO, ORDO, Research Data Management | Leave a comment

CANCELLED – Shut Up and Write sessions for postgraduate researchers (PGRs)

*Edit – 28.05.19 – Just to let you know that, unfortunately, the Shut Up and Write pilot has been cancelled due to extremely low interest.

This means that there will be no Shut Up and Write on Wednesday 29th May or Wednesday 5th June.

Please accept our apologies for any inconvenience caused.

We will investigate whether to try running it at a different time later in the year.

If you have any feedback, please contact library-research-support@open.ac.uk *

Library Services are starting Shut Up and Write sessions for postgraduate researchers (PGRs) on campus in Milton Keynes*. Sessions involve meeting with other PGRs in the Library building, writing for 25 minutes at a time then taking a 5 minute break. The idea is to make academic writing more productive and social.

If you are a PGR then simply turn up, bringing anything you need to write and to make yourself comfortable.

The first session is Wednesday 1st May, 13.00-15.00, using desks on the second floor of the Library. Signs will be put up on the day to guide you.

Subsequently, sessions will take place every Wednesday, 13.00-15.00 in the same place (unless notified otherwise). This will run on a pilot basis for 6 weeks in the first instance. If successful, Shut Up and Write will be continued.

Contact library-research-support@open.ac.uk if you have any questions.

 

*Details of the Betty Boothroyd Library’s location can be found on our Contact us page and on the campus map.

Posted in Library research support, Research Students | Tagged , , , , , | Leave a comment