We analysed 145,000 data availability statements (DASs) submitted by research authors to 176 Wiley journals between 2013 and 2020, from the same dataset we previously used to identify the impact of new policies at journals on trends in the use of DASs. We looked at URLs and DOIs contained within those DASs to identify the most common repositories (and other locations) used by researchers to store and share the new research data they create. We resolved DOIs, and captured destination for DOIs (as well as URLs). We mapped destinations to research disciplines, and ranked them to show data services and repositories most often used by researchers who choose to submit to Wiley journals. We share this information as source data and in dynamic figures here, as inspiration and direction for journal teams and research authors.
Data availability statements can provide useful information about how researchers actually share research data. We used unsupervised machine learning to analyse 124,000 data availability statements submitted by research authors to 176 Wiley journals between 2013 and 2019. We categorised the data availability statements, and looked at trends over time. We found expected increases in the number of data availability statements submitted over time, and marked increases that correlate with policy changes made by journals. Our open data challenge becomes to use what we have learned to present researchers with relevant and easy options that help them to share and make an impact with new research data.
Aim: We wanted to understand how well journal teams, comprising editors, managing editors, reviewers and publishers, perform across five Essential Areas of peer review according to a self-assessment of their own editorial and peer review processes. We also wanted to identify and share the best practices that journals use and recognise potential obstacles that could be overcome. Methods: Journals used a Self-Assessment tool to assess their peer review processes by answering questions and giving themselves a quantitative score and providing a qualitative explanation for their rating, across the five ‘Essential Areas’ of Integrity, Ethics, Fairness, Usefulness and Timeliness. Wiley colleagues independently rated the journals to distinguish best practices and identify potential obstacles. Results: We examined the responses of 132 journals which completed the Self-Assessment exercise. Journals tended to rate themselves more highly than the study authors did. The greatest variation in rating between journal self-rating (SA-score) and the study authors’ rating (R-score) was in the Essential Area of Usefulness, with the smallest variation in the area of Ethics. We identified a set of best practices that could help improve peer review in each of the Essential Areas.Conclusion: The Self-Assessment encourages journals to reflect on and change their peer review processes and offers practical guidance on how to do this. They benefit from greater awareness of technical solutions that exist to help them in this. The Self-Assessment also highlights how journals can be inconsistent in the way that their processes operate, with one policy in place for authors and a different or no policy in place for reviewers/editors. Rather than be content with the status quo, journals should strive to improve processes in the light of changing community expectations and technological advances.
It is easy to argue that open data is critical to enabling faster and more effective research discovery. In this article we describe the approach we have taken at Wiley to support open data and FAIR (Findable, Accessible, Interoperable and Reusable) data with the implementation of four data policies: “Encourages”, “Expects”, “Mandates” and “Mandates and Peer Reviews Data.” We describe the rationale for these policies and levels of adoption so far. In the coming months we plan to measure and monitor the implementation of these policies via the publication of data availability statements and data citations. With this information, we’ll be able to celebrate adoption of data-sharing practices by the research communities we work with and serve, and we hope to showcase researchers from those communities leading in open research.