MODERATOR:
Jonathan Schultz
Editor, Science Editor
Director
Journal Operations
American Heart Association
Baltimore, Maryland
SPEAKERS:
Sarah Brooks
Interim Associate Editor
American Journal of Political Science
Professor
The Ohio State University
Columbus, Ohio
Sowmya Swaminathan
Head of Editorial Policy
Nature Research
Berkeley, California
Hashi Wijayatilake
Managing Editor
PLOS Biology
REPORTER:
Dana Compton
Editorial Director
American Society of Civil Engineers
Reston, Virginia
The topic of this year’s Science Editor Symposium at the Council of Science Editors (CSE) Annual Meeting was “Reproducibility & Reporting Guidelines.” Speakers in this session described new initiatives their journals and organizations are taking to help ensure the research they publish is rigorous, accessible, and reproducible.
As described in a feature article in this issue of Science Editor, Dr Sowmya Swaminathan gave an excellent summary of the Nature Research journals’ experience with reproducibility initiatives such as checklists for transparent reporting, peer review of code, and registered reports. If you haven’t yet read the article Dr Swaminathan authored with her colleagues, “Three Approaches to Support Reproducible Research,” I highly encourage you to do so.
We learned during the session at the CSE meeting, as one would expect, that different journals are taking different approaches to these issues. Perhaps the most stringent and thorough policy is the American Journal of Political Science (AJPS) Verification Policy, described to the audience by Dr Sarah Brooks, AJPS Editor. AJPS is a high-impact journal, ranked at the top of the list of 50 highest-impact political science journals.
Efforts toward replicability and verification became a focus in political science in 1995, with Gary King’s publication titled “Replication, Replication.”1 AJPS’ efforts have progressed steadily since that time as well. In 1995, AJPS editors first began to request that authors make data publicly available. In 2012, the editors implemented a requirement for authors to upload their datasets to AJPS Dataverse. And most recently, in 2016, the editors established guidelines for replication, requiring external verification as a condition of publication.
Under its current policy, no study will be published in AJPS before verification by an independent third party.2 For quantitative analysis, AJPS relies on the Odum Institute for Research in Social Science,3 at the University of North Carolina, Chapel Hill. For qualitative analyses, the process is conducted by the Qualitative Data Repository4 at Syracuse University. When verification has been achieved, the replication dataset is awarded open science badges for “open materials” or “open data,” and the accepted paper can move forward to publication.
Dr Brooks acknowledged that AJPS’ stringent policy does come with both costs and benefits. On the one hand, the policy puts a great demand on the authors for a high level of documentation; they often need multiple re-submissions for replication. When asked about replication failure, Dr Brooks indicated that the Odum Institute will typically continue to work back and forth with the authors to resolve any issues until replication can be achieved. This process inevitably adds time to the publication process. The average resulting delay in publication is 50–65 days. Some of this time is understandably due to author response time. Demands on editorial office staff are also increased.
It’s essential to ensure data quality before worrying too much about replicability. As the saying goes, “garbage in, garbage out.”
On the other hand, Dr Brooks said, the AJPS policy is good for science, especially political science. It establishes a high bar for analytical rigor and produces datasets for replication as well as teaching purposes. For AJPS, these benefits outweigh the inherent costs.
Dr Brooks also outlined some of the challenges AJPS has faced since implementing its policy, including limitations of computational reproducibility and terminological confusion. She warned that replication should not be allowed to distract from other serious data issues. It’s essential to ensure data quality before worrying too much about replicability. As the saying goes, “garbage in, garbage out.”
Counter Publication Bias
PLOS Biology is taking a multi-pronged approach to promoting reproducibility and reporting, described to the audience by Dr Hashi Wijayatilake, Managing Editor. Like AJPS, PLOS Biology is a highly selective journal. Its efforts are aimed at countering publication bias and promoting open research practices and an open publication process.
Dr Wijayatilake summarized two methods by which PLOS Biology hopes to counter publication bias. The first is its Complementary Research Policy.5 Under this policy, the Editors commit that “scooped” manuscripts will still be considered for publication if such manuscripts confirm, replicate, extend, or are complementary to a recently published study (within the last 6 months). The manuscripts must not be derivative, but rather independent studies relying on their own data. At the heart of this policy is the notion of “the importance of being second,” as described in an editorial by the Journal’s staff editors.6 The value of these manuscripts is organic replication, which may be even better than a post-hoc replication study.
Another effort to counter publication bias is PLOS Biology’s upcoming launch of Registered Reports in collaboration with the CHDI Foundation7 (a not-for-profit organization that focuses on Huntington’s disease research and drug development). Study proposals are assessed for experimental design, ethical approval plan, data sharing plan, etc. If the registered report passes peer review, PLOS Biology commits to publish it, regardless of the study outcome. This takes pressure away from achieving a particular outcome, as pressure to publish can be toxic and lead to lax replicability.
Open Research Practices
Dr Wijayatilake discussed a number of policies at PLOS Biology in support of open research. These include a data policy that requires authors to make all data underlying their findings fully available without restriction at the time of publication; a materials sharing policy by which the journal strongly encourages deposition of materials in repositories; strong encouragement for authors to use Research Resource Identifiers (RRIDs) for citing and uniquely identifying research resources; and a partnership with protocols.io to enable authors to share protocols and methodological details which are then directly linked from the Methods section of their articles.
Open Publication Process
In support of open publication, PLOS Biology is an official partner with bioRxiv, which enables automatic preprint posting of submitted research articles for authors who opt in during the PLOS submission process. Conversely, authors posting preprints to bioRxiv may choose to concurrently submit to the PLOS journals through a transfer service. The PLOS journals have also launched published peer review.8 In this model, authors may choose at acceptance whether to publish the peer review history for their paper. Reviewers may choose whether to reveal their identities.
Despite some of the drawbacks such as extra work for authors and delays to the publication timeline, these publishers have not observed harm to their journals as a result.
By the end of this session, attendees had heard about a broad array of initiatives undertaken by three selective, high-impact journal publishers. The efforts presented by Dr Swaminathan, Dr Brooks, and Dr Wijayatilake ranged in complexity and stringency, but all are aimed at ensuring their journals are publishing the most rigorous research possible. Despite some of the drawbacks such as extra work for authors and delays to the publication timeline, these publishers have not observed harm to their journals as a result. They have concluded that the positive impact to the science outweigh the associated costs.
References and Links
- https://gking.harvard.edu/files/abs/replication-abs.shtml
- https://ajps.org/2019/05/22/verification-verification/
- https://odum.unc.edu/
- https://qdr.syr.edu/
- https://journals.plos.org/plosbiology/s/journal-information
- https://doi.org/10.1371/journal.pbio.2005203
- https://chdifoundation.org/
- https://blogs.plos.org/plos/2019/05/plos-journals-now-open-for-published-peer-review/