. SCIENTIFIC STANDARDS. Promoting an open research culture. Science. 2015 Jun 26;348(6242):1422-5. PubMed.

Recommends

Please login to recommend the paper.

Comments

  1. The TOP Guidelines are expressly designed with journals’ needs in mind. We invite them to sign on at any of three levels of support. For example, we wanted them to be able to express some level of support immediately, with minimal changes to the way they already do business. Thus, the barrier to entry for supporting the guidelines is low. Over time, as the overall culture in science changes (which we believe it already is doing), we expect journals will become comfortable requiring additional measures by authors and reviewers to support reproducibility. These practices could include requiring pre-registration of study hypotheses and analytic plans, and/or making software code for analyses and raw data available in a third party repository. Among the most stringent practices would be requiring results be reproduced by an independent party prior to publication.

    We are extremely heartened to see widespread support of the TOP Guidelines so quickly, which tells us the field was ready to receive them. We have 114 journals and 41 organizations that have backed the guidelines as signatories in a mere few weeks since we began reaching out to them. Many more are considering signing on, but must go through formal (and sometimes lengthy) approval procedures before expressing public support. All of us on the committee are strong believers that reproducibility is a critical feature of rigorous science and that these guidelines, and others like them, will be instrumental for strengthening the practice of science. The “Principles and Guidelines for Reporting Preclinical Research” sponsored by NIH, Science, and Nature, have paved the way for guidelines in other areas. Whereas those guidelines are aimed at preclinical researchers, the TOP Guidelines apply to all domains of science—basic and applied, biological, social and behavioral, and beyond—because the principles underlying them apply equally to all branches of science. With buy-in from such a long list of scientific organizations and publications across so many different areas of science, including the leading journals in a number of fields, we cannot help but think that, indeed, reproducible science is well on its way to being de rigueur, which is good for all of us.

    View all comments by Patricia Mabry
  2. We think these recommendations are very good and, if widely adopted, could benefit science in general, and research in neurodegenerative diseases in particular. One area that has been problematic is the use of animal models in AD research because some publications, and even clinical trials, have moved forward with neither confirmation of initial results nor disclosure of negative results that might have cautioned others to not pursue certain avenues. Sometimes trials have moved forward even without strong evidence for target engagement by the drug or molecule being tested. Underpowered, poorly designed initial animal studies with no blinding have sometimes led investigators to false conclusions, and because they don’t always include complete methodology or negative results in their publications, others cannot replicate the studies.

    NIH has been addressing these issues for years and has tried to educate the community and peer reviewers to be more aware of these problems. NIA is in the process of setting up a new facility, a Preclinical Database, to catalog both positive and negative results from animal studies that examine potential clinical interventions for Alzheimer’s disease. This should be operational in the near future and we will be encouraging scientists to submit their results for inclusion in this database, which will be publicly available. The International Alzheimer’s Disease Research Funders Consortium has set up a Working Group on Reproducibility as well. NINDS is currently recruiting a Rigor and Reproducibility Officer who will create programs to address the reproducibility issue and increase oversight of research the Consortium funds. Other NIH institutes are also taking steps to address the situation.

    View all comments by Creighton Phelps
  3. There certainly is a great need for promoting reproducibility within the scientific literature. As noted by the authors, reproducibility problems are likely due to various incentive systems that drive researchers, including institutions/universities, granting agencies, and publishers. While the article focuses on standards for journals and publishers (which is certainly one place to start), there is perhaps as great or perhaps an even greater need for reforms in institutions regarding promotion and tenure as well as in granting agencies. Publication in the “high-impact” and “high-profile” journals is so highly valued by most committees on promotion and tenure, and by reviewers of grant applications, that this drives researchers to eschew publishing “negative” or findings with less impact, even though these are also critical for moving science forward. Reforms in publication requirements are unlikely to completely solve the problem without considerable and parallel reforms within institutions and granting agencies.

    View all comments by Bruce Lamb
  4. Transparency, openness, and reproducibility, as the authors emphasize in the first paragraph, are of critical importance to advancing all areas of scientific research, so I think everyone would embrace them. However, the implementation of strategies to promote these ideals is the problem and subsequent paragraphs in this essay are on the mark as to why this is so. Notably, the lack of academic as well as publishing and funding incentives to promote transparency, openness, and reproducibility is a key factor, or barrier, as the essay notes. The table summarizing TOP guidelines is clear and achieving level 3 would be desirable, but this is daunting to implement due to the costs involved, even if the scientific research culture was aligned to accept the most stringent standards. That said, in my view, the Alzheimer’s Disease (AD) Neuroimaging Initiative (ADNI) comes the closest among NIH funded research studies to approaching but not completely meeting level 3 TOP guidelines (see Weiner et al., 2015), but I do not know of any research programs that fully implement level 3 TOP guidelines and I expect that the cost of doing so could double the NIH budget. For studies such as ADNI, which makes all data generated accessible to the public, the cost of efforts to promote transparency, openness and reproducibility while high, is very worthwhile, but this may not be the case for all research, as the essay notes. Thus, ADNI shows that the scientific culture, at least in the AD research arena on AD biomarkers, is willing to embrace the concepts outlined in this essay, and in many ways this reflects the attitudes of the ADNI Core Leaders who adopted the principals of openness and transparency as core ADNI values when they signed on in 2004.

    Indeed, this essay focuses on journals and publishing, but I think the need for change in how academia measures and incentivizes success in research is far more challenging than changing journal practices since academia rewards the success of individuals, not teams, which is a significant barrier to promoting transparency, openness, and reproducibility.

    View all comments by John Trojanowski

Make a Comment

To make a comment you must login or register.

This paper appears in the following:

News

  1. New Journal Guidelines Aim to Boost Transparency in Research