Skip to main content

Currently Skimming:

6 Toward Minimal Reporting Standards for Preclinical Biomedical Research
Pages 73-90

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 73...
... • Many of the tools that support reproducible research are al ready available through institutional libraries (e.g., data shar ing, checklists, preregistration, preprints, sharing code, sharing data, incentives, metrics) , and existing research support staff are available to provide expert assistance.
From page 74...
... • Consider approaches to compare reporting of rigor elements proposed in grant applications to those included in publications. • Suggest stakeholder actions to encourage transparent reporting and practi cal next steps toward establishing minimal reporting standards for preclini cal biomedical research.
From page 75...
... Methodology Standards as a case example of an effort to develop minimal standards for the design, conduct, analysis, and reporting of research and the limitations of checklists in changing behavior. WHAT TRANSPARENT REPORTING MEANS FOR REVIEWERS Benedict Kolber, Associate Professor, Duquesne University "Transparency will be the legacy of this rigor, reproducibility, transparency movement," Kolber said.
From page 76...
... IMPROVING ASSESSMENT OF REPRODUCIBILITY Richard Nakamura, Former Director of the Center for Scientific Review, National Institutes of Health Several factors have negatively impacted reproducibility in recent years, Nakamura said. As background, he said that after the congressional effort to double the total NIH budget over the course of 5 years1 ended in 2003, "all of science in the United States underwent somewhat of a recession." As a result, grant success rates were low and cuts to grant funding were high.
From page 77...
... To understand the potential impact of this increased burden on reviewers, the Center for Scientific Review surveyed reviewers about the extent to which they look at the primary literature cited by grant applicants. Nakamura said that 90 percent of reviewers responded that they had checked the original papers cited.
From page 78...
... . ENGAGING RESEARCH SUPPORT STAFF Franklin Sayre, STEM Librarian, Thompson Rivers University Basic and clinical researchers are supported by a cadre of research support staff, including statisticians, computer scientists, librarians,
From page 79...
... As a STEM librarian, Sayre said that he regularly works with graduate students and postdoctoral fellows who are seeking guidance on how to implement a required checklist, or who are interested in designing reproducible research. He described his role as happening within a "black box" that sits among research policy, incentives, and infrastructure on one side, and reproducible, rigorous research on the other.
From page 80...
... Many of the tools to support reproducible research are already available through institutional libraries, she said, such as institutional repositories, and support for data management and data curation. In addition, libraries are "natural partners" with other research resources such as the institutional Office of Research, Clinical and Translational Science Awards Program Hubs, high-performance computing centers, and biostatistics cores.
From page 81...
... A second Research Reproducibility Conference was held in 2018, designed specifically to teach researchers the skills needed for reproducible research, including working with reporting guidelines and minimum reporting standards. University of Florida At the University of Florida, where Rethlefsen currently works, she is deploying the same strategy to identify existing resources, establish partnerships, and drive change.
From page 82...
... She emphasized that sustaining grassroots or "volunteer" efforts is challenging, and support from institutional leadership is needed for success. APPLYING A SYSTEMATIC FRAMEWORK TO DEVELOPING MIMIMAL REPORTING STANDARDS Michael Keiser, Assistant Professor, University of California, San Francisco Keiser shared his perspective on transparent reporting as an early career researcher, drawing on Platt's systematic and transparent approach to science, which Platt termed "strong inference" -- a model of inquiry that relies on alternative hypotheses rather than a single hypothesis to avoid bias (Platt, 1964)
From page 83...
... Transparent reporting should include information on the logic and reasoning that went into a study analysis, he said. Data science tools are already available to encode and share relevant information, including preregistration in Registered Reports, software version control using Git, data repositories through Zenodo, and logic models using Jupyter notebook.
From page 84...
... THE IMPACT OF MINIMAL STANDARDS ON IMPROVING METHODOLOGY Steven Goodman, Professor of Medicine and Health Research and Policy and Co-Director of METRICS, Stanford University Goodman briefly shared his perspective as a research educator on some of the critical gaps in the training of research scientists. Many laboratory scientists, early career as well as some senior investigators, have a limited understanding of the "basic elements and formal logic and purpose of experimental design," he said, including blinding, randomization, sample size determination, and other aspects.
From page 85...
... Methods used for imputing missing data should produce valid confidence intervals and permit unbiased inferences.… Single imputation methods, such as last observation carried forward, baseline BOX 6-2 Patient-Centered Outcomes Research Institute (PCORI) Methodology Standards Topic Areas Cross-Cutting Standards • Formulating Research Questions • Patient-Centeredness • Data Integrity and Rigorous Analyses • Preventing and Handling Missing Data • Heterogeneity of Treatment Effects Design-Specific Standards • Data Registries • Data Networks • Causal Inference Methods • Adaptive and Bayesian Trial Designs • Studies of Medical Tests • Systematic Reviews • Research Designs Using Clusters • Studies of Complex Interventions • Qualitative Methods • Mixed Methods Research • Individual Participant-Level Data Meta-Analysis (IPD-MA)
From page 86...
... However, they are not necessarily easy to assess. As an example, he challenged participants to consider exactly how they might assess compliance with the standard that reads, "Single imputation methods, such as last observation carried forward, baseline observation carried forward, and mean value imputation, are discouraged." He added that assessing applicable standards can require "a fair amount of sophisticated judgment." The adherence of final reports to the PCORI Methodology Standards was evaluated and presented at the Eighth International Congress on Peer Review and Scientific Publication (Mayo-Wilson et al., 2017)
From page 87...
... can serve as reminders, but they are not sufficient for solving adaptive sociocultural problems and do not substitute for knowledge or understanding. In the absence of knowledge and understanding, enforcing minimal reporting standards may require significant effort and produce limited results.
From page 88...
... Deborah Sweet suggested that involving trainees and postdoctoral fellows in the review process would be helpful given that they are the scientists who are actually carrying out the laboratory experiments and therefore best suited to determine if there is sufficient information provided in a manuscript to allow them to reproduce or replicate the study. Addressing Underpowered In Vivo Studies Thomas Curran asserted that it is "unethical to conduct a bad animal experiment." He reiterated a point made several times during the workshop that researchers may add an underpowered animal study or use an inappropriate animal model in response to a request by a peer reviewer.
From page 89...
... He described a case example in which a paper in a high-profile journal was retracted due to concerns about a single image in a panel of dozens. He postulated that the image, related to an animal experiment, may have been added in response to peer review.
From page 90...
... 90 ENHANCING SCIENTIFIC REPRODUCIBILITY lems of irreproducibility, somewhat similar to the "bug bounties" used to identify security vulnerabilities in technology products and services, he said. He suggested that training grants could cover attempts by trainees to reproduce studies in their field of research and could even require it as a way to enhance training in rigorous research.


This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.