Practicalities of Conducting Replication Studies in Preclinical Cancer Biology
The first results for the Reproducibility Project: Cancer Biology, offer insights into the time and cost of conducting replication studies in preclinical cancer biology.
While many agree that developing a therapeutic should not be pursued unless grounded in reproducible preclinical results, the costs of replication have, to date, not been considered when such studies are funded. With this first estimate of the resources required, the Reproducibility Project: Cancer Biology lays the foundation for developing a research funding structure that may provide appropriate incentives for completing replication studies.
Unlike other assessments of reproducibility, the results of the Reproducibility Project: Cancer Biology are an open, growing dataset of systematic experiments. This open format is necessary for constructive discussion of reproducibility among researchers, industry leaders, policy makers, funding agencies and regulatory agencies that will shape the way preclinical research is done.
Recently, the cost of irreproducible research, especially in preclinical biology, has garnered a huge amount of attention. Irreproducible research expends financial and personnel resources and makes it difficult to differentiate truly significant advancements from irreproducible findings. Freedman (2015) (1) estimates that $28 billion is spent on preclinical research that yields results that cannot be reproduced.
Still unknown is the cost of performing the independent replication studies themselves, to avoid wasted resources and to determine which scientific findings are best suited for development of novel therapeutics. The Reproducibility Project: Cancer Biology (RP:CB) is a collaboration between Science Exchange and the Center for Open Science (COS) to independently replicate key experiments from high-impact, published cancer biology studies (2) and was initiated in response to multiple reports published from the pharmaceutical industry indicating that more than 70% of published findings could not be reproduced (3,4).
The first set of five replication studies were recently published in this edition of eLife (https://elifesciences.org/collections/reproducibility-project-cancer-biology), and represent a major landmark for the reproducibility discussion. These are the first replications to generate an open dataset that can be used to examine the rate of reproducibility in this field, and to study factors associated with the reproducibility of experimental results.
The project also gives insight into the time and costs of undertaking independent replication studies. The goal of the current commentary is to use this first sample of studies to discuss the practicalities of conducting these independent replications and the most common roadblocks that have affected the time to conduct them.
Science Exchange and independent replication
One of the key factors determining the validity of the RP:CB was the use of highly-qualified scientists to perform the replication studies. Science Exchange manages a network of more than 3,000 screened and verified contract research organisations (CROs), academic labs and government facilities that are available to conduct experiments on the behalf of scientists.
While Science Exchange primarily functions as a marketplace for a broad array of researchers, ranging from large biopharmaceutical companies to academic labs, for outsourcing experimental research, it also serves as a unique venue that enables reproducibility initiatives. An important application for clients of Science Exchange is the ability to independently run replication studies by capable laboratories.
Science Exchange has been involved in various reproducibility initiatives, including antibody and reagent validation, the Movember Foundation-Prostate Cancer Foundation Reproducibility Initiative (5,6), the Reproducibility Initiative (7) and the Gates Foundation 3ie supporting HIV prevention research validation. Similar to these other initiatives, the RP:CB tapped into the extensive Science Exchange network to use the existing capabilities of experienced and completely independent labs for this large set of reproducibility studies.
Practical components of independent replication
The timeline for the RP:CB replications began upon initial contact with the original authors. The RP:CB Core Team requested comments on the proposed protocols, raw data from original experiments, as well as reagents for the replications themselves. The types of requested information included vendor and catalogue numbers of specific reagents and specific details about protocols that were not published in the original manuscript.
The RP:CB Core Team also requested relevant cell lines, plasmid constructs, peptides or antibodies, when these were available, from the authors of the original study. The goal was to reduce the number of potential sources of variation from the original study as much as possible. The average time to receive protocol information and/or data was 81 (±21) days (Table 1).
This two to three-month delay in sharing information and data was typically the result of information decay and personnel changes in the original laboratory. In the intervening time between the original publication and our outreach, the corresponding author was often no longer working in the same laboratory and may or may not still have had access to data or reagents.
Once both the replicating labs and the original authors had the opportunity to comment on the proposed experimental work, the Registered Reports were submitted to eLife for peer review. The average time between the initial submission and acceptance of each Registered Report was 92 (±17) days.
We identified 11 service providers to perform the experimental work for the seven studies. These seven labs performed between two and five experimental services for each replication study. There was often a delay between the time that the protocols were approved with Peer Review and when the replicating lab received all of the necessary materials and reagents to begin experimental work.
Once experimental work began, projects lasted, on average, 192 (±84) days (approximately 6.5 months). Six of these replication studies included an in vivo experimental component. The greater than six-month experimental time period with these seven replications included delays such as optimisation for the timing of IVIS imaging of tumours (8) and confirming the expression of mutations (9). The costs of the first five replications ranged between $11,700 to $65,940, averaging $33,700.
Discussion and conclusions
While other authors have helped to sound the alarm about irreproducible research, the RP:CB project represents the first study of its kind to make replication results open to the scientific community. This open dataset allows each community member to re-analyse data, evaluate the quality control checks that were performed and vet the conclusions for themselves in a very tangible way. This degree of open access is unique for published research in biological sciences.
It is important to highlight that these are the first of several replications to be published. The RP:CB Core Team will perform a meta-analysis of all of the final reports to identify the factors that are associated with both reproducible and irreproducible studies.
Preliminary results suggest that reproducibility can be hampered by lack of detail in specifying materials and methods of the original studies. For example, exact conditions for compound synthesis, precise protocols for xenograft injection, detailed methods for tissue homogenisation/extraction, conditions for western blotting and detailed conditions for reverse transcription-polymerase chain reaction (RT-PCR) if omitted from the original manuscript must be left to the discretion of the replicating lab.
Solutions for the time and barriers to conducting replications include having raw data and full protocols available with the publication, so authors following up on the finding are not required to request this information. As well, depositing unique reagents that are not commercially available in repositories such as Addgene, JAX and ATCC would reduce barriers for follow-on studies as well as replication studies. Lastly, introducing efficiencies that can speed up the peer review process can avoid delays during publication.
In addition to the forthcoming meta-analysis of factors contributing to reproducibility, RP:CB will publish a study highlighting the timing, costs and roadblocks that were encountered during the process for the entire RP:CB.
These studies show that the important work of replication needs to be done and should be an automatic result of any exciting new finding. It is clear from the first set of replications that the results were not always aligned with the original publications. Given that most of the high impact studies included in the RP:CB likely represent many years of trial and error and optimising protocols, the costs of conducting replications are a small fraction of the grant awards that likely fuelled the original papers.
The costs associated with these replications are reasonable estimates for funding agencies to use for studies of this scope as they begin to make replication plans a required component of grant applications. The NIH issued a notice in 2015 that the “NIH and AHRQ plans to require formal instruction in scientific rigour and transparency to enhance reproducibility for all individuals supported by institutional training grants, institutional career development awards, or individual fellowships” in early 2017 (NIH NOT-OD-16-034).
This notice falls short of requiring evidence of independent replication for grant applications. In order for replication studies to become part of the framework of science, funding needs to be allocated so that principal investigators are able to include independent replication studies as part of their ongoing research programmes.
Finally, the ultimate result of irreproducible preclinical studies is very low follow-on success of clinical trials in which human subjects volunteer to participate. The success rate of company-sponsored, FDA registration-enabling development programmes progressing from Phase I to FDA approval between 2006 and 2015 was only 9.6% for all disease areas, and for oncology the success rate was even lower (5.1% (10)). This number is lower than in a previous report by Hay et al (11) that measured a 10.6% overall success rate and 7% for oncology drugs.
The important and compelling by-product of more robust and reproducible pre-clinical studies will be more efficient use of the total life science R&D spend ($71.1 billion in 2016 (12)) and presumably higher success rates of drugs to treat disease. DDW
—
This article originally featured in the DDW Winter 2017/18 Issue
—
Dr Nicole Perfito is a Core Team member for the Reproducibility Project: Cancer Biology and has contributed to other reproducibility projects including the Prostate Cancer Foundation-Movember Foundation Reproducibility Initiative. Completing her PhD in Biology at the University of Washington, her research focused on reproductive physiology and neuroendocrinology while at Princeton, the Max Planck Institute and the University of California, Berkeley. Currently, Nicole leads a team of staff scientists to deliver outsourced experimental services to leading R&D organisations.
Dr Rachel Tsui has a PhD in Biochemistry from UC San Diego, focusing on both signal transduction and computational biochemistry, and a BA in Chemistry from Boston University. At Science Exchange, Rachel collaborates with procurement and R&D leadership to optimise research outsourcing. Before Science Exchange, Rachel worked as a sales and marketing consultant in the pharmaceutical and healthcare space for ZS Associates, with a focus on marketing strategy for pipeline and launched oncology drugs.
Dr Elizabeth Iorns is the Founder & CEO of Science Exchange, the Co-Director of the Reproducibility Initiative and is a part-time partner at Y Combinator. Elizabeth has a PhD in Cancer Biology from the Institute of Cancer Research (UK), and before starting Science Exchange in 2011 was an Assistant Professor at the University of Miami (where she remains an Adjunct Professor). Elizabeth has received a range of honours and recognition, including the Kauffman Foundation Emerging Entrepreneur Award, one of Nature Magazine’s ‘Ten People Who Mattered’ and one of WIRED’s ‘50 Women Who Are Changing The World’. Elizabeth is focused on the development of innovative models to promote the quality and efficiency of scientific research.
References
1 Freedman, LP, Cockburn, IM, Simcoe, TS. 2015. The Economics of Reproducibility in Preclinical Research. PLoS Biol 13(6): e1002165.doi:10.1371/journal.pbio.1002165.
2 Errington, TM, Iorns, E, Gunn, W, Tan, FE, Lomax, J, Nosek, BA. 2014. An Open Investigation of the Reproducibility of Cancer Biology Research. eLife 3: e04333.doi:10.7554/eLife.04333.
3 Glenn, BC and Ellis, LM.2012. Drug Development: Raise Standards for Preclinical Cancer Research. Nature 483(7391): 531–33. doi:10.1038/483531a.
4 Prinz, F, Schlange, T, Asadullah, K. 2011. Believe It or Not: How Much Can We Rely on Published Data on Potential Drug Targets? Nat Rev Drug Discov 10(9): 712–712. doi:10.1038/nrd3439-c1.
5 Chronscinski, D, Cherukeri, S, Tan, F, Perfito, N, Lomax, J, Iorns, E. 2015. Registered report: the androgen receptor induces a distinct transcriptional program in castration-resistant prostate cancer in man. PeerJ 3:e1231 https://doi.org/10.7717/peerj.1231.
6 Shan, X, Danet-Desnoyers, G, Fung, JJ, Kosaka, AH, Tan, F, Perfito, N, Lomax, J, Iorns E. 2015. Registered report: androgen receptor splice variants determine taxane sensitivity in prostate cancer. PeerJ 3:e1232 https://doi.org/10.7717/peerj.1232.
7 Iorns, E, Gunn, W, Erath, J, Rodriguez, A, Zhou, J, Benzinou, M. The Reproducibility Initiative, 2014. Replication Attempt: “Effect of BMAP-28 Antimicrobial Peptides on Leishmania Major Promastigote and Amastigote Growth: Role of Leishmanolysin in Parasite Survival” PLoS One 9(12): e114614.doi:10.1371/journal.pone.0114614.
8 Aird, F, Kandela, I, Mantis, C. Reproducibility Project: Cancer Biology. 2017. Replication study: BET bromodomain inhibition as a therapeutic strategy to target c-Myc. eLife 6:e21253.
9 Horrigan, SK, Courville, P, Sampsey, D, Zhou, F, Cai, S. Reproducibility Project: Cancer Biology. 2017b. Replication study: Melanoma genome sequencing reveals frequent PREX2 mutations. eLife6:e21634.
10 BIO, Biomedtracker, Amplion. 2016. Clinical Development Success Rates 2006-2015.
11 Hay, M, Thomas, DW, Craighead, JL, Economides, C, Rosenthal, J. 2014. Clinical development success rates for investigational drugs. Nat Biotech 32: 40-51. doi:10.1038/nbt.2786.
12 IRI, 2016. 2016 Global R&D Funding Forecast. R&D Magazine Winter 2016.