Could The Keys To Precision & Personalised Medicine Be Rooted In Predictive Safety & Research Methods?

Could The Keys To Precision & Personalised Medicine Be Rooted In Predictive Safety & Research Methods?

By Robert G. Hunter

To meet the massive challenges of future healthcare, perhaps no two facets hold greater promise than biomarkers for precision medicine and systems biology for personalised patient care.

Although a high level of market fragmentation typically obscures the point, each of these solutions ultimately spans the vast terrain from drug discovery and development, to safety testing and clinical trials, to clinical practice.

However, there is a perspective from which these lifecycles can be viewed quite fully, and from which the two disciplines can in fact be seen to work together as mechanistic and predictive complements – namely the landscape of in vitro and in silico tools and methods rooted in the field of predictive toxicology and safety assessment.

Much of the promise of future precision medicine rests on robust biomarkers, and market forecasts continue to reflect high expectations, but actual clinical biomarker performance has been questionable and new approvals slow. What are the key issues with clinical biomarkers, and how do they relate to research?

According to biomarker and diagnostics expert Abdel-Baset Halim, PharmD, PhD, DABCC, FACB, the clinical lab community has a much higher awareness of the difficulties ahead and much lower expectations for the next five years. He reminds us that while most drug developers and physicians believe that most, if not all, biomarker assays can be standardised to produce consistent results, the clinical laboratory community recognises that even among decades-old tests, only a few qualify as standardised – total cholesterol, creatinine and glycosylated hemoglobin (1-3).

They know widely variant results are possible if tests are performed in different labs, using different methodologies or platforms, or even within the same labs using different lots of reagents. In fact, variability in results from LDTs (laboratory developed tests) can be significantly less than that from lab-independent IVD (in vitro diagnostic) kits utilised by different labs. (Note the contrast to predominant industry and regulatory views.)

They understand that even in the best cases, such as HER2 biomarker, the odds of identifying HER2- positive patient candidates can vary significantly. And this remains despite substantial improvements in quality systems, oversight via certifications such as CLIA (Clinical Laboratory Improvement Amendments), and via accreditation and proficiency testing provided by organisations such as CAP (College of American Pathologists). Halim cautions that this wide gap in expectations can severely impede future progress of molecular diagnostics and precision medicine.

Where in the research community do we find such terra firma to underpin our hope for future standards, and how could they possibly relate to the clinic?

New drug approvals are still anaemic despite ever-increasing budgets. Unclear payoffs from the genomics revolution after more than a decade. Research results are largely not reproducible across the board, despite a plethora of analytical technologies and approaches, or perhaps as a result of this seemingly endless diversity. Can we hope to find clarifying leadership from regulators?

In fact we can, if we are willing to broaden our thinking a little further to include the environmental dimension. The most recent vision for precision medicine as elucidated by the EU IMI and the US National Research Council calls for integrating molecular, phenotypic and environmental data. And in fact, EU AXLR8 and US Tox21 are each well-funded, state-of-the-art programmes to develop and refine screening and testing, not only for chemicals but also for drugs. (Incidentally, the two have demonstrated collaborating together and with similar efforts in Japan.)

How did this happen and why? Essentially there was no way to meet public demands for greater testing of environmental chemicals, increasing since the 1980s and 1990s, other than by creating these new standards based on in vitro and in silico technologies. And now, in addition to these regulatory uses, a host of drug research niches are using these same technologies for screening, safety assessments and prioritisation of new compounds. And standards of perspective and even of practice are beginning to emerge.

Can we envision developing standard methods that would not only expand these research niches, but also provide links to clinical biomarkers and other aspects of precision medicine? To answer this provocation, it helps to understand the research niches and industry superstructures already in place, and to focus on at least three aspects: First, are they scientifically credible? Second, are they sustainable, technologically and economically? Third, is there basis for agreement and evidence of results?

In response, yes – strong science is demonstrated across a wide range of technologies, including stem cells and primary cells, ’omics, imaging, liquid handling and automation, consumables, and bioinformatics (Figure 1).

Figure 1 In vitro and in silico technologies

Each of these are thriving dynamic markets. In addition, cell-based assays (Figure 2) are an important segment that has been studied by many market researchers, and yet confusion has resulted from inconsistent treatment of kits versus home brew, inclusion of consumables and equipment, etc.

Figure 2 Overview of in vitro assays

Some themes have become key niches quite quickly, for example 3D cell culture. Although fairly fragmented, in total this market is substantial and growing at a very healthy rate, which we recently estimated at $5 billion in 2012 growing 15% to $10 billion in 2017 (Figure 3).

Figure 3 In vitro and in silico methods for toxicology revenue growth estimated at 15% CAGR

The landscape consists of large multinational companies such as Agilent, PerkinElmer and ThermoFisher, plus a myriad of smaller companies and start-ups. Many stem cell innovators built businesses in the testing sector on the way to developing stem cell therapeutic products.

And yes, there is strong evidence of key agreement enabling results. Several unifying aspects exist, including primary/secondary compound screening strategies, weight-of-evidence assessment frameworks and integrated testing strategies. Furthermore, in silico (bioinformatics) solutions typically span the in vitro landscape, as described below, and increasingly play a unifying role.

Overview of in silico

Landscape

In silico or bioinformatics solutions are used in at least two basic ways: as companions to in vitro methods for mechanistic analysis, and as standalone systems for predictive analysis. Naturally these are often combined. As an aside, historically predictive approaches (also called ‘non-mechanistic’) have been contrasted with mechanistic approaches. However, our research shows that increasingly the two are used in tandem, and that they are being viewed as complementary parts of an overall toolkit. Thus our use of the term Predictive Toxicology or ‘PredTox’ as a discipline is intended to encompass both predictive and mechanistic, and to connote an evolution of the combined toolkit. In keeping with the ‘Tox 21’ vision, the key attribute of this overall evolution is being more and more able to use in vitro methods to predict relevant in vivo outcomes.

In practice

Ultimately, this must interface with systems medicine. Systems medicine acknowledges the complexity of our biology with its intention to use responsive, refined and targeted approaches to cure disease and maximise wellness. It leverages systems biology being developed in research applications. How is this complexity being reduced to practice? Consider the following example.

Fast forward: model-based drug development

It is becoming standard practice for pharmaceutical companies and the FDA to estimate clinical trial doses using computer models to evaluate why adverse events occur, and to determine the potential basis for variability in patient response. This was a key goal defined in FDA’s 2004 Critical Path Initiative report. And in June 2013, the FDA issued a regulatory letter to Critical Path Institute’s (CPath) consortium, the Coalition Against Major Diseases (CAMD), stating their decision to deem CAMD’s quantitative clinical trial simulation tool a “fit-for-purpose” drug development tool for Alzheimer’s disease (AD). This new tool will make it possible to simulate clinical trials by integrating all relevant data, enhancing future study efficiency and hopefully efficacy as well. And in September 2013, FDA announced a licensing agreement for use of PhysioLab Modeler software in its drug safety research, specifically for drug-induced liver injury (DILI). This software is used to author models and edit depictions of the body’s functional pathways paired with mathematical algorithms. The mathematical representation of normal and diseased biological systems makes it possible to simulate outcomes of specific interventions or treatments over any clinically relevant period of time using computer generated patients. In contrast to prior limitations of simulating a single generic virtual human, the PhysioLab platform can generate any number of virtual patients representing combinations of disease status, genetics, ageing, lifestyle or other factors. It is then possible to run a virtual clinical trial on a diverse patient population. Specialised PhysioLab platforms include Cardiovascular Disease, Metabolism, Rheumatoid Arthritis, Hypertension, Hypersensitivity, Dermis and Epidermis. And now, use of these same type models to offer suggestions directly relevant to patient care is being explored.

How did this predictive technology develop? What are its mechanistic underpinnings? What insights can be gleaned from this?

Drug safety testing has for many years followed the systems approach, to study the comprehensive response to toxic compounds at the DNA, RNA, protein and metabolite level. This has required substantial bioinformatics resources and sophisticated approaches and fortunately these have been available to leverage from the field of genome sequencing.

While these studies used basic statistical methods, they also typically involve dose-response, time series and other advanced aspects. And so enhanced statistical analysis tools had to be developed. And yet still use of statistically derived gene signatures alone can be limiting. Measured in terms of specificity and sensitivity, these mathematical models really have no mechanistic relevance to the functional biology of the end point. And while they performed well in small studies, their performance and robustness for larger studies was called into question.

And so knowledge-based approaches such as functional classifiers based on pathways and networks have evolved. This work has proven to be enormously complex, considering the underlying biology of tens of thousands of genes and myriad cellular pathways. Many vast data sets have had to be combined and integrated. While full understanding remains elusive, this formative work has been critical to advancing the field.

Sometimes gene expression profiles alone have not been enough to discriminate between toxic compounds, but when accompanied by other biological data are sufficient. For example, metabolite profiles have been used to predict classification of blinded compounds. Some groups have focused on functional annotation of genes using pathways, biological processes, regulatory interactions and molecular reactions, and others have created utilities which mine the published literature.

Plus, the definition of a ‘pathway’ has evolved significantly. While a pathway can be considered to be a linear chain of metabolic reactions or binary protein interactions, to make sense biologically a pathway has to be experimentally confirmed as a whole set of interactions, rather than only individual links. Pathway maps have been assembled, however, they are often different in scope and scale and do not overlap well. Substantial difference has been found between metabolic and signalling pathways, for instance.

And yet in studies for predictivity, even single pathway descriptors have demonstrated significant benefit versus statistics alone. And the field has in fact expanded from individual pathways into sets of pathway maps and multi-pathway classification techniques.

Because in quantitative functional analysis, the whole gene content of a certain functional entity (pathway, biological process, networks or sub-network) is being used as a functional classifier. For example, while historically metabolic and gene expression/proteomics have been separate data types that cannot be correlated statistically, knowledge- based pathways have been used as ‘common ground’ templates for integration and interpretation of mismatching, yet complementary data.

So called ‘pathway synergy’ methods developed for integrated analysis of multiple types of genetic alterations in cancers, have proven useful for the analysis of parallel pathways in toxicity studies.

Several studies established relationships between drug side-effects and the protein targets and pathways they affect. This approach of inferring molecular interactions goes beyond the routine methods of predicting compound effect based on chemical similarity or target sequence similarity, and has added considerably to mechanistic understanding. It is important to underscore that a big contribution has been made by researchers to include negative controls to avoid bias toward certain activity classes.

Historically, knowledge-based approaches have required large proprietary knowledge bases using powerful modelling tools. However more recently several comprehensive yet affordable tools have been developed, along with user-friendly interfaces to boot. These curated knowledge databases have tools, for example, which enable visualisation of pathway maps or networks representing chains of signalling interactions.

Thus, quantitative methods are being used to generate new types of toxicity predictive signatures based on well-characterised biological pathways that can be visualised and intuitively analysed, while retaining the link with the predictive model. The concept of ‘pathway barcoding’ has huge potential in the fast-developing fields of personalised and translational medicine, especially for quick assessment of drug response/resistance and side-effects in clinical trials.

Systems biology in EHR

These developments give us confidence that as physicians continue to adopt electronic health records (EHR) they will have available to them substantive applications of systems biology based on proven disease models. These will likely help prioritise which diagnostic screens and tests are most critical, at given points in time… sort of like they are being used today in drug discovery and safety assessment.

Representative examples of organisations rooted in PredTox helping build robust biomarkers and systems medicine/health IT

It can be difficult to single out specific key examples among the vast landscape of in vitro and in silico methods – certainly the selection of criteria depends on one’s point of view. Because of their criticality for precision and personalised medicine, combined with having unmet needs which fit with PredTox strengths, we are attracted to examples of companies rooted in PredTox expanding beyond research/safety into clinical domains such as biomarkers and systems medicine/health IT. The following highlights are updates from organisations which participated in our recent market/technology study by BCC Research.

Developing safer medicines

In vitro ADMET and predictive toxicology expert and Scientific Director of US operations for UK non-profit Safer Medicines Trust Dr Katya Tsaioun agrees that linking technological safety assessment breakthroughs to unmet needs in clinic is a key opportunity and the right time. She sees the present business and regulatory climate as more and more receptive to the needed modernisation of regulatory science via increased adoption of in vitro methods and for overall rationalisation of testing and approvals.

The US FDA’s Critical Path initiative clearly outlined this direction for change. Safer Medicines Trust is contributing to the field by bringing together the regulators such as EPA and FDA, predictive toxicology assays inventors and pharmaceutical and cosmetic industries in defining the appropriate validation levels for different purposes in chemicals safety assessment and developing a standard path for acceptance of new human biology-based methods into the regulatory process.

HESI: Facilitating key research and collaboration

Facilitated by the Health and Environmental Sciences Institute (www.HESIglobal.org), the week of March 10, 2013 marked a major milestone in efforts between ICH, FDA, MHRA and other drug regulatory agencies worldwide, as well as key industry and academic leaders, to forge a significant paradigm shift forward in drug cardiac safety testing. The essence of the change is greater reliance on not only in vitro but also in silico methods, in response to growing recognition of their merit in regulatory decisions as well as in research for screening of safety risk.

About 12 years ago in response to drug-related cardiac arrhythmias (Torsades des Pointes, also known as Long QT Syndrome), the cardiac safety field recognised that guidelines were urgently needed for cardiac safety testing of new drug candidates. HESI undertook a programme to evaluate the utility of preclinical safety testing assays for predicting clinical TdP.

The published results of these studies served as supporting experimental data for the subsequent development of the ICH S7B guideline. Since the implementation of this guideline, the occurrence of unanticipated TdP effects from approved drugs has been largely eliminated. This was the first ICH guideline to use safety pharmacology data from a nonclinical safety setting to inform potential clinical risk.

A number of in vitro and in vivo assays were established, with most focused on the main ion channels (across cardiomyocyte membranes) known to be involved in maintaining the cardiac action potential. In fact, focus has centred on a dozen or so ion channels, with particular emphasis on identifying mutations in the human ether-ago- go (hERG) gene implicated in triggering TdP/Long QT, as mandated by FDA as part of the so-called ‘Thorough QT clinical study’. However, in recent years this practice has been called into question for a propensity to generate false negatives (removing viable candidates from further consideration) in the face of a price tag of around $1 million per screen.

Furthermore, it is estimated that there are more than 70 types of ion channels that are differentially expressed across the repertoire of cardiomyocytes, and thus potentially involved in the overall cardiac electrical activity and beating of the heart. And already drug candidates are also screened for direct damage to cardiomyocytes at the cellular or sub-cellular level, such as mitochondria.

For example, anthracyclines, a widelyused class of chemotherapeutic drugs, is known to induce heart damage by direct structural or cellular damage. Clearly what is really needed is an integrated cardiac assay system, but currently these mainly consist of whole organ perfusion assays referred to as Langendorff heart assay, or telemetry experiments using live animals. These tests are technically challenging, labour-intensive and not amenable for high throughput.

For these reasons, in vitro technology has been gaining significant traction, enabled by key developments in stem cell technology and by a popular measurement technology called MEA. Reportedly, other tests are being reconsidered as part of a model collaboration to enhance the overall process of drug development, testing and approvals, in key part by reducing false negatives which eliminate candidates unnecessarily.

This follows exciting recent news in 2013 of FDA support for model-based drug development in both DILI (drug induced liver injury) and in Alzheimer’s. All of this underscores how drug cardiac safety testing is the epicentre for greater adoption of in vitro and in silico methods for predictive safety and tox, demonstrating how effective agency collaboration can lead to clear guidelines and hence to strong innovation and commercialisation.

ACEA Biosciences, Inc

In addition to its active participation in the ToxCast programme with US EPA, ACEA is riding the cardiac safety wave with its label-free impedance- based platform (xCELLigence RTCA Cardio) to deliver a highly-predictive in vitro assay for assessment of drug-induced cardiac liability. Specifically, its technology in combination with human-induced pluripotent stem cells (iPSC)- derived cardiomyocytes yields a Predictive Proarrhythmic Score (PPS) that correlates well with clinical arrhythomgenic risk. They see a future in which clinical applications could be targeted for specific subpopulations, even down to the individual patient by testing on iPSC-derived cardiomyocytes.

ACEA also has a community of users focused on oncology, where their platform is proving valuable for characterisation of cell migration and other predictive drivers of tumour size, metastasis and other critical therapeutic attributes. ACEA is also open to multiplexing different readouts in the future, for example coupling their technologies’ predictive capability with the mechanistic strengths of e-phys (electrophysiology) and imaging.

Stemina, Inc

Stemina is developing a series of toxicity assays for DART (developmental and reproductive toxicity) which are the first and only human in vitro tests for assessing the potential for a compound to cause birth defects if a woman is exposed during pregnancy. Stemina recently received a $10.6 million contract from the Environmental Protection Agency (EPA) under the EPA’s ToxCast initiative.

In addition, Stemina is developing a blood test for autism and metabolic subtypes of autism across the spectrum to allow diagnosis and personalised treatment of the individual patient. Stemina is leveraging breakthroughs in metabolomics and stem cells, and is supported by a strong intellectual property portfolio beginning with the work of Dr Gabriela Cezar at the University of Wisconsin and continuing with development of Stemina’s proprietary metabolomics platform and patent filings.

Enzo Life Sciences (Enzo Biochem)

Enzo, a pioneer in nucleic acid labelling and detection, translates its strong IP and technical capabilities into a broad product line of assays kits, reagents and biochemical compound libraries. This blend of products enables Enzo to offer ‘whole biology’ solutions (from genotypic through increasingly rich phenotypic read-outs) to the drug developer looking to analyse the toxicological effects of novel therapeutics.

With Enzo products, examination is possible at the nucleic acid, protein or whole cell level allowing for the discovery, analysis and quantification of biomarkers. In particular, Enzo’s unique live cell assays in combination with its biochemical libraries offer a powerful analytical tool to the predictive toxicology market. In partnership with industry leaders, pharma/ biotech clients, CROs/CMOs and its own clinical lab, Enzo works to offer validated solutions for detection and treatment of disease.

Agilent Technologies Inc

As an example of Agilent’s commitment to advancing new approaches to drug and chemical safety, Agilent has established a pre-competitive, private-sector partnership whose growing membership sponsors research designed to demonstrate the feasibility of using in vitro-based toxicity testing approaches along with integrated in silico systems biology to conduct human health risk assessments.

Agilent offers the broadest range of innovative measurement solutions in the industry, helping scientists and researchers all over the world apply cutting-edge technologies to toxicology. With leading analytical products across the major omics (ie, genomics, transcriptomics, proteomics and metabolomics), combined with GeneSpring bioinformatics software suite, Agilent is uniquely positioned to enable integrated multiomics approaches.

Accelrys (expected to merge with Dassault Systèmes SA)

Accelrys’ leadership across pharma scientific innovation lifecycle management (SILM) stems in large part from its early focus on predictive science, with offerings such as Discovery Studio to provide comprehensive predictive science capabilities for computational chemists, toxicologists and other scientists engaged in drug design. Specific robust solutions have included ADMET, where available filters include human intestinal absorption, aqueous solubility, blood brain barrier penetration, plasma protein binding, CYP2D6 binding, hepatotoxicity and others. Other examples include Quantitative Structure Toxicity Relationship (QSTR) models with a patented Optimal Predictive Space validation method.

While keeping pace with its informatics data management solutions, Accelrys has also evolved its modelling and simulation strengths into development and manufacturing with lab process and management compliance suite, and its Pipeline Pilot has enabled creation and management of scientific protocols and implementation of standard business rules across company boundaries to meet the demands of today’s highly collaborative environment.

Now Accelrys is launching an innovative new cloud-based lab collaborative research suite called ScienceCloud. In the process it has attracted an acquisition offer from Dassault Systèmes SA, a global leader in simulation and PLM (product life cycle management). Leveraging Accelrys’ complementary process-based expertise, the combined entity are planning to create a scientific innovation platform that dramatically enhances the way pharmaceutical researchers collaborate, achieve visibility upstream and downstream, make key decisions, meet compliance regulations and integrate to other key enterprise systems.

Ingenuity IPA (now part of QIAGEN)

In April 2013 QIAGEN acquired Ingenuity Systems, best known for its IPA (Ingenuity Pathway Analysis) biological and toxicology-oriented knowledge base and analysis tools, with the stated intention of the combined entity delivering on the overall workflow of ‘NGS sample to valuable insight’. Already, Ingenuity is enabling QIAGEN’s new Rx-CDx collaboration platform as well as its new Clinical Lab initiative.

Ingenuity has invested more than 10 years in the innovation of semantic search, ontology and software development to create an all-in-one, webbased software application that enables researchers to model, analyse and understand complex biological and chemical systems – for example, using expression data to make predictions about likely downstream effects on phenotypes and toxicological endpoints. Or predicting the activation or inhibition of upstream regulators (transcription factors, kinases, microRNAs, cytokines, etc) that may be responsible for the observed expression changes.

Examining expression patterns in the data can lead to actionable hypotheses about mechanism or to rank candidate molecules. In fact, IPA-Tox® is a data analysis capability within IPA® software that complements more traditional methods by bringing integrated systems toxicology to safety assessment. In contrast to other data analysis solutions targeted at the tox research community, IPA has the strongest track record of proven success and adoption in pharma, due in large part to the breadth, depth and quality of molecular knowledge in the Ingenuity Knowledge Base. Indeed, IPA has been broadly adopted by the life science research community and is cited in more than 10,000 peerreviewed journal articles.

Ingenuity has fit well into QIAGEN’s new capabilities for GeneGlobe, the company’s web portal providing access to biological assays. Its new version integrates Ingenuity Target Explorer’s biological interpretation and references with GeneGlobe’s extensive library of wet lab assay solutions. This enables researchers to search and select from more than 31 million PCR assay kits and NGS assay panel products, plus genome-wide assay solutions for 28 species with any gene or pathway of interest.

The search, selection and interpretation solutions create a user experience that precisely identifies assays and reagents that fit a life science researcher’s experimental design and adds a comprehensive set of interpretation solutions that deliver three key functionalities to help life science researchers:

1) For genes of interest, information on molecular function, cellular localisation and relevant publications, even those found in very current biomedical literature.
2) Built-in biological filters to display interacting networks of genes or molecules, along with supporting evidence, that match a disease or tissue-specific context selected by the researcher.
3) Dynamically updated pathway maps, also linked to supporting evidence in the biomedical literature.

And with the recent announcement of Ingenuity® Clinical, a new web-based solution for clinical interpretation and reporting of insights from NGS-based tests, QIAGEN is launching what it believes will be the first product specifically designed to address challenges of scale, speed and decision support that clinical healthcare laboratories face in the adoption of NGS.

The time required to make accurate clinical assessments – especially as tests move from single-gene to multiple-gene to panels, exomes and whole genomes – is becoming a fundamental bottleneck and is slowing the clinical adoption of NGS. The new solution will provide clinical labs with automated scoring, interpretation and reporting of findings in standardised, HIPAA Safe Harbor-compliant formats. Drawing upon the vast clinical and genomic data in the expert-curated Ingenuity Knowledge Base, the company began collaborating with molecular diagnostics laboratories in November 2013, and announced plans for a 2014 launch of a larger beta programme.

Collaborators providing important input include several commercial and academic testing laboratories – more than 20 clinical testing laboratories are currently participating in the Ingenuity Clinical early access programme. Test indications supported have expanded from hereditary/germline to somatic NGS panels including hereditary cancer, somatic cancer, carrier screening, cardiovascular and neurological test indications.

Entelos PhysioLab (now part of Rosa & Co, LLC)

The 2013 acquisition of Entelos PhysioLab provides Rosa & Co, LLC with a strong extension of its modelling and simulation capabilities, directed at both safety and efficacy. With its classical pharmacokinetic/ pharmacodynamic (PK/PD) and increasingly its mechanistic modelling background, Rosa & Co has built a strong pharma customer base by using open source software and by involving users in the process of delivering custom models. These have been used largely to explain experimental observations and identify key areas of uncertainty, and increasingly to generate and test hypotheses.

Now with PhysioPD modelling, it sees significant interest in mechanistic models for exploration of complex biological systems, and for support in more key decisions throughout the drug discovery and development lifecycle. Rosa & Co also sees more activity ahead in identifying and interpreting biomarkers to guide patient selection, assess early response, and to assist with development of companion diagnostics.

The Entelos PhysioLab systems biology platforms were developed over 15 years and generate virtual populations providing highly predictive analyses which also deliver insights into the mechanism of action (MoA) not possible due to invasive procedures or through any other type of modelling. They provide comprehensive models of the pathophysiology of different disease including Diabetes, Rheumatoid Arthritis, Cardiovascular and Hypertensive disease, allowing scientists to explore the dynamics of physiology and impact of external influences on the physiology in silico.

Thousands of virtual patients are combined to create virtual populations representative of the patient demographic, providing insights into the physiological outcomes of both individuals and populations based on phenotypic and mechanistic distributions, including principles of consumer variability which are key to precision medicine.

Models can be calibrated to preclinical data as well as clinical trial and post-approval outcomes. In some cases models are specifically targeted at understanding and predicting toxicity mechanisms, such as the DILI (Drug Induced Liver Injury) model, developed in collaboration with The Hamner Institute. The DILI model was designed to provide a predictive approach to toxicologic risk, combining quantitative liver drug metabolism with liver physiology expertise to support researchers’ in silico explorations that could take months or years to perform in living people, while also greatly reducing the need for animal studies.

This approach is expected to advance understanding of why patients vary widely in severity and susceptibility to liver injury, and help translate results from preclinical animal models across species and to model human response. The ultimate goal is development of predictive clinical biomarkers and pre-clinical assays that will help identify patient types at increased risk for developing liver injury in response to specific drug and/or combination-drug exposure. The approach is envisioned to help guide the development of new diagnostic tests as well as new ways to test drug safety.

Markers of impact

At $5 billion revenue, the PredTox in vitro and in silico market is now about the same size globally as the in vivo animal preclinical testing industry (CROs), which is highly profitable. Although the PredTox growth outlook is now higher, one key potential constraint is clarifying market opportunities over a longer time horizon for investors. Our intention is to help lend this clarity through ongoing market research and open collaboration on market trends and drivers such as described in this article.

In addition, nascent initiatives are underway to enhance market efficiencies by leveraging synergistic life sciences network effects, as well as those trending in patient care and patient-as-consumer domains. Coincidently, the Molecular Diagnostics industry is also about $5 billion global revenue, and it will be interesting to watch the Companion Diagnostics subset in particular for signs of the strategic trends described in this article.

About the BCC report

Our recent report published by BCC Research provides an overview of the current and future landscape of how new pharmaceutical drugs are developed, from the perspective of how they are tested for safety. This perspective enables use of a manageable level of medical and scientific detail, integrated with regulatory, social and resource drivers, to enable extrapolation to a variety of future forecast situations.

Detailed forecasts are accompanied by ‘opensource’ annotation of drivers and frameworks to better evaluate opportunities across pharmaceuticals, drug development equipment and tools, bioinformatics and services such as contract research organisations (CROs). (Note: Biologics are outside the scope of this topic.) These insights will be most beneficial to product and service marketers, business development professionals, C-level decision-makers, investors, healthcare strategists and information/big data technologists seeking a deeper understanding of specifically how the output of the genomic revolution is fuelling exciting new solutions and markets.

These in vitro and in silico technologies are now enabling solutions to span wider segments of the healthcare landscape, from predictive toxicology in drug development, to biomarkers in clinical diagnostics and systems biology in patient care. Along with great promise for healthcare, this is driving rapidly expanding global markets, forecast to grow from $5 billion in 2012 to $10 billion in 2017.

Another feature of the report is a detailed appendix containing custom profiles of representative companies (Figure 4), with examples of specific products and technologies that relate back to the body of the report.

Figure 4 Representative companies in the BCC report

For more information, see http://www.bccresearch.com/market-research/pharmaceuticals/invitro-toxicity-phm017e.html.

DDW

This article originally featured in the DDW Spring 2014 Issue

Robert G. Hunter has more than 20 years’ experience in life sciences and healthcare as an analyst, entrepreneur and management consultant. Bob is the author of the new market/technology report ‘In Vitro Testing: Technologies and Global Markets’ (PHM017E), published in January 2014 by BCC Research. Additional recent work includes commercialisation strategies for multiple stem cell lines for true genetic diversity, and for proteomic biomarkers via label-free imaging. He is an early investor in a company using stem cells for ‘tox in a tube’ testing of new drugs in R&D. Bob is also founder of Predict-Medicine, drawing on experiences in disease management, biosensors and webbased enablers of participative health and wellness, and inspired by the visions of Dr Leroy Hood, Dr Eric Topol and others to help fill in this vision and bring it to reality. For more information, visit www.predict-medicine.com.

References
1
Halim, A-B. Proficiency Testing for Monitoring Global Laboratory Performance and Identifying Discordance. Winter 2013 Volume 44, Number 1 Lab Medicine e19.

2 Halim, A-B. Discrepant results from laboratories impact patients receiving heparin or antithrombin therapy. Biomark Med. 2011;5:211-218.

3 Halim, A-B. Impact of discrepant results from clinical laboratories on patients and pharmaceutical trials: Evidence from proficiency testing results. Biomarks Med. 2009;3:231-238.

increasingly play a unifying role.

Suggested Reading

Join FREE today and become a member
of Drug Discovery World

Membership includes:

  • Full access to the website including free and gated premium content in news, articles, business, regulatory, cancer research, intelligence and more.
  • Unlimited App access: current and archived digital issues of DDW magazine with search functionality, special in App only content and links to the latest industry news and information.
  • Weekly e-newsletter, a round-up of the most interesting and pertinent industry news and developments.
  • Whitepapers, eBooks and information from trusted third parties.
Join For Free