Accelerating time to market through digitalisation

""

Sascha Fischer Business Development Manager for Automation at Genedata looks at three critical digitalisation components that enable end-to-end lab automation, improve the return on investment (ROI) on instrumentation investments and streamline time-to-market. 

Heightened regulatory demands, increasingly competitive markets, more challenging emergency medical needs, and remote working (underscored during Covid) — are just a few of the pressures on current drug discovery and development. The R&D community’s response to these pressures is, in part, investing even more in laboratory automation, such as mobile robots or liquid handlers, enabling fully automated cell culture, protein purification, screening assay execution, and sample logistics systems. All these investments intend to shorten project cycle times and get better-qualified leads into clinical development faster. 

However, automation of experimental wet lab execution does not fill the funnel faster and better. To achieve this, you need to enable true end-to-end automation, including the digitalisation of your automation workflows, so that everything from the experimental setup to reporting the analysed results happens with minimal human intervention. Digitalisation is requisite in three key areas: 

  1. automatic data capture and transfer 
  2. automating data processing and analysis (from the simplest to the most complex assays) 
  3. autonomous decision-making 

Data Capture and Transfer 

Regarding automating data capture and transfer, labs are mired in old-school manual processes. Consider this typical scenario: An R&D lab wants to automate drug discovery assays and purchases costly, state-of-the-art laboratory automation hardware deployed to prepare and execute experiments from start to finish in an unattended fashion. This instrumentation generates data on hundreds of thousands of test molecules without manual intervention. Yet, automation is nowhere in sight when it comes time to process the data. Scientists must use USB drives to transfer data from an instrument to a file server or computer workstation for data analysis. The process is inefficient, error-prone, and dilutes the benefits of your lab automation investments.  

To address such inefficiencies and to automate data capture and transfer, companies should implement data analysis platforms that automatically load data from plate readers gathered over 24 hours at a predefined time. Platforms that can analyse data without a single click, and report results to the data warehouse whilst notifying project teams that the analysis is complete and results are available, are ideal. By notifying team members with a link to their results, they are able to review, backtrack to the raw data, or adjust the data processing steps as needed. 

Here are some of the business-critical criteria such companies in their selection of a data analysis platform: 

  • Capabilities for vendor-agnostic integrations with analysis instruments, lab automation hardware, and compound management and automation scheduling software, giving them the flexibility to adopt cutting-edge systems. 
  • Compatibility with the existing data infrastructure for reduced deployment and maintenance overhead, minimising the informatics team’s efforts. 
  • User-friendly interfaces for setting up automated data transfer and analysis. 
  • Central visualisation dashboard to monitor experimental quality in real-time when needed. 

Automating Data Processing and Analysis 

Automation of data analysis applies not only to the simplest assays. Today, there is a demand to move high-information assays with multiparameter readouts earlier in the pipeline to obtain more qualified hits. While promising, applying such assays at scale during the initial discovery stages creates a practical challenge: automatically processing the data volume consistently and efficiently. This is a key concern of digitalisation.  

As you process and analyse complex data, you encounter a series of different decisions, such as quality control, model fitting, and result validation. Often these decisions can be subjective, and different people or even the same person over time will make other choices, leading to inconsistent outcomes. Moreover, mislabeled training data affect the performance of AI-based algorithms. 

These challenges have, however, been addressed as demonstrated by our collaborations with companies such as Amgen, AstraZeneca, and Roche. The following cases involve kinetic assays, surface plasmon resonance (SPR), and high-content, image-based screening. They illustrate that even with the most complex assays and lab data requiring many layers of quality control, validation, and decision-making, it is possible to automate analysis. 

Kinetic assays probe the mechanism by which a candidate drug interacts with its target or its modality of inhibition. Characterising these different mechanisms early on enriches candidate sets with desired features. Therefore, discovery programs incorporate detailed mechanistic studies during early hit findings. The challenge: scientists must make complex decisions taking into consideration raw data quality, models that best describe the data, model fit quality, and scalability. Collaborating with AstraZeneca, we’ve met this challenge head-on by creating an automated, multistage analysis workflow for high-throughput mechanistic analysis in the Genedata Screener. Every step of the workflow relies on user-defined standards or empirically determined criteria, such as choosing the right time window for analysis based on controls or selecting the best mechanistic model from built-in models in the platform. This automation capability reduced AstraZeneca’s analysis time for a full-deck screen from 30 hours to 30 minutes. Moreover, it made the analysis more objective, consistent, and robust. 

SPR is a biophysical method for assessing molecular interactions in a direct, time-dependent, and label-free manner. Traditionally, SPR assays occurred later, during hit-to-lead or characterisation stages, as they were too low throughput. Akin to the biochemical kinetic studies described above, however, technological innovations now allow SPR to be performed earlier and at higher throughput. Quality control of SPR experiments often involves: 

  • A visual review of raw SPR sensorgrams. 
  • Selecting the most appropriate fit model for each candidate. 
  • Annotating it accordingly. 

As with biochemical kinetic studies, this can be subjective and time-consuming. In collaboration with scientists at Amgen, we applied AI to automate SPR data analysis. First, our platform triages raw sensorgrams, analysing only those with sufficient binding. Then using AI, the platform automatically classifies data for each tested drug candidate as best fit by a kinetic or steady-state binding model. This AI-driven, automated workflow enables the platform to instantaneously choose the correct model more than 90% of the time.

High-content, image-based screening, a phenotypic approach, is target-agnostic, producing various phenotypic endpoints. This enables the characterisation of a candidate by relating its profile with known mode-of-action (MOA) compounds. Due to the volume and multiparametric nature of imaging data, AI is a popular tactic for data analysis. Yet, AI algorithms must be trained on a large set of pre-classified images, and the creation and annotation of those training datasets create a major bottleneck. Here, a deep learning-based HCS image analysis solution automates training data curation. This solution has a highly intuitive interface, unlike traditional AI methods requiring scripting skills and long hours of fine-tuning parameters. It automates the definition of phenotype sets used for training deep neural networks and the analysis of production-level screens. This makes AI-based analysis versatile and accessible to cellular biologists without special image analysis expertise. These capabilities enable organisations such as AstraZeneca to deploy this solution across multiple therapeutic areas and early stages of drug discovery.  

Autonomous Decision-making  

Looking to the future, many pharma R&D organisations have already undertaken big laboratory automation initiatives, setting up interconnected multi-room laboratory automation facilities designed to operate in a largely unattended fashion. With minimal human intervention, these labs should operate as closed-loop systems, where a suite of wet lab experiments is automatically conducted, resulting raw data are automatically analysed, and outcomes are used to drive subsequent decisions. 

To illustrate this approach in a real-world setting: Currently, during a high throughput (hundreds of plates) experiment on a lab automation workstation, the most advanced data analysis platforms can monitor and notify scientists in real-time on a per-plate basis when quality criteria fall too low so that scientists can stop the run manually. In the near future—and something we are actively working on—it will be possible to automatically pause the workstation from processing further plates when QC criteria are poor. Alternatively, the assay stops automatically once specified project milestones are reached, such as identifying a defined number of hits. In another example, a data analysis platform could determine outliers in an experimental run and automatically enable a predefined number of re-runs for these outliers. Finally, hit results could be combined with other information, such as structure-activity relationships or orthogonal assay results, to select hits automatically. Then, through integration with compound registration software and lab automation hardware systems, the system could automatically set up and trigger further validation assays on the selected hits using laboratory automation devices. Together, the automation of each step forms a fully closed automated testing cycle requiring minimal human intervention.  

The ultimate vision for autonomous decision-making combines experimental loops with prediction systems. This model has the potential to create a powerful R&D accelerator. It can take lab automation to an even higher level of autonomous decision-making. AI-based systems could gather information from in-house experiments, the public domain, design input from scientists, and a mesh of desirability criteria to predict the next molecules to make. Other systems could find optimal synthesis or engineering routes, set up automated synthesis and expression of these molecules and feed them into the next testing cycle. At this level, you have highly autonomous decision-making, where humans are involved to supervise and lead overall project direction. 

The realisation of this vision for autonomous decision-making demands the tight interconnection of software and hardware systems. Moreover, it requires that underlying data be high-quality and FAIR (findable, accessible, interoperable, and reusable), so that prediction systems can use them. This requires high-quality materials, well-designed experiments, consistent and standard analysis, and rigorous quality controls. The road to autonomous decision-making is paved with a strong digital backbone that prioritises data workflow automation. This heightened level of autonomous decision- making will liberate scientists to focus on innovations that advance novel drug discovery and accelerate the time to market new life-saving therapies.  

About the Author 

Sascha Fischer is a Business Development Manager for Automation at Genedata where he collaborates with the world’s leading pharmaceutical companies and their laboratory automation providers in the digitalisation of automation workflows and data processes. 

DDW Volume 24 – Issue 2, Spring 2023

Sascha FischerAbout the author:

Sascha Fischer is a Business Development Manager for Automation at Genedata where he collaborates with the world’s leading pharmaceutical companies and their laboratory automation providers in the digitalisation of automation workflows and data processes.

Suggested Reading

Join FREE today and become a member
of Drug Discovery World

Membership includes:

  • Full access to the website including free and gated premium content in news, articles, business, regulatory, cancer research, intelligence and more.
  • Unlimited App access: current and archived digital issues of DDW magazine with search functionality, special in App only content and links to the latest industry news and information.
  • Weekly e-newsletter, a round-up of the most interesting and pertinent industry news and developments.
  • Whitepapers, eBooks and information from trusted third parties.
Join For Free