Dr Patrick Courtney (SiLA consortium), Burkhard Schaefer (SiLA consortium/ASTM AnIML) and Oliver Peter, Senior Director, Senior Group Leader Biology Technologies and Lead Discovery, Biology, Idorsia Pharmaceuticals, look at the importance of automation in the face of global disease.
As we write this, vaccines are being injected into arms at scale and pace in many countries and we see light at the end of the Covid-19 tunnel. Full recovery will take time but we must start to think of how to better prepare for the next serious epidemic.
E-commerce players had already massively invested in process automation, so when shops shut and shoppers went online, they were able to meet the exploding demand for consumer products. Lessons for the world of drug discovery can be learnt from this experience.
Global success scaling up diagnostics
In the UK, the lighthouse labs were established, recruiting staff and equipment to ramp up diagnostic testing capacity. In the US, companies such as Curative pivoted to meet the challenge, recruiting technology and developing the software needed to scale up. In France, screening capability from MGI was rolled out, while Korea deployed diagnostic testing booths early on. Sample pooling, a technique from the 1940s to save resources, was employed. It allowed broad population testing, in Luxembourg, Switzerland, Germany, the US and elsewhere to good effect. But while diagnostic capacities were successfully scaled up, and vaccines were developed at record speed, therapeutic options have remained limited [Courtney/Royall2021]3.
New drugs are needed
Back in 2003, SARS surged across the world. But once the threat subsided, interest and funding dried up. Without a market, the incentive to develop and commercialise cures faded. Fast forward nine years, and in 2012 a related disease emerged in the Middle East – MERS. When the worst did not happen, drug development again stalled. When Ebola hit Western Africa however, some efforts were pursued and Remdesivir was ready to be tried out.
Luckily, Covid-19 turned out to be amenable to vaccine protection. However, new and meaner variants are emerging, evading the combined selection pressure of epidemiological counter-measures and first generation vaccines. We might still lose this race to a massively more virulent variant, in sight of the finishing line.
Designing the next drug generation
Given the many years it normally takes to bring a new drug to market, there was much hope that we could repurpose existing drugs to treat Covid-19, and that tools such as artificial intelligence (AI) would help to identify promising candidates quickly. Building on existing clinical experience and with safety profiles already established, this short-cut approach promises quick results and considerable savings. Repurposing drugs has often been successful, the most famous example being Viagra, a substance originally aimed at angina. Thalidomide became a successful treatment for leprosy. And last year a well-established steroid, dexamethasone, was identified as having great value in the treatment of Covid-19.
But some argue it is time to look at new candidates too [Collins2021] 2. The unprecedented success of developing multiple SARS-CoV-2 vaccines in only about a year shows what can be achieved if technology platforms are ready. The search for treatments includes all therapeutic modalities, from small molecule drugs and biologics, to cell and nucleic acid based therapies. All the recently emerging tools can be thrown into this battle, from AI and data science to CRISPR gene editing and organ-on-chip technologies.
Drugs also support vaccine development
The Phase III SARS-CoV-2 vaccine trials conducted to date enrolled around 20-40,000 subjects in each study arm. Such large trials represent a big challenge, but these numbers are required to reliably determine vaccine effectiveness when disease incidence is still, or again, relatively low. The BioNTech/Pfizer trial for example saw just 170 infections across both arms, allowing the calculation of a protection effectiveness of 95%. It is a general dilemma that vaccine testing depends on an ongoing epidemic.
To validate vaccines more quickly and with far fewer subjects, human challenge studies were tentatively conducted. In these, small groups of volunteers are deliberately infected to generate the required number of cases under laboratory conditions. Yet many have questioned the ethical basis for this approach, with a potentially fatal disease and no specific treatment. For other feared diseases where this approach was taken, such as malaria, prophylaxis and treatment options are available.
Thus, the availability of effective treatments also enables challenge trials, which facilitate vaccine development even once a pandemic is no longer rampant.
We need to get ready to respond rapidly to new diseases
Studies indicate that there are numerous zoonotic viruses with epidemic potential. The next pandemic will undoubtedly hit mankind, it is just a question of time. Climate change also spreads neglected and emerging communicable diseases into the affluent parts of the world, such as Dengue, Chikungunya, Zika, West Nile Fever. And the threat of devastating multi-drug resistant bacterial epidemics is looming.
We must develop funding models for preventative drug and vaccine development. The cost of such global “health insurance” will be manageable compared with economic disasters such as the one we are witnessing in the present pandemic. Importantly, we need to build platforms to make it easier, quicker, and ultimately cheaper to discover drugs and adapt vaccines.
Automating discovery revisited
Various AI tools and approaches promise to accelerate drug discovery and development. These depend on the availability of massive amounts of reliable data. To generate large experimental data sets of good quality, lab automation is required. It ensures rich data is produced reliably and reproducibly. Digital data formats supporting the FAIR principles (findability, accessibility, interoperability, and reusability) facilitate data analysis and sharing with collaborators. Recognising the importance of these factors, the role of standards in enabling science is now widely accepted [CENELEC2021]3.
In the past 20-30 years, de novo drug discovery was substantially based on lab automation in the form of HTS (high-throughput screening) and more recently, HCS (high content screening). This successful use of automation has paved the way for high throughput experimentation [Carson2020]4 in adjoining disciplines. So, discovery and experimentation, AI and automation become increasingly combined [Coley2020]5.
Moderna provides a prominent example: Embracing digitisation as a core attribute and key enabling element [Damiani2017]6 the company succeeded by building on their existing screening platforms and repurposing them for their vaccine business.
Beyond discovery, the importance of drug product formulation has gained wide recognition recently, with new drug delivery mechanisms and materials coming to the fore. The success of mRNA vaccines owes much to the use of lipid nanoparticles. Despite its importance, formulation labs generally are still poorly digitised and automated.
Building the tools to meet the challenge
Leveraging automation to tackle the emergent global health challenges requires agile experimental systems and data integration at scale. Many types of instruments and measurement techniques must be joined flexibly to meet constantly changing requirements (from liquid handlers and plate readers, PCR and sequencing instruments, all the way to bioreactors and purification devices). Rich data and metadata must be captured at every step of the drug discovery and development pipeline, ready for multiple analytical techniques and dynamic process control. The resulting data sets must be easily accessible, enabling downstream usage and data analytics. This serves scientific excellence and also, where required, adherence to strict regulations such as good laboratory practices (GLP) or good manufacturing practices (GMP) [Schiermeier2018]7.
Two organisations collaborating to enable all of the above are the non-profit SiLA Consortium and the ASTM AnIML task group. The SiLA Consortium develops connectivity standards to enable rapid integration of laboratory instruments and process control software. The SiLA 2 communication protocol exposes the capabilities of any device as a network-accessible service. The built-in discovery functionality allows client software to recognise new devices in a network and determine how to communicate with them. This greatly reduces the effort to control a new instrument from one or multiple LIMS, ELN or process control software.
The AnIML task group at ASTM focuses on scientific data. It has defined an open, XML-based universal data format for laboratory instrument data complete with contextual metadata. Through its generic approach, AnIML supports and combines data from arbitrary analytical and biological techniques. Formats such as AnIML get us closer to the ideal scenario of having a common data format for all our instruments. AnIML is increasingly adopted by software vendors, and is productively used in many pharma labs.
Data and communication are inseparable
Combining a communication protocol such as SiLA 2 with a data standard like AnIML allows for seamless process flows. Files no longer need to be exported from the instrument, placed into a transfer folder, and be imported by a target system. No data gets lost, or can be modified in transit, ensuring compliance and data integrity. With interfaces standardised, data from all instruments uses the same mechanism, greatly reducing implementation time.
Such an approach has many tangible benefits. Shortening the time to generate actionable results speeds up product development. Quick and easy integration introduces flexibility to reconfigure labs to meet new challenges. And using open formats allows it to be used wherever needed.
Building a robust process based on data
As we have all witnessed with Covid-19 vaccines, the development and scale-up of robust drug production processes is a key challenge. Process development combines artful experience and data-driven engineering following Quality by Design (QbD) principles.
Developing a typical bioprocess involves many categories of data: Continuous recordings of bioreactor process parameters, in-line cell counts and biomolecular analyses, recurrent analyses such as size exclusion chromatography (SEC), SDS-PAGE, western blotting and other methods to assess the culture.
Assembling the diverse data into a live digital batch record constitutes a challenging task. Often it is still solved with manual work involving bespoke Excel spreadsheets. AnIML has demonstrated its capability of gathering all data from a bioprocess culture in a single XML document almost a decade ago. Embracing this approach facilitates process data management for increased process understanding and rapid improvement cycles.
Innovating and manufacturing at scale in partnerships
Partnerships and collaborations with contract research organisations (CROs) and contract manufacturing organisations (CMOs) are essential to reduce time to market and scale up of manufacturing. Process transfer between partners can be rate limiting. Different equipment is often used, so standards such as AnIML help in exchanging the data involved in process transfer and quality control. The transport and access to this data efficiently between partners is increasingly cloud-based.
Embracing cloud solutions
Despite initial scepticism, the use of cloud data sharing services and communication and collaboration tools such as Office 365, Google Drive and AWS has become widespread in R&D. This is today supported and increasingly enforced by organisational IT functions. The benefits are broadly (if not often explicitly) accepted [Roe2020]8. The advantages of data clouds include secure system access, worry-free backup and archiving, document version control, and scaling of IT services as needed without investment in physical infrastructure.
The benefits of cloud connectivity became strikingly obvious, as physical distancing rules and home office rules due to Covid-19 have scattered work teams and forced research labs to adopt time-shifted operations.
Artificial Intelligence and Machine Learning methods, as well as collaborative visualisation technologies (Augmented/Virtual Reality AR/VR/XR), often require massive computing power that is best accessed through the cloud, offered as a service by specialised providers.
Ultimately, cloud systems also allow remote access to physical systems, regardless of location and time zone. Through the lab version of the IoT (Internet of Things), experiments and tasks could be safely planned, executed, and analysed from anywhere.
SiLA enables cloud labs and edge solutions
However, connecting physical, local lab systems to the cloud poses a problem. They are often tied into segregated lab networks, isolated by corporate firewalls. Traditionally, LIMS and ELN systems initiate a connection to an instrument, send commands, and receive results. This becomes more complex when the system is cloud-hosted. SiLA enables cloud connectivity by introducing connections in the opposite direction: The instrument initiates a connection to the cloud system, then waits for commands from this client. Strict security-by-design, certificate-based authentication and end-to-end encryption allow for a safe environment, even in a cloud setting. For instruments not equipped with the SiLA cloud feature, the SiLA community is creating an Edge Gateway. This software runs on the local network and provides a central point of connectivity to the cloud. In this way, any SiLA-compliant instrument can be safely connected to the cloud without additional development by the vendor or user.
Automating R&D processes contributes to a safer future
Despite warnings, the Covid-19 pandemic caught us painfully off guard. By embracing new technologies to rapidly discover and develop treatments, including flexible automation of experimental data generation and production processes, we are hopefully on the way to a safer, healthier future.
About the authors
Dr Patrick Courtney has 20 years industrial experience in technology development. He worked as Director for global firms such as PerkinElmer, as well as at Sartorius and Cap Gemini. He leads a European working group on laboratory robotics and is member of board of directors of SiLA (Standards in Laboratory Automation). He holds an MBA with a PhD in Robotic Engineering/Molecular Biology and has 100 publications and ten patents.
Oliver Peter is Sr. Group Leader, Biology Technologies and Lead Discovery at Idorsia Pharmaceuticals. He obtained his PhD from University of Zurich. In 2008, he built up the HTS and compound management facility and the biobank at Actelion, now Idorsia. Peter co-founded the SiLA Consortium in 2008 and serves on its board since.
Burkhard Schaefer is a computer scientist turned lab informatics specialist with 20 years of experience in the field. He is co-founder of BSSN Software, now part of Merck. Schaefer was already working on the standardisation of device interfaces and data formats at the Los Alamos National Laboratory and the National Institute of Standards and Technology. Today his focus is on strategies for consistent data management and integration. He is a member of ASTM, where he pioneered AnIML technology as an architect. He is also a member of SiLA’s Board of Directors.
- Dolgin, E. (2021), The race for antiviral drugs to beat COVID — and the next pandemic, Nature 592, 340-343 doi: https://doi.org/10.1038/d41586-021-00958-4 14 April 2021
- Collins, F. (2021) quoted in “Finding what works”, Economist Mar 27th
- Courtney, P. & Royall P.G. (2021). Using Robotics in Laboratories During the COVID-19 Outbreak: A Review. IEEE Robotics and Automation Magazine, 28(1), 28-39.
- CENELEC (2021), Putting Standards in Science workshop – Organ on Chip, 28-29 April
- Carson, N. (2020). Rise of the Robots. Chemistry–A European Journal, 26(15), 3194-3196.
- Coley, C.W., Eyke, N.S., & Jensen, K.F. (2020). Autonomous discovery in the chemical sciences part I: Progress. Angewandte Chemie International Edition, 59(51), 22858-22893.
- Schiermeier, Q. (2018). For the record Making project data freely available is vital for open science. Nature, 555(7696), 403-405. https://media.nature.com/original/magazine-assets/d41586-018-03071-1/d41586-018-03071-1.pdf
- Damiani, M. (2017), Building The Digital Biotech Company: Why and How Digitization is Mission-Critical for Moderna, white paper.
- Roe, R., (2020), Covid-19 accelerates the adoption of remote working tools, Autumn 22-25, Scientific Computer World; Smartlab informatics supplement 2020