Robust digital workflows can reduce errors and time to compile data leading to greater transparency, faster reporting and help biopharmaceutical companies reach regulatory submission faster. Ken Forman, Senior Director of Product Strategy, IDBS explains.
For many start-ups operating on limited investment capital, the ticking clock can be overwhelming. Teams must plan for post-approval activities while managing spend—and waiting nervously for regulatory approvals like their Prescription Drug User Fee Act (PDUFA) date to arrive.
Mature organisations must race to deliver key new patient therapeutics while meeting market revenue expectations. But first, there is the urgency of getting the new drug application (NDA) or biologics license application (BLA) filing into the hands of the respective regulatory authorities. The increasing complexity of therapeutics, including monoclonal antibodies and cell and gene therapies, along with building a supply chain for manufacturing, be it captive, fully virtual or hybrid, adds pressure as QC teams scramble to determine what data is critical to capture and what will be relevant to file.
Savvy organisations anticipate these challenges early in the drug approval process. Biopharmaceutical Lifecycle Management (BPLM) is key to delivering novel therapies and lifesaving drugs to the world. Ideally, it encapsulates every stage of drug development including drug candidate identification, clinical trials establishing efficacy, manufacturing processes, and supply chain activities for delivering these therapies to patients. Managing product lifecycle data end-to-end is a huge challenge. Each of these activities typically lives in separate parts of an organisation; specialists, equipment and digital tools are customised for those needs. In addition, many organisations outsource parts of the lifecycle to partners like contract development and manufacturing organisations (CDMOs).
While opportunities to enhance speed to approval can present themselves everywhere from faster onboarding and training of new personnel to applying lean principles for approval processes, one area often overlooked at the planning stages is a strategic approach to data selection, collection, management and reporting. This complexity can result in data residing in siloed, disparate systems—or even worse: on paper. The lack of common context, for both approaches, creates data integrity challenges which in turn can put successful filing at risk.
The US Food and Drug Administration (FDA) and the European Medicines Agency (EMA) actively promote a quality by design (QbD) approach to ensure product quality. However, the ability to use statistical and risk-management tools is often limited by the lack of data visibility and accessibility. Deploying a digital workflow with a common data backbone can shortcut the path to a deep understanding of data across the product lifecycle—and can clear the way for faster regulatory approvals.
Establish the digital workflow
When racing to file, it can be tempting to push teams towards the solutions that appear to be the easiest up front. An extreme but unfortunately still common example is the dreaded spreadsheet workflow.
Can spreadsheets be quickly set up to capture data by hand? Of course, they can. But what happens next? Manually entered data must be reviewed and approved at the point of capture. Then, it must be re-contextualised for comparison and analysis with data in other spreadsheets. Finally, it must be reformatted for regulatory authorities, with yet another review and approval cycle. Only then is it ready for submission. As the amount and complexity of data increases so do timelines and risks to the filing’s accuracy.
A key starting point is to design a digital workflow. Science has determined the most critical data to the drug manufacturing process, with clearly defined Critical Quality Attributes (CQAs) and Critical Process Parameters (CPPs). But how is the data gathered, and how is it going to be managed, monitored, analysed and reported on?
With an outsourced supply chain, eliminating spreadsheets and paper may not be a feasible starting point. This is especially true in the context of long-established partner relationships. Legacy processes often involve passing secure spreadsheets along with PDF documents like Certificate of Analysis. Still, approaching old agreements with a keen interest in improving speed and data quality can benefit both sides.
The process of designing a digital workflow is in itself an enlightening exercise. The design process will expose the current risk points that exist every time data is generated, touched, moved or altered. A thoughtful design process can also pinpoint slowdowns. Perhaps sponsors are waiting too long for data from partners, or vice versa.
Before making changes, it is important to identify the highest priority challenges. Then, incorporating digital tools can improve partner collaboration and streamline data management and sharing. A centralised GxP data backbone, for example, can enhance data integrity by creating a simpler architecture for storing and finding data and minimise error-prone touch points.
A data backbone can also make it easier for all partners to view and access the right data at the right time. Tools like 21 CFR Part 11-compliant, cloud-based products enhance visibility to the backbone. Whether new tools and procedures are provided by a vendor or developed in-house, however, change takes time. It is best to get an early start on these activities.

Building process robustness
According to The Product Quality Research Institute: “The ability of a manufacturing process to tolerate the expected variability of raw materials, operating conditions, process equipment, environmental conditions, and human factors is referred to as robustness.”
Many companies see this as a post-approval concern, but in reality, adopting a culture of process robustness even during process performance qualification (PPQ) and clinical trials manufacturing will enhance the odds of a successful approval, possibly even accelerating it. The biologics license application (BLA) form itself calls out the requirement for meeting “Good manufacturing practice regulations in 21 CFR Parts 210, 211 or applicable regulations, Parts 606, and/or 820.” The application and adherence to GMP can be most quickly and easily proven through the application of and attention to process robustness.
To anticipate slowdowns in a process robustness programme, anticipate where time is best spent with regards to data analysis and understanding. While regulatory guidelines and requirements demand comprehensive data collection, reviews and reporting, applying a monitor-by-exception approach can focus resources on the most flagrant problems needing immediate attention. This is done through the application of alerts that are generated when data violates defined specification limits, or strays from targets violating some defined ruleset such as the Nelson or Western Electric rule sets.
This is again where the digital workflow plays a key role. Sloppy data is a “human factor” that can decrease robustness. Any time spent capturing manufacturing process and analytical data requires extra work and increases the probability of missing or inaccurate data and compromises data integrity. Companies do not want scientists to painstakingly transcribe data from batch records to spreadsheets for continued process verification (CPV) when they should be monitoring drug material and process performance, but this is still happening. Good data is a basic requirement in determining whether processes are performing as expected.
If teams want to design robust processes quickly, they need to be able to efficiently analyse and react to process data. A monitor-by-exception approach can focus attention on potential areas of clear concern. Achieving process robustness requires resources who can apply statistical methods and analysis to identify more obscure problematic trends that may be developing. Having engineers, who are well-grounded in mathematics and statistics, is key to this effort. Additionally, many companies are pursuing applications of artificial intelligence (AI) or machine learning (ML) models to spot trends, but due to the relatively small amounts of data generated in trials, these new approaches are still in their infancy.
Advanced process control modeling tools, including bioprocess scale-down models, have long been recognised and valued by regulatory agencies. But moving from reactive monitoring to proactive process control requires justifying the predictive capabilities of these models with real data. Many companies also use statistical software for design-of-experiments (DOE) and multivariate data analysis (MVDA) to gain better insight into sources of variability. The core challenges of systematically capturing process development and manufacturing data and making it both contextually relevant and easily accessible for analysis remain an obstacle to progress, however. The same challenges appear in the technology transfer process.
Role of tech transfer
Technology transfer at different stages within the biopharmaceutical lifecycle, both between departments within the same organisation and increasingly outsourcing to external partners, adds another set of challenges. The lack of standardisation in data management and process documentation and limited digital data sharing means that tech transfer can be problematic for all involved.
In principle, standards such as ISA-88 for batch control should help minimise uncertainty and miscommunication when development, manufacturing, and quality assurance information is transferred between teams.
Two aspects of ISA-88 are particularly relevant for this purpose: the separation of process requirements from equipment capability, and the modular design approach. Bioprocess engineers are already familiar with the concept of scale-up and scale-down assessments and defining an overall process as a sequence of unit operations. The focus of process development is developing the general recipe, in other words the processing sequence required to produce a given product regardless of scale. This general recipe can then be transformed to a master recipe when the process is run on a particular set of equipment, ie. tech transfer.
With the right digital standards for effective data sharing in place, tech transfer could even support continuous improvement and enable learnings gained at different scales and/or sites to focus development and optimisation efforts on the areas of greatest impact to product quality and yield.
The reality is more complex, however. Many commercial systems, such as manufacturing execution systems (MES), are technically ISA-88-compliant. But ISA-88 provides standard structure rather than standard code. That means the actual implementation can vary significantly even within the same company. In addition, many systems are heavily focused on execution rather than process definition. As a result, they do not easily support a lifecycle approach.
To make the most of tech transfer, organisations need more precise digital standards. A more promising approach is to use BatchML (Batch Markup Language) and B2MML (Business to Manufacturing Markup Language). These XML schemas are designed to better facilitate the exchange of process definitions while retaining ISA-88 compliant structures.
Let’s achieve harmony…
One last challenge for effective data management is the lack of common terminology for processes.
Over the last decade, many pharmaceutical companies have undertaken internal ‘harmonisation’ projects. These projects aim to standardise the use of common terms for procedures and systems among employees. Still, discrepancies can arise from organic growth, as new sites are set up worldwide and develop their own internal procedures, especially when manufacturing new products. Discrepancies are also exacerbated as large biopharma companies grow through acquisitions. Many large pharmaceutical companies inherit conflicting vocabularies and schemas upon acquiring smaller companies. The longer they wait to address data exchange the more disruption it causes.
The lack of common terminology for parameter naming, for example, might lead to confusion among process engineers. But it can also cause more serious discrepancies between in-process control data supplied from two different sites which use different parameters for quality comparison. This can lead to poor product release decisions and even FDA ‘Form 483’ write-ups around data integrity.
While a fully harmonised and digitised biopharmaceutical lifecycle is still a long way off, there are bright spots. Allotrope is a project for analytical data standardisation and collaborations between biopharmaceutical companies and software vendors such as BioPhorum Operations Group are showing progress. The outputs of these efforts such as the BioPhorum position paper on continued process verification are essential to moving away from highly customised point solutions and towards shared digital data standards.
The drivers for reducing the time to prepare for regulatory filings are clear. Patients are waiting for new biopharmaceutical therapeutics to become available. Start-up and mature companies are all trying to manage cashflows and meet financial expectations. While digital strategies may not be the first solutions that come to mind, the ability to manage and securely share data across a network of manufacturing partners and regulatory agencies is fundamental to optimising this process. Greater insight into the entire biopharmaceutical lifecycle can improve the availability of effective therapies worldwide.
DDW Volume 24 – Issue 1, Winter 2022/2023
About the author:
Ken Forman has over 28 years of experience and expertise in IT, Operations, and Product & Project Management focused in the software and pharmaceutical space. Prior to joining Skyland Analytics which is now part of IDBS, Forman served as Director, Project Management, NAM at BIOVIA Dassault Systemes and held multiple director positions at Aegis Analytical.