Listen to this article on the DDW Podcast:
The Winter issue of the Drug Discovery World Magazine is accompanied by an exclusive DDW & SLAS2022 supplement ahead of SLAS2022 in Boston, US on 5-9 February. Sophia Ktori outlines the importance of data integrity in labs at every stage of product development and manufacturing on behalf of SiLA.
The concept of data integrity hinges on the need to safeguard and guarantee the consistency, completeness and accuracy of data throughout its lifecycle. Ultimately, lab-based operations must ensure that raw and derived data cannot be manipulated, and is fully traceable. The ALCOA acronym – and updated ALCOA+ extension – denote the major assurances that must be in place, for instance that data should be attributable, legible, contemporaneous, original and accurate, and, for ALCOA+, also complete, consistent, enduring and available.
For labs in pharma and other regulated sectors, maintaining data integrity is critical at every stage of product development and manufacturing. The potential for accidentally or intentionally changing or omitting data could ultimately impact on product and patient safety, even if there is just a simple transcription error. And when data integrity issues are flagged at a regulatory audit, hefty fines, site shutdown, and even lawsuits, may ensue. It’s a growing concern. A significant portion of the increasing number of FDA-issued warning letters1 issued year on year highlight data integrity violations or insufficiencies.
It might seem at first sight that issues with data integrity might primarily impact paper-based processes, but laboratory digitalisation and automation are not guaranteed solutions. Digital systems may be just as vulnerable to data integrity breaches if there is inadequate validation of controls, there is the potential for users to manipulate data, record manually into the system or transfer unsecured files, and/or there is a lack of traceability, and audit.
In fact, labs will likely present with both digital, and paper-based vulnerabilities. “The lab will typically have a LIMS, and a range of instruments, but then there is likely to be a lot of paper in between, so much work still has to be done manually,” explains Christophe Girardey, CSV & QA at Wega Informatik, which specialises in IT and informatics solutions for the lab, R&D, and clinical development sectors. Use any paper-based process and there is always the possibility that some of your data gets metaphorically or literally thrown in the waste paper basket, Girardey comments. “And, of course, that represents an immediate issue when considering ALCOA+ principles.”
The situation is compounded by the inevitability that today’s labs will likely house multiple hardware and software systems that need to be integrated to interact and communicate instructions, data and results, potentially in multiple directions, both within and out of the lab. “Historically, instrument integration and data transfer was based on file exchange, orchestrated by a LIMS or ELN, for example,” points out Burkhard Schaefer, head of partner management at the non-profit SiLA (Standardization in Lab Automation) consortium. “However, this mechanism is error prone, can be manipulated, and is incomplete.”
In fact, whenever there is a physical file, there will be potential data integrity issues around folder and file protection, because someone will ultimately, need to retrieve and read that file. Ideally, seamless instrument and software connectivity would be based on a fileless communication channel. “Errors, missing data and disruptions can then easily be detected, identified and reported,” Schaefer adds.
SiLA has been developed as a communication standard that supports the assurance of data integrity, through fileless communication “which means that there are no files on the disk, so there is no opportunity for manipulation, loss, or mistakes,” Schaefer states. Set up an integrated hardware and software infrastructure that supports both SiLA as the communication standard, and AnIML – Analytical Information Markup Language (AnIML) – as the standard for data, and you then also have “security in the accuracy, traceability and integrity of your key raw and results data., and all metadata”
Results data derived from raw data is then digitally signed, and assigned to the correct test or experiment protocol and samples in the LIMS system. “The digital signature effectively proves that data has not been manipulated since it left the instrument,” Schaefer says. “So you have this entire chain of trust that extends from the instrument.”
On top of supporting data integrity imperatives, the SiLA and AnIML standards support data security. SiLA employs transport-layer security (TLS) out of the box, and supports certificate-based authentication. “An instrument will recognise communication by an authorised client (software), and will not accept commands from unauthorised parties,” Schaefer comments. “Similarly, the control software, which might be a LIMS, can positively identify an instrument, which then prevents ‘man in the middle’ attacks, where someone tries to usurp the place of the validated instrument.”
Importantly for data integrity considerations, SiLA enables a modular validation strategy. “Each instrument, and more precisely, the SiLA interface, can be validated separately,” Schaefer says. That same process is carried out for every instrument separately and independently, and also for every client. “So I can demonstrate for each client, say a LIMS or ELN, that it properly implements the standard and can handle the results that come back … Adding a new instrument system effectively won’t change the validation status of existing systems, which makes the whole environment easily scalable, even in a regulated sector.”
As Girardey adds: “This is a major benefit of using a standard such as SiLA, in that you don’t need to validate each time you interface a new instrument with your LIMS, or even just upgrade your firmware. A typical challenge for labs is that each time they upgrade an instrument they need to consider validation, but using SiLA the process becomes more modular, so you can ensure that if you have several devices of the same type, you only need to test once, and then you can assume that the others would work well. And as soon as you add standardisation, you have on the one hand, reduced programming time and integration effort, but also, automatic support for data integrity obligations.”
In fact, supporting standards represents a win-win situation for every stakeholder in the lab, Girardey continues: “Lab managers can evidence how using standards to facilitate instrument and software integration saves time, by removing the need for manual data input or transfer, and removing the need to have four-eyes on results transfer, which effectively saves on resources. The scientists are happy because they can then spend more time doing their science, rather than completing documentation. Quality managers are more confident of their processes as data is traceable, and can’t be manipulated.”
Volume 23, Issue 1 – Winter 2021/22 | SLAS2022 supplement
About the author
Sophia Ktori has been a freelance writer and editor or 25 years, with a focus on life science sectors, including pharma, biotech and medtech. She also has a keen interest in emerging ‘omics technologies, and in the continuing evolution of laboratory informatics and automation.