We've updated our Privacy Policy to make it clearer how we use your personal data.

We use cookies to provide you with a better experience. You can read our Cookie Policy here.

Advertisement

Data Reporting: Connecting the Islands of Automation

Various icons relating to data floating in front of a keyboard
Credit: iStock
Listen with
Speechify
0:00
Register for free to listen to this article
Thank you. Listen to this article using the player above.

Want to listen to this article for FREE?

Complete the form below to unlock access to ALL audio articles.

Read time: 8 minutes

In this article, we explore ways analytical scientists can improve their data reporting and sharing practices, leading to improvements in data integrity and regulatory compliance.


Laboratory process automation in the pharmaceutical and related industries has been a topic of discussion since the 1980s. To help achieve this, there have been many technical advances in applications such as instrument data systems (e.g., chromatography data systems (CDS)), electronic laboratory notebooks (ELN), laboratory execution systems (LES) and laboratory information management systems (LIMS). In addition, there has been convergence between applications; LIMS  applications integrated with LES functionality mean only one application must be implemented.


However, often the problem is not the functionality offered by an informatics application but the way a project is implemented in laboratories. In many cases, informatics application problems include:

  • Failure to design a fully electronic process for automation.
  • Not eliminating spreadsheets from a process.
  • Incorrect application configuration by over-interpreting regulations.
  • Not interfacing instruments requiring manual data entry into the application.
  • Partial instead of end-to-end process automation.


We will discuss some ways of implementing the informatics applications above to avoid running into these problems.


Although the focus in this article will be on good practice (GXP)-regulated laboratories, the principles outlined here are applicable for all. Even if GXP regulations do not apply, the business benefits of effective laboratory process digitalization will enhance business effectiveness, speed of decision making and collaboration inside and outside of an automated laboratory.


The aim of this article is to connect the islands of automation that exist in an ocean of paper in most laboratories to ensure an electronic environment.

Process, process, process


The starting point for an effective implementation of any informatics application is to understand the workflow that you are automating. Typically, this is a two-stage process:

  1. Map and understand the current (as-is) process
    This needs to be a logical mapping of the process with a team of subject matter experts (SMEs) from the laboratory. They need to document the activities in the process, including those not in current SOPs, and understand the data inputs and outputs and the records created. Any bottlenecks in the process are identified and the causes listed. Paper printouts and data vulnerabilities that need to be resolved in the second part of the mapping process can also be documented. Lastly, improvement ideas for the process should be listed. Process maps can be drawn after the workshop for review later.

  2. Redesign the process to work electronically (to-be process)
    The first part of the second phase is to review the “as-is” process maps from the first phase. Inevitably there will be changes and this is normal. Once the current process maps are agreed, the improvement ideas can be used to challenge the as-is process. This includes removing bottlenecks, eliminating paper and interfacing analytical instruments to avoid manual data transfer and the associated transcription error checking. The “to-be” process should be used as a blueprint for automating the process with the selected informatics application. 


Although redesigning a process takes time, the payoff is immense as the process is simplified and operates faster than currently performed. If an application automates an as-is process, there is a high likelihood that it will require more effort as the process will still be hybrid and less efficient with zero return on investment for the parent organization. Therefore, management must resource and support the process redesign effort.

Process mistake one: Partial automation


Advertisement

During training courses that I have conducted, several attendees have stated that the first phase of a LIMS project is to automate sample management alone. This is a mistake, in my view, as a complete process is not being automated and there will be no business benefit generated. There will be a lot of effort being put into sample management with pretty barcode labels but nowhere to use them during analysis.


Instead, ensure sample management is integrated into an end-to-end process that generates reportable results. This is not to say “automate the whole laboratory” but rather to ensure that say, raw materials testing can be automated in its entirety from receipt to release. Then additional analytical processes such as in-process, finished product and stability testing can be added incrementally.


This approach is vital for demonstrating to both users and management that the system works and delivers benefits.

Process mistake two: Failure to interface analytical instruments

If no analytical instruments are interfaced to an informatics application, how will analytical data be input to the software? Manually, obviously! However, it is better to now consider the impact of the updated EU GMP Annex 11, where proposals for validated software technical controls are preferred to procedural controls operated by users:


Clause 2 ... Configuration hardening and integrated controls are expected to support and safeguard data integrity; technical solutions and automation are preferable instead of manual controls.1


The regulation also states that digitalization should be considered:


Clause 3. An update of the document with regulatory expectations to ‘digital transformation’ and similar newer concepts will be considered.1


In addition to the business benefits of interfacing now, there is the probability of regulatory pressure to automate in the future. Therefore, when planning laboratory automation, it is essential, from both business and regulatory perspectives, that key instruments are interfaced in the first phase of any project.

Process mistake three: Failure to eliminate spreadsheets


Advertisement

Spreadsheets are ubiquitous in laboratories, easily available, easy to use and a compliance nightmare.2,3,4 They are also a hybrid system with signed paper printouts and an electronic file. The biggest problem is that as a hybrid system, you can’t have an electronic process. The calculations performed should be incorporated into the informatics application to prevent printouts and manual input of the data. 


The exception to this rule is where a spreadsheet can be used within an application such as an ELN where the application’s audit trail can monitor changes, but the data used in the spreadsheet calculation must be loaded automatically into the spreadsheet to avoid printing and transcription error checking. However, it is critical to check if the application audit trail can track individual actions within the spreadsheet because if interim changes between an open and save event cannot be tracked this is a data integrity failure. You are just changing a hybrid problem into an electronic problem.

Process mistake four: Failure to eliminate blank forms

The FDA has required quality control (QC) laboratories to control blank forms since 19935 and this has been iterated in regulatory guidance on data integrity from MHRA, FDA and PIC/S.6,7,8 If uncontrolled blank forms are used it is impossible to determine how many times the work has been performed before an “acceptable” result has been obtained. If blank forms are used the administrative controls required to ensure that data cannot be falsified result in a high overhead for the QA department to issue and reconcile these forms.

Process mistake five: Failure to use electronic signatures

Although a process may be automated, some laboratories don’t take the final step of using electronic signatures by the performer and reviewer of the analysis. This creates a hybrid system but also leaves the electronic records unlocked, at which point post-reporting changes could be made. This is unacceptable and therefore the use of electronic signatures is a natural outcome of an electronic or digitalised process.

Turning principles into practice


That’s the theory for laboratory automation. To see how this is put into practice, we interviewed Roger Shaw of Digital Lab Consulting (DLC). His principles for enabling laboratory connectivity, data sharing and regulatory compliance are to:

  • Ensure analytical instruments are connected to instrument data systems or informatics applications.
  • Connect informatics systems together and communicate electronically between them.
  • Ensure each informatics system supports applicable regulatory requirements including data integrity.


From these principles, there are further requirements. Instrument standardization is important as it simplifies interfacing as well as a reduced validation approach that can be taken after the first connection has been verified.


Unlike pharmaceutical manufacturing, where there are well-established data standards and protocols, there is a lack of data standardization and interface standards for laboratory instruments. For some projects, Roger’s team has used instrument interfacing standards developed by the SILA Consortium and data standards such as ISA-88 when combining process development and manufacturing data. For other projects, the data system from the instrument supplier provides a simple interface for connection.

 

Diagram showing a laboratory automation project.
Figure 1: Outline of a laboratory automation project. Credit: Bob McDowall.


Roger described a case study for a QC department in one site of a multinational pharmaceutical company, shown in Figure 1. The components of the project consisted of the following:

  • Gas and liquid chromatographs connected to a networked CDS.
  • pH meters and analytical balances connected to the instrument supplier’s data system which is equivalent to an LES in that workflows can be defined to link instruments together for assays.
  • LIMS connected to the CDS and instrument data system.

 

The workflows between the three informatics applications are:

  • The LIMS is responsible for sample management and reporting of results to production.
  • The pH meter and balances are interfaced to the instrument data system which acquires data from standards and samples and transfers the results of this work to the LIMS.
  • The sample identities and weights, and the reference standard purities are imported from the LIMS into the CDS for each batch of analysis.
  • Following chromatographic analysis and interpretation, all post-run calculations are incorporated into the CDS workflows and electronic signatures are used by the performer and reviewer of the batch.
  • When a reviewer signs the CDS report, all records are locked and can be viewed by authorized users but not changed unless unlocked.
  • The reportable result is transferred to the LIMS.
  • All sample test results are collated in the LIMS and a certificate of analysis (CoA) is generated electronically and electronically signed.


Business benefits as well as regulatory compliance are obtained with this approach.

Summary


Advertisement

This article explores ways to improve data reporting and sharing practices coupled with improving data integrity and ensuring regulatory compliance. To achieve these goals, it is imperative that the current process is analyzed to identify bottlenecks, use of spreadsheets and blank forms and identify data vulnerabilities. The redesigned process should eliminate as many these problems as much as possible by working electronically with electronic signatures.


To ensure success the applications selected to automate the new process must have adequate technical controls to ensure data integrity and regulatory compliance. Rather than be standalone, each application should be interfaced to transfer data and information electronically to enable effective scientific collaboration.