PDA Letter Article

Data Integrity and the Preapproval Inspection

by John Godshalk, Biologics Consulting

An Inspector’s Experience

Data integrity has always been an important part of application review and cGMP inspections. In recent years, however, it has become more important due to the U.S. FDA’s renewed emphasis on the integrity of data in electronic and paper-based formats. FDA has seen more data integrity issues of late, possibly due to the increased use of computers, automation and complexity of processes. The recent FDA draft guidance, which came out in April (1), is an important step by FDA to clarify the current thinking and policy about data integrity.

As a former FDA inspector, it was always my job to read the BLA or NDA, then perform a preapproval inspection (PAI) based upon that information, looking for possible changes or inconsistencies along the way. In a similar way, the recent guidance emphasizes that data must be reliable and accurate, and that companies are expected to have risk mitigations in place to reduce or eliminate possible data integrity issues.

I thought it would be interesting to take a look at some data integrity audit/inspections observations I’ve made over my 15 years in the industry. I begin with the regulatory authority application.

Conformance to the Application

Data integrity is important for a regulatory authority application, both between the original data and the application and the PAI and the application. Performing QA on the application is an important internal control to make sure the original data is in complete agreement with the data in the application.

During a major review of the CMC section for a BLA, I discovered several cases where the data tables were either transposed in error or contained wrong data or metadata (usually typos, copying of the wrong line or incorrect transposing).

During a PAI as an FDA inspector, another inspector and I discovered the process was different than in the BLA. I remember we were doing an inspection for a biologic and the company had two filters on the line prior to filling. In this case, filtration could have an effect on the product, so we were very concerned that this was not in the original BLA filing (there was a drawing of the filling setup in the BLA with no filters present). This finding went on the 483 and the company had to justify (show data to support its use) the use of the filters, submit the extractables/leachables and filtration validation data and amend the BLA prior to approval.

Laboratory Records

As a practical example, an inspector can audit a laboratory assay at the notebook level, eye that same data transcribed into a LIMS system, and then review a printout or datasheet to ensure all the data remains the same across the board. There are several examples from my experience in this category, some using a LIMS system and some without:

  • Illegible printouts from an instrument in a batch record
  • Different data reporting in a LIMS than in the method validation
  • Use of unverified Excel spreadsheets, also unlocked spreadsheets
  • Method in batch record does not match method used in lab
  • Record not signed or initialed (i.e., cannot determine who completed the work)

Admittedly, illegible printouts seems basic, but you never know what you’re going to find in an audit, and inspectors have come across them. Of course, it’s always a good idea for a second person to review the batch record, and in this case, somehow the QA review missed it. Use of unverified Excel spreadsheets is pretty common, at least in clinical trial phase production. Verifying the calculations in a spreadsheet (and then locking it) will go a long way to ensuring data integrity. During clinical trial phase production, methods for testing change, and in this case the company did not update the method used in the batch record. One could argue that this is not strictly data integrity, just a method difference, but I would answer that one cannot rely on the data without knowing the method and if it’s validated.

Batch Records

It’s also not uncommon for an inspector to review a batch record. If the lab conducts an in-process test, an auditor/inspector could look at the batch record with a list of questions in mind. Who took the sample and when? Who ran the test? Can it be read? (e.g., initials, result, etc.). Was it recorded at the same time as it was taken? Was the pertinent information complete? Is the information recorded in the lab notebook or LIMS consistent (identical) with the data in the batch record? If there are copies (from an instrument for example) are they original/true copies? A good and easy to remember acronym for this, as mentioned in the guidance, is ALCOA (Attributable, Legible, Contemporaneous, Original, and Accurate). Another way to remember this is to think of the C’s: Contemporaneous, Consistent, Complete, and Correct, which is identical to those in Good Documentation Practice. Using good document control practices can go a long way to eliminating data integrity issues.

I have seen the following examples of incomplete data integrity in a batch record:

  • Illegible copies in a GMP record
  • Incomplete records (missing data with no explanation)
  • Batch records not updated for current process
  • Electronic Batch Record (EBR) data not validated properly: validation of process controller missing data table communication check between PLC controller and e-batch record database
  • Poor/unreadable/illogical notes for a deviation

Illegible copies should not be in a batch record, and ideally, QA should catch this upon review if it does happen. Sometimes a record is simply missing, and if this is the case, an internal audit or QA should ideally discover this omission. It’s important to always update the batch record to reflect the current process; periodic review of the batch record by production and QA is helpful in this regard.

EBRs are not yet widely used, but I think they will be the norm one day. As EBRs gain acceptance, data integrity, as verified through validation, is more important. EBR validation must include validation of data transfers. For example, where there is data table communication between a PLC controller and the EBR, it is important to verify data table elements and values between the controller and the EBR. This exercise validates data table integrity/communication between the two systems.

In EBRs, electronic signatures are considered equivalent to handwritten signatures for master batch records (master production and control records) and must conform to Part 11. Appropriate controls must be used to associate the record with the e-signature and with the person who is signing.

Metadata is information that gives meaning and context to data as numbers. For example, I have seen “88” sometimes in a batch record as a result of a calculation, and I had to explain to the client that it should read “88 mg,” so anyone reading the batch record could understand it. Metadata is needed to reconstruct cGMP records and therefore needs to be associated and retained in electronic records (see 211.188 and 211.194). The guidance notes that electronic data used for cGMP requirements should include relevant metadata. This is important for cGMP databases (like electronic deviation systems) and EBRs.

Control of cGMP Computer Systems

cGMP computer systems must have access controls to assure that data entry and changes to records are done by authorized personnel (211.68[b]). The guidance emphasizes this for EBRs (computerized master production control records or MPCR) and other electronic records. As an auditor, I’ve seen sharing of logins and uncontrolled Excel spreadsheets.

Logins should not be shared (data entry and changes must be controlled and attributable), and Excel spreadsheets, if used, must be verified and locked.

The guidance notes that electronic copies are considered to be accurate reproductions and can be used as true copies of either paper or electronic records, provided that they have the same content and meaning of the original data with the associated metadata.

FDA: Respond Fully to DI Issues

FDA recommends that companies remediate known data integrity issues by hiring a third party auditor to determine the scope of the issue, implementing a corrective action plan, and retraining or removing responsible individuals associated with the issue. If an official FDA regulatory action was taken, FDA may then reinspect to determine if the data integrity issues have been resolved.

Conclusion

Keep data integrity in mind when writing an application or supplement to regulatory authorities and consider performing a QC check on all the data tables (relative to the source data) and CMC information to ensure it is consistent with current production. Perform an internal audit or an audit by a third party to check data integrity issues to catch them prior to a regulatory authority inspection; this is especially important for a PAI. Consider including data integrity training in either overall GMP training or GMP refresher training. By following these steps, you may be able to reduce the chance of a data integrity observation during your next inspection.

Reference

  1. Guidance for Industry: Data Integrity and Compliance with CGMP, U.S. FDA, April 2016

About the Author

John GodshalkJohn Godshalk is a former FDA/CBER reviewer/inspector, cGMP, CMC and quality expert. He is currently a Senior Consultant at Biologics Consulting based in Alexandria, Va. He can be reached at jgodshalk@biologicsconsulting.com.