PDA Letter Article

Pharma Must Work “Smarter” in New Era Some Thoughts on the Impact of Cloud Computing on Parenteral Manufacturing

by Toni Manzano, Aison

Industry 4.0

Parenteral manufacturers are just now testing the waters of Industry 4.0. The factories of the future will operate in a state of continual monitoring as manufacturers increasingly rely on analytics to ensure effective processing. Yet the goal of producing quality product remains the same.

In 2011, the term “Industry 4.0” was used for the first time to describe the beginning of the fourth industrial revolution, referring to manufacturing processes powered by interconnected cyber systems. Fast forward to 2018, and we are well into this fourth revolution and so-called “smart manufacturing” is on the rise.

While many sites manage their critical information with electronic systems (e.g., manufacturing executing systems, laboratory information management systems or warehouse management systems), not all can be characterized as smart manufacturing. That standing can only be earned when a factory uses artificial intelligence (AI), machine learning (ML) and deep learning (DL) to make decisions using reliable knowledge based on generated data. Those factories that already manage significant parts of regulated tasks with electronic data and IT systems are ready to transition into smart manufacturing.

What is the difference between an electronic factory and a smart factory? The difference lies in the ability to apply existing siloed data into advanced algorithms and then transforming this information into knowledge.

Smart manufacturing requires two key components for implementation:

  • Physical elements designed under Industrial Internet of Things (IIoT) principles that acquire and process raw data on-site and export it via the internet
  • Cloud systems where information is transformed into knowledge by applying massive indexing and powered analytics

Discussions about smart manufacturing must take into account a fully connected layout, where each individual manufacturing component emits all the available information in real time. This same action is then repeated top-down across the different production levels. Physical elements, such as the IIoT or edge computing, require being combined with intangible objects like digital twins or cloud computing. But among all these components, the most revolutionary are those related to the predictive analytics and artificial intelligence. And managing the large quantity of data generation by a smart factory is no easy task when using traditional systems. This has led to reliance on big data systems, which requires significant resources to data and AI ecosystem, including specialized staff and expensive technology.

The smart alternative is to delegate heavy data processing to already-existing cloud services. Using cloud technologies produces better, high-quality products through efficient processes that are not complex and do not require powerful computation centers.When cloud computing is used, previously hard-to-understand variables within the manufacturing process transform into calculable solutions that define the future state of the processes. Cloud computing encompasses pattern recognition, automatic outlier identification, anomaly detection, neural networks and clustering and classifier algorithms. Other common analytic tools are Golden Batch Fitting, root cause identification for CAPA, cleaning process optimization or continuous manufacturing support. The common denominators in all use cases, however, are complexity, large numbers of involved variables, a huge amount of data that must be managed, and regulated requirements.

Elephant in the Room: Regulators

Whether a company invests in big data architecture or relies on cloud computing, the rules of GxP still apply. Some regulatory agencies and pharmacopoeias have already started to address this area. For example, the European Pharmacopoeia has included two frequently used ML and DL algorithms as valid chemometric techniques for processing analytical datasets.

Additionally, when information is processed under a regulated framework, the data, metadata, and the operation to transform it into knowledge must obey data integrity rules. With this in mind, the UK MHRA published the “GxP Data Integrity Guidance and Definitions” this past March. These guidelines introduced cloud systems consumed as services as valid computerized systems to manage the regulated information under the principles and recommendations included in the document.

To be fully accepted in the contexts of biotechnology and pharmaceuticals, the IIoT devices, the ecosystem of infrastructure, platforms and required computing to perform analytics, must be qualified and validated using the same criteria that has been applied for years. The digital transformation also applies to the quality system that wraps the entire process. The new players involved in the big data and cloud computing must understand the regulatory requirements when they work on GxP environments. Factories are evolving toward a state where everything is monitored and measured. Yet the end goals remain efficient, accurate, secure and traceable data and product/process quality.

[Editor’s Note: A follow-up article from the author will address the impact of data integrity on big data.]