Steve McCarthy, Vice President of Digital Innovation, Sparta Systems02.07.22
Regardless of industry or company size, quality leaders have five key imperatives in common. They care about ensuring product quality and efficacy and, most importantly, patient safety. They also care about the continuity of supply in both existing product and new product pipeline. This has perhaps never been more important than in today’s climate. Compliance is the final imperative. While a company can have a good compliance record, that doesn’t guarantee good product quality, safety, or supply. The converse is often true—a company with good product quality, patient safety, and robust supply chain tends to have strong compliance performance.
While these five imperatives are top of mind for quality leaders, they are increasingly faced with a bi-modal challenge. Mode 1 is defined by operational stability, predictability, efficiency, and incremental continuous improvement. It is driven by the need to lower the total cost of quality (TCoQ).
There is now a growing and competing tension with Mode 2. Mode 2 is defined by digital transformation and Industry 4.0 and involves innovative, out-of-the-box solutions that enable speed and big-step changes. These forward-thinking trends enable organizations to leverage technology and use data to innovate faster.
Mode 2 pressures can best be described by these seven macro trends:
However, industry doesn’t operate in only Mode 1 or Mode 2. It has to operate in both modes at all times.
Improving Business Outcomes with Digital Quality
The life sciences industry faces unprecedented levels of change. The growth in the number of patents, particularly in the area of precision medicines driven by complex manufacturing processes, highlights issues such as the cost of quality. Digital quality not only drives proactive quality, but also delivers real business benefits that address this and other challenges.
Recent data from LNS Research shows the value of quality data on business operations improvement. This includes providing quality staff with access to plant data in time to make timely and appropriate decisions and including the quality data in a holistic data model.
Companies have seen significant business benefits in key metrics such as:
The Traditional Approach to Quality by Design (QbD)
The Quality by Design (QbD) methodology has been applied across life sciences for several years. It started as an advanced six sigma process but has since become popular in the medical device and pharmaceutical industries. The concept was conceived by Dr. Joseph M. Juran, on the principle that quality should be designed into a product and the majority of quality problems and crises are directly related to the way a product was designed.
QbD traditionally comprises three key elements. The first is to define the quality target product profile. This is a summary of the product’s quality characteristics critical for ensuring the finished product meets the required standard of quality.
The next element is to define critical quality attributes (CQA). These are the physical, chemical, biological, or microbiological characteristics that need to be within a limit/range in order to ensure product quality. If we’re more mature in our approach, we execute our studies of critical to quality (CTQ) flow-down, understanding how each CQA effects the next.
Finally, it’s important to identify the critical process parameters (CPP). CPPs are variables that can impact CQAs. These could be equipment type, batch size, and mixing order and time. Raw material attributes must be kept within limits to ensure product quality. Examples of CPPs include temperature, pressure, pH, dissolved oxygen, agitation, and many, many more.
They must be monitored to ensure early and accurate detection of deviations that fall outside of acceptable limits. Real-time CPP alerts provide organizations with early detection and corrective action guidance of quality issues. This improves batch quality, yield, and reduces the risk of rejected product.
Historical QbD identifies which parameters are critical, and we monitor them. Sometimes we can gain insight from failures that have occurred, which can inform our understanding of the CQAs and CPPs. But it tends to generally be reactive, responsive, and after the fact.
How to Maximize the Power of QbD
The key to maximizing the opportunities lies in unlocking the potential of QbD and making it a dynamic methodology we can apply throughout the lifecycle of a product.
Industry rightfully spends millions of dollars on QbD. It’s an investment of many cross-functional dollars and resources across R&D, design quality, and manufacturing. The intent is that we design quality into the products and the processes by which we make that product—throughout its life.
However, the value returned from investment in QbD is suboptimal. This is because continuous process verification programs can’t use real-time, transactional manufacturing data optimally, both during production and, importantly, post-market for triage and investigation of issues that escaped the factory.
Dealing with quality anomalies in manufacturing can delay product release for hundreds of hours. Many quality management systems (QMS) aren’t integrated with batch automation, instead using manual processes for data transfer, which is inefficient. It limits visibility, creates data integrity concerns, and causes lengthy delays.
The solution lies in directly connecting quality management and manufacturing automation systems. This results in:
AI-Enabled Quality by Design: A Paradigm Shift
While industry has made great strides in terms of the traditional approach to QbD, the enabler is the ability to deliver continuous process verification of those CPPs and real-time monitoring of those process parameters against established standards.
Real-time and continuous monitoring of that data is streaming off a connected asset such as a bioreactor or near-real-time of that data is sitting in a data lake.
The solution is a system that delivers that capability and is integrated with an AI-enabled QMS to detect manufacturing anomalies. These include process anomalies, trends, and diversions in performance before they even become breaches of a warning limit of failures of a specification. It’s a paradigm shift.
This, in turn, allows for a proactive quality response with the immediate identification, capture, and documentation of this quality event. This is predictive—it’s pre-failure.
And as the whole system is one of continuous improvement, industry’s understanding of the CQAs improves and we can better characterize and tighten our CPPs.
While historical QbD identifies which parameters are critical, the combination of AI-enabled CPV and quality helps narrow down what those ranges should be, dynamically.
This means we can now detect anomalies sooner, detect anomalies not previously defined or characterized, and increase the level of analytical sensitivity. In short, we can enable proactive, even preventive quality.
The Future of Digital Quality Management Is Here
Traditional QbD brings enormous value, but at significant cost—and the ROI is limited. QbD should be dynamic, and it should live throughout the product lifecycle. AI-enabled continuous process verification, manufacturing anomaly capture, and the resulting proactive quality response truly optimize the power of QbD for a business.
The result is a proactive, preventive approach to managing quality events that speeds up innovation, reduces the total cost of quality, and ultimately improves the global population’s health and wellness.
With a BSc in medical and industrial biology, Steve McCarthy is vice president of Digital Innovation at Sparta Systems. McCarthy is certified in six-sigma process excellence and provides domain expertise and serves as an industry evangelist and customer advocate for Sparta. McCarthy has nearly three decades of experience as a quality and supply chain leader within the healthcare industry.
While these five imperatives are top of mind for quality leaders, they are increasingly faced with a bi-modal challenge. Mode 1 is defined by operational stability, predictability, efficiency, and incremental continuous improvement. It is driven by the need to lower the total cost of quality (TCoQ).
There is now a growing and competing tension with Mode 2. Mode 2 is defined by digital transformation and Industry 4.0 and involves innovative, out-of-the-box solutions that enable speed and big-step changes. These forward-thinking trends enable organizations to leverage technology and use data to innovate faster.
Mode 2 pressures can best be described by these seven macro trends:
- Process complexity and variability
- Pace of Innovation
- Industry 4.0
- Patient centricity
- Novel product
- Supply chain complexity
- Rising TCoQ
However, industry doesn’t operate in only Mode 1 or Mode 2. It has to operate in both modes at all times.
Improving Business Outcomes with Digital Quality
The life sciences industry faces unprecedented levels of change. The growth in the number of patents, particularly in the area of precision medicines driven by complex manufacturing processes, highlights issues such as the cost of quality. Digital quality not only drives proactive quality, but also delivers real business benefits that address this and other challenges.
Recent data from LNS Research shows the value of quality data on business operations improvement. This includes providing quality staff with access to plant data in time to make timely and appropriate decisions and including the quality data in a holistic data model.
Companies have seen significant business benefits in key metrics such as:
- On-time delivery
- Overall equipment effectiveness (OEE)
- Successful new product introductions
- Capacity utilization
- First pass yield
- Decrease in defects per million opportunities
The Traditional Approach to Quality by Design (QbD)
The Quality by Design (QbD) methodology has been applied across life sciences for several years. It started as an advanced six sigma process but has since become popular in the medical device and pharmaceutical industries. The concept was conceived by Dr. Joseph M. Juran, on the principle that quality should be designed into a product and the majority of quality problems and crises are directly related to the way a product was designed.
QbD traditionally comprises three key elements. The first is to define the quality target product profile. This is a summary of the product’s quality characteristics critical for ensuring the finished product meets the required standard of quality.
The next element is to define critical quality attributes (CQA). These are the physical, chemical, biological, or microbiological characteristics that need to be within a limit/range in order to ensure product quality. If we’re more mature in our approach, we execute our studies of critical to quality (CTQ) flow-down, understanding how each CQA effects the next.
Finally, it’s important to identify the critical process parameters (CPP). CPPs are variables that can impact CQAs. These could be equipment type, batch size, and mixing order and time. Raw material attributes must be kept within limits to ensure product quality. Examples of CPPs include temperature, pressure, pH, dissolved oxygen, agitation, and many, many more.
They must be monitored to ensure early and accurate detection of deviations that fall outside of acceptable limits. Real-time CPP alerts provide organizations with early detection and corrective action guidance of quality issues. This improves batch quality, yield, and reduces the risk of rejected product.
Historical QbD identifies which parameters are critical, and we monitor them. Sometimes we can gain insight from failures that have occurred, which can inform our understanding of the CQAs and CPPs. But it tends to generally be reactive, responsive, and after the fact.
How to Maximize the Power of QbD
The key to maximizing the opportunities lies in unlocking the potential of QbD and making it a dynamic methodology we can apply throughout the lifecycle of a product.
Industry rightfully spends millions of dollars on QbD. It’s an investment of many cross-functional dollars and resources across R&D, design quality, and manufacturing. The intent is that we design quality into the products and the processes by which we make that product—throughout its life.
However, the value returned from investment in QbD is suboptimal. This is because continuous process verification programs can’t use real-time, transactional manufacturing data optimally, both during production and, importantly, post-market for triage and investigation of issues that escaped the factory.
Dealing with quality anomalies in manufacturing can delay product release for hundreds of hours. Many quality management systems (QMS) aren’t integrated with batch automation, instead using manual processes for data transfer, which is inefficient. It limits visibility, creates data integrity concerns, and causes lengthy delays.
The solution lies in directly connecting quality management and manufacturing automation systems. This results in:
- Integrated quality and batch operations that give organizations real-time, end-to-end visibility, enable faster and better decision making, and proactive response to quality events.
- Real-time CPP alerts and corrective action guidance to provide early detection of manufacturing anomalies, prevent quality violations, and ensure higher product quality.
- Process context for quality investigations to reduce the time needed to investigate quality issues, increase operational efficiencies, and provide AI-enabled auto-categorization and correlation.
- QbD through continuous process verification, which provides real-time automation data for verification, closes the loop on process verification and increases compliance.
AI-Enabled Quality by Design: A Paradigm Shift
While industry has made great strides in terms of the traditional approach to QbD, the enabler is the ability to deliver continuous process verification of those CPPs and real-time monitoring of those process parameters against established standards.
Real-time and continuous monitoring of that data is streaming off a connected asset such as a bioreactor or near-real-time of that data is sitting in a data lake.
The solution is a system that delivers that capability and is integrated with an AI-enabled QMS to detect manufacturing anomalies. These include process anomalies, trends, and diversions in performance before they even become breaches of a warning limit of failures of a specification. It’s a paradigm shift.
This, in turn, allows for a proactive quality response with the immediate identification, capture, and documentation of this quality event. This is predictive—it’s pre-failure.
And as the whole system is one of continuous improvement, industry’s understanding of the CQAs improves and we can better characterize and tighten our CPPs.
While historical QbD identifies which parameters are critical, the combination of AI-enabled CPV and quality helps narrow down what those ranges should be, dynamically.
This means we can now detect anomalies sooner, detect anomalies not previously defined or characterized, and increase the level of analytical sensitivity. In short, we can enable proactive, even preventive quality.
The Future of Digital Quality Management Is Here
Traditional QbD brings enormous value, but at significant cost—and the ROI is limited. QbD should be dynamic, and it should live throughout the product lifecycle. AI-enabled continuous process verification, manufacturing anomaly capture, and the resulting proactive quality response truly optimize the power of QbD for a business.
The result is a proactive, preventive approach to managing quality events that speeds up innovation, reduces the total cost of quality, and ultimately improves the global population’s health and wellness.
With a BSc in medical and industrial biology, Steve McCarthy is vice president of Digital Innovation at Sparta Systems. McCarthy is certified in six-sigma process excellence and provides domain expertise and serves as an industry evangelist and customer advocate for Sparta. McCarthy has nearly three decades of experience as a quality and supply chain leader within the healthcare industry.