The end of drug shortages begins with data transparency
For more than 15 years, shortages of critical drugs — from oncology treatments to emergency room and surgical suite essentials — have burdened healthcare systems and compromised patient care and outcomes. The issue became uncomfortably real for me recently, as I searched for a pharmacy that could fill my daughter’s amoxicillin prescription. How would I feel, I wondered, if I needed immediate access — not to a common antibiotic, but a treatment for a life-threatening illness?
Despite ongoing analysis and guidance, supply gaps continue, especially for generic drugs, which account for most of the medications prescribed in Europe and the United States. For some generics manufacturers, margins in the low single digits can make consistent quality and compliance feel out of reach. The costs of addressing a single warning letter can drive a small supplier out of business, further shrinking supplies.
Lack of data transparency, on the macro and micro level
Lack of data transparency across the supply chain is a core challenge for regulators and suppliers of all sizes. It prevents insights into supply fluctuations and their root causes. Decreased access to global compliance and quality data for the active pharmaceutical ingredients (APIs) and finished drugs manufactured offshore has only compounded supply risks. Between 2020 and 2022, FDA’s five-year backlogs for offshore API facility inspections have increased from 30% to 80%, and the agency is pivoting to remote and other inspection formats.
Regulators, governments, patient advocates, and industry groups are actively working to solve today’s supply problems. By listing the generics most vulnerable to shortages, the European Medicine Agency (EMA) has taken an essential first step, which should help guide future efforts. So far, discussions of next steps have emphasised the need for economic incentives, to help generics manufacturers update manufacturing, and to speed better approaches to supply chain data collection, monitoring, and analytics.
Meanwhile, at the “micro” level, the individual facility and company, there is a new focus on establishing data transparency and connecting data across functions. Data-driven approaches are helping more companies reduce the risk of shortages by improving the efficiency of compliance and quality operations.
Connecting cross-functional data for more agile change control
One area of focus is improving post-approval process change control, a time- and labour-intensive behind-the-scenes process that often leads to supply delays. Using traditional approaches, with disconnected data, separate IT systems, and manual processes, a single change control process can take from six months to two years to complete. Depending on the regulatory agencies and regional requirements involved, the work required can delay a drug’s availability by up to five years. Today, a typical large biopharma company manages 40,000 of these applications each year, with up to 200 for a single product.
Imagine that the EMA approved one manufacturer’s new therapy two years ago. Since then, the company has developed a safer manufacturing process that reduces product costs. Its leaders also plan to use more sustainable packaging to reduce carbon footprint, and to shift from laboratory-based quality control to real-time batch release. Each of these individual improvements will require separate regulatory agency approval.
Gathering the data required for each change takes months. First, regulatory teams must determine the impact of each change, and which countries and internal documents will be affected. Supply chain teams must then do the same for individual product lots.
The quality department must then respond to these assessments, identify and update documents affected by the changes, and develop new user training programmes that incorporate the changes. The team will also have to manage and track these changes and estimate the potential risks of making each change.
Currently, at many companies, regulatory and quality teams use different software systems for each of these steps and communicate by email and phone. Errors can result in non-compliance and regulatory warning letters.
But that is only the beginning. After compiling, publishing, and submitting the applications, regulatory teams must optimise ongoing communication with regulatory agencies. In the end, regulators can still decide that they need to re-inspect the facility to re-approve the new and improved product, triggering additional delays in product availability.
Connecting manual, disconnected processes to gain speed and cut costs
Unified approaches to quality and regulatory data management bring different software together onto a single platform, which can help streamline and simplify change control. They make it easier for users to meet regulatory requirements, and to spot and address problems faster.
Integrating quality, regulatory, and supply chain data, documents, and processes can enable even greater agility. It is now possible, for example, to connect regulatory and quality data and content, particularly product documentation, with a corporate enterprise resource planning (ERP) system.
A growing number of companies of all sizes and types are unifying quality and/or regulatory data management. Some, such as Moderna, are connecting regulatory and quality operations to facilitate cross-functional collaboration, while others are connecting regulatory and quality data with their ERPs, which can potentially reduce batch-release timelines by up to 30%. Functional and cross-functional teams expect greater data transparency to make compliance easier, which would mitigate the risk of drug supply gaps.
Assessing the costs of disconnected systems
Every day that a drug isn't available costs a manufacturer hundreds of thousands to millions of dollars, whether for highly specialised drugs or everyday over-the-counter medicines. Unified approaches that improve data visibility, centralise access to real-time information, and automate workflows are already proving that they can speed patient access to the treatments they need.
Tracking the time and cost of using traditional approaches and technologies can reveal surprising insights into the total cost of ownership and operation. Consider the savings and improvements from the following:
- Reorienting highly trained and qualified people away from manual, administrative tasks to focus on priority efforts, such as interaction with regulators.
- Reducing the time spent on disconnected, one-off email and telephone communications, measured by the number of hours each employee spends on these efforts each day, within teams and across functions.
- Strengthening patient and healthcare providers’ trust in access to critical treatments. Although this cannot be measured, quantifying missed product release deadlines over time could offer insight into performance gaps and trends.
- Avoiding the intangible, but potentially significant, reputational costs of having a drug supply problem.
Disconnected data not visible across systems can affect name-brand, generic, and medtech manufacturers alike. Change control is only one of several behind-the-scenes operations that drain time and resources and delay patient access to treatments. CMC submissions and submissions publishing are two other examples, both of which are undergoing significant change.
As the current drug supply situation has reminded us, we are all patients, and the industry’s supply chain issues affect us all. Solutions already exist to help automate more behind-the-scenes processes and maximise access to connected, real-time data. However, they can only work from a foundation of data transparency, at the individual plant level and beyond.