The GLP-1 drug revolution – and why speed alone won't save pharma
The world of pharmaceuticals has always been fast moving, but the launch of blockbuster GLP-1 drugs has dramatically dialled up the tempo. Each week of delay to market for these diabetes and obesity treatments costs millions of dollars in lost revenue.
This goes a long way towards explaining why the industry is fundamentally rethinking how it approaches manufacturing. In this environment, time to market has evolved from a humble KPI to being the business model itself.
Time to market is existential
Drug patents last 20 years. Roughly half that time is consumed by clinical trials and regulatory approvals before the first commercial batch is ever produced. Once a patent expires, products typically lose around 90% of their value within six months.
Every delay at the manufacturing stage permanently erodes value that can never be recovered.
This is why, when capacity is constrained, companies rush to build plants quickly and they instinctively avoid anything that looks like a large, disruptive IT programme. The objective is to produce a compliant product as fast as possible.
‘Fat jabs’ gold rush
In a GLP-1 market, this pressure is amplified. For example, take Eli Lilly’s Mounjaro and Zepbound. With annual combined revenues of $36.5 billion, the drugs generate roughly $100 million per day. Launching just two weeks earlier adds billions in value. Few capital decisions in pharma can match that return.
In the US alone, around $600 billion worth of new pharmaceutical facilities are expected over the next five years, driven by government incentives, geopolitical pressure to onshore production, and the rise of GLP-1 drugs alongside personalised medicine.
Speed to market dominates every decision. This means compliance responsibility is increasingly pushed onto suppliers and contract manufacturers to accelerate timelines.
The hidden cost of speed
To move quickly, many companies are buying skid-based manufacturing units (modular, pre-assembled systems) that arrive pre-qualified. It is an effective way to compress project schedules, but it fragments data ownership.
Instead of one coherent operational system, companies end up with dozens. Engineering data, process data, quality data, and production records reside in different environments, owned by different vendors. It is not uncommon to see 25 or more systems supporting a single product. In fact, the industry as a whole has been slow to move beyond isolated use cases to more holistic digital transformation.
This fragmentation exists largely because there are no industry-wide standard data models or exchange structures, with each supplier solving immediate project needs. It’s also partly due to the challenges of introducing new digital tools in such a heavily regulated industry.
But here’s the rub: Fragmented data makes it harder to analyse performance holistically, identify bottlenecks, and improve processes. It also makes regulatory review more manual, more error-prone, and more expensive – precisely when volumes are ramping up.
Where AI actually delivers value
Much of the public conversation around AI in pharma focuses on discovery. However, in practice, the most impactful applications today sit much closer to manufacturing.
Many pharma companies rely on multiple contract manufacturing organisations (CMOs) and contract development and manufacturing organisations (CDMOs) to meet GLP-1 demand. Each batch generates extensive documentation, often PDFs, that must be reviewed for compliance.
In a lot of organisations, this review is still manual. Teams extract KPIs, temperatures, limits, and deviations by hand and re-enter them into spreadsheets. It is slow, labour-intensive, and risky.
AI can change this through rule-based, narrowly scoped applications that extract and validate data automatically. Instead of humans transcribing information, AI systems surface exceptions, flag deviations, and accelerate review cycles.
The pattern extends to other applications, such as predictive maintenance that reduces unplanned downtime during high-volume GLP-1 campaigns.
The common thread is specificity. The AI that works in pharma is narrow, transparent, and designed to augment human judgment rather than replace it.
Regulation is evolving, not obstructing
In Europe, Annex 11 and the forthcoming updates to Annex 22 provide early signals of how regulators are thinking about AI in regulated manufacturing environments. The emphasis is not on banning AI, but on ensuring transparency and accountability – particularly for GxP (Good Practice regulations) decisions that affect product quality or patient safety.
What does this mean operationally? Three requirements are emerging as non-negotiable:
Traceability: Every AI-generated output must be traceable to its inputs, training data, and model version.
Explainability: Black-box models are unacceptable for GxP decisions. If an AI recommends a process adjustment or supports batch release, the logic must be interpretable by domain experts and auditable by inspectors.
Human oversight: AI can assist, but it cannot autonomously make decisions that affect product quality or patient safety. A qualified person must review, validate, and take accountability for the outcome.
These are design constraints that separate credible industrial AI from research prototypes. AI that cannot be explained, validated, or audited will struggle to move beyond pilot phases. AI designed with regulation in mind will not.
Ensuring data quality and reliability
Additionally, successful AI integration depends on reliable operational data. Organisations must check that data is FAIR:
Findable: Operators and analysts need to easily locate relevant data using standardised metadata and identifiers, without having to dig through multiple folders or systems.
Accessible: Data needs to be securely accessible, enabling data sharing within and across teams, while maintaining compliance with regulations like GxP and ISO standards.
Interoperable: Data should be formatted and structured so that different systems and software can understand and use that data. This breaks down silos and enables integration - for example, manufacturing data from sensors can be combined with lab results and quality data for more advanced analysis.
Reusable: For data to have long-term value, it should be well-documented and trustworthy enough to be reused in new experiments or for AI models without needing to regenerate it from scratch.
By leveraging this operational data, medicines can get to market faster while still maintaining safety and quality.
Outsourcing does not remove responsibility
CMOs and CDMOs are indispensable partners in scaling production quickly. But outsourcing does not transfer accountability. If a company’s name is on the box, it remains legally responsible for what is inside it. That responsibility extends to data integrity, auditability, and regulatory compliance, regardless of where manufacturing takes place.
Without data continuity across internal operations and external partners, companies risk losing control precisely when scrutiny is greatest.
What sustainable speed requires
The lesson from GLP-1s is that speed must become sustainable.
The companies that come out on top will not simply be those that get to market fastest. They will be those that deploy capital projects intelligently, maintain data continuity across R&D, manufacturing and supply chains, and use AI to make faster, better-informed decisions.
That means treating data interoperability as a capital project requirement and deploying AI where it eliminates manual reconciliation – batch record review and deviation detection – without introducing new compliance risks.
It also means setting contractual requirements for how CDMOs structure and deliver batch data, pushing for real-time access to critical process parameters rather than PDF exports.
Most critically, it means assigning accountability for end-to-end data architecture. In many pharma organisations, IT owns systems, quality owns compliance, and operations owns production – but no one owns the data architecture that allows them to work as one. That needs to change.
In a GLP-1 world, speed is paramount. The challenge now is ensuring it lasts beyond the next quarterly earnings call, and the regulatory inspection that follows.
About the author
Thomas McCarthy, industry principal – life sciences, AVEVA
Thomas McCarthy is the industry principal - life sciences from AVEVA. He has over 30 years of Life Science experience working for both a customer's manufacturing and research organisations. McCarthy works with the life science customer community on important initiatives within the industry. He holds a Bachelor of Science in Chemical Engineering from Clarkson University and lives in Connecticut, US.
Supercharge your pharma insights: Sign up to pharmaphorum's newsletter for daily updates, weekly roundups, and in-depth analysis across all industry sectors.
Click on either of the images below for more articles from this edition of Deep Dive: Market Access 2026
