5 changes underway for R&D IT shifting to advanced modalities and biologics
Back in 2000, the FDA approved only six biologics versus 27 new molecular entities (NMEs). Fast-forward twenty plus years, and by the end of 2022 the number of approvals for biologics and advanced modalities outpaced NMEs, signalling the ascent of this category after decades of research. The recent historic FDA and UK regulatory approval of the first CRISPR-Cas9 gene editing for sickle cell disease shows this momentum.
The shift to new modalities brings many changes; most importantly, improved patient outcomes and better coverage for rare diseases. mRNA COVID vaccines were a shining example. These new modalities also upend the way work is done in the lab. Bringing advanced modalities from the lab to patients requires more data, complex multimodal data, and greater need for collaboration across specialisations. You can’t simply apply legacy tools built for small molecule R&D.
The industry is racing towards inventing and delivering better medicines faster and it needs to develop a new tech strategy in parallel. IT teams and companies that can align their teams and rapidly incorporate new innovations stand to be the most productive biopharma companies, driving success for decades to come.
This is not a quick or easy transition. Here are the five areas where we’re seeing the most progress today:
1. 'Follow the molecule' data and tech strategy is key, and software is catching up
Having data all together, marrying biological data to small molecule data, and harmonising data so that scientists can see it regardless of modality or where it came from – this is IT’s North Star. This is how we use data to go faster, and create better, safer patient outcomes.
Benchling's 2023 State of Tech in Biopharma report illustrated the need: 40% of IT at large biopharma support 20+ unique scientific software applications for their teams alone, as scientists work across a large amount of different scientific software applications. This is not ideal from a collaboration, security, or productivity perspective.
Moving away from siloed apps and towards a single interface, a central platform for scientists (and the IT teams supporting them) to capture and manage data across research, process development, and into manufacturing is key — essentially, following the molecule.
2. The industry is tackling instrument connectivity
The challenge of automating and standardising data capture from instruments has long plagued the industry. However, progress is underway, thanks to a movement away from proprietary data formats and vendor lock-in, and towards open industry standards, open-source tools, and automated data integration.
Earlier this year, Allotrope Foundation achieved an important milestone, launching publicly available data standards for lab instruments using the Allotrope Simple Model (ASM). Recently, Benchling built on this momentum with the launch of Connect, which automates instrument data capture and management using a unique open source approach, mapping all instrument output to the ASM and making the converter codes open source and freely available on GitHub.
3. Security used to be a barrier to adoption of cloud-based software, and now it’s the opposite, security is a benefit.
As scientific techniques and R&D processes have become more complex with new modalities, the amount of data generated increases massively. Biotech knows that managing cybersecurity risks appropriately today requires engineering, automation, real-time analytics, threat intelligence, significant tooling, and more. And that modernising a company’s cybersecurity architecture takes a tremendous investment and requires changes in team and culture, which have been challenging for biotechs.
But a shift is happening. Biotech companies are now taking advantage of the economies of scale that mature cloud computing providers can offer on cybersecurity, resiliency, and disaster response. Tech cloud providers have a duty and incentive to be secure — they invest far more in security than most companies can afford to on their own, and also have an abundance of expertise. More times than not, maintaining an on-prem strategy exposes biotech to more risk because 100% of the security responsibility and resourcing is on you, the company.
4. Built-for-science gains ground
Scientists today are coming up in a computational lab and demanding tools purpose-built for science, and the market is listening. Tools that capture, manage, and analyse data — especially biological data — in a user-friendly interface that scientists actually enjoy using are now seen as foundational. 70% of scientists and IT have adopted an R&D data platform, according to Benchling’s recent report.
With growing demand, a robust ecosystem of cloud-based, ‘built-for-bio’ software has emerged, helping scientists to process, share, and collaborate on growing biologic data sets. Oftentimes, it’s scientists who’ve worked in the industry who create new tools, tools that meet the high bar of what they would use and want to make the science easier. Pluto, PipeBio, Watershed, and Quartzy are examples here.
5. AI is the carrot for better biotech data practices
As much as AI will drive transformational changes in how we discover new drugs, one lesser discussed, but equally beneficial impact of AI on biotech, is that it requires the industry to prioritise strong data foundations. For IT, AI is a forcing function to operationalise the flow of data coming from experimental pipelines and integrate previously siloed data sets end-to-end.
Companies need to build their data strategy and systems before benefitting from AI and ML. Beyond collecting data, doing so in a standardised way and anticipating how the data should be consumed is key. Companies need to design their data systems for the analytics and AI and ML they aim to layer on top. In the coming years, the importance of scientific data and AI will only grow, putting additional pressure on biopharma to abandon legacy tech and build a stronger digital foundation.
In this new normal, IT teams in biotech and biopharma will increasingly play a pivotal role in ushering in new data management strategies. With the use of cohesive data strategies and modern tech, therapeutics companies can set themselves up for success for the next two decades of discovery.