Don't hold back lab digital transformation

Digital
digital automated laboratory

Mike Tarselli doesn’t like to talk about “the lab of the future”. It’s a phrase, he says, that gives people permission to put off digitalising their lab data and modernising their lab data infrastructure. In fact, there’s no reason that a data-efficient, cloud-based lab shouldn’t be the lab of today.

pharmaphorum caught up with Tarselli, the chief scientific and knowledge officer at TetraScience, to discuss why so many labs are behind the times, and what can be done to bring them up to speed.

“Lab data, and biopharma data specifically, is some of the most important data on Earth,” Tarselli said. “It ranks right up there with nuclear codes and the secrets to human health. And yet, that data tends to be relegated to physical storage, or it tends to be relegated to inefficient data transfer, or worse.”

There’s a better way, Tarselli said, and it lies in the cloud. Though it may be challenging to facilitate the transition, it’s well worth the effort.

The state of lab data today

Working with a variety of biopharma partners, Tarselli says there are a number of recurring themes he sees regularly. One is labs that are still using local storage or even paper files.

“It's 2023 and they still use paper binders or paper process sheets or run paper CAPAs if something goes wrong,” Tarselli said. “Many still, despite the security concerns they have inside their companies, put large data files in emails, or they use USB sticks or thumb drives to move files around, either internally or – gasp – sometimes externally.”

Labs are often using Microsoft Excel to process data simply because it’s been their tool of choice for 30 years, he said, even if it’s not optimised for the task.

As the amount of data generated in lab work grows, continuing a local or on-premises paradigm can become costly and cumbersome, as well as create security concerns.

“They have huge storage costs because they have to move it to an Isilon or a rack or something inside their company. And they constantly have to keep buying physical machines or devices so they can store, move, transport, and expand and interpret that data,” he said.

Even when labs are committed to a cloud infrastructure, it can be challenging to make the change because of the lack of interoperability and the many data formats floating around.

“30% or 40% of the [issues we see] interchange between common data formats. So, it sounds like me saying ‘common’ means that there's this list of five that every lab knows, but it's not true. There are probably somewhere between two and four hundred data formats that any given large pharma needs or uses on a routine basis,” Tarselli explained.

Poor data management has a cost on workforce efficiency, as well – Tarselli says that, at inefficient organisations, high-value well-paid scientists are sometimes spending a whole day each week on data processing.

“That’s people acting as data buses,” he said.

The three blockers to modernisation

If the status quo is so expensive, inefficient, and unsecure, what holds labs back from modernising their infrastructure and moving their data systems to the cloud? Tarselli says he puts those blockers in three categories: operational, strategic, and cultural.

Operational hurdles have to do with systems and infrastructure not being in place, or the existing systems not working well together. They can also involve an organisation’s data policies and governance. 

Those hurdles can be overcome, but not without investment, which is where the strategic and cultural blockers come into play. Strategic blockers can come down to cognitive inertia in management or a lack of strategic vision.

“If you've got systems that are currently running and they're supporting your current lines of therapies or your current workforce, you're going to keep funding those, right?” Tarselli said. “You're going to keep buying hardware or you're going to keep buying USB drives. You're going to keep buying Microsoft Word licences because that's what the budget is apportioned to do, and that's what people who are in your organisation know how to maintain and administer.”

Finally, cultural blockers have to do with the workforce’s unease about the cloud or automation.

“A lot of people believe that automation, cloud, and digital data will steal their jobs,” Tarselli said. “Or that they'll have to do something different, or that they'll have to take on extra training, or that it will be too complex in the short term to both deliver against the goals they have for their therapeutic or their clinical trial and also do this whole digitisation thing […] You came up in school for seven to ten years learning about cell vectors and cloning and sequencing. You didn't come up thinking about file patterns and metadata and structures of schema. That's fair.”

All of these blockers can be found in all kinds of industries, but pharma has had a particularly hard time staying up to date technologically, possibly because it’s such a highly regulated, risk-averse industry. 

“They don't think about what could be because they say this is the thing that the regulators have signed off on and it's the safest way, so let's go this way and that's fine,” he said. “But that tends to mean that they are going to lag behind competitors and even the FDA, who maintains an Office of Digital Transformation."

Overcoming the hurdles to the digital lab

The first step toward modernising lab data infrastructure is to address the cultural and strategic blockers with a change management approach, Tarselli said.

“The first step is to create a negative case for change,” he said. “Not a positive case, not ‘If we do this, we'll be scions on the hill, and we'll have access to all of our data, and AI will drive everything.’ That's a great vision, but it can't be done without building a solid foundation today. To achieve this, you create the negative case for change. You say, what is hampering you? Once you have that negative case in mind, you start thinking, okay, what would solve this negative case?”

After securing that buy-in from leadership, Tarselli said, the next step is to create a step-by-step plan for change, updating processes one at a time to prove out their value. 

“And then you measure, right? You inspect what you expect, as the old phrase goes. You say, given this plan we're trying to do over the next year or two, is it resolving those problems we identified at the beginning? And do we see a return on our investment? If you can do these things, it's pretty straightforward and that's where a lot of orgs sort of get their mojo going, right?”

When it comes to digitalising labs, some problems, like data interoperability, will need to be addressed as an industry. Groups like the Pistoia Alliance or the Allotrope Foundation are working to build consensus and address them, Tarselli mentioned. The challenge is that, as the webcomic XKCD points out, it’s hard to solve data standards problems without creating even more standards.

But many challenges can be addressed on an organisation-by-organisation basis. It just takes the courage and leadership to challenge a status quo that’s long overdue for a refresh.

“In the last decade, we've all moved to our mobile phones as a way of living, for most aspects of our life, and we take digitalisation for granted,” Tarselli said. “But one of the most critical industries that we trust with our lives by relying on the safety and efficacy of the therapeutics they develop is still operating with unreliable, inefficient tools. This is like going back in time 10 years.”

About the Interviewee

Mike TarselliAs chief scientific & knowledge officer, Mike Tarselli accelerates the Tetra Scientific Data Cloud™ through knowledge capture, training curricula, GxP compliance, use case research, and external scientific evangelism. Previously, Mike was the scientific director for SLAS, a global professional society dedicated to lab automation, and an associate director at Novartis, building an external scientific collaboration platform. His pharmaceutical experience includes bench and operational roles at Millennium, ARIAD, and Biomedisyn. Mike received his PhD from UNC Chapel Hill, completed post-doctoral work at Scripps Research, and his MBA through Quantic School of Business & Technology. Mike currently serves on the Pistoia Alliance Board of Directors, the UMass Amherst College of Natural Sciences Advisory Board, and the Editorial Board of the NIH/NCATS Assay Guidance Manual. He is an active volunteer with the American Chemical Society, the Regeneron International Science & Engineering Fair, and the National Science Foundation. Mike has been an invited speaker for student groups, professional organisations, and international expositions.

About TetraScience

Tetrascience

TetraScience is The Scientific Data Cloud company with a mission to accelerate scientific discovery and development, and improve and extend human life. The Tetra Scientific Data Cloud is the only open, cloud-native platform purpose-built for science that connects lab instruments, informatics software, and data apps across the biopharma value chain. It delivers the foundation of harmonised, actionable scientific data necessary to transform raw data into improved scientific outcomes faster. Through the Tetra Partner Network, market-leading vendors access the power of our cloud to help customers maximise the value of their scientific data. For more information, please visit tetrascience.com.