Getting personal with diagnostics and data

Tim Harper

Cientifica Ltd

This article, in our personalised medicine month, discusses diagnostics and data, looking at the challenges in accuracy, technological advances and the emergence of “Big Data”.

Despite huge advances in medical technology over the last century, the diagnosis and treatment of disease is still “mired in voodoo-like practices”, according to Silicon Valley investor Vinhod Khosla. While most doctors would vehemently disagree, there is little doubt that diagnosis and prescription has room for major improvement, and that a variety of emerging technologies are beginning to offer solutions which promise to improve accuracy, efficiency and clinical outcomes.

One of the biggest challenges in accurate diagnosis is the availability of data. This may be information about the patient’s previous medical history, something that is increasingly available. Ideally, this would include information about the clinical outcomes of the treatment of patients with a similar medical background and would be supported by a range of diagnostic tests and even genetic data – a tall order for a 15-minute appointment with a GP. But by finding new ways to use data, and making new kinds of data available, it may be possible in the near future.


“A major bottleneck with many diagnostic techniques is the time lag between taking a sample and getting a result back.”


First, let’s take a look at making new data available. A major bottleneck with many diagnostic techniques is the time lag between taking a sample and getting a result back. In some cases, this is unavoidable as cells have to be cultured and this takes a finite time. However, in many cases the major cause of delay is in taking a sample, shipping it to a lab for analysis and getting the results back, a process that is both time consuming and expensive, and is one that is ripe for disruption.

Many diagnostic tests, from haematology, through food allergies to infectious diseases are based on enzyme-linked immunosorbent assays (ELISA) that allow the detection of specific antigens producing an optical indication via a colour change, bioluminescence or florescence. Samples are scanned optically using a combination of a microscope objective lens and a CCD camera and can give quantitative results. But this model is under threat from a number of point of care (PoC) diagnostic tests, which aim to replace all of the current hardware with a simple rapid test that can be performed in a doctor’s surgery – just like a pregnancy test.

The approach we see succeeding is to print the diagnostic tests, and the optical detectors, using a variety of nanoparticle containing inks. A similar approach has been used to produce plastic solar panels, with the major advantage coming from the low cost of printing electronic devices using a roll-to-roll process instead of taking a highly complex and expensive semiconductor fabrication route. The optical detector, similar to a solar cell that converts photons into electrons, and hence a detectable current, is printed on a plastic substrate, and the lab on a chip device is then printed, layer by layer, on top of that. Not only is the device cheap to manufacture, but putting the detector as close as possible to the ELISA test allows almost 50% of the light emitted to be collected, compared with a few percent which would be seen by an optical microscope objective lens positioned above an assay well – and that results in higher sensitivity or a faster result depending on the test.


“But a major problem is that most doctors have to rely on their own skill and training to make accurate clinical decisions…”


Getting the results in minutes instead of weeks is a major step forward, but combining diagnostics with techniques such as nanopore based genetic sequencing, something that promises to reduce costs to hundreds of dollars, would allow doctors to have all the information they required to diagnose and prescribe in a single appointment.

But a major problem is that most doctors have to rely on their own skill and training to make accurate clinical decisions, despite similar decisions being made by other doctors every day, and the outcomes of those decisions being monitored. In the same way that a combination of GPS mapping and social networking allows us to quickly make decisions about what to eat, where to stay and how to get there, could we apply these same ideas to medicine? That’s where the emerging technology of ‘Big Data’ comes in.

Big Data refers to the technologies required to take the huge quantities of data that exist – in the case of healthcare the estimate is that our current 500 petabytes (a petabyte is a million gigabytes) will increase fifty fold by 2020 – and turn that into useful information. The challenge is that this data is held in a variety of ways and in a wide variety of forms from structured databases to collections of images or PDF archives of medical journals. Applications such as monitoring citywide networks of traffic sensors, water flow, energy usage or extracting information to allow businesses to make better decisions are driving the technology forward, and medical applications are already on the radar.


“…applying Big Data to medicine promises to put petabytes of global expertise in the hands of doctors…”


In the same way that the iPhone put the Internet at your fingertips, applying Big Data to medicine promises to put petabytes of global expertise in the hands of doctors, along with the results of point of care diagnostics to make informed and highly personalised decisions based not only on the results to hand, but also on the outcomes of the proposed treatment on large numbers of similar patents.

While it is unlikely that as Vinhod Khosla claimed, technology will replace 80% of doctors – privacy issues and the need to deliver results in person rather than a text message asking you to report immediately to the nearest hospital will see to that – it is more certain that technology will improve 80% of clinical decisions, something that benefits both patients and heathcare authorities.




About the author:

Tim has built a twenty-five year career on identifying, understanding and acting on technology trends, from instigating and managing development projects to investment and eventual commercialisation of micro- and nanotechnologies. This has been complemented by work in assessing and addressing the societal challenges created by the development and governance of emerging technologies such as nanotechnology, synthetic biology, regenerative medicine and geoengineering.

Tim has founded companies ranging from scientific instruments to medical diagnostics, and is recognised as both a leading expert in the economics and commercialisation of nano- and other emerging technologies, and as a respected science and technology communicator. His work has been used by governments and public policy organisations, the financial sector and international business in setting, defining and measuring effectiveness of R&amp,D programs, technology policy and innovation strategy.

How can technology improve clinical decisions?