'Quantified Self' data versus privacy concerns

Articles

As data tracking and collection increase, a balance has to be struck between the benefits to health from increased knowledge and the intrusion on individual privacy that results.

The 'Quantified Self' movement, which is the use of technology to acquire data on multiple aspects of a person's daily life, has been gathering steam for some time. There has been an influx of affordable, non-intrusive wearable technology, and the latest gadget, the Apple Watch, tracks not only wearers' heart rates and exercise times but also how many calories they burn and how often they take breaks from sitting.

Myriad vendors and consumer and pharmaceutical brands have invested heavily in this space, and the technology is now small and affordable enough to be accessible to the masses. It has been estimated that around 485 million wearable devices will be shipped worldwide by 2018.1 And beyond wearable technology, the number of 'things' connected to the Internet will be over 50 billion by 2020. These 'things' can, and will, track relevant, health-related activities such as sleep and eating patterns, coffee consumption, movement within a household, toilet usage, and so on.


"Companies will soon be collecting troves of aggregated data that will make what is currently being collected seem minuscule in comparison"

Individuals should be able to opt out of certain types of data tracking and collection according to data protection legislation and human rights acts, but is it really possible to opt out from intertwined networks all around us? If the providers of these products/services can tie in the health and wellness benefits of the Quantified Self movement to get critical mass from 'silent or conditional opt in', then companies will soon be collecting troves of aggregated data that will make what is currently being collected seem minuscule in comparison. The term 'Big Data' will hardly suffice.

Big Data is poised to deliver 'big savings' and 'transform healthcare' and it is certainly full of promise for the future with its 'big benefits'. However, Big Data comes with a big challenge: privacy. Most data subjects – that is, you and me, ordinary people – are unaware of how our personal data is collected (not only when we provide our details while shopping online but also when we simply browse the web), stored (in which country?), transferred (where will it go?) and used (who will do what to it?).

Simply asking data subjects to click on and agree with lengthy pages of 'terms and conditions' before allowing them to use a service is no longer sufficient from the US Federal Trade Commission's (FTC) perspective. Strong advocates and an increasing amount of fines for Big Data misuse show that the 'Big Privacy' movement is already building momentum.

There are many questions about Big Data and healthcare. Some argue that the focus of the Big Data phenomenon has already moved from 'should we adapt Big Data into our business?' to 'how can we use Big Data to make our business grow?' We have moved on from the data-scarce era to an era where we are being flooded with more data than we can comprehend. Undoubtedly, Big Data is helping researchers gain information beyond their wildest dreams: comprehensive medical records covering a wider population, holistic healthcare evaluation from primary care to secondary care and multiple perspectives of a single case are all easily available.

The big question often ignored by Big Data proponents is, where are these data coming from, and do we have the proper informed consent in place from the original data subjects? One common reason this question is swept under the carpet is the answer: We don't know.

Data on your desk right now could have been transferred from hundreds of different data brokers already; maybe they are part of an extraction from a larger database so that the original source becomes untraceable. However, 'unknown' doesn't mean it is legal and ethical or even 'totally anonymous'. Especially with Big Data analytics, some argue that anonymised data no longer exists.

The next question is, what can or will Big Data analytics do? If these actions involve any possible disadvantages to the data subject even in the future (an increase in insurance premiums or the potential for employment discrimination) or occur without proper consent (or simply discontent by the data subject), Big Privacy will probably win. Facebook and Google have made headlines because of fraud claims over data or privacy violations. And privacy advocates and the FTC have called fitness-tracking apps a 'nightmare' and 'very disturbing'.

In the battle with Big Data, individuals do not need to be 'identified' in order to be placed in a disadvantageous position. For example, if all de-identified medical records were openly available, would health insurance premiums increase simply because a person lives in an area with a high prevalence of smoking and obesity? Even when medical records are available only for public-sector research, a sharp drop in women reporting postnatal depression has been observed because of the fear that their babies may be taken away by social services. Or will we refuse to be treated by an HIV-positive nurse or avoid a hospital with higher-than-average rate of hepatitis infections?2 Given enough money, resources and time, all de-identified data can be identified again.

Steps healthcare market researchers can take

Big Data has certainly gained the attention of healthcare market researchers. It is important to engage practitioners, patients and regulatory bodies regarding the benefits of participating in Big Data research by conducting individual needs assessments. Stakeholders need to learn 'what's in it for them' and an awareness campaign with positive stories from data subjects benefiting from Big Data is a good start.

Security and accuracy are the two main concerns from Big Privacy advocates, especially in the healthcare space. If Big Data practitioners can gain the trust of all key stakeholders by safeguarding a transparent process for collecting accurate, accessible data, we might see a happy ending with both Big Data and Big Privacy winning.

References

1 ABI Research. Wearable Computing Devices, Like Apple's iWatch, Will Exceed 485 Million Annual Shipments by 2018. 21 February, 2013.

2 Anderson R. Why Anonymisation Doesn't Protect Privacy. https://www.privacyassociation.org/media/presentations/14DPI/DPI14_Keynote_RAnderson_PPT.pdf

About the author:

Jessica Santos PhD is global compliance and quality director with Kantar Health. She provides oversight and support for pharmacovigilance, global data privacy, ISO certification, SOX compliance and HIPAA and pharmaceutical client requirements.

Have your say: What should be done to get a balance between privacy concerns and big data?

Read more from Kantar Health:

Analysing the impact on social networks: Twitter as a catalyst

profile mask

Claire

3 December, 2014