Electronic health records expose privacy fears
This is the second in a series of three articles examining the legal and ethical dilemmas faced when using new mediums and data sources for healthcare market research and data analytics, following on from a BHBIA meeting. This piece looks at the issues around use of electronic health records.
On 15 January 2013, the UK Health Secretary, Jeremy Hunt, announced that all hospitals would be required to computerise their patients’ records within 12 months. He stated that hospitals would need to have “digital records that are capable of being shared” for all their patients from 2014.
Electronic health records (EHR) have been on the cards for years, with various suppliers of software, such as Cerner and iSoft, helping to enable electronic systems of data collection.
Over the past 10 years, opinion on EHR has been split. For some, their adoption is a move towards greater efficiency, improved quality, and reductions in cost and medical errors. For others, particularly patients, this development raises a host of privacy concerns.
The use of EHR means that significant amounts of data are transferred daily, due to the increasing amount of health information exchanged electronically. This raises many complex security and privacy questions, particularly regarding the nature of consent.
“91 per cent also expected to be explicitly asked for their consent before their medical records were shared”
A recent study from Imperial College London found that 78.9 per cent of respondents voiced concerns about the security of EHR and 71.3 per cent felt the NHS would not be able to guarantee the safety of personal information. In addition, 91 per cent also expected to be explicitly asked for their consent before their medical records were shared for health provision, research or planning purposes.1
Under the 1998 UK Data Protection Act, data subjects need to be informed of the purposes for which their information is being collected, stored and transferred, and consent must be sought for any transfer of information, and any secondary uses of that information.
EHRs therefore represent a complex area in relation to privacy. When and how should consent be obtained from the patient? Should explicit consent be requested before any personal information is released? How does this work in an emergency situation?
Consent and access are important issues regarding EHR too. Personal data still belongs to the patient but other people have access to it. There are also privacy risks in relation to potential future uses of EHR. The government may decide to ‘optimise’ the use of EHR in the future, with regard to findings, patterns and/or trends that may not otherwise be evident in localised patient records. It may even sell its database to a contractor to undertake such research.
The government could argue that it has anonymised the patient records and it is simply extracting value from the database in the public interest. That means, however, that it is using individuals’ personal data (supposedly anonymised) for a purpose other than that for which it was collected. Consent at the time of data collection, therefore, becomes ‘implied consent’ when data is used for other purposes.
There is also a range of issues in relation to the anonymisation of data. In March 2013, Massachusetts Institute of Technology (MIT) researchers published a paper on their analysis of 1.5 million cell phone traces over 15 months and found that just four points of reference were enough to uniquely identify 95 per cent of them.2
In August 2006, AOL Research released a compressed text file on one of its websites containing 20 million search key words for more than 650,000 users over a three-month period. AOL did not identify the users, but it took The New York Times less than two days to locate an individual from the ‘anonymised’ search records.3
“A lack of trust in EHR systems could result in a real problem in terms of people not being willing to provide information on their health”
A lack of trust in EHR systems could result in a real problem in terms of people not being willing to provide information on their health, which could cause major issues, both in relation to ensuring that people receive appropriate care and impacting the future of healthcare research.
The NHS suffers from a poor reputation in relation to data security. According to the UK Information Commissioner’s Office (ICO), at least 1.8 million sensitive papers went missing through the NHS in the period July 2011 to July 2012, which included data security records being dumped in bins and electronic records found for sale on an auction site. Other data breaches included patient records being stolen and posted online and unsecured laptops being taken from staff members’ homes. The ICO levied fines of around £1 million on NHS bodies during the first six months of 2012.4
Concerns over privacy and security of data in relation to EHR are not unfounded. In order to ensure as high a level of data security and privacy safeguards as possible, the UK government must first develop a robust governance and policy framework around them. In order to address the range of privacy issues that may arise in relation to electronic health records, the UK government should seek to undertake a comprehensive privacy impact assessment (PIA) on the system. A PIA is a process for assessing the impacts on privacy of a project, policy, programme, product or service and, in consultation with stakeholders, for taking remedial actions as necessary in order to correct, avoid, or minimise the negative impacts.5 Undertaking a PIA could help the government identify the privacy impacts of the proposed EHR and what must be done to ensure that the project is not a liability. Carrying out a PIA could also help to assure stakeholders that the government takes their privacy seriously and that it seeks the views of those who could be interested in, or affected by, the project.
1 Imperial College London, ‘Key findings from survey on patient views towards EHRs‘ 24 Feb 2014.
2 Hardesty, Larry, ‘How hard is it to ‘de-anonymize’ cellphone data?‘, MIT News Office, 27 March 2013.
3 Barbaro, Michael, and Tom Zeller Jr., ‘A Face Is Exposed for AOL Searcher No. 4417749‘, The New York Times, 9 August 2006.
4 The Telegraph, ‘NHS lost 1.8 million patient records in a year’, 29 October 2012.
5 Wright, David, ‘The state of the art in privacy impact assessment’, Computer Law & Security Review, Vol. 28, No. 1, Feb. 2012, pp 54-61 [p 55].
About the authors:
David Wright is managing partner of Trilateral Research, a London-based partnership, which he founded in 2004. He has been a partner in numerous projects funded by the European Commission involving privacy, surveillance, risk, security and ethics. He has published many articles in peer-reviewed journals, as well as writing books on privacy issues.
Inga Kroener, senior research analyst, joined Trilateral in 2013. Her research interests lie in the areas of governance, policy and ethics in relation to surveillance and security technologies. Prior to joining Trilateral, Inga worked as a lecturer in Science and Technology Studies at University College London. She has also worked as a senior research associate at Lancaster University, for the RSA and for DEFRA. Inga has a PhD in Science and Technology Studies from UCL, and an MSc in Science and Technology Policy Research from Manchester University. She has been published in peer-reviewed journals.
About the BHBIA: The core aim of the BHBIA is to promote the excellence with integrity of business intelligence in the healthcare industry. Members are drawn from pharmaceutical/ healthcare companies and the agencies/consultancies that supply business intelligence services to those companies. Find out more about the BHBIA here.
Read the first article in this series: