New study calls for more public involvement in the design of big data medical research projects such as DNA databanks and epidemiological studies based on healthcare records, allowing research to continue whilst protecting individual privacy
Public participation should be at the heart of big data projects in health care and biomedical research, according to the findings of a new report by the Nuffield Council on Bioethics, the leading medical ethics body in the UK.
The report calls for greater transparency about how people's data are used, and recommends the introduction of criminal penalties in the UK for the misuse of data.
It also says that anonymising or pseudonymising data are not sufficient to ensure privacy is not breached.
The report warns that by not taking into account people's preferences and values, projects that could deliver significant public good may continue to be challenged and fail to secure public confidence.
Recent UK health data projects, such as care.data, 100K Genomes, UK Biobank and the Scottish Informatics Programme have each, in their own way, raised ethical questions surrounding the use of data.
On the other hand, there is also concern that the European Data Protection Regulation as it stands currently, would make it impracticable to use these important resources, because researchers would not be able to re-use personal information without asking the individuals concerned to give consent to each specific project.
The Nuffield report attempts to plot a middle route that will meet concerns about individual privacy and having data protection laws that facilitate research. The report examines all aspects of privacy and public interest and how developments in data science are putting pressure on conventional means of protecting privacy (including privacy rights, data protection and duties of confidence).
It concludes that good governance that involves public participation and accountability is essential to maintain public trust.
"We now generate more health and biological data than ever before. This includes GP records, laboratory tests, clinical trials and health apps, and it is becoming easier and cheaper to collect, store and analyse this data," said Martin Richards, chair of the Bioethics working party and emeritus professor of Family Research at Cambridge University.
"There is a strong public interest in the responsible use of data to generate knowledge, drive innovation and improve health. However, people understandably have concerns about their privacy. If we don't get this right, we risk losing public trust in research, and ultimately missing out on the benefits this type of research can bring,” Richards said.
Possible harms
While the use of data offers significant opportunities to generate knowledge, drive innovation and improve health, the report finds that the possible harms of data misuse are poorly understood and many are not recognised under UK regulation. Examples include the distress caused by a loss of privacy; loss of trust in the medical profession; and, the possibility that if data are not shared appropriately within health services, people might receive poorer care.
Among a number of recommendations, the report says that health authorities should track how data are used and that people should be told if and when there have been breaches of data security.
In addition, the report recommends the introduction of robust penalties, including imprisonment, for the deliberate misuse of data, whether or not it results in harm to individuals.
Re-identification
In addition to the protections offered under the UK Data Protection Law and the Human Rights Act, data projects typically use one of two approaches to protect the privacy of individuals: they either seek individuals' consent to use data; or they de-identify data to make them anonymous.
The report argues that as data sets are increasingly linked or re-used in different contexts to generate new information, it becomes increasingly difficult to prevent the re-identification of individuals. On its own, consent cannot protect individuals from the potentially harmful consequences of data misuse, nor does it ensure that all their interests are protected.
Therefore, good governance is essential to ensure that systems are designed to meet people's reasonable expectations about how their data will be used, including their expectations about a sufficient level of protection.
Michael Parker, a member of the working party, who is professor of Bioethics at Oxford University, said compliance with the law is not enough to guarantee that a particular use of data is morally acceptable. “Clearly not everything that can be done should be done. Whilst there can be no one-size-fits-all solution, people should have say in how their data are used, by whom and for what purposes, so that the terms of any project respect the preferences and expectations of all involved."
Negotiating interests
The report calls for greater transparency about how people's data are used, and recommends the introduction of criminal penalties in the UK for the misuse of data.
It also says that anonymising or pseudonymising data are not sufficient to ensure privacy is not breached.
The report warns that by not taking into account people's preferences and values, projects that could deliver significant public good may continue to be challenged and fail to secure public confidence.
Recent UK health data projects, such as care.data, 100K Genomes, UK Biobank and the Scottish Informatics Programme have each, in their own way, raised ethical questions surrounding the use of data.
On the other hand, there is also concern that the European Data Protection Regulation as it stands currently, would make it impracticable to use these important resources, because researchers would not be able to re-use personal information without asking the individuals concerned to give consent to each specific project.
The Nuffield report attempts to plot a middle route that will meet concerns about individual privacy and having data protection laws that facilitate research. The report examines all aspects of privacy and public interest and how developments in data science are putting pressure on conventional means of protecting privacy (including privacy rights, data protection and duties of confidence).
It concludes that good governance that involves public participation and accountability is essential to maintain public trust.
"We now generate more health and biological data than ever before. This includes GP records, laboratory tests, clinical trials and health apps, and it is becoming easier and cheaper to collect, store and analyse this data," said Martin Richards, chair of the Bioethics working party and emeritus professor of Family Research at Cambridge University.
"There is a strong public interest in the responsible use of data to generate knowledge, drive innovation and improve health. However, people understandably have concerns about their privacy. If we don't get this right, we risk losing public trust in research, and ultimately missing out on the benefits this type of research can bring,” Richards said.
Possible harms
While the use of data offers significant opportunities to generate knowledge, drive innovation and improve health, the report finds that the possible harms of data misuse are poorly understood and many are not recognised under UK regulation. Examples include the distress caused by a loss of privacy; loss of trust in the medical profession; and, the possibility that if data are not shared appropriately within health services, people might receive poorer care.
Among a number of recommendations, the report says that health authorities should track how data are used and that people should be told if and when there have been breaches of data security.
In addition, the report recommends the introduction of robust penalties, including imprisonment, for the deliberate misuse of data, whether or not it results in harm to individuals.
Re-identification
In addition to the protections offered under the UK Data Protection Law and the Human Rights Act, data projects typically use one of two approaches to protect the privacy of individuals: they either seek individuals' consent to use data; or they de-identify data to make them anonymous.
The report argues that as data sets are increasingly linked or re-used in different contexts to generate new information, it becomes increasingly difficult to prevent the re-identification of individuals. On its own, consent cannot protect individuals from the potentially harmful consequences of data misuse, nor does it ensure that all their interests are protected.
Therefore, good governance is essential to ensure that systems are designed to meet people's reasonable expectations about how their data will be used, including their expectations about a sufficient level of protection.
Michael Parker, a member of the working party, who is professor of Bioethics at Oxford University, said compliance with the law is not enough to guarantee that a particular use of data is morally acceptable. “Clearly not everything that can be done should be done. Whilst there can be no one-size-fits-all solution, people should have say in how their data are used, by whom and for what purposes, so that the terms of any project respect the preferences and expectations of all involved."
Negotiating interests
The use of data involves negotiating a complex range of interests. Everyone has a personal interest in protecting privacy and promoting the public good, and there is also a public interest in respecting individual privacy and promoting the public good.
The report argues that decisions about the use of data are social choices that will involve a number of different people including regulators, commercial firms, doctors, researchers, patients and the wider public.
Each will bring their own preferences and expectations about how data should be used. An inclusive process of deliberation will help identify the best approach in each data project.
Every data project should produce a clear, public statement about how data will be used, who will have access to it, and should continue to report on how it has, in fact, been used.
"Data is increasingly seen as a commodity to exploit and there are often strong political, economic or scientific interests that try to set the terms of a data project prior to any wider public debate", said Susan Wallace, of Leicester University, another member of the working party. Any data project should first take steps to find out how people expect their data to be used and engage with those expectations through a process of continued participation and review,” Wallace said.
The Nuffield Council on Bioethics is an independent body funded by the UK Medical Research Council and the medical research charity Wellcome Trust.
Link to report here
The report argues that decisions about the use of data are social choices that will involve a number of different people including regulators, commercial firms, doctors, researchers, patients and the wider public.
Each will bring their own preferences and expectations about how data should be used. An inclusive process of deliberation will help identify the best approach in each data project.
Every data project should produce a clear, public statement about how data will be used, who will have access to it, and should continue to report on how it has, in fact, been used.
"Data is increasingly seen as a commodity to exploit and there are often strong political, economic or scientific interests that try to set the terms of a data project prior to any wider public debate", said Susan Wallace, of Leicester University, another member of the working party. Any data project should first take steps to find out how people expect their data to be used and engage with those expectations through a process of continued participation and review,” Wallace said.
The Nuffield Council on Bioethics is an independent body funded by the UK Medical Research Council and the medical research charity Wellcome Trust.
Link to report here