New report explores how Europe’s evolving data and AI rules will impact open science
Get your free copy
The EU’s world-leading data protection and privacy rules can make life very complicated for the region’s scientists and researchers.
Experts say the General Data Protection Regulation (GDPR), for example, is sometimes used as “an excuse” for not sharing sensitive data, such as personal health data, even where there is a clear public interest. Although the GDPR does allow for the analysis of such data in the context of a public health crisis, the regulation stands accused of hampering the efforts of scientists trying to figure out how COVID-19 spreads and who is susceptible.
With Europe now pulling together a new strategy encompassing the exchange of data and key applications, such as artificial intelligence, a new report from the Science|Business Cloud Consultation Group, an independent group of cloud experts from academia, industry and public-sector institutions, explores how to enable data to be more widely exploited by scientists, while protecting individuals’ fundamental rights and privacy and upholding European values, such as non-discrimination.
The report’s recommendations, which do not necessarily reflect the views of individual members of the group, include making the intent of legislation and regulations crystal clear so researchers fully understand what the objectives are, the legitimate interests being protected and can readily identify unintended consequences. There is also a call for the development of more and better software tools that make it easier for researchers to comply with European data rules throughout the scientific process.
Other recommendations include linking public subsidies to data sharing requirements and limiting the extent to which applications and data/meta-data are integrated: The default position for service providers could be that data sets should be independent of applications, as much as possible, unless the custodian of the data can present a robust case as to why they have to be integrated.
New spaces, new rules?
As well as considering the impact of the GDPR, the report explores how the common data spaces proposed by the European Commission will work alongside the European Open Science Cloud. One of the objectives of the common data spaces will be to fuel research and innovation in Europe, implying there will be a significant overlap between the common data spaces and the EOSC. However, in many cases, researchers are likely to have to pay to access the data, either through a financial transaction or by agreeing to share the results of their research with the providers of the source data.
These data spaces could potentially provide a common infrastructure, together with the appropriate legal framework, to establish an adequate chain of protection of personal data while using the derogations provided for in the GDPR.
The report also considers the Commission’s plans to drive the development of safe and responsible artificial intelligence (AI). The Commission is proposing that AI products that are deemed to be high-risk could have to undergo testing for safety, fairness, and privacy before being released onto the European market. But a sufficiently broad, yet robust, definition of a high-risk application could prove hard to pin down. Experts also note it will be hard for regulation to dictate and enforce minimum levels of quality in a complex field, such as software applications.