ITPI Event recap – The EU Data Strategy and the Draft Data Governance Act (19th May 2021)
Authors: Sebastião Barros Vale, Chelsey Colbert, Limor Shmerling Magazanik, Rob van Eijk
On May 19, 2021, the Israel Tech Policy Institute (ITPI), an Affiliate of The Future of Privacy Forum (FPF), hosted, together with the Tel Aviv University, The Stewart & Judy Colton Law and Innovation Program, an online event on the European Union’s (EU) Data Strategy and the Draft Data Governance Act (DGA).
The draft DGA is one of the proposed legislative measures for implementing the European Commission’s 2020 European Strategy for Data (EU Data Strategy), whose declared goal is to give the EU a “competitive advantage” by enabling it to capitalise on its vast quantity of public and private sector-controlled data. The DGA will establish a framework for using more data held by the public sector, regulating and increasing trust in data intermediaries and similar service providers that provide data sharing services, and data altruism (i.e., data voluntarily made available by individuals or companies for the “general interest”).
While speakers also addressed other proposals tabled by the European Commission (EC) for regulating players in the data economy – such as the Digital Services Act (DSA) and the Digital Markets Act (DMA) -, most of the discussion revolved around the DGA’s expected impact and underlying policy drivers.
Both Prof. Assaf Hamdani, from the Tel Aviv University’s Law Faculty, and Limor Shmerling Magazanik, Managing Director at ITPI, assumed moderating roles during the event, which counted on the input of speakers from the EC, the Massachusetts Institute of Technology (MIT), Covington & Burling LLP, Mastercard and the Israeli Government’s ICT Authority.
The recording of the event is available here.
The DGA as a tool for trustworthy data sharing
Maria Rosaria Coduti, Policy Officer at the EC’s Directorate-General for Communications Networks, Content and Technology (DG CNECT), started by outlining the factors that drove the EC to put forward the EU Data Strategy. As the EC acknowledges the potential of data usage for the benefit of society, it intends to harness it through bolstering the functioning of the single market for data, with due regard to privacy, data protection and competition rules.
To attain that goal, there were questions that needed to be addressed to increase the trust in the exchange of such data. Those included a lack of clarity on the legal and technical requirements applicable to the re-use of data and the sharing of data by public bodies, as well as the creation of European-based data storage solutions. On the other hand, the EC also identified the need of further empowering data subjects through voluntary data sharing, as a complement to their data portability right under the General Data Protection Regulation (GDPR).
According to the EC official, the EU Data Strategy rests on 4 main pillars: 1) a cross-sectoral governance framework for boosting data access and use; 2) significant investments in European federated cloud infrastructures and interoperability; 3) empowering individuals and SMEs in the EU with digital skills and data literacy; 4) the creation of common European Data Spaces in crucial sectors and public interest domains, through data governance and practical arrangements.
The DGA itself, which intends to create trust in and democratise the data sharing ecosystem, also focuses on four main aspects: 1) the re-use of sensitive public sector data, by addressing “obstacles” to data sharing, complementing the EU’s Open Data Directive – together with an upcoming Implementing Act on High-Value Datasets under Article 14 of that Directive – and building on Member-States’ access regimes; 2) business-to-business (B2B) and consumer-to-business (C2B) data sharing through dedicated, neutral and duly notified service providers (“data intermediaries”); 3) data altruism, enabling individuals to share their personal data for the common good with registered non-profit dedicated organisations, notably through signing a standard to-be-developed European data altruism broad consent form; 4) the European Data Innovation Board, which shall be an expert group with EC secretariat, focused on technical standardisation and on harmonising practices around data re-use, data intermediaries and data altruism.
Specifically on the topic of data intermediaries, and replying to questions from the audience, Maria Rosaria Coduti mentioned the importance of ensuring they remain neutral. This entails that intermediaries should not be allowed to process the data they are entrusted with for their own purposes. In this respect, Recital 22 of the draft DGA excludes cloud service providers, and entities which aggregate, enrich or transform the data for subsequent sale (e.g., data brokers) from its scope of application.
It should be noted that the third Compromise Text released by the Portuguese Presidency of the Council of the EU opens the possibility for intermediaries to use the data for service improvement, as well as to offer “specific services to improve the usability of the data and ancillary services that facilitate the sharing of data, such as storage, aggregation, curation, pseudonymisation and anonymisation.”
Coduti underlined the importance of preventing any conflicts of interest for intermediaries. Data sharing services must thus be rendered through a legal entity separate from the other activities of the intermediary, notably when the latter is a commercial company. This would, in principle, mean that legal entities acting as intermediaries under the DGA would not be covered by the proposed DSA as hosting service providers, nor as “gatekeepers” under the DMA proposal. Ultimately, the EC wishes intermediaries to flourish, by becoming a trusted player and a valuable tool for businesses and individuals in the data economy.
The upcoming Data Act: facilitating B2B and B2G data sharing
Lastly, the EC official briefly addressed the upcoming EU Data Act. In parallel with the revision of the 25-year old Database Directive, this piece of legislation will focus on Business-to-Business (B2B) and business-to-government (B2G) data sharing. Its aim will be to maximise the use of data for innovation in different sectors and for evidence-based policymaking, without harming the interests of companies that invest in data generation (e.g., companies that produce smart devices and sensors).
While the Data Act will not address the issue of property rights over data, it will seek to strike down contractual and technical obstacles to data sharing and usage in industrial ecosystems (e.g., in the mobility, health, energy and agricultural spaces). Coduti stressed that the discussion around data ownership is complex, due to the proliferation of data obtained from IoT devices and the emergence of edge computing, as well as the necessary balance between keeping utility of datasets and safeguarding data subjects’ rights through anonymisation.
On the latter point, Rachel Ran, Data Policy Manager at the Israeli Government ICT Authority, echoed Coduti’s concerns, stating that data cannot be universally open. According to the Israeli official, there is a tradeoff between data utility and individual privacy that has to be accepted, but questions remain about the level of involvement that governments should have in determining this balance.
On May 28, the EC released its Inception Impact Assessment on the DA. Stakeholders are encouraged to provide their feedback over a 4-week period.
An increasingly complex digital regulatory framework in the EU
Henriette Tielemans, IAPP Senior Westin Research Fellow, offered a comprehensive overview of the EC’s data-related legislative proposals which were tabled in the last six months, other than the DGA. The trilogue work currently underway in the EU institutions on the proposed ePrivacy Regulation was also mentioned.
In the context of its Strategy for Artificial Intelligence (AI), the EC has very recently published a proposal for a Regulation laying down harmonised rules on AI. Tielemans saw this proposal as important and groundbreaking, suggesting that the EC is looking to set standards also beyond EU borders, like it did with the GDPR. She stressed that the proposal takes a cautious risk-based approach.
Tielemans highlighted the proposal’s dedicated provision (Article 5) on banned AI practices, arguing that the most contentious amongst those should be real-time remote biometric identification for law enforcement purposes in public spaces. However, she noted that the paragraph contained wide exceptions to the prohibition, allowing law enforcement authorities to use facial recognition in that context, subject to conditions. Tielemans predicted that the provision will be a “hot potato” during the Regulaiton’s negotiations, notably within the Council of the EU.
Furthermore, Tielemans stressed that “high-risk AI systems”, which are the major focus of the proposal, are not defined thereunder, as the EU intends to ensure the Regulation is future-proof. However, Annex III forwards a provisional list of high-risk AI systems, like systems used for educational and vocational training (e.g., to determine who will be admitted to a given school). Annex II is more complex, due to the interaction with other EU acts: if there are products subject to a third-party conformity assessment (under other EU laws) where the provider would like to integrate an AI component, that would be considered high-risk AI. Tielemans also noted that, once a system is qualified as high-risk, then providers acquire a number of obligations on training models, record keeping, among others.
On the DSA, Tielemans pointed out that the proposal is geared towards providers of online intermediary services. It provides rules on the liability of providers for third-party content and how they should conduct content moderation. In principle, she stressed, such providers shall not be liable for information which is conveyed or stored through their services, although they are burdened with takedown – but no general monitoring – obligations. The proposal distinguishes between different types of providers, with their respective obligations matching their role and their degree of importance in the online ecosystem. Hosting service providers have less obligations than online platforms, and the latter less so than very large ones: a sort of obligation “cascade”.
Lastly, Tielemans concisely mentioned the DMA and the revised Directive on Security of Network and Information Systems (NIS 2 Directive) as other noteworthy EC initiatives, identifying the former as a competition law toolbox and an “outlier” in the EU Data Strategy. She also pinpointed the latter’s broader scope, increased oversight and heavier penalties as important advances in the EU’s cybersecurity framework.
Reconciling “overarching” with “sectoral” regulation on data sharing
Helena Koning, Assistant General Counsel, Global Privacy Compliance Assurance & Europe Data Protection Officer at Mastercard, provided some thoughts about the draft DGA’s potential impact on the financial industry.
She started by outlining the actors involved in the data sharing ecosystem. These include: (i) individuals, who demand data protection and responsible use of data; (ii) businesses, that wish to innovate through data usage and insights drawn from data, notably by personalising their products and services; (iii) policymakers, who increasingly regulate data usage; and (iv) regulators with bolstered enforcement powers in this space.
Then, Koning stressed that companies in the financial sector are currently subject to a significant regulatory overlap when it comes to data collection and sharing, with the ePrivacy Directive applying to terminal equipment information, GDPR applying to personal data in general and the Second Payment Services Directive (PSD2) covering payment data sharing. While there is already guidance by the European Data Protection Board (EDPB) on the interplay between the GDPR and PSD2, Tielemans added that lawmakers tend to regulate in siloes, adopting overlapping and sometimes conflicting definitions and obligations. This results in financial sector players being pushed to wear very different hats under each framework (e.g. payment service providers and data controllers). In this regard, Tielemans said that the EC should place further efforts in ensuring consistency between EU acts before proposing new legislation.
Koning showed concern that instruments such as the DGA and the Data Act will add to this regulatory complexity and that SMEs and citizens will have a hard time complying with and understanding the new laws. On this point, she addressed the fact that the DGA and PSD2 have diverging models for fostering data-based innovation: as an illustration, while PSD2 mandates banks to share customer data with fintechs, free of charge and upon the customer’s contractual consent, the DGA centres around voluntary data sharing, for which public bodies may charge fees and data subjects are called to give GDPR-aligned consent.
Furthermore, Koning expressed doubts about the immediate benefit that data holders and subjects would get from sharing their data with intermediaries, often in exchange of service fees.
Alternatives to data sharing and focus on data insights
Dr. Thomas Hardjono, from MIT Connection Science & Engineering, develops research on using data to better understand and solve societal issues, including the spread of diseases and addressing inequality. Hardjono started by congratulating the direction taken by the EC with the DGA, stating that his group at MIT had been studying issues relating to commoditization of personal data since the publication of a 2011 World Economic Forum report. In Hardjono’s view, public data is a societal asset that should be treated as carefully and comprehensively as personal data.
On that point, Rachel Ran mentioned that governments should seek to encourage data sharing through data governance and to centre their policies around the needs of data subjects. She added that data products – like Application Programming Interfaces (APIs) – should be human-centered. Data should be seen as a product, but not a commodity, especially when it comes to sharing government data.
Ran continued by describing one of the Israeli ICT Authority’s major projects: creating standard APIs for G2G and G2B data sharing. But there are significant challenges to such tasks, including: (i) unstructured and fragmented data; (ii) duplicated records and gaps; and (iii) inconsistent data formats and definitions. This ultimately leads to suboptimal decision-making by government bodies, given they are not properly informed by accurate and updated data.
About data sharing services, Hardjono stated that data intermediaries regulated under the DGA may face specific hurdles, notably on the level of intelligibility of data conveyed to data users. There are questions on whether the draft DGA’s prohibition on intermediaries to aggregate and structure data could prevent them from developing services which are interesting for potential data users. Koning added that a number of data sharing collaborations are already in place and that new EU regulation should facilitate rather than prevent them.
On that topic, Hardjono mentioned communities would be more interested in accessing insights and statistics about their citizens’ activity (e.g., on transportation, infrastructure usage and spending patterns), rather than large sets of raw data. On the other hand, aggregated data could be publicly shared with the wider society.
As a solution, Hardjono proposed developing and making available Open Algorithms, allowing data users (e.g., a municipality) to access specific datasets of their interest and to directly ask questions to and obtain insights from data holders about such datasets, through APIs. This would also avoid moving the data around, by keeping it with data holders.
Then another question arises, according to Hardjono: given the commercial value of data insights, there should be business incentives, possibly via fair remuneration, to gather, structure and analyze the data. In that context, Hardjono stressed that clarifying the intermediaries’ business model is crucial and should be addressed by the DGA. He also suggested that a joint remuneration model, shared between the public sector and data users, could be devised. Moreover, this leads to novel doubts about data ownership, notably about who owns the insights (the holder or the intermediary?) and on what title: could they be considered as the provider’s intellectual property?
Upon the observation from Prof. Assaf Hamdani that some cities are now imposing or incentivising companies and citizens to share data through administrative procedures and contracts, Hardjono regretted that the DGA did not devote enough attention to so-called data cooperatives. While Article 9(1)(c) of the DGA does offer a description of the services that data cooperatives should offer to data subjects and SMEs (including assisting them in negotiating data processing terms), there is an extensive academic discussion in the US about other roles these cooperatives could play in defending citizens’ interests, that could feed into the DGA debates. On the issue of data cooperatives, Ran held that such cooperatives should address data subjects’ needs and share data for a specified purpose, praising the DGA model in that regard.
Lastly, Hardjono highlighted the fact that certain datasets may have implicit bias and that algorithms used to analyze such data may thus be implicitly biased. Therefore, he held that ensuring algorithmic auditing and fairness is key to achieving good societal results from the usage of the large volumes of data at the relevant players’ disposal.
Ran added that, besides trustworthy, data should also be discoverable, interoperable (with common technical standards) and self-describing, to facilitate its sharing and ensure its usefulness.
For further reading, you can check out –
ITPI & FPF’s report “Using Health Data for Research: Evolving National Policies”
FPF’s report “Understanding Corporate Data Sharing Decisions: Practices, Challenges, and Opportunities for Sharing Corporate Data with Researchers”FPF’s Event report: “Brussels Privacy Symposium 2020 – Research and the Protection of Personal Data under the GDPR”