Tomorrow is May 25 and therefore the entry-into-force of the GDPR. The European Commission views the GDPR as one of its significant Digital Single Market (DSM) achievements. The Commission estimates that the DSM could add Euros 415 billion a year to EU GDP and add hundreds of thousands of jobs (see also this document on the economic impact of the DSM). There is no Commission calculation on what contribution the GDPR would make to this overall DSM estimate (it does say that GDPR will save business some money – see below), but the Commission argues that the GDPR will enhance trust in the digital economy and therefore promote the expansion of Europe’s digital economy.
As somebody who has spent a significant portion of the last year on counselling member companies on the GDPR, the immediate compliance burden looms larger than the possible innovation opportunity. Nonetheless, there is still scope for European regulators and policymakers to interpret and implement GDPR in ways that would help innovation in Europe. This is actually crucial because entry-into-force is not the end of the process. In many ways, it is the beginning. And EU policymakers will grapple with implementing the law to ensure it does not have unintended consequences.
The scope of the GDPR is vast, but here are ten suggestions that reflect issues that our member companies are concerned about as regulators implement and interpret the new regulation.
- Ensure harmonized implementation of the GDPR
- Make the One-Stop Shop Work for Everybody, Including SMEs
- Promote Artificial Intelligence Adoption in the EU
- Focus Regulatory Resources Where the Potential for Significant Consumer Harm Exists
- Craft Data Breach Notification Requirements so that Overnotification Does not Occur
- Embrace Data Portability but Do Not Stop Companies from Offering Tailored Solutions
- Allow Firms to Use Legitimate Interest when Processing Publicly Available Personal Information
- Implement Right-to-be-Forgotten to Meet Societal Goals and Allow for Technological Innovation
- Work with the DPAs so the ICANN WHOIS System can Continue to be Used
- Consider the Education Technology Sector – Set the Age of Consent at 13 and Review Contracts
Ensure harmonized implementation of the GDPR
It is worth recalling that when the GDPR was proposed, the idea was that one uniform law would actually save business money. The European Commission estimated that it would save industry Euros 2.3 billion a year. However, compliance costs for the Global 500 could amount to $7.8 billion. Whatever the true numbers, it is essential for companies to be able to rely on information provided by Member State Data Protection Authorities and for that information to be valid throughout the Union. So SIIA takes the Commission at its word when it says in this Communication that the GDPR provides for: “A harmonized legal framework leading to uniform application of rules to the benefit of the EU digital single market.” But, it is worthwhile noting that as of January 24, 2018 only two Member States had adopted all the legislation for full implementation of the GDPR. And even when all the Member States do so, despite one Regulation, there are still important areas where Member State rules could diverge. For example, it is up to Member States to lay down the rules for reconciling freedom of expression with data protection. It will be important to at least strive for as unified a view as possible on this topic given that the right-to-be forgotten has to be balanced with freedom of expression.
Make the One-Stop Shop Work for Everybody, Including SMEs
With respect to the Lead Supervisory Authority, the guidance says that for companies without an establishment in the EU, they are obliged to deal with the regulator in every country they are active in. This is potentially problematic for the thousands of SMEs throughout the world (not just the United States) that do routine business with the EU. Companies that solicit advice from a DPA should be able to rely on that advice throughout the Union. Without this ability, the very notion of a single digital market is compromised.
Promote Artificial Intelligence Adoption in the EU
The European Commission notes in this Communication that private investment in AI in 2016 in Europe was Euros 2.4 to 3.2 billion; in Asia Euros 6.5 to 9.7 billion; and, in North America Euros 12.1 to 18.6 billion. The Commission would like to encourage more private AI investment in the EU. This involves creating a multi-faceted strategy as the Communication notes. It will be important in this context to ensure that the GDPR’s Recital 71 “right to an explanation” is interpreted in such a way so that it, in effect, does not have an AI innovation-inhibiting effect in the EU. In particular, as the Working Party 29 guidance on automated individual decision-making and profiling suggests, it should not be interpreted to require the public disclosure of proprietary source code. It should also apply only to those automated decisions that have significant or legal effects. This SIIA Issue Brief on “Algorithmic Fairness” suggests that companies could offer a “rough narrative” for how algorithms arrive at outcomes. SIIA’s comment letter to the Article 29 Working Party articulates this point in this way: “In order to avoid disparate impact, companies should have data and model governance programs; consider when to expend resources to assess the impact of their data practices on vulnerable groups; demonstrate social benefits stemming from their uses of data analytics systems; and, provide transparency and explanations without having to disclose source code of proprietary algorithms.” A recent Center for Data Innovation paper entitled “How Policymakers can Foster Algorithmic Accountability” also offers a useful framework for thinking about these issues. And the bottom line is this: the more data is available for analysis, the more useful the algorithms become. What that means is that it really is important that the GDPR (and the EC’s free flow of non-personal data proposal) truly do lead to one integrated data market in the Union.
Focus Regulatory Resources on Areas Where the Potential for Significant Consumer Harm Exists
This is not a plea for Europe to adopt the U.S. sectoral privacy system. However, the GDPR distinguishes between personal data and sensitive personal data – see Recital 51. Even if both are protected, it does not mean that regulatory resources must be devoted equally to both kinds of data. European Data Protection Board Supervisor Giovanni Buttarelli notes that there are about 2500 people working for all of the EU’s DPAs. So it would be logical for regulators to focus on protecting, for instance, health and financial data, especially in a data breach context. Misuse and/or theft of this data really does have the potential to threaten individuals’ fundamental rights and freedoms.
Craft Data Breach Reporting Requirements so that Overnotification does not Occur
Data breaches are unfortunately all too common all over the world. It is therefore understandable that a lot of emphasis has been placed on the GDPR’s data breach notification requirements. The basic requirement in Article 33 is for a data breach to be reported to the DPA “without undue delay and, where feasible, not later than within 72 hours after having become aware it.” SIIA provided comments to the Article 29 Working Party on this issue. It will be important to focus on notice requirements for data breaches involving identity theft/possible financial harm. Regulators should work with companies so that there is a culture of resolving, not just reporting, data breaches, which means that the awareness standard of a breach should be certainty, not reasonable certainty. Controllers are supposed to have “arrangements” with processors regarding data breaches – this should be clarified to mean contractual arrangements. The 72 hour clock should not start ticking from when a processor or sub-processor becomes aware of a data breach. Processors and sub-processors should notify their customers (controller) “without undue delay” as the GDPR requires. The 72 hour clock should then start ticking from the time the Controller becomes aware of the breach.
Embrace Data Portability but Do Not Stop Companies from Offering Tailored Solutions for Customers
See SIIA’s comment to the Article 29 Working Party on this topic. It is true that data portability can increase competition. But regulators need to be careful that it does not produce the opposite effect. Competition in the technology industry is often driven by consumer demand for competing features and functions. So making application programming interfaces (APIs) mandatory in every case could undermine the very competition policymakers say they wish to see. It would also be helpful if regulators confirmed that the portability right arises in the business to consumer, not business to business context. It is crucial that data processors not be liable for the portability obligations that their controller customers have because it is the controllers that have the relationship(s) with data subject, not the data processors. The portability right should include data affirmatively provided by data subjects, not inferred data. Although data should be provided in a machine-readable format, it should be understood that that does not mean that each firm must have systems that are compatible with the systems used by all other companies. There should be no emerging blanket interoperability requirement – instead vendors should be encouraged to develop APIs from common data formats or allow SMEs to create a cost effective “translation and transfer” market. Pseudonymous data should not be covered by the data portability requirements if it is not reasonably likely that an individual will be re-identified.
Allow Firms to Use Legitimate Interest when Processing Publicly Available Personal Information
Many companies access and process publicly available information to provide valuable services such as anti-money laundering, know-your-customer, credit scoring and other services. Companies should be allowed to do so using legitimate interest as the justification for processing that information. That does not mean, however, that firms should be absolved from the duty to provide an explanation and otherwise be subject to GDPR requirements if the result of the processing could affect a data subject’s interests or fundamental rights and freedoms.
In addition, when applying the requirement in Article 5 that information collected for specific purposes “not be further processed in a manner that is incompatible with those purposes” the analysis of context and likelihood of harm should be key elements in assessing whether the new use of the data is compatible with the original purpose. This is the approach taken in the Working Party 29’s opinion on purpose limitation that allows assessment of “compatibility” to consider “the context in which the personal data have been collected and the reasonable expectations of the data subjects as to their further use,” the “impact of further processing on the data subjects” and “the safeguards applied by the controller to prevent…any undue impact on the data subjects.”
Implement Right-to-be-Forgotten to Meet Societal Goals and Allow for Technological Innovation
SIIA has written extensively and critically about the right-to-be-forgotten as this blog makes clear. Basically, the new right veers too far in the direction of compromising freedom of expression and out-sources what are really value-driven decisions to companies in a way that policymakers might one day regret. Moreover, while extra-territorial application of the GDPR to companies that do business with EU citizens even if those companies are not established in the EU can be defended, the idea that the right-to-be-forgotten should be implemented even in non-EU Internet domains goes too far. The United States, for instance, simply has a different approach to freedom of expression from the EU. Beyond ideology and constitutional considerations, the right-to-be forgotten has practical implications as this Atlantic Council and Thomson Reuters report entitled “Big Data: A Twenty-First Century Arms Race” makes clear. Companies that provide crucial services for, for instance, financial regulators need access to credible and up to date data, including personal data.
With respect to technological innovation, the European Commission is promoting blockchain technologies. One way to do that would be to, as discussed above, consider pseudonymization as meeting the anonymization test if it is “reasonably likely” that pseudonymization will not lead to re-identification. See this interesting slide presentation called “GDPR Ramifications of Blockchain Technologies” by Kenneth K. Luvai, which discusses this possibility. Pseudonymization would allow blockchains to continue to be constructed so that they are immutable, i.e. unchangeable, which is what provides their powerful record-keeping potential as this SIIA Issue Brief describes.
Work with the DPAs so that the ICANN WHOIS System can Continue to be Used
The Internet Corporation for Assigned Names and Numbers (ICANN) has contracts with Registries and Registrars that register literally millions of domain name applications every year. The Registries and Registrars are obliged to maintain domain name registration data. i.e. names, email addresses, phone numbers, and other administrative and technical data. This data is called “Who is responsible for the domain name data,” i.e. WHOIS data. Companies and individuals can search the WHOIS databases to identify domain name registrants. SIIA has used the system for many years in managing its anti-piracy program. Access to WHOIS data is essential to combatting intellectual property right infringements online and many other online illegal activities. On November 2, 2017 ICANN announced that it was suspending contractual enforcement against registries and registrars for GDPR-related non-contract compliance. ICANN and stakeholders have been working since then to come up with models that would allow for continued real-time effective access to WHOIS data, while still allowing for GDPR compliance. The European Commission has been involved in this effort as well. ICANN and its stakeholders will continue to work on solutions. But practical ideas that can work in real-time are needed from EU authorities as well.
Consider the Education Technology Sector – Set the Age of Consent at 13 and Review Contracts
SIIA sent this letter to the British ICO in response to the “Consultation: Children and the GDPR Guidance.” The GDPR sets the default age of consent at 16 but allows Member States to establish it at 13 (the UK plans for a 13 year old consent age). SIIA encourages Member States to set the age of consent at 13. The higher consent age is an obstacle to the educational development of teenagers; creates a barrier between teenagers and vital support and information services; ignores decades of industry best practices around offering online services to teenagers aged 13 and older; prevents teenage freedom of expression; and, motivates increasingly tech-savvy teenagers to lie about their age.
The education technology marketplace typically involves companies contracting with education authorities. This means that the firms are data processors and the education authorities are controllers. So it would be good for education authorities to review their contracts to ensure they are GDPR-compliant. Moreover, it is worthwhile considering best student privacy data practices. Such practices can include collecting, sharing and using student PII only for education-related purposes; providing transparency regarding the type of PII being collected in privacy notices; appropriately vetting third parties collecting PII; ensuring security to protect student information against risks of unauthorized access or use; and, confirming data retention, student rights, and user requests for access and deletion of data.
Conclusion
There are probably hundreds of ways that the GDPR can be made to work better both to protect fundamental rights and to promote the digital economy. These ideas reflect the challenges brought to us by our member companies. We stand ready to work with EU authorities to help make the goals, including promotion of the digital economy in the EU, of the GDPR and the Digital Single Market a reality.

Carl Schonander is Senior Vice President for Global Public Policy.