2015-10-14

A big new law is coming, and a lot of companies doing business online aren’t going to like it.  Neither will many advocates of civil liberties for Internet users. Europe’s pending General Data Protection Regulation (GDPR) updates and overhauls EU data protection law – the law that produced this week’s Schrems case and last year’s “Right to Be Forgotten” ruling in the EU. Data protection has long been a field considered arcane and impenetrable by many US lawyers.

Most businesses and other entities outside Europe have rightly paid little attention in the past because the law seemingly did not apply to them.  But if draft GDPR provisions circulating at this near-final stage in the lawmaking process are enacted into law, that’s about to change.  Companies that previously fell outside data protection jurisdiction, including those with minimal ties to Europe, are being brought within its scope.  For many, compliance will entail meaningful costs in money and engineering time.  And online companies that deal in content – whether as creators and online publishers or as technical intermediaries – may find themselves receiving unprecedented erasure demands from European citizens or regulators.   Going forward, if users around the world find their Facebook reminiscences about European acquaintances disappearing – or can’t find tweets about individuals who settled fraud allegations with the FTC – this law will likely be the reason.

The GDPR is in many other respects a very good law. Europe already provides more robust legal privacy protections than many countries, including the US; this will make those protections even stronger and advance global norms around the privacy rights of Internet users.  And it should surprise no one that European lawmakers, angered by the Snowden revelations and the US government’s lackadaisical response, want more control over what personal data leaves Europe and how it is protected and safeguarded.  But the GDPR has many other consequences, intended or unintended, for free expression, innovation, and the cost of doing business on the Internet.   Those deserve much more public discussion than they are currently getting.

Over the coming months, I will be unpacking these elements of the GDPR in a series of blog posts. My focus will mostly be on how the law affects Internet intermediaries – and through them, users’ ability to receive and impart information using the Internet.  Some aspects I discuss, like jurisdiction and the “Right to Be Forgotten,” will be important for other kinds of online entities as well.

The series isn’t about privacy under the GDPR, and it won’t focus on data protection law governing collection and use of user data in logs or other back-end storage systems.  Great coverage of privacy aspects is available from public interest groups, law firms, and other sources.  This post frames the issues and the project; tomorrow’s separate <FAQ> will preview the subjects of coming “deep-dive” posts, including the GDPR’s free expression and jurisdiction provisions.  Those posts will be available on the Stanford CIS blog.

A major goal of this series is to foster better conversation between data protection experts and practitioners focused on other parts of Internet law — particularly intermediary liability and free expression.  My own background is in Internet law.  I am not a data protection lawyer.  In my previous role as Associate General Counsel for Google, I had an immersive real-world education in data protection, most recently in relation to the CJEU’s “Right to Be Forgotten” ruling in Costeja.  But there are other areas of data protection law where I am a relative novice.  My hope is that data protection practitioners, as well as other Internet law mavens, will leave comments or otherwise reach out with feedback, including criticism.   These posts will later be aggregated in a single publication, which will be greatly improved by your comments.

A brief background on data protection law, intermediary liability, and the GDPR

The law of data protection is generally very foreign to US lawyers.  But some version of it exists in at least 109 countries around the world, and provides important rights to citizens.  Data protection is enshrined in the EU Charter of Fundamental Rights as a right distinct from privacy: a broad right to limit processing of all information relating to oneself, not just information that invades personal privacy.   Where it conflicts with other fundamental rights, including rights to receive and impart information, the rights at issue must be balanced.  The 1995 Data Protection Directive sets out a detailed framework for the data protection right, including specific legal grounds for entities to process personal data.  It also establishes regulatory bodies for enforcement.  National and sub-national Data Protection Agencies (DPAs) are the primary enforcers, and have ongoing relationships with many regulated entities.  For most Internet companies, the foremost data protection issue has been, and will continue to be, the backend processing of data about users – maintaining account information, for example, or tracking behavior on a site.

The law of intermediary liability limits and defines the legal responsibility of technical intermediaries for content posted online by third parties.   It protects companies, but it also protects Internet users by limiting the circumstances in which their speech will be deleted based on suspicions of illegality.  In the US, key intermediary liability laws are the DMCA for copyright and CDA 230 for defamation, invasion of privacy, and most other concerns.  In the EU, intermediary liability is governed by Articles 12-15 of the eCommerce Directive, as implemented in the national laws of Member States. Protected intermediaries generally have no obligations to police, and no liability for unlawful user content until they know about it. To comply with these laws, intermediaries operate notice and takedown systems to remove content when notified that it violates the law.  In theory intermediaries should only remove user content if the notice is correct and the content actually is illegal – but intermediaries often delete content based on inaccurate or bad faith accusations, leading to over-removal of Internet users’ lawful speech.

Historically, many lawyers have not drawn a connection between data protection and the law of intermediary liability.  The two fields use very different vocabularies, and are for the most part interpreted, enforced and litigated by different practitioners.  A lawyer who views an issue through the lens of intermediary liability and one who views the same issue through the lens of data protection may have trouble even understanding each other’s concerns.

But if the two fields were ever really separate, the CJEU’s 2014 “Right to Be Forgotten” ruling in the Costeja case changed that.  The court ruled that Google had to de-list certain search results when users searched for the plaintiff’s name.  It prescribed what is effectively a notice and takedown system to remove search results, but arrived at this remedy through the language and logic of data protection – with no reference to Europe’s intermediary liability rules or the rights they are designed to protect. Costeja follow-on cases will likely force lower courts to grapple more directly with questions about how the two areas of law fit together – including language in the eCommerce Directive that some argue renders it inapplicable to data-protection-based removals.  Even as those cases progress, however, EU legislators are overhauling the governing law by replacing the Data Protection Directive with the pending GDPR.

Legislative Process for the GDPR

The GDPR has been in the works since January 2012, when the European Commission proposed a comprehensive update and reform of the 1995 Data Protection Directive.  A number of drafts from different EU governance bodies have been released since. A good collection of drafts and related documents is here, and an app and chart comparing four drafts is here. This discussion does not distinguish between drafts except where differences are relevant.

The GDPR is now in a final “trilogue” process, in which remaining differences will be resolved.  One announced timeline put finalization as early as December, though such deadlines often slip.  The law will come into force two years after its publication date.  Because it is a Regulation rather than a Directive, it will not have to be implemented as separate legislation in each member state of the EU.  Rather, it will automatically go into effect.  The GDPR covers a lot of ground, with provisions addressing everything from data portability, to coordination between national DPAs, to company codes of conduct and appointment of data protection officers.  A good summary of the process and overall issues as of June is here, and a substantive Q&A from the European Parliament is here.

There is a chance that some of the sound and fury around the GDPR will come to nothing, if provisions of the GDPR are obviated by other sources of law – such as one of the pending trade agreements with the US, or laws arising from the EU’s new Digital Single Market (DSM) initiative.  This possibility of preemption could explain why trade and business groups have been relatively unengaged with the GDPR.  But the DSM process is in its infancy, and trumping the GDPR through a trade agreement seems like a long shot.  European lawmakers do not seem disposed to make major concessions to the US right now on issues of privacy and data protection.  And to the extent that US trade negotiators are seeking such concessions, their priorities may not lie with the issues I identify here.

Final passage of the GDPR will not necessarily answer the questions raised in this series about intermediaries and user access to information.  Practitioners have significant unresolved differences about how certain points in the 1995 Directive should be interpreted; the GDPR probably won’t change that.   Existing drafts are unclear on some key points, and seem likely to remain so – there can be good reasons for negotiators to choose constructive ambiguity, leaving room for DPA or court interpretation after the law is enacted.  The upshot is that we will not necessarily see expert consensus on everything the GDPR means, and what parts of the law it has changed, even once its language is finalized.

Ambiguous drafting, intentional or not, will likely leave room for litigation and policy battles about the GDPR’s impact on Internet intermediaries and user free expression.  But it is clear that overall the Regulation moves the needle in a troubling direction for online innovation and civil liberties.  It extends jurisdiction to a vast new group of Internet companies, imposing burdensome regulatory obligations on companies that have never heard of this law.  It extends “Right to Be Forgotten” content erasure requirements, applying European legal standards to require deletion of content that is legal in other places.  By the same token, it puts decisions balancing European users’ speech and privacy rights into the hands of foreign technology companies, instead of national courts.  And it tilts the playing field for the people whose rights are affected: it expands rules and institutions to vindicate privacy rights, but has no countervailing increase in resources or legal channels to protect speech and information rights.  These issues merit much closer consideration before the GDPR is finalized and brought into effect.

Daphne Keller is Director of Intermediary Liability at The Center for Internet and Society at Stanford Law School

Versions of this article are cross-posted at the Stanford Center for Internet and Society blog and the Internet Policy Review News & Comments.  An FAQ going into more depth on these questions will be posted tomorrow

Show more