2013-07-01

Case C-131/12: Google Spain SL & Google Inc. v Agencia Española de Protección de Datos (AEPD) & Mario Costeja González – read Opinion of AG Jääskinen

This reference to the European Court of Justice (CJEU) concerned the application of the 1995 Data Protection Directive  to the operation of internet search engines. Apart from demonstrating the many complications thrown up by this convoluted and shortsighted piece of regulation, this case raises the fascinating question of the so-called right to be forgotten, and the issue of whether data subjects can request that some or all search results concerning them are no longer accessible through search engine.

All of these questions are new to the Court.

Factual background

The referral arose out of a complaint made by the data subject about  announcements published some 15 years ago in the press and on the internet about the forced sale of his property following bankruptcy . Not so long ago, such announcements would previously have required a visit to the archives of the newspaper. Now this information could be acquired by typing his name into an internet search engine. Google acted as a simple internet search engine service provider in relation to data, including personal data, published on the internet in third-party source web pages and processed and indexed by Google’s search engine. In the main proceedings, the data subject sought to remove from Google’s index the indexing of his name and surnames with the URL addresses of the newspaper pages displaying the personal data he was seeking to suppress. Google Inc argued that it was not caught by the Directive as it could not be said to have “processed” the information, that it could not be a “controller” of the data, and that, as a Californian company, it was outside the jurisdiction. Google Spain simply acted as commercial representative of Google for its advertising functions. (This argument found no favour with the AG: processing of personal data takes place within the context of a controller’s establishment if that establishment acts as the bridge for the referencing service to the advertising market of that Member State, even if the technical data processing operations are situated in other Member States or third countries.)

There are three main situations where personal data is processed on the internet.

1)  the publishing of elements of personal data on any web page on the internet  (the ‘source web page’).

2) the case where an internet search engine provides search results that direct the internet user to the source web page.

3) the situation that arises when an internet user performs a search using an internet search engine, and some of his personal data, such as the IP address from which the search is made, are automatically transferred to the internet search engine service provider.

The order for reference in this case related to the second situation.

Legal background

At the time the Data Protection Directive  was adopted the internet had barely begun. The first rudimentary search engines had started to appear, but nobody could foresee how profoundly it would revolutionise the world. Nowadays almost anyone with a smartphone or a computer could be considered to be engaged in activities on the internet to which the Directive could potentially apply. As the AG drily observes

 it is clear that the development of the internet into a comprehensive global stock of information which is universally accessible and searchable was not foreseen by the Community legislator.

One of the many consequences of this explosion of information available on the internet is that any private individual using an aggregator such as Flipboard or one of the social media becomes, in effect, a controller of data for the purposes of the Directive. Such a person appears to be “engaged in processing of personal data with automatic means”, if this processing takes place outside that person’s purely private capacity. In  the 2003 case of Lindkvist the Court concluded that ‘the act of referring, on an internet page, to various persons and identifying them by name or by other means’ ‘constitutes “the processing of personal data” wholly or partly by automatic means within the meaning of Article 3(1) of [the Directive]’.

Source web pages on the internet often include names, images, addresses, telephone numbers, descriptions and other indications, with the help of which a natural person can be identified.

The fact that their character as personal data would remain ‘unknown’ to internet search engine service provider, whose search engine works without any human interaction with the data gathered, indexed and displayed for search purposes, does not change this finding.

But the interpretation of who is the “controller” of such data should not be so wide. At this complex interface of law and new technology, the law must be applied in a proportionate manner so as to avoid “unreasonable and excessive legal consequences”.

The Court should not accept an interpretation which makes a controller of processing of personal data published on the internet of virtually everybody owning a smartphone or a tablet or a laptop computer.

The finding that a search engine is too automatic and insufficiently “human” to be considered a “controller” of information chimes with both common and civil law approaches to liability in defamation: courts have generally found that search engines are prevented from exercising complete control over the search terms and search results, which means that they are facilitators rather than publishers for the purposes of defamation (Designtechnica Corporation, Google UK Limited, and Google Inc).

No “right to be forgotten”

All this will be irrelevant however if the CJEU follows the AG’s finding that Google should not generally to be considered as a ‘controller’ under Article 2(d) of the Directive. Equally, the “right to be forgotten” question will only arise in the substantive judgment if the Court disagrees with the AG on the “controller” point. However, AG  Jääskinen’s discussion of this issue is important as it is the first time it has been articulated in such detail.

In essence, the national court asked whether the rights to erasure and blocking of data, provided for by the Directive, extend to enabling the data subject to contact the internet search engine service providers himself in order to prevent indexing of the information relating to him personally that has been published on third parties’ web pages.

By so doing, a data subject seeks to prevent potentially prejudicial information from being known to internet users, or is expressing a desire for the information to be consigned to oblivion, even though the information in question has been lawfully published by third parties. In other words the national court asks in substance whether a ‘right to be forgotten’ can be founded on Article 12(b) and 14(a) of the Directive.

Because this situation engages European Law, the Charter of Fundamental Rights and Freedoms is involved, which means that of the right to protection of personal data can be framed in terms of Article 8;  the right to respect for private and family life under Article 7;  freedom of expression and information as protected in Article 11 and the freedom to conduct a business in Article 16.

The AG was of the view that  the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests.

The purpose of processing and the interests served by it, when compared to those of the data subject, are the criteria to be applied when data is processed without the subject’s consent, and not the subjective preferences of the latter.

A subjective preference alone does not amount to a compelling legitimate ground within the meaning of the Directive. The “right to be forgotten”, in other words, would be a significant legal innovation for which there are insufficient grounds in EU or national law. In any event, after taking into account the conflicting rights to information and protection of personal data under the Charter, this was not an area where a right to be forgotten could be fashioned without great difficulty.

The particularly complex and difficult constellation of fundamental rights that this case presents prevents justification for reinforcing the data subjects’ legal position under the Directive, and imbuing it with a right to be forgotten.This would entail sacrificing pivotal rights such as freedom of expression and information.

The AG also expressly discouraged the Court from concluding that these conflicting interests could “satisfactorily be balanced in individual cases on a case-by-case basis, with the judgment to be left to the internet search engine service provider”:

Such ‘notice and take down procedures’, if required by the Court, are likely either to lead to the automatic withdrawal of links to any objected contents or to an unmanageable number of requests handled by the most popular and important internet search engine service providers.

The issues underlying this case are predicted in a fascinating and broad ranging discussion posted on Inforrm about search engines and the general inadequacy of legal instruments to regulate them.

Problems ensue when legal authorities apply legal rules that are not developed to address and fulfill content removal requests. Websites and their content listed among search results are created by and uploaded by third parties; websites are owned by third parties, not the search engine operator, which makes it legally and technically impossible for search engine operators to interfere with the content. The relevant content must be removed from the original website for the content to avoid the search engine’s algorithmic formulae. The impossibility and illegality, however, does not prevent claims from being filed for non-removal of certain content from search engine results. (Gönenç Gürkaynak, ”Understanding Search Engines”).

Instead of trying to bend outdated regulation to the new technology, argues Gürkaynak, we need to come up with legal definitions that clearly define and set the boundaries for what types of providers are held responsible for the content broadcasted on the Internet.  Only by doing this will we enable the letter of the law to converge with the spirit of the law. The CJEU’s ruling on this case will be an important step in this direction.

Related posts:

Can Google be sued for the content of blogs on its platform?

Turkish block on Google site breached Article 10 rights

Google steps up pressure on government censorship

How can courts manage the Facebook phenomenon?

Court awards anonymity for victims of libellous “paedophile” allegations

Filed under: Case comments, Case summaries, In the news Tagged: data protection, EU Charter of Fundamental Rights, freedom of information, Google, human rights, internet, privacy, search engines

Show more