By Milana Knezevic
The European Commission (EC) on Thursday released a “mythbuster” on the controversial Court of Justice of the European Union ruling on the “right to be forgotten”. The document tackles six perceived myths surrounding the decision by the court in May to force all search engines to delink material at the request of internet users — that is, to allow individuals to ask the likes of Google and Yahoo to remove certain links from search results of their names. Many — including Index on Censorship — are worried about the implications of the right to be forgotten on free expression and internet freedom, which is what the EC are trying to address with this document. But after going through the points raised, it is clear they need some of their own mythbusting.
1) Groups like Index on Censorship have not suggested “the judgement does nothing for citizens”. We believe personal privacy on the internet does need greater safeguards. But this poor ruling is a blunt, unaccountable instrument to tackle what could be legitimate grievances about content posted online. As Index stated in May, “the court's ruling fails to offer sufficient checks and balances to ensure that a desire to alter search requests so that they reflect a more 'accurate' profile does not simply become a mechanism for censorship and whitewashing of history.” So while the judgement does indeed do something for some citizens, the fact that it leaves the decisions in the hands of search engines – with no clear or standardised guidance about what content to remove – means this measure fails to protect all citizens.
2) The problem is not that content will be deleted, but that content — none of it deemed so unlawful or inaccurate that it should be taken down altogether — will be much harder, and in come cases, almost impossible to find. As the OSCE Representative on Media Freedom has said: “If excessive burdens and restrictions are imposed on intermediaries and content providers the risk of soft or self-censorship immediately appears. Undue restrictions on media and journalistic activities are unacceptable regardless of distribution platforms and technologies.”
3) The EC claims the right to be forgotten “will always need to be balanced against *other* fundamental rights” — despite the fact that as late as 2013, the EU advocate general found that there was no right to be forgotten. The mythbuster document also states that search engines must make decisions on a “case-by-case basis”, and that the judgement does not give an “all clear” to remove search results. The ruling, however, is simply inadequate in addressing these points. Search engines have not been given any guidelines on delinking, and are making the rules up as they go along. Search engines, currently unaccountable to the wider public, are given the power to decide whether something is in the public interest. Not to mention the fact that the EC is also suggesting that sites, including national news outlets, should not be told when one of their articles or pages have been delinked. The ruling pits privacy against free expression, and the former is trumping the latter.
4) By declaring that the right to be forgotten does not allow governments to decide what can and cannot be online, the mythbuster implies that governments are the only ones who engage in censorship. This is not the case — individuals, companies (including internet companies), civil society and more can all act as censors. And while the EC claims that search engines will work under national data protection authorities, these groups have yet to provide guidelines to Google and others. The mythbuster itself states that a group of independent European data protection agencies will “soon provide a comprehensive set of guidelines” — the operative word being “soon”. This group — known as the Article 29 Working Party — is the one suggesting you should not be informed when your page has been delinked. And while it may be true that “national courts have the final say” when someone appeals a decision by a search engine to *decline* a right to be forgotten request, this is not necessarily the case the other way around. How can you appeal something you don't know has taken place? And what would be the mechanism for you to appeal?
As of 1 Sept, Google alone has received 120,000 requests that affect 457,000 internet addresses and may remove the information without guidance, at their own discretion and with very little accountability. To argue that this situation doesn't allow for at least some possibility of censorship, seems like a naive position to take.
5) All decisions about internet governance will to an extent have an impact on how the internet works, so it is important that we get those decisions right. In its current form, the right to be forgotten is not up to the job of protecting internet freedom, free expression and access to information.
6) It may not render data protection reform redundant, but we certainly hope the reform takes into account concerns raised by free expression groups on the implementation of, and guidelines surrounding, the right to be forgotten ruling.
This article was posted on 22 Sept 2014 at indexoncensorship.org