By: Karin Deutsch Karlekar, Director of Free Expression At Risk Programs, PEN America
What if a Japanese man had standing in court to require Google to delist the digital record of his child prostitution conviction; or if a British man could request that search engines operating in Europe expunge 50 entries detailing a medical procedure he bungled; or if a Swiss financier could use European Union law to make Google consider removing any mention of his felony convictions from its results?
Increasingly, these claims are being considered in courts and legislatures worldwide under a newly developed legal notion that people have an inherent “right to be forgotten” (RTBF). This trend concerns free expression advocates who see the potential for broad censorship of news, historical information, or literary and academic works. Now this concept is set to be tested in the United States by a draft law introduced last month in the New York State Assembly.
So far, the United States has been reluctant to codify ‘’the right to be forgotten” at the federal level through judicial rulings or legislative action, but the concept has gained significant footholds elsewhere. Within the E.U., Google has received a total of 702,699 requests to delist 1,941,026 URLs since May 2014, and has granted 43.2 percent of these requests, according to its transparency reports. In India and China, citizens are using this Western legal precedent to buttress their cases to force search engines to delist articles from apparent results. The South Korean government established a legislative framework for requesting access restrictions on public content, and Canada was considering its own framework.
A turning point in establishing the concept of RTBF came in 2014 when the European Parliament instituted a regulation allowing its citizens to appeal for search engines operating in Europe to eliminate access to articles that were “inadequate, irrelevant, no longer relevant, or excessive.” The law has its roots in a 2009 suit by a Spaniard, Mario Costeja González, who demanded the delisting of two 1998 articles that linked him to foreclosure notices, lest Google search results infringe on his privacy rights. A few months after the passage of the law, the European Court of Justice (ECJ) ruled that Costeja’s claim was supported under the E.U.’s existing 1995 Data Protection Directive, and required Google to remove the articles. The landmark ruling required Google to delete results for certain articles from its in-country platforms or else open itself to legal appeal by the requesting party. Additionally, Google and similar U.S.-based search engines contend with a 2010 French law that protects RTBF, as well as national jurisprudence favoring RTBF in Argentina, Belgium, Germany, and India, among other countries. Some of these legal frameworks also shift the burden onto the search engine to prove that the information is too important to be deleted.
The competing interests of free expression, the public “right to know,” and individual privacy make this legal precedent contentious. Those in favor say search engines operate on algorithms and do not amount to media outlets, and hence are not subject to press freedom protections. They also assert that the ECJ ruling does not censor content on a news outlet's website, only its appearance in search results. Others have suggested that in an era of “big data,” where lengthy dossiers are collected for individuals and both the government and private corporations conduct mass surveillance, RTBF restores some balance by allowing individuals greater privacy and the ability to move past their mistakes. Indeed, a Google data leak revealed that 95 percent of RTBF requests involve removing personal material rather than information about violent crimes or politically motivated scrubs. Nearly 90 percent of Americans surveyed in a 2015 study give some level of support to the idea of having “right to be forgotten” legislation in the United States as well.
Still, serious questions arise as to whether the public’s “right to know” should be forced to take a back seat to an individual’s desire to erase their past. The law causes obvious tension between open informational access and individual privacy rights, between that access and government authority to regulate it, and between freedom of expression and personal privacy. While countries like Germany and France are in favor of more concrete language in the law that clearly elevates freedom of expression, Belgian courts have gone so far as to require censorship of a newspaper’s archives, arguably infringing on press freedom and public access to information. Similarly alarming is a Paris court’s ruling that Google must censor its search results beyond French borders. The French national data protection authority, CNIL, has since fined Google for not extending French national law to cover all country domain names. This raises questions about a country's ability to regulate cross-border data flows, and subsequently impacts other nations’ free expression protections by forcing RTBF requests made by the citizens of one country onto an entire search engine with global reach.
The misuse of broadly worded laws to restrict free expression is on the rise; every year, PEN America and other advocacy groups document hundreds of cases where defamation, anti-terrorism, or blasphemy laws are used improperly to silence writers and journalists. With RTBF jurisprudence still developing, and ambiguous, undefined terms such as “irrelevant” and “excessive” being used to make determinations that now have substantial implications for information access and free speech and press in Europe and beyond, free expression advocates should continue to watch closely the trend in favor of RTBF laws. The use of these rulings to censor news and information, and to authorize excessive and discretionary government regulatory powers, is cause for concern. Particularly in this era of “fake news” and altered historical narratives, the ability to change or erase the record should be applied sparingly, and should be subject to review and appeals processes for each request made with a public interest exception built in. More balance and definition is needed to ensure that as privacy rights are strengthened, the core rights to free expression and information are not eroded or overly circumscribed.