Have you ever done something you regretted? I certainly have. Wouldn’t it be great if nobody could find out about those missteps? There was a time when that wasn’t a huge issue. Most of our mistakes were neither recorded nor disseminated widely. In more serious cases, there were provisions for preventing dissemination—for example, closing juvenile proceedings or sealing court records. However, the internet has changed that by giving almost everything we do permanence and global distribution.
The right to be forgotten, a law that allows people who believe that online information about them is misrepresentative to demand that search engines remove links to the material, is an attempt to address this. It originated in Europe, but has spread to some countries in Asia and South America. It reflects how many other countries differ from the United States on the proper balance between freedom of speech and privacy. Here, the former is considered far more important than the latter, whereas Europe views both as human rights that must be balanced.
In actuality, the right to be forgotten is a misnomer in its current form. It is a right to have links in search engines removed, so that particular stories don’t appear in the results when a search is done for a person’s name.
On the surface, a right to be forgotten makes a lot of sense. Why should someone’s entire life be haunted by something that happened years or even decades ago? Unfortunately, the devil is in the details. Who decides what should be removed from searches? Is it the person involved, companies like Google and Microsoft, or government officials? What criteria should be used? What is the proper balance between the privacy right of the individual and the public interest in information?
In Europe, the initial decision is made by the company, which is supposed to apply 13 specific criteria, covering from who the person is to the accuracy, age, and relevance of the data to whether the content was “published in the context of journalistic purposes.” These criteria are characterized as a “flexible working tool.” There is no specific guidance as to how to weigh the various criteria in a specific case, which guarantees that consistent decisions are almost impossible.
Further exacerbating the problem is the sheer volume of requests. For example, Google received more than 253,000 removal requests in the first year after the right to be forgotten was upheld by the European Court of Justice. Google rejected roughly 60 percent of these requests, although no one knows the basis for rejection because Google has not made public its criteria. It is also likely that other search engines use different criteria, increasing the arbitrary nature of the entire process.
If the request is denied, it can be appealed to the Data Protection Commission (DPC) in the requester’s country. Eventually, a company can be fined if the DPC finds that the request should have been granted. This creates an incentive for the company to grant requests in close cases, increasing the number of items deleted. Combined with the lack of clear criteria, the system almost guarantees excessive delinking.
However, the harm to the interest in a free flow of information has not meant a great benefit to privacy. Until recently, the delinking applied only to the version of the search engine for the country where the person resided (for example, Google.sp), and the results could include a note that some results were delisted. This meant that anyone who saw that could go to another version of the search engine, say, Google.com, and find the missing links.
Recognizing that, the French Data Protection Commission recently ordered Google to remove the links in all versions of its search engine. In essence, the French commission is asserting the right to censor information worldwide. Google recently lost?its appeal of the order.
If successful, the French DPC will have transformed the right to be forgotten from a nuisance to a serious threat to the free flow of information. If the French can impose restrictions worldwide, why shouldn’t every country? This would mean that search engines in all countries could be prevented from linking to any story that a public official in even one country in which they do business deems unacceptable.
For anyone who believes in free speech and the free flow of information, this is frightening. Admittedly, it probably wouldn’t be effective in many cases, as people will simply republish stories over and over (as has already happened in some instances), leading to more delinking requests, followed by more republishing in an endless cycle. Still, to the extent it is effective, it will censor information in arbitrary and inconsistent ways. All things considered, it is time to forget the right to be forgotten.
T. Barton Carter (COM’78), a lawyer and a College of Communication professor of communication and associate dean, can be reached at firstname.lastname@example.org.?
“POV” is an opinion page that provides timely commentaries from students, faculty, and staff on a variety of issues: on-campus, local, state, national, or international. Anyone interested in submitting a piece, which should be about 700 words long, should contact Rich Barlow email@example.com. BU Today reserves the right to reject or edit submissions. The views expressed are solely those of the author and are not intended to represent the views of Boston University.