On Tuesday, the European Court of Justice ruled that Google must respect the "right to be forgotten" and remove links to information that individuals might consider personally embarrassing. Google called the ruling, which cannot be appealed, "disappointing." Google was being discreet. The ruling is more than disappointing; it's dangerous.
The case at issue pertained to a Spanish citizen whose home was repossessed and auctioned off by local authorities in 1998 for failure to pay social security taxes. The news of his repossession was published in a local newspaper whose archives remain online. (Unlike every other news outlet that has reported the ECJ decision, I will not name the Spaniard, because, gosh, the ubiquitous reporting on his dire straits 16 years ago really undermines his legal victory.)
The former deadbeat Spaniard now has his financial affairs in order and believes his former travails are no longer relevant to his current circumstances. With some justice, he believes that Google's ability to point interested parties directly to his unsavory past actively harms his current reputation. The court agreed.
The problem here is not that European courts are more serious about protecting privacy than American courts. Stronger privacy protections are important. Google has, in many respects, overstepped the bounds of both law and propriety in its efforts to "organize" the world's information. I don't personally think that pointing to information that has been published previously in a newspaper is an unforgivable invasion of privacy, but if that's where European society decides, through its legal system, to draw the line, that's fine.
The problem is in the remedy. The Court acknowledged that it is not easy to decide what personal information might be in the public interest and what might not. (Emphasis mine.)
However, inasmuch as the removal of links from the list of results could, depending on the information at issue, have effects upon the legitimate interest of Internet users potentially interested in having access to that information, the Court holds that a fair balance should be sought in particular between that interest and the data subject’s fundamental rights,in particular the right to privacy and the right to protection of personal data.
This "balance," continued the Court, may depend "on the nature of the information in question and its sensitivity for the data subject’s private life and on the interest of the public in having that information, an interest which may vary, in particular, according to the role played by the data subject in public life."
That's all very nuanced and sensible, but where the Court goes horribly wrong is in deciding that Google must make the judgment call on what is socially valuable and what is injuriously invasive. The so-called "data subject" who feels harmed by search results is invited to contact the search engine operator directly. The operator "must then duly examine its merits."
We do not want search engine operators deciding, on their own, the "merits" of individual cases. Google employs a great many brilliant programmers, but (unless Stephen Hawking is right) they're never going to come up with an algorithm that can properly balance the public interest against private shame. Never mind the immense logistical and bureaucratic impediments involved with any such endeavor. The more basic problem is that a search engine is in no position to be deciding such questions! Google doesn't represent the people, and it has no position in the legal system. Google provides an index to what's available online. It's alarming enough that Google hamstrings its autocomplete function for obscure and subtle reasons. If Google is given the responsibility for judging whether a 16 year old home auction notice in the newspaper should disappear from the auction, what's Google's approach going to be when businesses demand that critical news reports or bad reviews need to be excised? How does Google decide who is a public figure and who isn't? The potential for conflicts of interest is infinite.
The European Court of Justice concluded its ruling by stating that in cases where the search engine "does not grant the request, the data subject may bring the matter before the supervisory authority or the judicial authority so that it carries out the necessary checks and orders the controller to take specific measures accordingly."
But shouldn't contacting the appropriate civil authority be the first step? Google should not be our censor. Google cannot be the censor. That's far too much power for a profit-seeking company to have. There is no question that it is incredibly valuable to society to have an index of all available human knowledge. But what's off limits to that index must be decided by courts and governments, and not private enterprises.