The European Union’s High Court of Justice, which is much like our Supreme Court, has ruled that Google is responsible for people’s privacy in some cases.
The case began when a Spanish lawyer Googled himself and saw some damaging information in a Spanish newspaper. He felt that the information was out of date, and asked the newspaper to remove it. They refused, so he asked Google to remove the items from their index, so that people searching for him wouldn’t find that information so easily.
This is a common issue, actually. We have clients come to us for reputation management when a negative review or an unfortunate news story shows up prominently in their search results. Typically, we help them add enough great content that the bad news is no longer on the first page when people search for them.
One such case was that of a hospital that had the misfortune to have a murder take place in their parking lot. While this was clearly not the hospital’s fault, nor under their control, it was harmful for them when that news story showed up all over the SERPs when prospective patients Googled the hospital. We have no qualms about helping organizations in this kind of position.
Most review sites will not remove bad reviews if asked to do so, and news outlets usually remove a story and publish a retraction only if the story turns out to be false. We’ve never even thought of asking Google to remove something from their index. It sounds like suppression of information, quite contrary to the ideas of free speech, a free press, and the people’s right to know.
The European High Court specifically stated that personal privacy trumps the people’s right to know, according to Law360. The top legal adviser to the court agreed that Google’s search engine is a machine, not someone intentionally controlling information, but the court disagreed. While the actual source of the information — in this case a Spanish newspaper — was not required to remove the story in question, Google has been told that they must now remove web pages from their index if they are asked to do so by someone who finds their search results unflattering. Or at least if they are asked to do so by a European, and unless they have a good reason not to. The criteria for considering something a “good reason” have not been defined very clearly.
The investment of time and money involved in doing so might be considerable. Jeff Deist wrote, at Mises Economics Blog, that “search engines don’t ‘own,’ host, or control the unflattering content in question, they merely direct searchers to a list of results from a particular query. So what, precisely, is the legal theory that justifies compelling a search company not to direct eyeballs to content which exists anyway?”
In the U.S. this would apparently not be an answerable question. In Europe, however, unflattering comments made by someone at a social media site could conceivably, on the basis of the High Court ruling, become the problem of the social media site owner.
The result for the lawyer who brought the suit is, of course, that millions of people who would never have Googled him now know all the embarrassing stuff he sued to hide. Our method would have worked better.