[By Alistair Fairweather]
If you sued one of the world’s largest companies for defamation and won, you might expect a bit more than €5 000. But in the case of “Mr X” vs Google, which was recently tried by a French court, it is a full €5 000 too many.
The rather sordid tale of Mr X began when he was arrested and tried for allegedly raping a 17-year-old girl. After a very public trial he was convicted instead of the lesser charge of “corruption of a minor” and sentenced to a hefty fine and a three-year suspended sentence.
Yet, after the trial, when Mr X’s name was typed into Google, the engine immediately suggested search queries like “Mr X rapist” and “Mr X satanist” — despite the fact that he had been acquitted of rape and was not a satanist.
How could this happen? It helps to understand how Google’s auto-suggest function works. As you begin typing your query, Google’s complex algorithm compares it with similar queries made by millions of other users and suggests the search terms they most often used.
So if you search for “hotels in…” on our local version of Google (.co.za), it is likely to suggest things like “Hotels in Cape Town” and “Hotels in Durban”. That’s because it can tell you are an SA user and wants to be as useful as possible, so it shows you things other South Africans search for.
Sometimes this automation can produce rather strange results. A famous example is the query “Why wont my parakeet…” which brings up the suggestion “Why wont my parakeet eat my diarrhea” (sic). While most of the planet finds this unbelievably idiotic and revolting, there are clearly quite a few people who worry enough about this “problem” to Google it.
And once people accidentally discover one of these humorous suggestions, they begin to publicise it, which drives more people to search for the same term, and thus reinforces its rankings in the suggestion field. The suggested query “I am extremely terrified of Chinese people” used to be the last suggestion on the list when you typed “I am extremely …” but now it is the first.
However we feel about these kinds of suggestions, they are simply Google’s algorithm reflecting our own behaviour back at us. Google is essentially capturing people’s collective curiosity. If that curiosity is mistaken, embarrassing or just plain weird then it’s a reflection of society, not of Google. To make it a moral argument is to blame the car manufacturer for the drunk driver’s accident.
Google argued all of this in its case, of course. Yet the French court, in its infinite wisdom, ruled that ‘algorithms or software begin in the human mind before they are implemented’, which makes Google liable for the defamation. It also ruled that Google had “no evidence” that its suggestions were completely automated and include no human intervention.
I might be willing to concede the first point — Google set the train in motion so it is responsible for ensuring it doesn’t crush anyone. But to imply that Google employees are intentionally intervening in auto-suggestions is laughable.
Google Search processes over 3bn searches per day — that’s about 34 000 searches per second. It maintains over a million servers. It has hundreds of millions of customers around the world. And yet, we’re expected to believe that its employees took the time to intervene in a set of auto-suggest results for an obscure French sex offender?
The very fact that Google had to be compelled by court order to remove the auto-suggestions for queries about Mr X is evidence of how reluctant Google is to manually intervene in its finely balanced systems.
This verdict speaks of a fundamental and dangerous misunderstanding of the way the Internet — and technology in general — works. If Google can be convicted for auto-suggestions, then why not for search results? And why not for defamatory emails sent via Gmail? Its software is responsible in each case — at least as far as the French court is concerned.
And the French judges aren’t the only ones with a wholesale misunderstanding of technology. In February, an Italian court convicted three Google employees of criminal privacy violations and gave them six month suspended sentences.
They were held responsible for a video posted on the Google Video site that showed an autistic boy being abused. Though they removed the video as soon as it was reported, that wasn’t good enough for the Italian court. Again, Google was deemed to be liable for the actions of its users.
The law has always moved at a stately, unhurried pace. In many ways this is sensible. Law has the deepest implications for justice in our societies. If it were too quick and easy to change there would be no stability or continuity. A law that bends to the weekly whims of fashionable outrage is worse than no law at all.
But stability can sometimes harden into rigidity. Society is evolving at an ever more rapid pace. If our laws — and law enforcers — can’t keep up with those changes, then they will eventually lose both their relevance and their force.
- Alistair Fairweather is digital platforms manager at the Mail & Guardian
Visit the Mail & Guardian Online, the smart news source
- Subscribe to our free daily newsletter
- Follow us on Twitter or on Facebook