If you type "Rizaeddin bin Fakhreddin" into Google, Google will give you a list of links and a small box to the right. The first link will probably be to the English Wikipedia article on bin Fakhreddin, created and written by me; this can easily be checked by going into the page history of the article. But most likely you'll never bother to actually click on the article because of that small box to the right. "Rizaeddin bin Fakhreddin was a Tatar scholar and publicist that lived in the Russian Empire and the Soviet Union", it reads.
I typed that sentence. I also put the birth and death dates onto Wikipedia. I uploaded the picture to Wikimedia Commons and put it into the article – or articles, actually, because I also created the article on the German Wikipedia. But now I find this information directly on Google. There is a link to the Wikipedia article, but that may as well be a result of Father Google's omniscient mercy. Nowhere does the box state that it presents the work of an unpaid volunteer next to Google advertisements. The effect is obvious: In a 2017 study, half of the participants attributed what they found in the Knowledge Graph, which is the name of that small box, not to Wikipedia, but to Google.
The Knowledge Graph has recently been in the news for saying that California Republicans are Nazis. The scandal was reported, discussed, closed, opened again and finally forgotten. Conservatives still think Google is biased against them; Google says the whole thing wasn't its fault.
We regret that vandalism on Wikipedia briefly appeared on our search results. This was not the the result of a manual change by Google.
— Google press release
No, obviously it wasn't. None of the content you presented there was. That was all Wikipedia's.
But the interesting thing is that in the public eye, this was still Google's fault. Read through the Twitter thread; none of the enraged commenters there seem to believe that this wasn't an action by a Google employee. "Google: Republicans are Nazis", read the headline on the Drudge Report article exposing the issue, and Wired magazine made a whole story out of making clear that the vandalism itself happened on Wikipedia. And all of that while more Wikipedia editors quickly did the dirty work; they hunted down the specific edit that caused the problem, corrected the vandalism and placed the page under semi-protection to prevent copycats. Meanwhile, the Knowledge Graph is still humming along, the ideology section removed, the rest still filled with Wikipedia data, and Google can be happy until the next scandal.
And we are left with a question: Why do we let this happen? Why do we let a multi-billion dollar company exploit us as uncredited mules – as long as there isn't a need for someone to shift the blame to? Where is the organization that should be responsible for protecting the rights of its volunteer editors – where is the WMF? Traditionally, Google is one of the biggest sponsors of the Foundation; for example, they chucked Jimmy Wales a $2m grant in 2010, more than they donated the whole last year. A few months later, they acquired the knowledge base Freebase, which was to form the basis for the Knowledge Graph, for an undisclosed sum.
After the recent scandal surfaced, the Foundation took an apologetic stance. "We're sorry", its statement seems to say, "and no, online encyclopedias still aren't a bad thing." But on 15 June, WMF executive director Katherine Maher, writing an opinion piece in Wired, saw the other side: "If Wikipedia is being asked to help hold back the ugliest parts of the internet, from conspiracy theories to propaganda, then the commons needs sustained, long-term support", she says, "The companies which rely on the standards we develop, the libraries we maintain, and the knowledge we curate should invest back. And they should do so with significant, long-term commitments that are commensurate with our value we create."
This is a step in the right direction. At the very least, the platform economies of the world should give something back to the largest source of the information they feed their algorithms with. As Maher concludes, "we shouldn’t be afraid to stand up for our value", but maybe it is time we see Google – and Facebook, and Amazon – not only as partners, but also as the ones making huge profits sustained by our unpaid labor.