Our enthusiastic embrace of the electronic resources that give us information, communication and community, all at lightning speed and with press-button ease, has changed the world. It changes governments too, as the contemporary Middle East shows. The rising curve of developments in this electronic wizardry has grown breathtakingly steeper in the last two decades, showing no sign of leveling out.
At the forefront of some of these vertiginous developments is Google, a name so iconic that it has become a verb. If you google "Google" you get over 6 billion results (Yahoo gets 4 billion, Microsoft's Bing 250 million, Lycos 25 million, Altavista 16 million. You get the point: Google is way out ahead.) The question is: Is the googlization of everything a uniformly good thing, or should we worry? The answer, according to media expert and professor Siva Vaidhyanathan in his "The Googlization of Everything (and Why We Should Worry)": Yes, we should worry.
His chief reason is that Google is an advertising business, which uses its information services as a way of getting salable information in return. It offers free access to information so that it can profile the people searching for that information, and then auction space to advertisers targeted at those people's profiles.
It is an ingenious business model, but, among other consequences, it raises for searchers the question of whether Google's ranking of information is an accurate reflection of the information's reliability and utility. As Vaidhyanathan points out, Google has convinced us that we should trust it implicitly; most people rarely click past the first three results it offers. "This means," he writes, "that Google, the most flexible yet powerful information filter we use regularly, could come to exercise inordinate influence over our decisions and values." Control knowledge, and you control people: Google would not be the first to realize the significance of that truth.
Moreover, the localization of advertising means that Google does not pass on information about the world at large but about what interests locally, the kind nearby advertisers would prefer us to have so that we can shop with them. Thus Google becomes, says Vaidhyanathan, "more about shopping than learning."
Vaidhyanathan was prompted to examine "googlization" -- having our information processed through Google's systems and algorithms before we get it -- because when Google began scanning millions of books in 2004 to create a vast digitized library, he saw two problems: that the aim of Google Books and its eBooks platform was in fact not to create a library but a bookstore, and secondly that it was interested in accumulating a vast body of text that it could mine for the development of search techniques.
Vaidhyanathan admires the innovations promised by the latter aim, but what he is not so sure about is the increased hold it gives Google over how its users view the world.
The most recent development in lawsuits initiated by publishers and authors against Google's books digitization plan was a decision this month (March 2011) by a court in New York, applauded by the U.S. Department of Justice, that the settlement Google offered to publishers and authors "would," in the words of Judge Denny Chin, "give Google a significant advantage over competitors, rewarding it for engaging in wholesale copying of copyrighted works without permission." It would also gain Google the profits from "orphan works" whose copyright holders cannot be located. For now, anyway, the googlization of our access to books has been hampered. But although this issue is what got Vaidhyanathan started, it is not his only concern.
Note first that Vaidhyanathan has not written a diatribe against Google itself, a company he admires for many reasons despite also having questions about it. Instead the book is about "googlization," that is, the development of reliance by Internet users on major search service providers -- Google being the most major -- who, because of their control of the process, have an influence over information that could be distorting or even, potentially and if it fell into the wrong hands, malign.
Vaidhyanathan's anxieties are being exacerbated by the increasing pressure Google is itself feeling from competitors in certain sectors, principally Facebook and Apple, with consumer attention switching to mobile devices that are locked-down to their providers. This is very different from the radically open Web model that Google took its start from. Every minute you spend on iPhone away from the Web means that Google can't harvest the information you are giving away about yourself on it. Facebook competes directly with Google for advertising and is making huge strides. Of course, Google is changing in response: going mobile, and at the same time, in reaction to Facebook, seeking to be more current and relevant by privileging recent information over older information -- and thus again distorting the value of the information it ranks.
Note the concern here: that because information runs through the decision algorithms of the Google system we cannot regard its handling of information as ideal or even merely neutral, but only as a product of the algorithm's workings. And because it is profiling our identities and interests, our prejudices and desires, it is tailoring information to us, just as it is auctioning space to advertisers with something specific to sell us because of our profiles.
One big thing someone could say on Google's behalf is that it stepped into a void and provided an invaluable service to the world. There was no really effective way of finding information on the Internet until Google created it, remedying a market failure and reaping the just reward for doing so. Should we not clap our hands instead of wringing them?
Vaidhyanathan's reply is that Google is filling too much of a gap, a gap that state institutions should fill. It would, he says -- and surely rightly -- be a fabulous investment for the world and its future if governments got together and financed the building of a global digital library open to all. Only imagine if there were a truly public, independent and neutral search engine for all information, ranking it purely on reliability and usefulness. That would be a resource for mankind worthy of the name.
Google has used its "sterling reputation" to argue that it can provide this and associated services faster, cheaper and better than public bodies can.
Vaidhyanathan disagrees, arguing that Google's preeminence in the market has undermined what should be genuinely public responsibilities to provide public goods. This is not, Vaidhyanathan is careful to insist, to say that Google is evil -- far from it; but it is also not the white knight of popular imagination either. It is a business, seeking a profit, driven by the profit motive; it is not a public service institution. The perception of it as an altruistic and benevolent organization was created by its users, not itself. The motto "Don't be evil" has never been part of its public face, Vaidhyanathan says; it was we the users who wished it to succeed because we applauded those aspects of it that were genuinely applaudable -- the commitment to openness, fraternity, little or no control or censorship, open code, the democratic process of lots of people building amazing facilities for the Web. Google opposed the world domination of Microsoft, which never pretended to be anything but a for-profit institution, opposing open-source and code-making anarchy. Google was the good guy, and it succeeded: David contested Goliath, and became another Goliath by doing so.
Google both spread and benefited from goodwill in this way. The company's founders say that they began with principles and are not going to violate them. But, says Vaidhyanathan, the sheer scope of Google and its proliferation into many new areas has produced frictions and contradictions. He gives the example of the run on United Airlines stock that resulted from a trifling error, the absence of metadata on a Web page giving the date of a report. Someone saw on Google an old report that UA was seeking bankruptcy protection, and thought it was new information. An alert flashed out, holders of UA stock began selling it as quickly as possible, and the share price tanked. So did that of other airlines, because (typically for the stock markets) panic selling spread like an infection before the mistake was detected. When Bloomberg issued a correction the stock prices rallied, but not before a lot of money was lost. Trust in Google, Vaidhyanathan says, prevented people from checking the facts before the damage was done -- and the example is illustrative of the potential for damage.
One of the most interesting points Vaidhyanathan makes concerns the levels of responsibility that Google bears for the content it offers us, from lowest to highest across three types of function: scanning and linking, hosting and serving, and scanning and serving. To understand this, remember that there is content that not everyone wishes to have accessed, such as pornography, copyright infringements and privacy invasions. If Google keeps such content available beyond a certain tolerance limit, it opens itself to backlash or even prosecution.
Its lowest level of responsibility relates to content that other people create, which it simply copies and posts for its own index. It neither solicits nor prohibits it, neither buys nor profits from it. It makes no editorial input, it just indexes it and makes it available. Google is not responsible for what other people put on the Net.
The next level of Google's responsibility is for what it "hosts and serves." Here Google makes other people's content available on its facilities, obviously enough benefiting from the profiles it harvests and the advertising space it can accordingly sell on its facilities.
In "scanning and serving" Google has the greatest responsibility. It digitizes things that were not hitherto digitized, and makes them available. Examples are Google Maps, Google Street View and Google Books. Google here makes the decision about what is worthwhile to scan and serve. People can complain -- for example, about what appears on Google's street view service -- but it can take weeks for content to be removed, and by then it will probably have been copied and disseminated many times over. As Vaidhyanathan puts it, this involves a bad assumption on Google's part to the effect that everything is up for grabs and can only be challenged and removed post facto. This wrongly sets the default at the lowest possible standard.
This naturally leads Vaidhyanathan to ask whether there should be regulation. On this he is agnostic, pointing out that there is already some degree of regulation in the form of copyright laws, but that matters would be improved if we did not have to trust that a company like Google will not abuse its position, but will allow users to search in a fair and uncorrupted way. Now, Google does not (though many think it does otherwise) take money to put links high in Web searches. Search providers might one day not be so ethical. But in making editorial decisions about what is worthwhile, and in now claiming to prefer "high quality sites" over "low quality sites" (are they always able to tell which is which?) they are taking sole responsibility for value decisions of considerable moment.
A problem here is that Google's search algorithms are a commercial secret, so the chance of a transparent audit of how they make those decisions is slim. What standards does Google use? If it only follows my previously manifested desires, it is not going to give me the information that would most be of use to me.
I am with Vaidhyanathan in wishing to see a Human Knowledge Project set up, a 50-year goal to create a global digital library and information resource that every child anywhere in the world can tap into. The technology exists, but not the political will, says Vaidhyanathan, even though the advantage of a nonprofit humanitarian resource has so much going for it that it is surprising that such a project is not already well under way.
Perhaps the lack of political will has something to do with how many governments do not want their people to know too much or be able to find out too much. Here is a chicken and egg situation: Do we have to wait for the world to be civilized enough for a Human Knowledge Project to become feasible, or would it only become civilized enough if such a project existed? I suspect the latter: and therefore think that all those who are interested should put their shoulders to the wheel and create it. That would be a true search for knowledge, and the most liberating thing we could ever do.