2014's fast food atrocities
Burger King's black cheeseburger: Made with squid ink and bamboo charcoal, arguably a symbol of meat's destructive effect on the planet. Only available in Japan.
Barry Diller, the veteran TV executive who has cut a restless swath through the cable and Internet industries, is seen as something of a techno-visionary in the world of corporate media. His 1994 adoption of a Mac Powerbook and discovery of email merited a flattering New Yorker profile that attempted to portray him as a geek. (In his circles, I guess he was.) Diller has spent the last several years assembling and running the Web conglomerate InterActiveCorp. But his worldview remains heavily shaped by his Hollywood background, and it typifies the response of many a complacent media executive to the rise of blogging.
“Self-publishing by someone of average talent is not very interesting,” he told The Economist in 2006. “Talent is the new limited resource.” At a technology conference that year, he declared, “There’s just not that much talent in the world, and talent almost always outs.”
Diller’s view echoes that of avowedly elitist polemics like Andrew Keen’s “The Cult of the Amateur.” According to this perspective, talent is a resource of fixed supply. The existing institutions of the publishing and broadcast world are already doing an efficient and thorough job of finding all that talent and giving it a platform. And all this other stuff that’s spewing forth from the Web’s profusion of blogs and podcasts and videos? It’s just dross that obscures the real talent’s output.
Beyond the obvious arrogance, this view misreads and underestimates the Web in several ways. It’s a mistake to think of human creativity as a kind of limited natural resource, like an ore waiting for society to mine; it is more like a gene that will turn on given the right cues. Diller and his ilk envision the Web simply as a new distribution channel for the same old stuff, and human expression as a static commodity, uninfluenced by the medium that bears it or the social environment in which it emerges. Their view values each bit of expression based on marketplace worth and potential breadth of appeal, but ignores any worth the expression may have to the person who made it. Most narrow-mindedly of all, they assume that yesterday’s filtering methods will remain reliable and sufficient tomorrow, no matter how radically the environment changes around them.
This is a recipe for failure. Yet, despite these flaws, despite its condescension and its inflexibility, Diller’s attitude remains widespread among media company leaders. They are rightly afraid that it will be harder for them to work the same way and maintain the same profits in the new media world; but they are deluded in believing they have any choice in the matter. Already, today’s Web has evolved well beyond the familiar shape of Diller’s picture. It is expanding the opportunity to manifest talent even as it is exploding the agreed-upon structures for rewarding the works that talent creates. These changes are wreaking havoc with the music industry, whose youthful customers have moved into the new world faster than the companies that sell to them. The same crisis is now beginning to engulf television, movies, book publishing — everywhere that physical goods can be replaced by digital files, and anywhere that the old gatekeeping model of talent recognition can be eroded by the demotic currents of the publish-everything Web.
Diller and his species of executive have always excelled at finding rare talents that can, at their best, enchant a mass market. But this very success has blinded them to the different, more diffuse sort of talent present among the Web’s millions of contributors. Of course talent isn’t universal, nor is it evenly distributed. But there is far more of it in the world than Diller’s blinkered vision allows. On the Web it can reveal itself in a far wider range of ways, and far more people will have a chance to cultivate it. It will never be perceived in a uniform way; you and I will recognize it in very different places and judge it in very different ways. But it is surely there — and, fortunately, denigrating it will not make it go away.
Because the Web comes to us on a screen, it has been easy to misapprehend it as the next phase in the evolution of television. The advent of blogging looked to some observers like the latest mutation of reality TV, which dissects the lives of ordinary people on a mass stage. But there’s one defining difference: on a reality show, only a few people get the opportunity to participate, and those who win the chance remain at the mercy of the show’s producers. In articles that explore how blogging can turn lives inside out, rendering the private public, references abound to “The Truman Show”– a 1998 movie about a man who discovers his life is an elaborately staged TV program. Truman Burbank, that movie’s protagonist, is a victim, practically a prisoner, with no choice in his performance. Bloggers, on the other hand, are volunteers; they may have little power over whom they reach, but they have unprecedented control over what they say and whether to keep saying it. No one can vote you off the island of your own blog.
We talk too much about television as an antecedent to the Web, and not enough about the telephone. When the telephone arrived in American homes and businesses in the late nineteenth and early twentieth centuries, there was some uncertainty over how people would use it and how using it would change their lives. Some social critics worried that the telephone’s insistent intrusions would undermine the status of the home as a refuge from the world’s pressures. Others feared that the phone would erode the shared public space of our communities and disengage us from social life. Telephone conversations were neither private nor trusted. Party lines and operators meant conversations were likely to be overheard; con artists took advantage of the new technology to prey on the naive.
Today the telephone has become our most trusted and confidential form of everyday communication. If we don’t want a conversation to leave electronic tracks, we pick up the phone instead of sending an email. Businesses that want to verify our identities will call us and speak to us. At a time when we are just beginning to grapple with how to use the Web, the telephone offers comfort and familiarity. It is the Web, now, that poses threatening new questions about privacy and anonymity, and the telephone that reassures with the warmth of the human voice and the intimacy of real-time connection.
In “America Calling: A Social History of the Telephone to 1940,” the sociologist Claude S. Fischer argues that our customary mode of discussing new technologies leads us astray by casting the technology as the protagonist and the human user as a victim. We should not inquire into the telephone’s “impacts” or “effects,” Fischer writes: “That is the wrong language, a mechanical language that implies that human actions are impelled by external forces when they are really the outcomes of actors making purposeful choices under constraints.”
This is good advice for Web critics, too. Like the telephone before it, the Web will be defined by the choices people make as they use it, constrained by — but not determined by — the nature of the technology. The most significant choice we have been making, collectively, ever since the popularization of Internet access in the mid-1990s, has been to favor two-way interpersonal communication over the passive reception of broadcast-style messages. Big-media efforts to use the Net for the delivery of old-fashioned one-way products have regularly failed or underperformed. Social uses of our time online — email, instant messaging and chat, blogging, Facebook-style networking — far outstrip time spent in passive consumption of commercial media. In other words, businesspeople have consistently overestimated the Web’s similarities to television and underestimated its kinship to the telephone.
One reason blogs have flourished is that they sit comfortably at this divide between communication types: they partake of some of the characteristics of each, in proportions that vary depending on the style of the individual blogger. Some blogs are simply vehicles for conversation among friends. Some are exclusively public discourse. But many take advantage of blogs’ potential to cross back and forth over this line. A post meant originally for a small circle of friends may “go viral” and catch the attention of millions; a broadside post from a public figure may spark a back-and-forth exchange in the comments. This mutability can be breathtakingly powerful; it can also be treacherous. Either way, whenever we observe an instance of it, we sense we are witnessing something that could only occur in this form, via this medium — something uniquely bloggish.
Once we acknowledge that the Web inherits at least as much from the telephone as from the television, complaints about the “problem” of the Web’s abundance appear in a different light. In a 2007 article, James McGrath Morris, a journalism historian, wrote, “There is a point when there are simply too many blogs. With 30 million blogs today, we may well have reached that point.”
He was not the first, nor the last, to raise the “too many blogs!” alarm. In November 2008, Time’s Michael Kinsley wrote, “How many blogs does the world need? There is already blog gridlock.” The question, echoing a legion of similar skeptics, sounds reasonable at first. But what if he’d written, “How many telephone calls does the world need”? Did someone tell Kinsley that he needed to read all those blogs? If they keep multiplying, whose party are they spoiling? Most blogs are read only “by the writer and his mother,” says Sreenath Sreenivasan, a professor at the Columbia School of Journalism. If he is right — and no doubt, in some cases, he is — does that make them worthless? If blogging’s only accomplishment was that it got more people to phone home to mom, Web-style, why would anyone object?
The sheer volume of blogs evokes a peevish resentment among some observers, as if the outpouring represented a personal affront. How dare all these people presume on our attention! Do they really think that anyone is listening to them? It is certainly possible to blog into a void — to post and post and never get a visitor or a comment. But it’s unlikely many of us would persist with such unrewarding labors. Most blogs have some sort of audience, however tiny — moms and beyond. Where do these readers come from? More often than not, they are other bloggers. Observers steeped in the values of the broadcast world identify this as a failure: Look, the only people who care what you’re doing are already in your club! But in fact, as they say in the software industry, this reciprocity is not a bug at all — it’s a feature.
People who have no experience blogging often fail to understand the essentially social nature of the activity. Blogging is convivial. Bloggers commonly blog in groups, whether formally (as with our Salon bloggers) or simply through the haphazard accretion of casual connections. In these groups, what you contribute is obviously important; but so is where you choose to place your attention. Reading is as much a part of blogging as writing; listening is as important as speaking. This is what so many bloggers mean when they claim that “blogging is a conversation”: not that each post sparks a vigorous exchange of comments, but that every post exists in a context of post-and-response that stretches across some patch of the Web, link by link, blog to blog.
In “The First Word,” her book about the origins of language, Christine Kenneally describes the scene when two apes trained in sign language first encountered each other: “What resulted was a sign-shouting match; neither ape was willing to listen.” Anyone who’s ever witnessed a deadend flame war online knows the feeling.
For communication of any stripe to take place, listening must somehow be involved. Knowing how to listen is no less essential in keeping a blog than in any other encounter with other people. For bloggers, listening takes many forms, including following other bloggers (or subscribing to their feeds), reading comments, and checking the referrers to find inbound links. When all these channels are open, a blogger can feel embedded in a buzzing hive of attention and support and argument. Of course, that means that if the channels close, the author can feel abandoned, betrayed.
Bloggers, most of them solo bootstrappers of their own stream of self-expression, are the most autonomous writers the world has yet seen — the least dependent on others to publish their words. (This is a central difference between the individualist blog and the collective wiki — the other Web-native writing form that achieved popular success this decade. On a blog, you alone can edit your words; on the typical wiki, anyone with an account can change anything.) At the same time, of all the species of writer, bloggers are the least insulated from their audience, most vulnerable to the ebb and flow of attention and response. They are both alone and in a crowd. Their solitude can inspire self-indulgent ranting; their sociability can tempt them into self-serving pandering. But every now and then they manage to hold their balance in this paradoxical position for an extended, exhilarating spell.
“Each blog,” James Wolcott wrote in 2002, “is like a blinking neuron in the circuitry of an emerging, chatterbox superbrain.” This striking image cuts two ways. It’s alluring to think you might be participating in a grand barn-raising for species-wide consciousness via the Web. It’s creepy, too: What if doing so costs you some part of your separate identity — what Nicholas Carr calls a “loss of selfness,” a feeling of “slowly being emptied,” a sense that “we are beginning to blur at the edges”?
That is certainly a possibility. Any act of public expression, of “putting everything out there” — your political arguments or your creative work or your personal story — is a gamble. We offer something to the world; we cross our fingers that our contributions won’t simply be ignored or derided or misappropriated. Sometimes we’re surprised at how much we get back, and sometimes we feel used.
Either way, we are going to keep at it. Whatever the outcome of each of our individual bets, we can now see that collectively they constitute something unprecedented in human history: a new kind of public sphere, at once ephemeral and timeless, sharing the characteristics of conversation and deliberation. Blogging allows us to think out loud together. Now that we have begun, it’s impossible to imagine stopping.
Domino's Specialty Chicken: It's like regular pizza, except instead of a crust, there's fried chicken. The company's marketing officer calls it "one of the most creative, innovative menu items we have ever had” -- brain power put to good use.
KFC'S ZINGER DOUBLE DOWN KING: A sandwich made by adding a burger patty to the infamous chicken-instead-of-buns creation can only be described using all caps. NO BUN ALL MEAT. Only available in South Korea.
Taco Bell's Waffle Taco: It took two years for Taco Bell to develop this waffle folded in the shape of a taco, the stand-out star of its new breakfast menu.
Krispy Kreme Triple Cheeseburger: Only attendees at the San Diego County Fair were given the opportunity to taste the official version of this donut-hamburger-heart attack combo. The rest of America has reasonable odds of not dropping dead tomorrow.
Taco Bell's Quesarito: A burrito wrapped in a quesadilla inside an enigma. Quarantined to one store in Oklahoma City.