"Roman Candle" turns 20: Secrets of Elliott Smith's accidental masterpiece (slideshow)
Elliott and the friends with whom he recorded in middle school in Texas (photo courtesy of Dan Pickering)
Topics: Entertainment News
It was a dragon punch, I think, or maybe a fireball move, executed with flashing thumbs by a man playing what looked to be Super StreetFighter 2X Revival on his new Gameboy Advance, that made me realize I hadn’t been paying attention.
Exiting a subway train, I glanced down and was transfixed. The graphics, color and resolution of the Gameboy were riveting; the two kung fu masters pirouetting in air leaped off the three-inch screen. Caught in a spasm of techno-envy, I savored, for a split second, a familiar moment of awareness: Technology was leapfrogging itself, again.
Such moments always inspire equal parts irritation and delight: irritation at the inexorable planned obsolescence of high-tech gadgetry; delight at the fact that some new models really are cooler than last year’s. And it’s an awareness that I used to breathe on a daily, hourly, minute-by-minute basis; a part of the roaring ’90s indistinguishable from the surge of the Internet, the mainstream advent of e-mail and the Web, the debuts of Amazon, Yahoo and eBay, multigigabyte hard drives and weekly updates of Netscape Navigator. There was a time, somewhere around 1996 or 1997, when the gadgets and the software and the Net really did seem to be getting better all the time.
There was a deeper structure, too, that was even more exciting, tied to the communities and collaboration made possible by the Net. The sheer amount of information available to the world in the new online library could only be a good thing. The free software movement suggested that computer technology could usher in a kind of democratic egalitarianism that would be truly liberating. The astoundingly burgeoning powers of networked computers would unleash all kinds of individual creativity. I never subscribed to the out-of-control techno-libertarian thesis that the digital world would wipe out governments and international borders, but I did believe something empowering was going on. Sure, the necessity to purchase a new high-powered computer every six months to run ever-more-bloated software could be annoying, but that was OK. DSL really was better than dial-up, computers always got cheaper and you simply couldn’t live without the Net.
The dot-com boom, initially, was rooted in that self-evident sense of being on an endless escalator of betterment. But the dementia of the stock market bubble of 1999 and early 2000, substituting greed for glee, began to overwhelm the positive excitement. And the bust ended up making it all seem like a dream. After the election madness of late 2000, the collapse of the economy into the recession of 2001 and, of course, Sept. 11, the attention of those of us who had made an infatuation with technology into a career understandably wandered. As companies went bankrupt and layoffs mounted and bombs began to fall, once-pressing questions — like whether Linux-based operating systems would ever become truly user-friendly — lost their urgency.
But seeing that Gameboy made me want to stop and smell the semiconductors. Wasn’t I missing out?
There was my friend who had given me a CD he had burned himself with MP3s downloaded via Napster. And there was my neighbor who enjoyed rewinding live television via his Replay TV set, even as he downloaded multimegabyte Photoshop files into his laptop via his wireless broadband home network. And what about that moment just two weeks ago when I unpacked the Lego Mindstorms Robotics Invention System 2.0 that I had bought for my son’s birthday, and realized that the infrared tower would only plug into a USB port, which my aging desktop lacked?
And yet, though my Gameboy sighting made me wonder how I could ever have become deaf to the siren-song beep of technology’s inexorable march forward, it only reminded me of my fondness for following the upgrade path — it didn’t rekindle it. I was remembering what it had been like to be exhilarated by constant change, but I wasn’t actually excited again.
It’s easy enough to guess why. Since Sept. 11, priorities have changed. For months, I’ve been more fascinated with Islamic fundamentalism than free software, with the geopolitics of oil rather than the laws laid down by code — and I’m sure I’m not alone. Techno-driven exuberance begins to fade when you notice that history, culture and religion don’t follow the same rules as personal data assistants and game consoles. Next year’s model isn’t automatically better. Our bombs may have been smarter in Afghanistan than ever before, and it’s nice, right now, to revel in the euphoria of knocking the Taliban off their repressive perch, but is the world really a better place? What other horror is being birthed elsewhere, while our attention is focused on Tora Bora? Poverty, war, hatred, death — Sept. 11 was just another punctuation mark of horror delimiting the end of a century replete with them.
After the dot-com bust, I could say, well, those venture capitalists and investment bankers never really understood what the Net was really about — they didn’t really “get it.” But after Sept. 11, I began wondering whether I too had been blithely ignorant. The Internet was no longer the story, it was a sideshow. And in such a context, what do good Gameboy graphics really mean? Why should I care about wireless connectivity and flat-screen monitors? Next war, I’ll probably be able to use my PDA to view real-time satellite broadcast video of helicopter gunships blasting villages into rubble — but I won’t be a better person for it. The potential of the Internet to change society or empower individual creativity pales against the threats of bioterror, nuclear weapons proliferation, secret military tribunals and the spread of radical fundamentalisms of all kinds. If Silicon Valley wasn’t convinced that it was no longer the center of the universe after the dot-com bust, then Sept. 11 brought the news home with a sledgehammer.
Sept. 11 refocused American attention on the real world, and for that, maybe, we should be grateful. But I’m finding, as a longtime technophile, that the events of this fall have had a chillingly unexpected aftershock. Even as the gadgets march on, and the resolutions get crisper, and the bandwidth pipes get fatter, I’m wondering, do I even believe, anymore, in progress?
On a recent episode of the sublimely silly Dave & Steve’s Video Game Explosion Dave & Steve cited Diablo 2 as a game that fit the category “Games that take a million hours to play.” Then they took ironic enjoyment in abusing a flunky of theirs who they had sent to play the game for a week because he had only gotten his character to the 87th level — instead of maxing out at 99.
I had to chuckle. I’ve done my time on Diablo 2, and before that Starcraft, and before that Total Annihilation and on, backward in time, to that first ecstatic moment of Asteroids on a Radio Shack TRS-80. I know all about that quality that game creators aspire to: replayability. I’ve been a willing pawn — convinced, like everybody in the gaming biz, that the games keep getting better and better, more engrossing, more immersive, more narrative driven or more gory or more realistic — whatever the player wants. Buy one disc, play it for a million hours — now that’s value.
But after a million hours of Diablo, what have you got? A sore wrist, most likely, not to mention a comprehensive understanding of what armor attributes are most useful for defending against Razor Pit Demons. Oh, and then there’s that hollow empty feeling in your stomach when you wonder whether it was really worth it to sacrifice so many hours of your short life in a futile search for a complete set of Milabrega’s Regalia.
Ultimately, was the level of engrossment really any different than it was for, say, Tetris? I would never read a novel or watch a video over and over again for as many hours as a good game, but that doesn’t mean the game is better than the book or movie. In fact, it probably means the opposite: Replayability is insidious — it’s a vortex of escapism qualitatively different from that provided by TV or mystery novels or sports events.
The kind of hardcore gamers who will play Diablo every night for five or six hours are still a fraction of the overall population (though probably a bigger fraction of the teenage population). So maybe it’s no big deal — just an aberration affecting the terminally obsessive. After all, there are still plenty of people out there being creative with their computers, editing home video, making music, instant-messaging with the rest of the world 24/7, 365 days a year. The gamers are by no means the norm — just as Silicon Valley’s view of itself as the bleeding edge of the universe was not shared by everyone outside the borders of California (or even within them).
But the aspect of technological progress that was chafing me was a sense that replayability, or something like it, was more than just a goal for games, but also an essential part of the whole apparatus of gadgets and Internet infrastructural improvements being shoved at consumers from every direction. Too much of what we are told are “enhancements” are really just ways of keeping us plugged in for longer and longer times.
So we have to have broadband, to make our streaming audio and video more satisfactory, and we need a PDA with a color screen so we can spend more time looking at our calendars and contacts databases without our eyes crossing, and we must have wireless connectivity because we hate dealing with cords and abhor being out of contact with our e-mail. We want the technology to improve so we can spend more time — pleasurably — with it. We want, in essence, a kind of constant replayability — to be always logged on and never dissatisfied, never uncomfortable, always entranced. And, of course, there is no end to the incremental improvements one can make toward becoming more comfortable.
The idea of incremental improvement took a big hit when the World Trade Center towers fell. And not unrelatedly, after Sept. 11 I haven’t played one minute of Diablo. And it’s not because I suddenly have qualms about hacking Venom Lords or Oblivion Knights with my battleax while real people are being shredded by 15,000-pound daisy cutters in the snow and ice of the Hindu Kush. It’s because replayability suddenly seems like a copout. It’s because staring into my computer screen, for whatever reason, seems vapid. Suddenly, I check my e-mail less often, and no longer feel so sorely the lack of a cellphone. Do I really need to be instant messaging about my sense of horror at thousands of dead in New York and anthrax and mutilated Afghan children at every possible second?
It would be all too easy, here, to segue into a neo-Luddite rant about the evils of technology; to conclude that at the close of the year 2001, the lessons of terrorism and war and recession should teach us to turn away from the evil simulations of pixels and packets. But that’s not the whole picture.
My lack of excitement for ensuring that my browser software is up to date and there’s enough RAM on my system and that I’ve got the right add-on cards for my PDA isn’t because I think technology is bad. It’s because the events of this year have left me craving something more real. There’s a place for technology — even the most seductive consumerist gizmos, which sales figures suggest are unexpectedly flying off the shelves this Christmas — in that search for the real. But it’s not to be found in the comforting idea that progress is inherent in the inevitable advance of chip speed and jaw-dropping graphics.
In the aftermath of Sept. 11, like much of the rest of the United States population, I plunged into an instant course of study on Islam and oil, Middle Eastern politics and Central Asian geography. But while the struggle to get educated seemed worthwhile and certainly appropriate to my job as a journalist, it also felt inadequate. I wasn’t about to learn Arabic or become an oil futures analyst. I wanted to engage, but felt the level of my engagement to be trivial.
I found my thoughts drawn back to a previous life as a China scholar, before I was seduced by the Net and technology. There, I did have some expertise, and some language ability, and some knowledge upon which to build. Sure, China had suddenly slipped out of the news, but it would be back, I was sure of that.
I set up a kind of shrine in my bedroom, populated by exotic jungle plants, Chinese dictionaries and some pens and paper. I started studying, reading an autobiography of a Taiwanese high-tech entrepreneur. Hours once spent crafting Starcraft strategies or hacking on Linux were now devoted to unlocking the mysteries of Chinese characters and wrestling with a language rooted in thousands and thousands of years of intricate culture. It was oddly disconnected from anything happening in Afghanistan, but it was also infinitely more satisfying than checking my e-mail or Web-surfing game sites.
There was also an aesthetic dimension that was pleasing. A long flat table, with just a few dictionaries, a light, a marble Buddha, a notebook and some colored pens. No phone, no Internet access. No hollow feeling at the end of the night for having sent so much energy swirling into a game-interface vortex.
But this refuge was not anti-technology. At one point, it occured to me that an essay was brewing, and I wanted to jot down a few notes. I pulled out my 2-year-old Vaio laptop and placed it on the table. Slim, sleek, exquisite in that trademark Sony consumer-technology way, it looked appropriate amid the smiling Buddha and the spider plants in a way that my bulky desktop would not have.
The appreciation of the aesthetic dimension of new technology left me somewhat confused. I had talked myself into rejecting the “next year’s model is a must have” mentality — but at the same time, I cherished the power and beauty of good design. Was I succumbing to some Martha Stewart-like version of the good techno-life? Should I get a new Nokia phone because, like a new shirt or pair of patent leather shoes, it made me feel more stylish, more full of self-esteem, just to brandish one?
Probably not. But there is clearly a point where planned obsolescence intersects with an actual improvement in quality — whether that quality is measured in terms of increased productivity, or on a less definable aesthetic or spiritual basis. Once I had reoriented myself away from seeking new technology in order to be engrossed or distracted by it, and put it into the context of a comprehensive attempt to make the minutes go by more meaningfully, my laptop seemed less a symbol of consumerist gratification and more a part of a fully lived life — less an obsessed-over fetish and more a satisfying tool.
The shrine to China is not the only new station in my household. My purchase of Lego Mindstorms Robotic Inventions System — ostensibly for my 4-year-old son, who is obsessed with robots — required its own altar, complete with the Vaio, which did have a USB port, and speakers, so we could listen to the instructions and video clips on the CD-ROM, and buckets of Lego parts.
My son sat on my lap as we installed the software and watched the introductory “tour.” If anything could be described as techno-porn, this was it. Video images of outlandish Lego robots, dancing to a rock ‘n’ roll beat, complete with the promise that with this system, you would have the power to create robots that did what you wanted them to. My son declared that he had never seen anything so cool in his whole life, and I had to agree.
So now I spend some hours not grappling with Chinese but down in the basement, constructing robots and learning to program — with or without my son, who is less interested in the complexities of branching paths and variables than in seeing an actual robot crash into a wall and then bounce away, beeping.
Have I relapsed — is this Lego binge just an immersion in a new computer game even cooler than the latest real-time strategy hit? Perhaps. Maybe the pull of next year’s technology is too strong for me to resist forever, and as the immediate impact of Sept. 11 fades, my willingness to succumb to the Sonys of the world will grow.
But I see a difference. Time spent on the robots, like the time I’d spent with Chinese, is not time lost forever — it’s time spent learning, whether about ancient cultures or modern programming concepts. It’s time spent engaging with the world, rather than spent escaping from it. Time spent growing as a person, rather than fiddling with a mouse. Time spent loving the technology less for its own sake and more for what it can do. The lesson I’ve learned about technology in 2001 is not that it is evil, or that it will change the world, but that we who use it can change ourselves, should we choose to do so. Progress is to be found not in the chips but in our heads.
Elliott and the friends with whom he recorded in middle school in Texas (photo courtesy of Dan Pickering)
Heatmiser publicity shot (L-R: Tony Lash, Brandt Peterson, Neil Gust, Elliott Smith) (photo courtesy of JJ Gonson photography)
Elliott and JJ Gonson (photo courtesy of JJ Gonson photography)
"Stray" 7-inch, Cavity Search Records (photo courtesy of JJ Gonson photography)
Elliott's Hampshire College ID photo, 1987
Elliott with "Le Domino," the guitar he used on "Roman Candle" (courtesy of JJ Gonson photography)
Full "Roman Candle" record cover (courtesy of JJ Gonson photography)
Elliott goofing off in Portland (courtesy of JJ Gonson photography)
Heatmiser (L-R: Elliott Smith, Neil Gust, Tony Lash, Brandt Peterson)(courtesy of JJ Gonson photography)
The Greenhouse Sleeve -- Cassette sleeve from Murder of Crows release, 1988, with first appearance of Condor Avenue (photo courtesy of Glynnis Fawkes)