The prince of polygons

How, with a little help from Microsoft, 3-D chipmaker Nvidia won the hearts and minds of hardcore gamers.

Published May 16, 2002 7:30PM (EDT)

In the world of golf, new designs for clubs and balls can lead to greater accuracy and longer drives. The game of tennis has changed dramatically from the days when Bjorn Borg swung a wooden racket, as the progression to steel and graphite made serves ever more powerful. Likewise, moviemaking and pop music take advantage of every increase in special effects or production technology.

But gaming, which exists in a strange terrain somewhere between competitive sport and pure entertainment, is inseparable from gaming technology to a degree that trumps everything else. New technology makes possible new kinds of games -- and gamers have come to expect that each year's games will be prettier, more realistic, more immersive. Computer gaming has become a standard bearer for the belief that technology, of any kind, is inherently progressive. So what if the dot-com bubble popped, and if the Internet hasn't yet demolished international borders or toppled authoritarian governments? Games are still getting better, technically. Computers keep improving, bandwidth keeps growing, and the picture keeps getting brighter.

3Dfx stumbled in 1997, when its first attempt at an integrated 2-D/3-D card, the Voodoo Rush, cratered. For the first time, 3Dfx showed vulnerability and allowed potential customers to look to alternatives -- a problem accentuated in 1998 when the Voodoo Banshee card was also panned. But the 1998 introduction of the Voodoo 2 chip set, which offered significant speed gains over previous cards -- when used with Glide-compatible games -- won plaudits from gamers.

And that was crucial, because the possession of a good 3-D accelerator was the only way to play certain games as they were meant to be played. The shooter Unreal, for example, was little more than a well-executed variant of the Doom run-and-gun formula. But when matched with the newest accelerator card, the game's graphics, which prompted one game magazine to warn readers that its cover image was in fact a screen shot from the game, were astonishing. The enemies looked and moved more realistically (as realistically as a 7-foot-tall bipedal reptile with razorlike claws could move), thanks to the new card's ability to push even more polygons, which also allowed for larger and more detailed environments. The new card also handled reflective surfaces, multiple light sources and more. Some features were supported by older cards, but though the result might be almost as pretty, frame rates ran in the single digits. Gamers rushed to upgrade their systems.

This symbiotic evolution cycle continues to this day: Developers, desperate to distinguish their product in the marketplace, regularly throw in graphics features they hope will seem new and innovative -- facial expressions, limb-specific damage modeling, sparkly weapons effects -- to elicit a "whoaaa" from the gaming press. Though new graphics are not sufficient to ensure that a game will be a hit, last year's graphics are a sure knock against any game. As most triple-A titles take years to gestate (so much so that one company lists the release date for a long-anticipated game as "When it's done"), most developers aim their products for the upcoming generation of video cards, sometimes working closely with card makers to get a jump on what those features may be. When the collaboration works, it works well for both parties. The desirability of one spurs interest in the other.

However, the cycle can break down when new products, like 3Dfx's Rush and Banshee, two cards that the company attempted to introduce in 1997 and 1998, don't feed the frame rate or feature beasts. Those failures, most observers agree, were the beginning of the end for 3Dfx -- and gave Nvidia an opening.

"In a nutshell," says Brian Hook, formerly a 3Dfx lead programmer, "3Dfx's management simply couldn't capitalize on the boon handed to them ... and as a result they flailed and turned over management numerous times. The top management at 3Dfx was never, ever in a position to really maximize the opportunity presented to them, and it just happened to take a couple years for reality to catch up. 3Dfx coasted a long time on Glide compatibility and name recognition, but neither of those would last forever -- and they knew it. They couldn't cultivate relationships with OEMs [original equipment manufacturers] -- another key sign of failure on their part."

And Nvidia wasn't sitting still. In 1997 it introduced the Riva 128 chip, which, though it didn't match the existing Voodoo 2-based cards in terms of performance, was inexpensive and, crucially, supported both 2-D and 3-D. Nvidia won a few OEM contracts, and this base kept it in the game. Meanwhile, ATI released its improved Rage 128 card; however, it was plagued with software driver problems -- a curse that would follow ATI products for years and hinder its aftermarket cachet.

One year after it showed signs of new life with the Riva 128, Nvidia introduced the Riva TNT. From there, it was clover. Certainly, it was at that point that Nvidia didn't look back.

In the fast-paced world of computer products, brand loyalty, such as the kind that Apple engenders, is the exception. Perhaps that is a reflection of the greater influence of marketing -- no longer do you see "Ford families" or "GM loyalists" on the roads -- or a recognition of the turnover rate in the what's-hot category of the tech market. In the 3-D card world there wasn't even much loyalty to certain technologies. Gamers bought what produced the best results: the fastest game play, the best image, the most reliable and stable card.

If you ask 19-year-old Greg Tan in between his attempts to hijack a beautifully rendered cargo ship on his screen at Next Game, he'll tell you that better graphics are a plus but the key factor is speed: "If I'm losing because the game's too slow," he says, "then I'll upgrade." He doesn't pay much attention to brand names but does keep track of what his friends are saying about individual products. "If it's got a reputation as being slow, that's a big negative," he says. "As long as I'm winning and killing people, I'm happy."

Just such happiness was what Nvidia's TNT card promised to deliver. It won industry and press awards for its performance and compatibility with gaming APIs. But even more important, Nvidia began to successfully imitate 3Dfx's strategy of targeting top 3-D software developers, hoping to get them on board with the advanced features the new chip offered.

Those features, with names like "stencil," "destination alpha," "full-time multitexture" and with 32-bit color (which 3Dfx would take another year to support) represented the advances Nvidia had been planning for a year, almost two product cycles before. With support for more colors, the TNT could show subtler gradations, such as those in a sunset, without the "banding" or leaps in colors that 16-bit cards suffered from. More colors also meant, again, more realistic scenes. The other features allowed for better depth imaging, better object textures and other improvements to overall visual quality --which was becoming almost as important a selling point as the ability to churn out high frame rates.

"At this point," Hook says, "they had a strategy that worked: Release products quickly with lots of incremental improvements, build and keep strong OEM partnerships, don't alienate partners by competing with them, and start cultivating strong developer relations."

The "release products quickly" part would become a crucial advantage as 3Dfx continued to delay its next generation of product, the Voodoo 3. This would finally be 3Dfx's one-card solution, and it was promised that the Voodoo 3 would support at least some of the features of the Riva TNT.

Ever since the Riva 128 card, Nvidia had been on a strict six-month schedule of product releases and revisions. If a new card came out in the spring, expect a faster version in the fall. And expect a new card the next spring. As Gordon Ung, a senior editor at MaximumPC, says, Nvidia's management learned to "execute like Germans." That meant that Nvidia could put out two or three product cycles while 3Dfx was delayed on one.

Tony Tamasi, now general manager of graphics processors for Nvidia but then 3Dfx's director of product marketing, says: "Fundamentally, I think the thing that Nvidia brought to the industry was a rhythmic, six-month product pace. This was a philosophical paradigm shift in the industry." Previously, product cycles had been eight months to two years. However, the frantic pace of consumer demand made the old model untenable.

As the computer world knowingly approached Y2K and unwittingly faced the NASDAQ crash, the unsupportable, Dale Carnegie-like optimism that fueled the dot-com world was actually being fulfilled in the games market. Here, it really was true that every year brought closer the fever dream of "Toy Story" on the desktop. Consumers, ready to toss a few hundred here or there to meet the system requirements of, say, Quake III or Unreal Tournament, were primed for regular technology advances. They expected it. And it was crucial for chip makers to deliver.

Delivering was key, agrees Jon Peddie, president of Jon Peddie Research and a longtime industry analyst. Nvidia, he says, "executed flawlessly." He notes that Nvidia didn't succumb to the same disease of "creeping featuritis" that 3Dfx did. "If a product doesn't have all features working by the time they cut it and delivered, they hold the feature for the next revision," he says. They could do this because both the engineers and the management knew that the held feature would show up in the next revision, six months down the line.

"3Dfx was one of the worst sinners in not shedding items and shipping," Peddie adds. "They did such a great job on the first three products, buyers were willing to wait" -- but that patience went only so far.

"3Dfx coasted a long time on Glide compatibility and name recognition," says Hook, "but neither of those would last forever -- and they knew it."

It'd been a while since 3Dfx had a big hit. The Voodoo 2 was yesterday's news, the 2-D/3-D cards had hemorrhaged resources and money from the company, and Nvidia was the new darling of an audience that 3Dfx knew was hungry for the next big thing. All those pressures led to 3Dfx's next, last, desperate move.

Rejecting its previous business model of building chip sets and selling them to manufacturers, in December 1998 3Dfx moved to become vertically integrated by acquiring card manufacturer STB Systems in a stock transaction worth $141 million. It was undeniably a gamble, especially when 3Dfx had angered a number of its old business partners by terminating its long-established contracts with third-party card manufacturers. It was also a dot-com-like move, one that expanded the company suddenly and violently and leveraged the whole business on its ability to expand its fortunes in the undefined future. Crucial to this, the company thought, was raising the 3Dfx brand instead of selling chips to companies like Diamond, which then made Diamond-branded cards. And 3Dfx needed to shore up the fortunes of Glide, which was under increasing pressure from Microsoft's Direct3D.

What this meant for the expanded company was that it suddenly had to take on the Herculean tasks of retail distribution, promotion, support, inventory and all the other needs of a hardware company that had to reach and deal with end users. And that was in addition to refocusing the remnants of STB as well as learning how to make the cards themselves at the new manufacturing plant in Juarez, Mexico. It was a bold move -- but it turned out to be the company's Waterloo.

Though he'd left the company by then, Hook says it was fear that drove 3Dfx to make this radical move: "They basically freaked out because they didn't trust their board vendors anymore, and they felt they could control the whole package going that route. They were also way deluded into thinking that they could leverage STB's relationships with OEMs like Dell into getting inclusion in Dell's systems. Bzzzt. OEMs don't care about the board vendor; they care about the chip set. Choice of board vendor becomes an issue when it comes down to pricing and availability." And 3Dfx was losing ground on all fronts. For example, even the Voodoo 3, still in prototype, would not support items, such as stencil buffering and destination alpha, that OEMs felt were necessary selling points.

Not that that was immediately apparent. At the turn of 1999, 3Dfx had a market capitalization of $200 million; almost one-third of all aftermarket 3-D cards sold carried 3Dfx chips; and various gaming Web sites were labeling the company "king of the 3-D hill." But some of the gaming press also asserted that Nvidia had stolen the show in hardware during 1998. And Nvidia had made deals that put its products into retail systems from Dell, Compaq, Micron and Gateway and was announcing quarter after quarter of record revenue, hitting $65.5 million a quarter in February 1999. A month before, Nvidia had gone public and was listed on the then booming NASDAQ.

When the five-months-late Voodoo 3 finally appeared in mid-1999 (at last, 3Dfx had an integrated 2-D/3-D card!), with actual product for the first time shipping under the 3Dfx name, the brand's reputation carried it to the top of PC Data's sales chart. But by then Nvidia had released, in six-month lockstep, the TNT2 chips and its ultimate market master, the GeForce.

While 3Dfx promoted the Voodoo 3's "T-buffer" technology -- which, Accelenation's Thomas Monk points out, offered little in place of raw fps power and relied on masses of once again expensive RAM -- Nvidia was announcing GeForce 256. It was the first 3-D accelerator that supported transformation and lighting (T&L) calculations right in the hardware, offloading those tasks from the CPU and making games look even better while preserving the all-important frame rate.

"They hit the market with a double-barreled shotgun," says Monk, "and stunned the competition." As for 3Dfx, developers were increasingly reluctant to churn out Glide versions of their games, which threatened to cut off the company at the knees, if not higher.

Nvidia made the most of this, taking the chance to rebrand the system not just as a 3-D accelerator but as a GPU -- a graphical processing unit. This capacity would require some rejiggering by game developers, but the change was made easier with Microsoft's support of it in revisions to Direct3D. Nvidia was a prized corporate partner now. How prized could be measured by the intrigue roiling around Microsoft's nascent Xbox.

Though practically everyone who was anyone in the field had at one time or another been rumored to be involved in Microsoft's attempt to unseat Sony and Nintendo in the console world, the leading candidate to manufacture the graphics processors had been a small start-up named Gigapixel. In fact, Gigapixel engineers worked in the same office buildings used by Microsoft's WebTV operation.

Behind the scenes in the spring of 2000, 3Dfx had been working to acquire Gigapixel, both for its promising intellectual property and the much needed windfall the Xbox deal would represent. However, that would be a $186 million swing and a miss.

Less than a month before the Gigapixel deal closed and long after it'd been set in irrevocable motion, Microsoft announced that it would instead co-develop with Nvidia a new GPU in the game machine. Early rumors that March had alone driven Nvidia stock up 71 percent, to $100 a share, in the three days before the official announcement; even after profit-taking, Nvidia's stock remained at all-time highs. In contrast, the bridesmaid 3Dfx had once again devoted resources and time to something that had only postponed the release of its long-awaited products.

By April of 2000, Nvidia, not burdened with running its own manufacturing and marketing business, launched the GeForce 2 GTS, which not only ran at higher clock speeds than its predecessor but incorporated existing technologies, such as full-screen anti-aliasing, which minimizes the "jaggies" that previously plagued the appearance of diagonal lines, with new ones like per-pixel shading. Again, the company went to game developers and charmed them, earning Nvidia-based cards the right to boast of displaying visual qualities no other card could offer. And that month, ATi showed a preliminary version of its Radeon chipset, which would win back some hardcore credibility for the company.

Months prior, 3Dfx had announced its Voodoo 5 line, which would add little other than increased clock speed -- but the company's only news was that the card would be delayed until June. Add to that production and bitter labor problems at the Juarez plant and 3Dfx was reporting fiscal woes in early 2000.

In August, the company pre-announced that its fiscal report for that quarter would show a "large" loss. Revenues had dropped to $67 million, down from $104 million for the previous quarter, while Nvidia's yearly revenues topped $452 million. 3Dfx still had a gross operating profit of about $10 million, but the cost of acquiring Gigapixel resulted in a $100.5 million net loss for the quarter. 3Dfx was burning through its now limited cash supplies; the company looked desperately for new strategies.

One was to develop a new consumer chip using the technology it purchased from Gigapixel. This chip, code-named Rampage, never made it past the larval stage. Part of the reason was what Jon Peddie now calls a "dissipation" of the company's resources. In November 2000, 3Dfx said it was pursuing plans to branch out into the general-purpose graphics chip market (set-top boxes, game consoles, cellphones and other devices), an area long dominated by ATi, Intel and others.

Not one of these products shipped.

That same month, 3Dfx closed the Juarez plant and announced it would get out of the business of manufacturing the cards themselves.

And then it was a short 30 days until 3Dfx let it be known that it had laid off its entire staff and sold a variety of assets (not including the Juarez plant) to ... Nvidia.

The gaming press was stunned. There had been nothing but ominous news from 3Dfx for a while but this was a shock. Suddenly, the competition was over and the enemy was us: For $112 million Nvidia picked up not only 3Dfx's patents, brand names and inventory, and Gigapixel's assets, but also over 110 3Dfx employees, including one of the founding architects of 3Dfx, Gary Tarolli.

Game over, man. Nvidia was the winner. And six months later, the company introduced a new product. And six months later, another. Pixel shaders, vertex shaders -- technology marches on.

Does this mean that Nvidia rules every computer running Quake III? Not by half.

The size of the retail aftermarket is "about 10 percent" of the total 3D card market compared to the OEM deals made by companies like Dell and IBM, points out MaximumPC's Ung. And even in the aftermarket, ATi has retained a strong place; the situation on the Mac side of things, where you can order up a new Power Mac G4 with a GeForce 4 MX, GeForce 4 Ti or Radeon card, is not a bad reflection of the larger picture.

And where Nvidia may currently count for about two-thirds of the standalone chip market, according to Nvidia's Burke, in the larger "integrated" scene, which includes motherboard-based graphics chips and more, Nvidia chips represent significantly less than half of the market.

However, Nvidia may be making a move in that direction with a motherboard design dubbed nForce -- it was enough to impress übergeek site Anandtech into calling it "the biggest step forward in chipset technology we've ever seen." Leveraging technologies and techniques learned from its contributions to the Xbox, the nForce (which will be released as a final product by most major motherboard manufacturers) features an integrated graphics accelerator chipset, though users should be able to expand to a different accelerator card if they wish; perhaps even a Radeon. The motherboards should be available this summer; there's no word yet on when nForce-based PCs should hit the market. Whether this can knock into ATi's good OEM karma is the big question.

On the other end of the spectrum, Nvidia is targeting the professional 3D workstation market with its Quadro line of chipsets. These go into the monster boxes that render 3D scenes more complex and lifelike than is possible on a mere mortal's machine. By most reviews, Nvidia was welcomed into the market, which had been impacted by the implosion of DEC, the hobbling of SGI and other trauma. The fact that professionals can now spend a mere grand and, with a Windows box, match or surpass the performance of last year's specialized workstations has given Nvidia the luster of a minor savior with that crowd.

Of course, none of this is guaranteed to last. Nvidia may be at the top of the heap in most people's heads now, but the same elements that brought the company to prominence could easily be used to unseat it. On one side, 3Dfx-like financial problems loom over Nvidia, from SEC investigations into Enron-like accounting irregularities and allegations of insider trading, to a dispute with Microsoft over the pricing and availability of Xbox components. And then there are rumors, vehemently denied by Nvidia, that Microsoft may drop it from the lineup for the yet-unannounced Xbox 2.

There's also the fact that the maturation of Direct3D (as part of a bigger conglomerate of APIs called DirectX), which allowed and prompted Nvidia's rise, has lowered the barrier for the next upstart. DirectX is a de facto standard in Windows. That means that those who would design new hardware don't have to face the same turmoil that caused previous chip makers such grief. In a way, the pioneers have made it easy for the kids today.

Will that matter to the true believers at Next Game? Perhaps. There is, as there has always been, a next generation of games on the way, games that will certainly bring today's technology to its knees. Doom IV, Dungeon Siege, Duke Nukem Forever -- all these will sorely tax today's crop of video cards as gamers look to achieve the cinema-quality graphics that previews have hinted at. It's a virtual certainty that many of these gamers will rush out and plop down a few hundred dollars just to shoot at the most lushly rendered monsters yet.

That doesn't mean that the new games will actually be any fun to play. As any gamer could tell you, for every new Doom or Warcraft, there's a Rune or a Blair Witch game -- something that may look good but offers such dull and lifeless gameplay that a few rounds of the original Doom or Marathon would be infinitely more appealing. Or even Pong. Even the most exquisitely rendered Balrog can't make up for a bad story or dumb design.

Despite that, there seems to be no slowing of the cycle of software pushing new hardware and new hardware prompting new content. Most of us have reached our limit in what we need from a word processor or a spreadsheet or an e-mail program. But will we ever reach a limit in what we lust for in entertainment?

The recent history of gamers and game technology suggests not. Just ask the guy purchasing a copy of Jedi Knight II at the local software outlet: Will you upgrade? The question isn't if but when.


By Daniel Drew Turner

Writer and editor Daniel Drew Turner covers technology, design and politics.

MORE FROM Daniel Drew Turner


Related Topics ------------------------------------------