High tech’s missionaries of sloppiness

Computer companies specialize in giving consumers lousy products -- it's the American way of techno-capitalism.

Topics: Auto Industry,

High tech's missionaries of sloppiness

Have you had a rat’s nest of computer-related problems take over your life lately for days, or even months — wrecking your work schedule, your leisure plans and your sleep?

If you have, I’m sure you long for the day I do, when computer and software companies that inflict more pain of these dimensions on consumers than any other industry are flattened by the competitor they so richly deserve — the cyber equivalent of the AK-47. That’s the Automat Kalashnikov 1947, the rifle that for roughly half a century has continued to outsell all its rivals — including its chief American competitor, the M16 — around the world.

The AK-47 has every attribute of products I love best. Type the name of this assault weapon into Google and you’ll find site after site listing the same shining virtues: little changed from the original model, the AK-47 and its derivatives are deliberately simple in design, therefore easily and inexpensively manufactured — and above all, reliable, the reason why an estimated 40 million of them have been made so far. For many years, the far more complicated M16 — packed with innovations — was famous mostly for jamming.

It hardly requires a shrink to explain why someone like me — who ordinarily finds the very idea of guns nauseating — should enjoy conjuring up the arms business when reminded that American computer companies are fully aware of the glitchiness of their products and don’t care.

I’m not talking about planned obsolescence, the (dubious) idea that shortened life spans have to be built into industrial products to ensure that industries have enough customers to stay alive.

I am talking, for instance, about the unsurprising message in PC World’s July issue — based on responses from 16,000 subscribers — that computer owners are having more trouble than ever with their machines, and that very few of them are happy with these products or the quality of service from their makers. In analysing repair histories of 13 kinds of products gathered by Consumer Reports, PC World found that roughly 22 percent of computers break down every year — compared to 9 percent of VCRs, 7 percent of big-screen TVs, 7 percent of clothes dryers and 8 percent of refrigerators.

I am talking about a study of personal-computer failure rates by the Gartner Group discovering that there was a failure rate of 25 percent for notebook computers used in large American corporations. “That means one out of four notebooks will fail in a given year,” says Kevin Knox, a technical director at Gartner, who believes that that rate has in all likelihood increased since the study was done three years ago.

None of this is accidental. A culture of carelessness seems to have taken over in high-tech America. The personal computer is a shining model of unreliability because the high-tech industry today actually exalts sloppiness as a modus operandi.

Not long ago, Silicon Valley marketing guru and venture capitalist Regis McKenna — for whom I was editing a book — told me that high-tech leaders who had once made pilgrimages to Japan to understand quality circles and other tools of quality control had lost interest in those buzzwords of the 1980s. They had come to see their product reliability problems as an inevitable side effect of what they excelled at — innovation at top speed.

“‘Act fast and fix the problems later’ is how we operate here,” Regis said. He showed me a Stanford Computer Industry Project study whose conclusion was that Japan would always lag behind America in software innovation and sales because of a business culture in which perfectionism is rampant. Unlike Japanese computer companies hobbled by elaborate quality control and testing procedures, the Stanford researchers found, American companies accept “good enough” quality for the sake of speed. Being first to market with new products is exalted as the highest goal here, and companies fall back on huge technical support and customer service staffs to cope with their many errors of commission and omission.

“Don’t worry, be crappy,” was how Silicon Valley veteran and pundit Guy Kawasaki expressed the same idea two years ago, in a speech that won him a standing ovation. He explained to his audience of 1,000 entrepreneurs that revolutionary products don’t have to be fault-free: “Do not believe that the first version has to be perfect. If the software industry were honest, they would tell you the algorithm is: ship, then test.”

But what does the personal computer industry mean when it says “first version”? Seemingly, anything. The new features crammed into virtually every product and every software release could put most of our significant computer-related purchases into that category.

Computer and software companies could improve the reliability of their products. But they simply don’t.

Thirteen years ago, Watts Humphrey, a 27-year veteran of IBM who is now a fellow of the Pentagon-financed Software Engineering Institute at Carnegie Mellon University — developed a methodology for designing quality and reliability into software products. The idea at its core is that high quality has to be designed into software development and manufacture from the start; it cannot just be “tested in” at the end of the process.

So what about those companies that whine that giving consumers bug-free products would mean raising their prices by as much as 50 percent? Quality-focused software development can dramatically shrink overall development costs, says Humphrey. The few American companies that have adopted his techniques show astonishing results. For instance, at Raytheon Electronics Systems, where the cost of quality was almost 60 percent of total production costs in 1990, that tab had fallen to 15 percent by 1996 and has since sunk below 10 percent.

Humphrey believes that there is no excuse for glitchy software. “We should stop talking about software bugs as if they were mere annoyances,” he has said. “They are defects and should be so labeled.” Unlike software companies, he said, “Many other industries produce high quality products and take full responsibility for their defects.” Though commercial aircraft are, like computers, extremely complex hardware and software systems, their makers do not duck responsibility for their flaws.

But Humphrey has been ignored by the American personal computer industry. Many technologists note an eerie parallel to the American automobile industry’s disdain, in the 1950s and 1960s, for the quality-boosting methodologies invented by W. Edwards Deming — on which Humphrey’s technique is closely modeled. And they predict that someday soon, the computer industry of some foreign country that embraces Humphrey’s ideas will do to its American competitors exactly what Japanese car makers did to Detroit.

Bryan Pfaffenberger, on the faculty of the School of Engineering and Applied Science at the University of Virginia, is one of many experts reminding us that the Japanese auto industry thrived by giving Deming’s ideas a home:

“Japanese car makers took Deming’s teachings to heart,” writes Pfaffenberger, “and they started making some exceptionally fine automobiles. What’s more, they were cheap. The result? Japanese auto makers grabbed nearly a third of the U.S. market and most of the international market.”

Already there is one foreign country venerating Humphrey, 73, the way the Japanese did Deming. India has 22 of the 38 software companies around the world that have adopted his methodology and are certified to have met the Software Engineering Institute’s highest — “Level 5″ — standards for quality. (Four American companies, including Perot Systems and Citicorp, own Level 5 subsidiaries in India.) Last year, the Indian government and several Indian companies founded the Watts Humphrey Software Quality Institute in Chennai, in South India, where a contract software development firm called Advanced Information Systems is churning out software with just 0.05 defects per 1,000 lines of code — “better than the space shuttle’s software,” Pfaffenberger says — and has, as a result, doubled its profits.

Critics of Humphrey’s high-quality software regimen — which imposes strict performance measures on programmers — protest that it cramps creativity. “[A] fine expression of 19th-century ideas about scientific management … It’s a good thing for the technology that so few people are disciplined in the way Humphrey proposes,” grumbles a techie reviewer of one book by the quality expert, “A Discipline for Software Engineering,” at Amazon.com.

The unwillingness of programmers to submit to micro-management might be understandable from a psychological perspective. But any victim of defect-riddled personal computers — which is to say, virtually every user of these machines — is unlikely to have much sympathy for their feelings on that score. Speaking out on our behalf is a growing band of respected computer scientists and engineers who argue that the era of playful creativity governing the design and manufacture of PCs is over, and that it has got to give way to one in which computers are seen by their creators as being more like bridges and tunnels than, say, the houses of Frank Lloyd Wright or Le Corbusier.

So what can we do?

“While stories of bad software killing people are still rare, they exist and may portend the future,” James Morris, dean of the Carnegie Mellon School of Computer Science, recently argued in an essay proposing a joint research and education push by universities, government and industry to improve the dependability of computer systems.

He suggested calling this the High Dependability Computing Consortium (HDCC), warning that “as we entrust our lives and livelihoods to computers, many systems will effectively become critical … Even a simple word processor can become mission critical if it crashes a few minutes before the courier pickup deadline for a proposal submission. It is vital that even everyday, seemingly non-critical applications be raised to a higher level of dependability to replace the enormous hidden costs their unreliability levies on businesses and individuals.”

But set against the practices of the personal computer industry, Morris’ perspective might easily belong to another galaxy.

Common sense would seem to suggest that measuring defects is a vital first step toward eliminating them. Incredibly enough, no one collects (un)reliability statistics for personal computers and software. “There’s no independent body today collecting numbers and the companies themselves certainly won’t give out any information about the performance of their products,” says Gartner’s Knox.

Common sense would also seem to hold that knowing the root causes of breakdowns is essential in preventing recurrences. Yet, in a short paper Knox sent me, he wrote, “PC problems are very rarely diagnosed. Rather, ad hoc solutions are immediately applied.” The accepted industry practice, he said, is “fix rather than diagnose … For example, rather than diagnosing and fixing a specific system-level problem, the [computer company] might immediately apply a new motherboard assembly and BIOS, never determining what the root cause of the problem was.”

It’s as if Kawasaki’s “don’t worry, be crappy” advice about the development of products had become a license for slipshod work in every sphere of computer companies’ operations; carelessness that freely wastes these firms’ own resources.

Even a mere computer-user can see this. Let me explain. Twice, between May and October of this year, Dell, America’s biggest direct seller of personal computers to consumers, had to replace the modem in a $3,600 portable machine still less than two years old that replaced its predecesor, a lemon, within six weeks of my original purchase. (I will leave the reader to imagine what weeks of pointless troubleshooting, assisted by Dell’s telephone tech support — first on behalf of one computer and then each of two modems — did to deadlines for my own work.)

After my second Dell modem failed, the company was supposed to send its on-site service tech to my house with a new motherboard as well as a third modem. But someone at a Dell warehouse somewhere in Texas failed to put the motherboard on the plane, so that the same service tech had to wait for another expensive overnight shipment and pay me another call two days later. When changing both components did no good and I was asked to let Dell fly my machine to its repair shop in Tennessee, my local Airborne Express truck driver also had to make a second trip to my house because the reference number a Dell employee gave the air-freight company did not match the one she had given me. And at the end of all this, when I finally had a working computer again, no one at Dell could tell me why the cause of the latest episode of trouble was essentially unknown. (“Fix, don’t diagnose.”)

One more example of monumental inefficiency and waste. In the early autumn, two separate faults in my Hewlett Packard printer — slightly older than a year — led to hours-long tech support calls over several days, at the end of which the company’s tech support staff told me I needed the latest version of its software driver, and that I’d have to exchange my printer for a new one they would send out.

The replacement printer arrived the next day, shipped by Fed Ex even though it was perfectly useless without the new driver, for which I was told I’d have to wait a week. A kind HP tech support lady, Joann Osier, took pity on me when — after three weeks in which I had two printers but could print nothing — I still hadn’t received the promised software CD. She said she would simply Fed Ex me a sample copy she happened to have sitting on her desk, and she did. I thought of that good Samaritan with special gratitude nearly two months later, when the software promised nearly three months earlier finally appeared on my doorstep: It had been shipped by (priority overnight) Fed-Ex.

Perhaps the biggest irony in all this is that the shoddiness of high-tech products means that people don’t use more than a very small fraction of the innovations developed at breakneck speed that are supposed to justify high-tech sloppiness. For a start, many of these have more style than substance — what computer scientists are calling “feature-itis,” “glitzware” or, in a pointed reference to the products of late-1950s Detroit, “tail fins” and “flashy chrome bumpers.”

But that’s not the worst of it. In a syndicated newspaper column published in the Los Angeles Times on Nov. 27, Gary Chapman, director of the 21st Century Project at the University of Texas and an authority on the social implications of new developments in information technology, noted that “repeated experiences with software glitches tend to narrow one’s use of computers to the familiar and routine. Studies have shown that most users rely on less than 10 percent of the features of common programs as Microsoft Word or Netscape Communicator. It takes a high toleration for frustration and failure to explore beyond the boundaries of one’s own comfort level … It also calls into question how much money and energy we spend on new software features that most people don’t use or even know about.”

In the essay by Carnegie Mellon’s James Morris on computer dependability, he wrote, “In the 1950s, what was good for the car industry was good for the U.S. … As with car quality in the 1950s, it is widely argued that it is a disservice to stockholders to make software more reliable …” I e-mailed him a few days ago to ask what he thought it would take before this state of affairs ends. In his reply, he mentioned that a number of Silicon Valley companies, as well as NASA and his university, would be launching his brainchild, the High Dependability Computing Consortium, on Monday. But he didn’t promote his consciousness-raising effort as the most likely agent of change (and he gave no answer at all to my supplementary question about whether American computer or software companies — as opposed to universities and government agencies — had committed serious sums of money or resources to the project.)

“Not until the consumers demand [quality] and get it from overseas will the reigning companies believe,” he e-mailed me. “American computer and software companies are making too much money in the current environment to care.”

Featured Slide Shows

  • Share on Twitter
  • Share on Facebook
  • 1 of 7
  • Close
  • Fullscreen
  • Thumbnails
    AP/Jae C. Hong

    Your summer in extreme weather

    California drought

    Since May, California has faced a historic drought, resulting in the loss of 63 trillion gallons of water. 95.4 percent of the state is now experiencing "severe" drought conditions, which is only a marginal improvement from 97.5 percent last week.

    A recent study published in the journal Science found that the Earth has actually risen about 0.16 inches in the past 18 months because of the extreme loss of groundwater. The drought is particularly devastating for California's enormous agriculture industry and will cost the state $2.2 billion this year, cutting over 17,000 jobs in the process.


    Meteorologists blame the drought on a large zone (almost 4 miles high and 2,000 miles long) of high pressure in the atmosphere off the West Coast which blocks Pacific winter storms from reaching land. High pressure zones come and go, but this one has been stationary since December 2012.

    Darin Epperly

    Your summer in extreme weather

    Great Plains tornadoes

    From June 16-18 this year, the Midwest was slammed by a series of four tornadoes, all ranking as category EF4--meaning the winds reached up to 200 miles per hour. An unlucky town called Pilger in Nebraska was hit especially hard, suffering through twin tornadoes, an extreme event that may only occur every few decades. The two that swept through the town killed two people, injured 16 and demolished as many as 50 homes.   

    "It was terribly wide," local resident Marianne Pesotta said to CNN affiliate KETV-TV. "I drove east [to escape]. I could see how bad it was. I had to get out of there."   

    But atmospheric scientist Jeff Weber cautions against connecting these events with climate change. "This is not a climate signal," he said in an interview with NBC News. "This is a meteorological signal."

    AP/Detroit News, David Coates

    Your summer in extreme weather

    Michigan flooding

    On Aug. 11, Detroit's wettest day in 89 years -- with rainfall at 4.57 inches -- resulted in the flooding of at least five major freeways, leading to three deaths, more than 1,000 cars being abandoned on the road and thousands of ruined basements. Gov. Rick Snyder declared it a disaster. It took officials two full days to clear the roads. Weeks later, FEMA is finally set to begin assessing damage.   

    Heavy rainfall events are becoming more and more common, and some scientists have attributed the trend to climate change, since the atmosphere can hold more moisture at higher temperatures. Mashable's Andrew Freedman wrote on the increasing incidence of this type of weather: "This means that storms, from localized thunderstorms to massive hurricanes, have more energy to work with, and are able to wring out greater amounts of rain or snow in heavy bursts. In general, more precipitation is now coming in shorter, heavier bursts compared to a few decades ago, and this is putting strain on urban infrastructure such as sewer systems that are unable to handle such sudden influxes of water."

    AP/The Fresno Bee, Eric Paul Zamora

    Your summer in extreme weather

    Yosemite wildfires

    An extreme wildfire burning near Yosemite National Park forced authorities to evacuate 13,000 nearby residents, while the Madera County sheriff declared a local emergency. The summer has been marked by several wildfires due to California's extreme drought, which causes vegetation to become perfect kindling.   

    Surprisingly, however, firefighters have done an admirable job containing the blazes. According to the L.A. Times, firefighters with the state's Department of Forestry and Fire Protection have fought over 4,000 fires so far in 2014 -- an increase of over 500 fires from the same time in 2013.

    Reuters/Eugene Tanner

    Your summer in extreme weather

    Hawaii hurricanes

    Hurricane Iselle was set to be the first hurricane to make landfall in Hawaii in 22 years. It was downgraded to a tropical storm and didn't end up being nearly as disastrous as it could have been, but it still managed to essentially shut down the entire state for a day, as businesses and residents hunkered down in preparation, with many boarding up their windows to guard against strong gusts. The storm resulted in downed trees, 21,000 people out of power and a number of damaged homes.

    Debbie Arita, a local from the Big Island described her experience: "We could hear the wind howling through the doors. The light poles in the parking lot were bobbing up and down with all the wind and rain."


    Your summer in extreme weather

    Florida red tide

    A major red tide bloom can reach more than 100 miles along the coast and around 30 miles offshore. Although you can't really see it in the above photo, the effects are devastating for wildlife. This summer, Florida was hit by an enormous, lingering red tide, also known as a harmful algae bloom (HAB), which occurs when algae grow out of control. HABs are toxic to fish, crabs, octopuses and other sea creatures, and this one resulted in the death of thousands of fish. When the HAB gets close enough to shore, it can also have an effect on air quality, making it harder for people to breathe.   

    The HAB is currently closest to land near Pinellas County in the Gulf of Mexico, where it is 5-10 miles offshore.

  • Recent Slide Shows



Comment Preview

Your name will appear as username ( settings | log out )

You may use these HTML tags and attributes: <a href=""> <b> <em> <strong> <i> <blockquote>