We have computers. Why aren't we more productive?

Technology doesn't usually save companies time or money -- but in a competitive world, it often keeps them in business.

Topics: Alan Greenspan,

When project manager Mal Glendinning pitched his bosses at the Washington Water Power Company on a new customer information system, he claimed the software would save the Spokane, Wash., public utility time and money and, by extension, make it more competitive. Like many computer professionals, he never considered the possibility that maybe the new system would save neither time nor money, yet still be worthwhile. Or if he did, he kept his thoughts to himself.

In fact, new research shows that technology rarely saves businesses time or money. In fact, innovations often come at considerable expense. But they do help companies do new things that would otherwise be impossible.

This may explain why productivity statistics haven’t increased along with investments in information technology. For years, economists have puzzled over what they call the productivity paradox. U.S. companies have invested billions of dollars in computer technology since the 1970s. Yet government statistics show they reaped no productivity gains until 1997. This is perplexing. We expect computers to make companies more productive by allowing workers to get more done in less time — that is, by increasing production without increasing costs.

In the last two years, productivity has risen by 2 percent. The business press has gushed that investment in computers is finally paying off. Federal Reserve Chairman Alan Greenspan recently told Congress that productivity is up and that information technology is responsible for the current economic boom.

A look at technology projects company by company, however, shows that costs are going up, not down. Companies are spending more to do more. Their technology investments are bringing not cost savings or increased productivity, but increased capabilities. They invest in technology to stay competitive and grow market share.

Consider the evidence: First of all, many large-scale information technology projects fail outright. A 1998 survey by Standish Group International, a Massachusetts research firm, found that only 26 percent of all information technology projects are considered successful by the companies that contracted them. Some 28 percent were considered failures; 46 percent were considered “challenged” — for taking too much time, going over budget or lacking desired features and functions. They are as risky as junk bonds and oil exploration, says Jack Ross, a consultant with the Chicago Research and Planning Group who studies ways to measure the “intangible” benefits of computer systems.



Typically, the projects that do succeed enhance a company’s output in some way, but at greater expense than the old system. It is rare, if not unheard of, for new technology to allow a company to do exactly what it did before, only faster and less expensively.

The Washington Water Power Company is a typical example. The utility spent four years writing a new client-server application for customer accounts, billing and meter reading. The object-oriented framework alone took a year. The project managers argued that they would save on future development costs by reusing bits of software they’d already written. They also said the system would allow the company to perform new tasks or change to a different brand of hardware.

But both these scenarios were vague and futuristic. In fact, companies that have created object-oriented software have often found that they can’t reuse it down the road. What the new system did do was offer the utility company’s customers better service. Customers phoning for information no longer had to hold while an employee retrieved information from a slow mainframe; they didn’t have to be transferred to a different department to order new service or resolve a bill. What the new system did, in effect, was help the company retain its customers.

The new system also corrected some problems caused by the limitations of the old mainframe and its terminals. The company was able to cut eight jobs through attrition. It estimated that between those jobs and a better bill collection system, the new software would save it $500,000 a year. But the system cost $16.5 million to complete. Not factoring in the cost of maintaining both the old and new systems, it would take 33 years to justify the cost of the new system.

The company says it needed the new software to keep its customers and stay in business in the newly deregulated energy industry. That’s probably true. It might not survive without the new system — but it certainly wasn’t going to save any money.

Consider another typical success story. PMA Insurance of Bluebell, Pa., replaced what it admits was a sprawling, disorganized and inflexible policy-writing program with a new one that would let it write custom insurance policies. The old system didn’t allow for unusual data-entry fields, print non-standard forms or calculate rates for non-standard policies; the new one could do all that. But Mark Clark, PMA’s vice president of information systems, admits that the new system was more expensive than the old. The cost of the new system was less than the cost of maintaining the old one over four years. But maintenance costs and software licensing fees were higher on the new system. So, while the company knew it would not be saving money, it went ahead with the project, hoping the system would pay for itself by helping PMA gain new customers.

Then there’s the not-so-secret history of the automated teller machine. ATMs may be saving the banks money now, but they weren’t for 10 years, says Marc Perl, an MIS manager at a large financial services company and a former Bank of America employee.

“The banks put out ATMs because they thought they’d save them money, save them tellers. But it turned out teller demand didn’t go down much,” he says. “By the late ’80s, they started to realize that the cost of maintaining the system was huge.”

Initially, the banks’ technology managers underestimated the costs, not taking into account such things as the expense of manufacturing and mailing out ATM cards. If there was a payoff, it came so late that the banks would have been better off investing their money in fixed-rate five-year certificates of deposit, Perl says.

Formal studies confirm that companies are using computers to do more, not to cut costs. “Today the focus is on customer service, quality, timeliness and innovation,” says Erik Brynoffson, an economist at the Massachusetts Institute of Technology. He uses company data, like interviews with managers, to form a picture of productivity trends across many corporations. In the past, he says, “Everything was focused on lowering costs and increasing efficiency. Band-aids used to be any color you wanted as long as they were tan. Now you can get them in all sorts of shapes and sizes.”

The truth is, government productivity statistics don’t reflect the contribution that computers make. The numbers don’t measure improvements in quality, innovation, flexibility or timeliness. Nor do most businesses measure these kinds of benefits when they evaluate their own computer projects.

“Often return on investment on information technology projects cannot be measured,” says Stephen J. Winterburn, PeopleSoft implementation manager at Scottish & Newcastle Retail Ltd., commenting on a survey conducted by the Harvard Business School and Cambridge Information Network, an online community for CIOs. “How do you measure improved communications between internal and external customers in hard cash?” he asks. “How do you put a cost on having Y2K compliant systems? We all know Y2K could bring down an organization, but it’s quite hard to justify the cost of ensuring that all systems are OK.”

Companies are often stuck having to justify needed technology projects by claiming they will save money even when they won’t. “The technology manager says, ‘How can we convince the board of directors to make infrastructure investments that we know are right?’” explains Ed Baum, president of Cambridge Information Network. In the CIN-Harvard survey, one-third of 140 respondents said they don’t believe the effects of information technology can be measured. But more than 80 percent of the non-believers said they do formal studies anyway on some or all of their investments.

Several years ago, Federal Express addressed the murky situation head-on with a new set of guidelines for evaluating potential projects. Now the company divides all technology proposals into one of three categories: “strategic,” “required” and “return on investment.”

Required projects are things Federal Express has to do to stay in business, such as payroll, billing and anything related to safety or government regulations. Return on investment projects must either bring in additional revenue, improve a product in a way that’s measurable, or avoid a cost in a way that can be quantified; any project that doesn’t produce a 30 percent return on investment won’t be approved. Strategic projects, however, are ones that can’t be justified by cost savings, but which are essential to Federal Express’ business, marketing, or customer service.

“We decided to track all our packages, and we had to put a significant investment into technology to do that,” said Winn Stephenson, Federal Express’ vice president of network computing. “It was strategic because there was no way we could figure out the direct return on investment of the project, but it was the way we wanted to market our services. It had to do with how we wanted to grow our business, how we were perceived in the marketplace, and giving our customers 100 percent satisfaction, all things that are the hallmark of our service.”

The kind of technological innovations that FedEx categorizes as “strategic” are the hallmark of competitiveness across industries today. They are new products and services that don’t necessarily improve a company’s bottom line, but often bring it more customers.

“Senior management has bought off on the fact that they’re dependent on computers,” says Perl. “To run a business in the 1990s, they need it. They have no choice.”

The question remains: If computers don’t cut costs, what caused the recent uptick in productivity numbers?

MIT’s Brynoffson believes the statistics are based on a narrow, outmoded definition of productivity by which, he concedes, productivity could have increased. Or the government might have tweaked its measurement methods. Or it could just be that the economy is going gangbusters, he offers. Northwestern University economist Robert Gordon attributes the change solely to improvements in computer manufacturing; he says in a recent study that productivity elsewhere has declined.

It may be too early to say what caused the gain — and 2 percent is not an overwhelming change, say the experts.

But one thing is clear. In the vast majority of cases, information technology projects cost businesses more than they save. But they help companies do new things that human beings can’t. Databases, networks and speedy computation make it possible for corporations to offer better customer service and new products — be they custom insurance policies or ATMs. The computer revolution of the last three decades may not have led to a massive increase in productivity, but it has made life better for many, and certainly helped bring about wild increases in competitiveness.

Once in a while, an innovative investment does allow a company to pull ahead of its competitors for a few years and realize a big payoff. This seems to have been the case with Federal Express’ package-tracking system. Walmart is also frequently credited with having beat its competitors and made huge profits for a couple of years because of an innovative computerized stock-replenishment system. “Firms that succeed in implementing these new business processes earn unusually high profits because it’s so difficult for other firms to copy them,” says Brynoffson. “One of the iron laws of capitalism is that profits tend to get beaten down to zero as more firms learn to implement new technology. It seems paradoxical, but profits and returns are the highest for the newest innovations and the ones that are the least understood. In chaos lies opportunity.”

True, but these days, the main opportunity is for technology suppliers and customers, for anyone, that is, who can benefit from the increased appetite for technology or the new goods and services that technology enables.

And there’s nothing wrong with that. Just don’t mistake it for old-fashioned, industrial-age productivity.

Cate T. Corcoran is a San Francisco freelancer who writes about business, technology, culture and media.

Featured Slide Shows

  • Share on Twitter
  • Share on Facebook
  • 1 of 26
  • Close
  • Fullscreen
  • Thumbnails

Comments

0 Comments

Comment Preview

Your name will appear as username ( settings | log out )

You may use these HTML tags and attributes: <a href=""> <b> <em> <strong> <i> <blockquote>