Home About Resources Investors Businesses Members Admin
Entrepreneur and Business Resources Integral Methods and Technology Governance and Investor Responsibility
|
Why IT Doesn't Matter AnymoreAre we spending too much on technology? This provocative Harvard Business Review excerpt suggests that IT no longer conveys competitive advantage, so invest your capital elsewhere In 1968, a young Intel engineer named Ted Hoff found a way to put the circuits necessary for computer processing onto a tiny piece of silicon. His invention of the microprocessor spurred a series of technological breakthroughs—desktop computers, local and wide area networks, enterprise software, and the Internet—that have transformed the business world. Today, no one would dispute that information technology has become the backbone of commerce. It underpins the operations of individual companies, ties together far-flung supply chains, and, increasingly, links businesses to the customers they serve. Hardly a dollar or a euro changes hands anymore without the aid of computer systems. As IT's power and presence have expanded, companies have come to view it as a resource ever more critical to their success, a fact clearly reflected in their spending habits. In 1965, according to a study by the U.S. Department of Commerce's Bureau of Economic Analysis, less than 5 percent of the capital expenditures of American companies went to information technology. After the introduction of the personal computer in the early 1980s, that percentage rose to 15 percent. By the early 1990s, it had reached more than 30 percent, and by the end of the decade it had hit nearly 50 percent. Even with the recent sluggishness in technology spending, businesses around the world continue to spend well over $2 trillion a year on IT. But the veneration of IT goes much deeper than dollars. It is evident as well in the shifting attitudes of top managers. Twenty years ago, most executives looked down on computers as proletarian tools—glorified typewriters and calculators—best relegated to low level employees like secretaries, analysts, and technicians. It was the rare executive who would let his fingers touch a keyboard, much less incorporate information technology into his strategic thinking. Today, that has changed completely. Chief executives now routinely talk about the strategic value of information technology, about how they can use IT to gain a competitive edge, about the "digitization" of their business models. Most have appointed chief information officers to their senior management teams, and many have hired strategy consulting firms to provide fresh ideas on how to leverage their IT investments for differentiation and advantage. Behind the change in thinking lies a simple assumption: that as IT's potency and ubiquity have increased, so too has its strategic value. It's a reasonable assumption, even an intuitive one. But it's mistaken. What makes a resource truly strategic—what gives it the capacity to be the basis for a sustained competitive advantage—is not ubiquity but scarcity. You only gain an edge over rivals by having or doing something that they can't have or do. By now, the core functions of IT—data storage, data processing, and data transport—have become available and affordable to all.1 Their very power and presence have begun to transform them from potentially strategic resources into commodity factors of production. They are becoming costs of doing business that must be paid by all but provide distinction to none. IT is best seen as the latest in a series of broadly adopted technologies that have reshaped industry over the past two centuries—from the steam engine and the railroad to the telegraph and the telephone to the electric generator and the internal combustion engine. For a brief period, as they were being built into the infrastructure of commerce, all these technologies opened opportunities for forward-looking companies to gain real advantages. But as their availability increased and their cost decreased—as they became ubiquitous—they became commodity inputs. From a strategic standpoint, they became invisible; they no longer mattered. That is exactly what is happening to information technology today, and the implications for corporate IT management are profound. From
offense to defense In the long run, though, the greatest IT risk facing most companies is more prosaic than a catastrophe. It is, simply, overspending. IT may be a commodity, and its costs may fall rapidly enough to ensure that any new capabilities are quickly shared, but the very fact that it is entwined with so many business functions means that it will continue to consume a large portion of corporate spending. For most companies, just staying in business will require big outlays for IT. What's important—and this holds true for any commodity input—is to be able to separate essential investments from ones that are discretionary, unnecessary, or even counterproductive. At a high level, stronger cost management requires more rigor in evaluating expected returns from systems investments, more creativity in exploring simpler and cheaper alternatives, and a greater openness to outsourcing and other partnerships. But most companies can also reap significant savings by simply cutting out waste. Personal computers are a good example. Every year, businesses purchase more than 100 million PCs, most of which replace older models. Yet the vast majority of workers who use PCs rely on only a few simple applications—word processing, spreadsheets, e-mail, and Web browsing. These applications have been technologically mature for years; they require only a fraction of the computing power provided by today's microprocessors. Nevertheless, companies continue to roll out across-the-board hardware and software upgrades. Much of that spending, if truth be told, is driven by vendors' strategies. Big hardware and software suppliers have become very good at parceling out new features and capabilities in ways that force companies into buying new computers, applications, and networking equipment much more frequently than they need to. The time has come for IT buyers to throw their weight around, to negotiate contracts that ensure the long-term usefulness of their PC investments and impose hard limits on upgrade costs. And if vendors balk, companies should be willing to explore cheaper solutions, including open-source applications and bare-bones network PCs, even if it means sacrificing features. If a company needs evidence of the kind of money that might be saved, it need only look at Microsoft's profit margin. In addition to being passive in their purchasing, companies have been sloppy in their use of IT. That's particularly true with data storage, which has come to account for more than half of many companies' IT expenditures. The bulk of what's being stored on corporate networks has little to do with making products or serving customers—it consists of employees' saved e-mails and files, including terabytes of spam, MP3s, and video clips. Computerworld estimates that as much as 70 percent of the storage capacity of a typical Windows network is wasted—an enormous unnecessary expense. Restricting employees' ability to save files indiscriminately and indefinitely may seem distasteful to many managers, but it can have a real impact on the bottom line. Now that IT has become the dominant capital expense for most businesses, there's no excuse for waste and sloppiness. Given the rapid pace of technology's advance, delaying IT investments can be another powerful way to cut costs—while also reducing a firm's chance of being saddled with buggy or soon-to-be-obsolete technology. Many companies, particularly during the 1990s, rushed their IT investments either because they hoped to capture a first-mover advantage or because they feared being left behind. Except in very rare cases, both the hope and the fear were unwarranted. The smartest users of technology—here again, Dell and Wal-Mart stand out—stay well back from the cutting edge, waiting to make purchases until standards and best practices solidify. They let their impatient competitors shoulder the high costs of experimentation, and then they sweep past them, spending less and getting more. Some managers may worry that being stingy with IT dollars will damage their competitive positions. But studies of corporate IT spending consistently show that greater expenditures rarely translate into superior financial results. In fact, the opposite is usually true. In 2002, the consulting firm Alinean compared the IT expenditures and the financial results of 7,500 large U.S. companies and discovered that the top performers tended to be among the most tightfisted. The twenty-five companies that delivered the highest economic returns, for example, spent on average just 0.8 percent of their revenues on IT, while the typical company spent 3.7 percent. A recent study by Forrester Research showed, similarly, that the most lavish spenders on IT rarely post the best results. Even Oracle's Larry Ellison, one of the great technology salesmen, admitted in a recent interview that "most companies spend too much [on IT] and get very little in return." As the opportunities for IT-based advantage continue to narrow, the penalties for overspending will only grow. IT management should, frankly, become boring. The key to success, for the vast majority of companies, is no longer to seek advantage aggressively but to manage costs and risks meticulously. If, like many executives, you've begun to take a more defensive posture toward IT in the last two years, spending more frugally and thinking more pragmatically, you're already on the right course. The challenge will be to maintain that discipline when the business cycle strengthens and the chorus of hype about IT's strategic value rises anew
|