Nick Carr has a long post up pursuing the idea that computer usage is bound to follow the model of electricity usage: moving from all DIY to a mostly metered model, due to the overhead and waste:
The energy-inefficiency of the machines themselves is compounded by the way we’ve come to use them. The reigning client-server model of business computing requires that we have far more computers then we actually need. Servers and other hardware are dedicated to running individual applications, and they’re housed in data centers constructed to serve individual companies. The fragmentation of computing has led, by necessity, to woefully low levels of capacity utilization 10% to 30% seems to be the norm in modern data centers. Compare that to the 90% capacity utilization rates routinely achieved by mainframes, and you get a good sense of how much waste is built into business computing today. The majority of computing capacity -- and the electricity required to keep it running -- is squandered.
Well, this needs a big "it depends". For large companies, with thousands of employees (or, millions of users, like Google), there's certainly going to be pressure to wring inefficiency out of the system. However, there's a simple reality: most outfits simply aren't that big. If your business is small, and the biggest use of a PC is to keep the books, then you really don't see those costs. The problem Carr outlines is a real one, but it's limited to a certain segment of the business population. With power, it makes a lot of sense to centralize - the mom and pop shop can't afford to build a coal fired plant in the basement. They can afford the 1-2 PCs they might need though, and selling them on a centralized vision is going to be hard.
Analogies are useful, but they only take you so far.
Technorati Tags:
management