to send electricity over long distances, obviating the need for a local power plant and the people to run it.

To Carr, today's corporate data centers are the private power generators of old: inefficient, underutilized and too costly in the face of the network model of delivering IT services.

"As the technology matures and central distribution becomes possible, large-scale utility suppliers arise to displace the private providers. Although companies may take years to abandon their proprietary supply operations and all the sunk costs they represent, the savings offered by utilities eventually become too compelling to resist, even for the largest enterprises. Abandoning the old model becomes a competitive necessity," Carr wrote.

Nick Carr fires back
On Friday, the author of "The End of Corporate Computing" responded generally to IT executives' comments and criticisms.

"What we don't know is the ultimate shape of the IT utility model or the course of its development. That's what makes it so interesting--and so dangerous to current suppliers. What we do know is that the current model of private IT supply, where every company has to build and maintain its own IT power plant, is profoundly inefficient, requiring massively redundant investments in hardware, software and labor. Centralizing IT supply provides much more attractive economics, and as the necessary technologies for utility computing continue their rapid advance, the utility model will also advance. Smaller companies that lack economies of scale in their internal IT operations are currently the early adopters of the utility model, as they were for electric utilities....

"There are certainly tough challenges ahead for utility suppliers. Probably the biggest is establishing ironclad security for each individual client's data as hardware and software assets become shared. The security issue will require technological breakthroughs, and I have faith that the IT industry will achieve them, probably pretty quickly."

If technology and marketing investments are any indicator, many computing companies firmly agree that utility computing will become "too compelling to resist."

Starting in 2002 with the launch of IBM's On-Demand vision of more flexible computing, several vendors have gotten on the utility computing bandwagon. Sun used the name N1 to describe its data-center software, and Hewlett-Packard used the term Adaptive Enterprise.

However, initial efforts by both large and small technology providers--which are still in development--have primarily focused on infrastructure technology, rather than hosted services, to make corporate data centers more efficient.

Yet at the same time, there have been a growing number of Internet-delivered services aimed at corporations.

IBM offers hosted processing power and applications to companies, while Sun earlier this year launched its Sun Grid initiative where customers pay a flat-rate of $1 per hour per CPU, in a fee-for-service structure similar to those used by utility companies. Meanwhile, Salesforce.com and Google, which both deliver services via the Internet, were the two of the most high-profile stock market entrants last year.

Chief electricity officer?
But while utility computing is an enticing idea, holding up the electricity industry as the model for how computing should evolve doesn't sit right with all IT executives.

Peter Lee, CEO of grid software company DataSynapse, said that Carr's conclusion that the combination of virtualization, grid computing and Web services will result in utility computing is "100% spot-on." But he said the electricity industry analogy doesn't hold up entirely.

"We do not think the computing industry will eventually resemble the electricity industry as an exact parallel, because unlike electricity, there are many more variables in terms of computing power that would need to be standardized," Lee said. "Computing will, however, become much more utility-like, both in terms of pricing and in terms of on-demand power."

In his piece, Carr theorizes how the shift to utility computing could reshape the competitive forces in today's computing industry. He argues that leading "utility suppliers" of the future will either be today's large hardware providers, specialized hosting companies such as Digex, Internet outfits such as Google and Amazon, or as-yet-undiscovered start-ups.

Longtime computing industry executive Kim Polese, who is now CEO of open-source start-up SpikeSource, said that Carr's competitive analysis should figure in the effect of open source and offshore development from emerging markets, both of which are causing "huge disruptions."

"This means to me that we can't assume that competition will come from the usual places," Polese said. "The leaders of tomorrow may not even exist today, but they could grow offshore from start-up into sizable companies quickly given the strong demand for their services. The computing utility services may be arbitraged across a network of service providers, of various sizes, with pricing developed via dynamic price discovery."

Microsoft, meanwhile, is well positioned to take advantage of any move to hosted services, said Bob Muglia, senior vice president of Microsoft's Windows Server division.

"I think there will be a split. Companies will outsource things that can be very effectively run for an inexpensive price by others...On the other hand, I do think there will always be areas where people are putting in investment to drive business advantage that will either remain in-sourced or under very tight control of outsourcing--not purely hosted. There's a mixture of all these things," Muglia said. "We'll work well in both environments."

IBM's Ambuj Goyal, the general manager of IBM's Lotus division and former strategy executive in Big Blue's software group, fully buys into the notion of utility computing: He wrote a paper for IBM on the subject 10 years ago and offers hosted services for some Lotus products.

However, as with many discussions about the future, the reality will likely lie somewhere between extreme positions.

"Rather than take a 50,000-foot view...you need to get down to earth and look at individual cases," Goyal said. "A standardized utility model has a role, but what a business should do depends on each particular case."

Utility computing will only be as reliable as the network upon which it is built. What will happen when the network goes down? Will there be a disaster recovery plan? What about security? It is for these reasons that an IT staff is still needed. Until someone creates a self-sustaining network, utility computing will be a pipe dream.

The author also seems to have ignored things like security, redundancy, competitive advantage and half a dozen other things.

If your idea of computing is two clerk punching data into an Exel spreadsheet, then utility computing will probably work well for you.

I suspect that covers about 20% of the buisness world. The rest are using IT much more strategically. It not only doesn't make sense to move to online utility computing for the other 80%, they would be giving away market edge by doing so!

Utility computing will only be as reliable as the network upon which it is built. What will happen when the network goes down? Will there be a disaster recovery plan? What about security? It is for these reasons that an IT staff is still needed. Until someone creates a self-sustaining network, utility computing will be a pipe dream.

The author also seems to have ignored things like security, redundancy, competitive advantage and half a dozen other things.

If your idea of computing is two clerk punching data into an Exel spreadsheet, then utility computing will probably work well for you.

I suspect that covers about 20% of the buisness world. The rest are using IT much more strategically. It not only doesn't make sense to move to online utility computing for the other 80%, they would be giving away market edge by doing so!

It's funny that centralized power would be the chosen analogy given the fact that many energy experts believe that future energy will be generated right where it's needed and centralized power will be obsolete.

It's funny that centralized power would be the chosen analogy given the fact that many energy experts believe that future energy will be generated right where it's needed and centralized power will be obsolete.

1. Everybody wants their own custom application. There are many reasons for this but it boils down to nobody likes to make do with a "general" application because the people in charge of making the decision and those in charge of actually using the applications don't understand them. I've worked in several major IT departments and in each case there was much duplication between systems due to lack of knowledge of the existing infrastructure, internal politics and general resistance to making changes to procedures to accomadate a general solution.

2. Inertia - Corporate data centers are large organizations within a larger organization. They will be resistant to change because it would mean their jobs. Plus, there is the cost of changeover to utility computing. That means data conversion, QA, end-user retraining and then there is the entire physical apparatus of the data center. What happens to that? Nobody is going to be quick to jump on this bandwagon.

3. Security - I think this has been covered in other comments quite well. Businesses do not want to risk their data and are willing to pay more to store it themselves.

4. Outsourcing - Another name for utility computing is outsourcing. Basically instead of maintaining your own data infrastructure you'd be outsourcing your IT department. Then instead of X number of companies with varying sized IT departments we'd have X Number of companies soliciting Y number of IT providers.

One comment stated something about fungibility and that hardware was pretty much hardware. Having worked in IT departments and for a software vendor I'd have to disagree. Sure, you can get hardware to generally be identical but hardware is a collection of paperweights without software, be it system or application software. The hardware/software combo is complex enough that each installation is pretty much a unique entity. Very much hard work goes on in IT departments to standardize hardware and software throughout an enterprise environment and there are always those machines that are an exception. Same hardware, same software, same settings but somehow two different machines will peform differently.

1. Everybody wants their own custom application. There are many reasons for this but it boils down to nobody likes to make do with a "general" application because the people in charge of making the decision and those in charge of actually using the applications don't understand them. I've worked in several major IT departments and in each case there was much duplication between systems due to lack of knowledge of the existing infrastructure, internal politics and general resistance to making changes to procedures to accomadate a general solution.

2. Inertia - Corporate data centers are large organizations within a larger organization. They will be resistant to change because it would mean their jobs. Plus, there is the cost of changeover to utility computing. That means data conversion, QA, end-user retraining and then there is the entire physical apparatus of the data center. What happens to that? Nobody is going to be quick to jump on this bandwagon.

3. Security - I think this has been covered in other comments quite well. Businesses do not want to risk their data and are willing to pay more to store it themselves.

4. Outsourcing - Another name for utility computing is outsourcing. Basically instead of maintaining your own data infrastructure you'd be outsourcing your IT department. Then instead of X number of companies with varying sized IT departments we'd have X Number of companies soliciting Y number of IT providers.

One comment stated something about fungibility and that hardware was pretty much hardware. Having worked in IT departments and for a software vendor I'd have to disagree. Sure, you can get hardware to generally be identical but hardware is a collection of paperweights without software, be it system or application software. The hardware/software combo is complex enough that each installation is pretty much a unique entity. Very much hard work goes on in IT departments to standardize hardware and software throughout an enterprise environment and there are always those machines that are an exception. Same hardware, same software, same settings but somehow two different machines will peform differently.

I think one of the biggest hurdles to Utility computing is Security. The need for security will manifest itself in two ways:1. A delay in transitioning over to Utility computing: Until customers are convinced that their information will not be compromised they will be reluctant to transfer critical/high volume processing. One aspect of this could be the need for closed networks. Electricity companies typically own their distribution networks.2. A different breed of players: Security companies could get a foot in the door ( a small foot though) by offering security solutions for utility computing over the internet.

I think one of the biggest hurdles to Utility computing is Security. The need for security will manifest itself in two ways:1. A delay in transitioning over to Utility computing: Until customers are convinced that their information will not be compromised they will be reluctant to transfer critical/high volume processing. One aspect of this could be the need for closed networks. Electricity companies typically own their distribution networks.2. A different breed of players: Security companies could get a foot in the door ( a small foot though) by offering security solutions for utility computing over the internet.

Electricity is a simple, physical entity.Corporate data is complex, somewhat arbitrary (in that the corporation defines it) and needs to be consistent across the enterprise.Utility computing can only be of use where the data managed by the utility does not need to be integrated into, and consistent with, Corporate Data. How often does this happen? Not often.In most medium to large corporations, it is the data and business processes embodied in the applications that matter, not the technology. That's why Carr is correct when he says that "IT Doesn't Matter" but incorrect when he claims that utility computing is the answer.It may sometimes be the answer but not enough to empty the corporate IT department.

What is data when it is stored on a computer? Magnetic marks when in storage and electricity when being processed. When it is being processed, it doesn't matter what it is, text, numbers, graphics, AI, sound, whatever, those are abstractions. Minor point, but I am bored.

I don't think that utility computing is ready for prime time, and I doubt it ever will be. You are still using the same amount of computing resources, so what was saved? Nothing really, except higher costs from markup and slower access and processing time.

Electricity is a simple, physical entity.Corporate data is complex, somewhat arbitrary (in that the corporation defines it) and needs to be consistent across the enterprise.Utility computing can only be of use where the data managed by the utility does not need to be integrated into, and consistent with, Corporate Data. How often does this happen? Not often.In most medium to large corporations, it is the data and business processes embodied in the applications that matter, not the technology. That's why Carr is correct when he says that "IT Doesn't Matter" but incorrect when he claims that utility computing is the answer.It may sometimes be the answer but not enough to empty the corporate IT department.

What is data when it is stored on a computer? Magnetic marks when in storage and electricity when being processed. When it is being processed, it doesn't matter what it is, text, numbers, graphics, AI, sound, whatever, those are abstractions. Minor point, but I am bored.

I don't think that utility computing is ready for prime time, and I doubt it ever will be. You are still using the same amount of computing resources, so what was saved? Nothing really, except higher costs from markup and slower access and processing time.

I've been following utility computing efforts for some time. It seems that Sun's Grid is closest, with simple CPU/hour and GB/month prices of $1 each. The challenge is whether my chosen software will run on the Sun Grid? Today the answer is no, but maybe tomorrow?

I've been following utility computing efforts for some time. It seems that Sun's Grid is closest, with simple CPU/hour and GB/month prices of $1 each. The challenge is whether my chosen software will run on the Sun Grid? Today the answer is no, but maybe tomorrow?

Utilities deliver homogeneous products, electricity, or gas, that is absolutely identical at each site with no customization at all.

Utility computing is just another name for webhosting, app hosting and other centralized computing attempts. Just like those, there are many promises but it always boils down to one thing, the costs only are cheaper if all use the same app or service with little to no customization. If the thought is "if it isn't MS office you don't need it" then utility computing has a chance to be big. Other than that it will become a respectible component of the IT future. It will not be the only "IT".

Utilities deliver homogeneous products, electricity, or gas, that is absolutely identical at each site with no customization at all.

Utility computing is just another name for webhosting, app hosting and other centralized computing attempts. Just like those, there are many promises but it always boils down to one thing, the costs only are cheaper if all use the same app or service with little to no customization. If the thought is "if it isn't MS office you don't need it" then utility computing has a chance to be big. Other than that it will become a respectible component of the IT future. It will not be the only "IT".

Companies large and small are going to move across to this model so they can forget about the IT and can concentrate on the core business, I think in the future companies will run on thin clients connecting to trusted online desktop services like <a class="jive-link-external" href="http://www.cosmopod.com" target="_newWindow">http://www.cosmopod.com</a> for all their computing needs.

Companies large and small are going to move across to this model so they can forget about the IT and can concentrate on the core business, I think in the future companies will run on thin clients connecting to trusted online desktop services like <a class="jive-link-external" href="http://www.cosmopod.com" target="_newWindow">http://www.cosmopod.com</a> for all their computing needs.

Report offensive content:

If you believe this comment is offensive or violates the CNET's Site Terms of Use, you can report it below (this will not automatically remove the comment). Once reported, our staff will be notified and the comment will be reviewed.

E-mail this comment to a friend.

E-mail this to:

Note: Your e-mail address is used only to let the recipient know who sent the e-mail and in case of transmission error. Neither your address nor the recipients's address will be used for any other purpose.