Without a doubt, the cloud will be the platform of choice for many new applications. But some existing programs are likely to remain in data centers for awhile, said Greg Papadopoulos, chief technology officer at Sun Microsystems.

"In general, it will be really expensive and hard to move legacy pieces over,” he said. “It's a much better strategy figuring out: What are the new pieces that I want to move to the cloud?"

The panel also discussed how cloud computing can speed the deployment of applications and reduce the cost of managing them.

That is one of the difficult lessons Ars Technica’s readers learn when they have their first experience with the technology, according to writer Jon Stokes.

One reader explained it this way: “Much like you have to make a lifestyle change instead of simply 'going on a diet’ if you want to lose weight and keep it off, you can't just ‘implement virtualization.'"

Readers also stressed that technology staffers must be well-trained in the specific product being deployed or problems are certain to follow. And managers do not need product training, but they need to know enough about the technology to make informed decisions.

If only the virtualization challenges ended with deployment. But such is not the case, according to a recent survey.

Network Instruments found that although 75 percent of respondents were deploying virtualization solutions, many said they lacked the technology to deal with problems that arise in the new environment.

In a strange way, the issue makes sense. According to the survey, 50 percent of the organizations involved had turned to virtualization to save money. Another way to cut costs is to not buy performance management tools.

Consequently, respondents’ biggest problem was troubleshooting, with 78 percent saying they had difficulty identifying the source of problems with their virtualization systems.

One of the biggest security concerns associated with virtualization is that no one in particular is responsible for security, according to CIO magazine’s Kevin Fogarty.

In a traditional setup, a server is the responsibility of the data center where the server resides. But virtualization, by design, blurs those lines, which is a good thing when it comes to managing capacity but not so good when dealing with security.

“Should the business unit that requested it be able to configure and secure it?” Fogarty asked. “Should it be the IT manager closest to the physical host? A centralized master sysadmin tasked with management and security for all the virtualized assets in an enterprise?”

Furthermore, patching and maintenance, which are essential to security, can be troublesome in a virtualized setting, he said.