Got a plan for your move to the cloud? If you do, you’re in the minority. According to a recent study, a mere 31 percent of those planning a move to the cloud actually have a strategy for migrating data and applications. The rest don’t.

To avoid the complexity and cost, we recommend heeding the following advice. It’s but a fraction of the endless recommendations we received from cloud experts who offered well-seasoned tips on what to do and not do when you begin planning to move workloads to the cloud.

1. Have a real reason for the migration“Cloud strategy is often not planned, but rather executed from impulses and desires of certain key individuals,” noted Bob Green (@BobGreenCPACITP), lead partner, business risk and technology services for SingerLewak.

“You need to have a good reason to migrate existing or develop new workloads to the cloud,” said Terence Ngai (@TerenceCNgai), Head of Cloud Delivery Management for HP. “Don’t be fluffy on the metrics. Be clear on what and how you’re measuring progress and success. You need concrete metrics to show success and credibility of your cloud initiatives.”

“A good CIO serves his or her organization well by taking smart, calculated risks that hold the best potential to move the organization forward,” explained Reed Sheard, vice president for college advancement and CIO of Westmont College.

“Illustrate that the move to the cloud is not simply a lateral shifting of data from bucket A to bucket B, but is instead a tangible demonstration of a company’s progress,” said Steve Prentice (@stevenprentice), senior writer for CloudTweaks. “It’s not just data that’s moving forward; it’s leadership too.”

2. Vet the cloud provider“A cloud is not an outsourced data center. It’s a collection of compute, network and storage infrastructure that is provisioned and managed via APIs,” noted Sravish Sridhar (@sravish), CEO of Kinvey. “Make sure your cloud provider has a robust set of APIs to control the infrastructure so that you can be truly elastic and cost efficient.”

“You want a secure, integrated, centrally managed and easy-to-use environment, with SLAs around availability and performance, especially at peak demand,” said Azmi Jafarey, CIO at Ipswitch.

“Evaluate providers carefully using a comprehensive framework such as the one at CSMIC,” advised Scott Feuless (@ISG_News), principal consultant with Information Services Group (ISG). “Comparing them is not for the faint of heart. Get help if you need it.”

3. Conduct a cloud impact analysis“Almost every application migrated to a cloud service has connections with various other applications and systems,” said Pierluigi Paganini (@SecurityAffairs), security researcher for the InfoSec Institute. “It is crucial to preventively evaluate the impact of the migration on these connections.”

“Ensure that workloads are cloud ready,” said Mike Matchett (@smworldbigdata), senior analyst and consultant for Taneja Group, “A thorough cloud review can help identify where complex applications need to be updated to make them both safe and efficient during cloud execution.”

4. Test requirements first“Running a pilot with an initial provider of interest is a great way to get a deeper understanding of both the provider and the process,” said Ben Trowbridge (@Ben_Trowbridge), founder of Alsbridge,

“Each application has different security requirements, workload profiles, performance requirements, availability requirements, transition requirements, elasticity requirements, technology compatibility requirements, bandwidth requirements and more,” said ISG’s Feuless. “The opportunities to generate revenue and drive efficiency into the business by leveraging the unique capabilities of the cloud also vary by application.”

Ian Apperley (@ianapperley), writer and ICT consultant at whatisitwellington, recommends you “create a ‘Cloud Beachhead’ and then target specific workloads that can be migrated into that environment. The beachhead is the environment that will include security, identity management and other basic ICT functions, which the workloads will need later on.”

“Cloud users can capture the cloud’s economies of scale, save time and limit capital spending by using cloud as an application design center, rather than rearchitecting an existing application to fit cloud,” said Kerpan, who recommends you refine the cloud application once the basic elements of your cloud structure are built.

“With the ubiquity of operations in the cloud, there is a really good chance there is an existing solution that your implementation team can copy and iterate off of to achieve the desired result,” said Scott Teger (@scottteger), vice president of operations for 36 Labs.

6. Get a firm grasp on pricing“The move to the cloud is about money,” said Bob Plankers (@plankers), a blogger at The Lone Sysadmin. “Know your costs. The hard costs of infrastructure are easy to compute, adding up the money spent on licenses and servers and vendor support. But where is your team’s terrifically expensive staff time spent? How much time is spent keeping legacy systems alive? Or working around problems, double-entering payroll data, manually deleting HR records, restoring files from backup, fixing the HR app icon on 50 desktops and so on? An organization that has visibility into these types of costs can do apples-to-apples comparisons with the costs of hosted, SaaS-type solutions, making the path to the cloud much clearer.”

“If you are already multisite with a hosted infrastructure then cloud vs. hosting is mainly a financial move, so make sure to bring finance in to figure out how to structure the deal,” said Jeffrey Bolden (@jbolden1517), managing partner for Blue Lotus SIDC. “Your finance people may want to structure the deal where you own the servers (with you purchasing in advance or the cloud provider financing) and you sell them back to the cloud provider at the end of the term so you can take depreciation.”

In addition, as you’re vetting cloud providers, Robert Moulton (@Seven10Software), CEO of Seven10 Storage Software, recommends you “choose cloud storage offerings that offer multiple layers of security and trust services with the ability to enforce and audit policy on the workloads and data they are storing within a cloud storage environment.”

Sanabria tells the sad tale of Code Spaces, whose unprotected AWS console was held ransom by attackers. When Code Spaces wouldn’t pay up, the attackers deleted all of their data, thus forcing Code Spaces to close shop.

9. Be wary of cloud lock-in“There is no point making a big effort to move a workload into a cloud only to have it locked in to a specific cloud provider,” said Taneja Group’s Matchett. “Make sure that you can migrate both data and workloads, using virtualization, fluid migration or container-like solutions if your goal is to take advantage of cloud brokerage opportunities.”

Other experts suggested you just resign yourself to understanding that you will be locked in given the databases you choose, and that the level of services that one cloud provider uses will differ from another. Understand that and make decisions accordingly.

“While the CentOS at Rackspace is the same as CentOS at AWS, infrastructure services will differ,” said Mark Herschberg (@madisonlogic), CTO of Madison Logic.

Ask yourself, “Are you making a set of decisions that will cause your code base to only ever be happy on one vendor’s offering,” said Todd Graham (@bluenoseinc), co-founder and CTO of Bluenose. “There are plenty of good reasons to embrace a specific vendor and their tools, but make those decisions actively and ensure your organization is aware of the debt you’re willing to take on vs. avoid.”

“Consider using this situation of ‘cloud lock-in’ to your advantage,” suggested Bruno Scap (@MaseratiGTSport), president of Galeas Consulting. “Explain to the vendor that you understand that your organization will get locked in, and that you are willing to allow it if you receive a discount in return.”

10. Train your staff before and after“Outsourcing is tempting, just make sure you don’t give up institutional knowledge in the process,” advised Mike Vitale (@TalkPointDotCom), CTO of TalkPoint. “You or someone on your team needs to know how the service works. If the cloud provider offers training sessions, take them up on it.”

“The skills required to migrate an application to the cloud are very different from the skills required to keep it running once there,” said Orchestratus’ Swidler. “Invest in bringing your operations teams up to speed well in advance of the switchover, so they can support the service quality levels your customers expect.”

“Interlace the actual work to move workloads with incremental training as you progress,” suggested Paul Martine (@citrix), CIO for Citrix. “It’s the best way to allow your team to be most effective during the transition.”

11. Don’t go it alone

“There are few things that will cost you more than hiring or dedicating an inexpert developer or sysadmin to perform the move and later having to undo the damage,” said Ryan O’Hara, lead tech, professional services at Linode. “Getting things done right the first time is key.”

“Find an expert consultant who can decipher the tidal wave of cloud options and help you decide which one is going to be best for your company,” advised Scott Maurice (@scottjmaurice), managing partner for Avail Partners. “Look for someone who shops the marketplace daily for cloud service providers.”

“There are a lot of smart people out there who have great experience the last few years moving workloads to the cloud,” said Jonathan Alexander (@vocalocity), CTO for Vonage Business Solutions. “You need these people, you want these people. Have them come train your people on how to do it, then have them come audit progress and provide additional guidance at intervals along the way.”

12. Rethink backup and business continuity“Cloud infrastructure has downtime too,” noted Kinvey’s Sridhar. “Ensure you have a backup and disaster recovery strategy.”

“Get intimately familiar with the shared responsibility model,” advised Tal Klein (@VirtualTal), vice president of strategy at Adallom. “Cloud adoption is a collaboration between vendor and customer. Design a business continuity plan for the cascading effects of inevitable events like outages and breaches—because recovery from such events in cloud services are vastly different than on-premise.”

“Your chosen IaaS provider may offer what you consider a significant number of POPs, but no matter how large that number, they’ll never be as distributed as your end users,” noted Gary Ballabio (@ballabio), product line director, Enterprise Cloud Solutions for Akamai. “Downtime is a reality. Architect applications to allow for failover to alternate regions, or even alternative cloud providers if possible to ensure availability.”

13. Start small“Low- to medium-security workloads, those without stringent latency requirements, and where the workload is elastic with variable traffic, will work well,” said Ipswitch’s Jafarey.

“These small applications give business and technology employees confidence in their ability to use and move business apps to a cloud environment,” said Bill Schrier (@BillSchrier), senior policy advisor for the State of Washington Office of the CIO.

14. Specify parameters of ownership“Choose a cloud provider that allows you to select the jurisdiction in which your data is stored,” advised Ajay Patel (@ajayhighq), co-founder and CEO of HighQ.

“When you use the cloud you are ‘outsourcing’ your compute and your data to a provider’s infrastructure, but you must not ‘outsource’ the ownership of your data,” said Gilad Parann-Nissany (@Porticor), founder and CEO of Porticor.

15. Your migration can be delayed by months due to bandwidth“If you’re moving terabytes or petabytes of data, remember that will take time. Probably on the order of days,” warned Madison Logic’s Herschberg.

“Have a contingency or business continuity plan in place in case the ‘move’ to cloud takes longer than expected or doesn’t go according to plan,” advised Alex Rayter, principal for Phoenix 2.0.

“Look hard at data migration time and costs. It may not be practical to move large data sets (terabytes to petabytes) to the public cloud, making it difficult to move those workloads,” said Rob Clyde (@AdaptiveMoab), international vice president of ISACA and CEO of Adaptive Computing. “A private cloud or hybrid cloud approach is probably more appropriate in such cases.”

16. Automate, automate, automate“Companies should strive to automate as many of the essential migration project steps as possible—including sales, planning, migration and on-site phases,” said Todd Schwartz (@GetSkyKick), co-founder and co-CEO of SkyKick. “Doing so will help reduce project risk and complexity, and simplify the management of the migration project.”

“The primary benefit of the cloud is the ability for your infrastructure to be mapped into code,” said Adam Duro (@ZehnerGroup), CIO at ZehnerGroup. “Without a coded configuration, every buildup and breakdown will require redundant and manual work by your engineers.”

“Automate everything you can, as you may have to do this several times over the lifecycle of your service,” advised Christian Verstraete (@christianve), Chief Technologist, Cloud, for HP.

Conclusion: Understand the opportunities of the cloudThe one piece of advice we heard from every expert is that cloud adoption is a journey and you should not expect to fully understand it on day one, day 23 or day 223. It’s an evolving process, and sharing knowledge with others who are ahead of you in the journey will be to your great benefit.

If you’re already a few steps into the cloud migration path, please share your experience in the comments below. Thanks.

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundation of software-defined infrastructure, Nutanix has rapidly expanded into full application lifecycle management across any infrastructure or cloud .Join us as we delve into how the Nutanix Developer Stack makes it easy to build hybrid cloud applications by weaving DBaaS, micro segmentation, event driven lifecycle operations, and both financial and cloud governance together into a single unified st...

"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.

Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.

While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity.
In his session at @ThingsExpo, Jim Frey, Vice President of Strategic Alliances at Kentik, discussed tactics and tools to bridge the gap between IoT project teams and the network planning and operations functions that play a significant role in project success.

DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lavi, a Nutanix DevOps Solution Architect, explored the ways that Nutanix technologies empower teams to react faster than ever before and connect teams in ways that were either too complex or simply impossible with traditional infrastructures.

In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundation of software-defined infrastructure, Nutanix has rapidly expanded into full application lifecycle management across any infrastructure or cloud .Join us as we delve into how the Nutanix Developer Stack makes it easy to build hybrid cloud applications by weaving DBaaS, micro segmentation, event driven lifecycle operations, and both financial and cloud governance together into a single unified stack.

TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the Internet evolved, however, the number of applications and users, and the types of data accessed and stored also evolved. The original intent of TCP was to communicate data in the form of text across computers; today’s data transfer is more complex including high pixel images, audio files, and video delivery.

In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The de...

TCP (Transmission Control Protocol) is a common and reliable transmission protocol on the Internet. TCP was introduced in the 70s by Stanford University for US Defense to establish connectivity between distributed systems to maintain a backup of defense information. At the time, TCP was introduced to communicate amongst a selected set of devices for a smaller dataset over shorter distances. As the...

Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and privat...

While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improvi...

DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their D...

According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud e...

"At the keynote this morning we spoke about the value proposition of Nutanix, of having a DevOps culture and a mindset, and the business outcomes of achieving agility and scale, which everybody here is trying to accomplish," noted Mark Lavi, DevOps Solution Architect at Nutanix, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York ...

@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhi...

Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the pr...

"NetApp's vision is how we help organizations manage data - delivering the right data in the right place, in the right time, to the people who need it, and doing it agnostic to what the platform is," explained Josh Atwell, Developer Advocate for NetApp, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.

The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environm...

"We were founded in 2003 and the way we were founded was about good backup and good disaster recovery for our clients, and for the last 20 years we've been pretty consistent with that," noted Marc Malafronte, Territory Manager at StorageCraft, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.

Cloud computing budgets worldwide are reaching into the hundreds of billions of dollars, and no organization can survive long without some sort of cloud migration strategy. Each month brings new announcements, use cases, and success stories.