We are living in a world of unprecedented growth in the digital universe. In just over a decade, the Internet and the creation and utilization of digital data has gone from nascent to a vital part of everyday life. This is why electronic data is growing at rates that make Moore's Law appear tame, with a forty-four fold increase expected in data between now and 2020. That would put the total amount of electronic data somewhere in excess of 35 zettabytes, according to IDC.

To manage and store all this data, we are also seeing an unprecedented global build out of massive data centers. From co-location providers like Savvis to Internet giants Google and Facebook to enterprises across industries, billions of dollars are being spent to increase the data center footprint and house this tsunami of data. In fact, Industry leaders estimate that nearly 450 billion dollars is being spent annually on new data center facilities.

Ironically, the Internet and today's "cloud" solutions were supposed to save us from physical infrastructure build out. While cloud computing or "as a service" products do help minimize a company's or person's need to buy hardware or software, those same cloud solutions are behind the majority of this data center explosion.

Rather than the cloud saving us, it is actually the key driver of increased energy utilization, carbon footprint and other ecological issues, in addition to large dis-economies of scale related to cost and capacity utilization, not to mention data security.

As Greenpeace so aptly put in their recent report, titled ‘How Green is Your Cloud?': "There have been increasing attempts by some companies to portray the cloud as inherently ‘green' despite a continued lack of transparency and very poor metrics on performance or environmental impact. "

To put some statistics behind this: The Environmental Protection Agency estimates that 2% of North American energy consumption goes to servers and data centers. In addition, approximately 2% of the global carbon footprint is data centers.

Five years ago, cloud computing consumed some 623 billion kilowatts per hour, according to Greenpeace. Current estimates have data center electricity demand at 31 gigawatts globally, marking a 19% increase in 2012 alone. This compares to essentially flat demand for electricity in the general market.

The reality is a few large companies are driving a huge percentage of this. A recent estimate by industry analysts note that Google alone uses some 260 million watts continuously across the globe - equivalent to the power used by all the homes and businesses in a city of about 200,000 people. And that was a number found nearly a year ago.

It's not only the amount of energy being consumed but how these large data centers are impacting the flow of electricity. There is a trend of concentrating data centers in a centralized area where energy, land and bandwidth is cheaper. This is having a significant impact on the electricity grid, which is responsible for managing how electricity is distributed and spread across a geography.

Looking beyond energy consumption, costs and environmental impact, there are also impacts on you, the consumer of the cloud.

For one, this data center build out is driving a continued dis-economy of scale for cloud solutions. Why? Because these data centers are very expensive to build. And the greatest single contributor to that expense? You guessed it . . . power!

This is also ironic, since the cloud was to herald in an era of low-cost IT. But as long as your cloud provider is pursuing an expensive, centralized data center model, those costs have to be passed on to the customer at some point. Sure, companies like Google can leverage their large revenue base from advertising to offset cloud prices (thus, the recent launch of Google Drive at just five cents a gigabyte).

But most other vendors will have to cover their expenses through their services. This is why we are seeing such a disparity in the price of on-premise storage devices and cloud storage, as you can buy a two terabyte hard drive for 100 dollars, but the same two terabytes in the cloud could cost you hundreds if not thousands of dollars per month.

Data centers are also woefully underutilized. Analysts, such as Gartner, estimate that most servers sitting in data centers are operating at 20 to 30 percent capacity. That means up to 80 percent of storage or computing space is sitting idle. This is because companies have to build data centers for peak usage, just in case they need all that capacity. Even if you are operating at 50 percent capacity, you are wasting a huge amount of server space.

There are also security considerations for you to think about. With a centralized data center model, your data is moving through a single pipe and stored in a single, centralized location. This makes data compromise easier. At the least, make sure you know how your provider is encrypting your data and that it remains encrypted both in motion and at rest. Also, look for key compliance mandates, such as SSAE 16 (formerly SAS 70 type 2) or PCI DSS compliance.

Finally, if we continue growing data at the current pace, we literally cannot build enough data centers to house it. We need to find an alternative course of action. One answer is to pursue a more decentralized architecture. The Internet, after all, is the largest peer-to-peer network in the world and completely decentralized. Many research institutions and universities have been working on distributed systems for years, and we are starting to see commercially viable options. Take Skype, for example, which has built the largest phone and video company in a peer-to-peer approach and thus built nearly zero physical infrastructure to support it.

With a decentralized, distributed cloud model, you are the cloud, as you help power the solution. While for some, this may seem like a security risk or just uncomfortable, this is the way of the future. Just like the Internet was at one time an unknown environment, peer-to-peer networks or decentralized systems will one day be as common and comfortable as our gmail.

About Margaret DawsonMargaret Dawson has more than 20 years of experience in global leadership for both start-ups and Fortune 500 technology companies, including Microsoft and Amazon.com. An avid speaker and technologist, she is a frequent author and presenter on cloud computing, data management, network security, integration and other business and technology themes, and is on the Cloud Connect Advisory Council. She is currently Vice President of Marketing for Symform, a cloud storage and backup provider. Prior to Symform, she ran product management and marketing for Hubspan, a B2B cloud integration provider. While at Microsoft, Dawson led a product management team for two network security products. She is also an active member of CloudNOW, a non-profit consortium of the leading women in cloud computing.

About Praerit GargPraerit Garg is the President and Co-founder of Symform. Often regarded as the “father” of distributed cloud storage, he is a proven technical leader for distributed modeling systems. Prior to Symform, he was a Senior Director in Microsoft's Server and Tools division where he built and managed the Dynamic Systems Platform & Tools team. Under Garg's leadership, the team delivered technology and solutions across several Microsoft products, including the distributed modeling system in Visual Studio 2007. He is a co-inventor on 14 U.S. and international patents

Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) laun...

"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless...

The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @...

Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2017 New York
The 7th Internet of @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, New York.
Chris Matthieu is the co-fo...

You are moving to the Cloud. The question is not if, it’s when. Now that your competitors are in the cloud and lapping you, your “when” better hurry up and get here. But saying and doing are two different things.
In his session at @DevOpsSummit at 18th Cloud Expo, Robert Reeves...