Electricity consumption in data centers worldwide doubled between 2000 and 2005, but the pace of growth slowed between 2005 and 2010. This slowdown was the result of the 2008 economic crisis, the increasing use of virtualization in datacenters, and the industry's efforts to improve energy efficiency. However, the electricity consumed by datacenters globally in 2010 amounted to 1.3% of the world electricity use. Power consumption is now a major concern in the design and implementation of modern infrastructures because energy-related costs have become an important component of the total cost of ownership of this class of systems.

Thus, energy management is now a central issue for servers and datacenter operations, focusing on reducing all energy-related costs, such as investment, operating expenses and environmental impacts. The improvement of energy efficiency is a major problem in cloud computing because it has been calculated that the cost of powering and cooling a datacenter accounts for 53% of its total operational expenditure. But the pressure to provide services without any failure leads to a continued scaling systems for all levels of the power hierarchy, from the primary feed sources to the support. In order to cover the worst-case situations, it is normal to over-provision Power Distribution Units (PDUs), Uninterrupted Power Supply (UPS) units, etc. For example, it has been estimated that power over-provisioning in Google data centers is about 40%.

Furthermore, in an attempt to ensure the redundancy of power systems, banks of diesel generators are kept running permanently to ensure that the system does not fail even the moments that these support systems would take to boot up. These giant generators work continuously to ensure high availability in the event of a failure of any critical system, emitting large quantities of diesel exhaust, i.e., pollution. Thus, it is estimated that only about 9% of the energy consumed by datacenters is in fact used in computing operations, everything else is basically wasted to keep the servers ready to respond to any unforeseen power failure.

When we connect to the Internet, cyberspace can resemble a lot to outer space in the sense that it seems infinite and ethereal; the information is just out there. But if we think about the energy of the real world and the physical space occupied by the Internet, we will begin to understand that things are not so simple. Cyberspace has indeed real expression in the physical space, and the longer it takes to change our behavior in relation to the Internet, in order to clearly see its physical characteristics, the closer we will be to enter a path of destruction of our planet.

Despite being fashionable and many people refer to it, only a few seem to know what the "cloud" really is. A recent study by Wakefield Research for Citrix, shows that there is a huge difference between what U.S. citizens do and what they say when it comes to cloud computing. The survey of more than 1,000 American adults was conducted in August 2012 and showed that few average Americans know what cloud computing is.

For example, when asked what "the cloud" is, a majority responded it's either an actual cloud, the sky or something related to the weather (29%). 51 percent of respondents, believe stormy weather can interfere with cloud computing and only 16% were able to link the term with the notion of a computer network to store, access and share data from Internet-connected devices. Besides, 54% of respondents claimed to have never used a cloud when in fact 95% of those who said so are actually using cloud services today via online shopping, banking, social networking and file sharing.

What these results suggest is that the cloud is indeed transparent to users, fulfilling one of its main functions, which is provide content and services easily and immediately. However, the lack of knowledge about the computing model that supports all of our everyday activities, leads to a growing disengagement with the consequent deterioration of the security concerns of content and privacy.
In reality, cyberspace is not an aseptic place filled only with accurate and useful information. The great interest of cyberspace lies precisely in that it allows for social vitality, based on a growing range of multimedia services. Its fascination comes from acting as a booster technology for the proliferation of all forms of sociability, being a connectivity instrument. Therefore, cyberspace is not a purely cybernetic thing, but a living, chaotic, and uncontrolled entity.

Beyond these concerns, others equally serious are emerging. By analyzing our daily use of these new technological tools, we conclude that the growth of the Internet is suffocating the planet. We have to face the CO2 emissions produced by our online activities as internal costs to the planet.
We can start by showing some awareness of the problem, restricting our uploads and even removing some. Why not? What about reducing our photos on Facebook and Instagram? Keeping them permanently available consumes energy! If no one cares about our videos on YouTube, why not delete them? At least keep them where they do not need to be consuming energy.

We still have to go further and think that if awareness and self-discipline are not enough, we must consider the possibility of a cost for the sharing of large volumes of personal information. It is perhaps the only way to get most people to stop making unconscious use of the cloud, clogging it by dumping huge amounts of useless information into cyberspace. The goal is not to limit the access to information, this should always be open access, but rather give it a proper and conscientious use.

Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. Cloud computing is first and foremost a concept of distributed resource management and utilization. It aims at providing convenient endpoint access system while not requiring purchase of software, platform or physical network infrastructure, instead outsourcing them from third parties.

The arrangement may beneficially influence competitive advantage and flexibility but it also brings about various challenges, namely privacy and security. In cloud computing, applications, computing and storage resources live somewhere in the network, or cloud. User’s don’t worry about the location and can rapidly access as much or as little of the computing, storage and networking capacity as they wish—paying for it by how much they use—just as they would with water or electricity services provided by utility companies. The cloud is currently based on disjointedly operating data centers but the idea of a unifying platform not unlike the Internet has already been proposed.

In a cloud computing environment, the traditional role of service provider is divided into two: the infrastructure providers who manage cloud platforms and lease resources according to a usage-based pricing model, and service providers, who rent resources from one or many infrastructure providers to serve the end users. Cloud computing providers offer their services according to several fundamental models: software as a service, infrastructure as a service, platform as a service, desktop as a service, and more recently, backend as a service.

The backend as a service computing model, also known as "mobile backend as a service" is a relatively recent development in cloud computing, with most commercial services dating from 2011. This is a model for providing web and mobile applications developers with a way to link their applications to backend cloud storage while also providing features such as user management, push notifications, and integration with social networking services. These services are provided via the use of custom software development kits (SDKs) and application programming interfaces (APIs). Although similar to other cloud-computing developer tools, this model is distinct from these other services in that it specifically addresses the cloud-computing needs of web and mobile applications developers by providing a unified means of connecting their apps to cloud services. The global market for this services has an estimated value of hundreds of million dollars in the next years.

Clearly, public cloud computing is at an early stage in its evolution. However, all of the companies offering public cloud computing services have data centers, in fact, they are building some of the largest data centers in the world. They all have network architectures that demand flexibility, scalability, low operating cost, and high availability. They are built on top of products and technologies supplied by Brocade and others network vendors. These public cloud companies are building business on data center designs that virtualize computing, storage, and network equipment—which is the foundation of their IT investment. Cloud computing over the Internet is commonly called “public cloud computing.” When used in the data center, it is commonly called “private cloud computing.” The difference lies in who maintains control and responsibility for servers, storage, and networking infrastructure and ensures that application service levels are met. In public cloud computing, some or all aspects of operations and management are handled by a third party “as a service.” Users can access an application or computing and storage using the Internet and the HTTP address of the service.