Archive for the ‘Public Cloud’ Category

Even though cloud computing has enabled many organizations to improve their operations, that doesn’t mean they’re using the technology in the same way. The needs and desires of different businesses and industries require different deployment approaches, whether through public, hybrid, or private models. In addition, how users interact with the applications that run on these architectures varies considerably.

Digital information flows through different avenues.

Helping aircraft take flight Brandon Butler, a contributor to CIO, noted that commercial aircraft manufacturer Boeing is merging the capabilities of on-premise virtualized workloads with a public cloud solution to create a hybrid environment. David Nelson, the company’s chief cloud strategist, stated that the applications the organization uses run more efficiently and serve the needs of Boeing much better than an in-house data center.

Hosted on a cloud server, one of the tools used by Boeing monitors all the flight patterns of planes around the world. It incorporates both real-time and historical data, which translates to a huge amount of traffic running through the system on a consistent basis. Previously, the application operated through five laptops that were synced together, which required diligent cooling. Nelson stated that there was so much detail and analysis within the digital information that the machines couldn’t efficiently host the program.

One of the most interesting applications Nelson uses takes on-premise Boeing resources and merges them with a public cloud storage environment. To deliver better assistance to remote mechanics working with their machines, Boeing launched a tool that allows technicians to research materials as well as conduct and verify maintenance and repairs. In addition, Boeing aircraft specialists can contribute to the system.

“It’s seamless to the end user,” said Nelson, as quoted by the news source. “But it provides all the functionality they need.”

Although the prospect may sound far-fetched, some professionals have speculated that increased investment in cloud computing may put major software developers out of business. If you look at the situation from the perspective of an open-source developer, it makes a fair amount of sense.

Cloud servers encompass computing network.

Enhanced sharing capabilities and cloud implementations go hand in hand, delivering an easier way for IT professionals to collaborate to create customer relationship management (CRM) solutions, desktop operating systems, and other deployments.

Creating a wide wake
As adoption rates increase, technological accouterments designed to complement cloud technology are growing. For example, potential customers initially expressed concerns regarding security, primarily because they didn’t know what they were dealing with. In response, hosting companies made it a priority to enhance defense measures, and some organizations are even building entire business models around providing cloud security.

This level of response is characteristic of the cloud industry and explains why adoption rates have increased significantly. Skyhigh Networks conducted a Q1 2014 report based on data collected from more than 8.3 million cloud users and found that 3,571 cloud services were in use — 1,320 more than the previous quarter.

As far as what services were being used, everything from data storage to application usage were considered. On an interesting note, the study discovered that, on average, an organization leverages 24 different file sharing solutions and 91 disparate collaboration programs. This statistic supports the cloud’s status as a popular service, but it also signals a call to action.

At first glance, it seems as if business leaders would perceive investing in a single, comprehensive model to be preferable to using multiple cloud hosting providers. Perhaps not enough vendors offer holistic solutions, however, or these decision-makers believe that leveraging numerous services gives them some kind of advantage.

These days, businesses are aggregating an incredible amount of data from a lot of different silos. Whether they’re using the information to create enhanced marketing campaigns, conduct research for product development, or look for a competitive edge in the market, these companies are taking whatever steps are necessary to protect that data. Between data breaches and natural occurrences like severe weather that can cause companies to lose their data, many are moving their disaster recovery initiatives to cloud servers.

A broken disk.

A practical solution
One of the most popular deployment options, public cloud models offer companies the opportunity to back up their data in encrypted, secure environments that can be accessed whenever it’s convenient. However, businesses are looking to take this capability to the next level. Redmond Channel Partner referenced a study sponsored by Microsoft titled “Cloud Backup and Disaster Recovery Meets Next-Generation Database Demands,” which was conducted between December 2013 and February 2014 by Forrester Consulting.

The research firm polled 209 organizations based in Asia, Europe, and North America, with 62 percent of survey participants consisting of large-scale enterprise IT managers. Many of the businesses reported having mission-critical databases larger than 10 terabytes. Respondents claimed that some of the top reasons for using public cloud computing models for backups included saving money on storage (61 percent) and reducing administration expenses (50 percent).

Forrester noted that a fair number of enterprises often omit encrypting their database backups due to the complexity involved and the possibility of data corruption. A number of participants also acknowledged they neglect to conduct tests regarding their disaster recovery capabilities.

The available opportunities
Despite these drawbacks, Forrester’s study showed that cloud-based backup and disaster recovery (DR) models have matured over the past 4 years. In addition, there’s the option of using a hybrid approach that involves combining on-premise DR solutions with public cloud storage. For example, an enterprise could keep all its data in in-house databases and orchestrate a system that would either duplicate or transfer all data into a cloud storage environment in the event of a problem.

The world of cloud computing is undergoing a monumental shift. Competition across private, hybrid, and public cloud solution providers has been heating up thanks to new innovations and decreasing price changes. As the rate of adoption is becoming increasingly more affordable, businesses on the small and midsize level are looking to capitalize on scalable storage space and flexible communications.

Employees access files and applications stored on public cloud architectures.

Anything you can do, I can do cheaper
Pedro Hernandez, a contributor to TechWeek, noted that Microsoft is making good on its promise to match public cloud prices set by Amazon, reducing computing expenses by 35 percent and storage by about 65 percent. At the commencement of CEO Satya Nadella’s placement as the new leader of the corporation, the professional stated that he will spearhead a “cloud first, mobile first” business plan to integrate all of Microsoft’s products so that they work in a more compatible manner.

Steven Martin, general manager for Microsoft Azure, noted that the economics side of the cloud business is certainly a major factor in the cloud storage market, but it doesn’t necessarily guarantee a profitable result. The executive claimed Microsoft plans on investing heavily in research and development, looking for new approaches and infrastructure designs that will deliver a more secure, operable public cloud framework. In addition, the company expressed interest in searching for new partnerships in an effort to gain outside insight into an increasingly competitive market.

Getting down to specifics, the cost reductions of Microsoft’s cloud servers will be organized around two models. “Standard” will be defined as general-purpose virtual machines and won’t offer load balancing or auto-scaling. Although this change will result in a 35 percent price contraction, industry critics have speculated about whether the company may be sacrificing quality. The second deployment will refocus on storage expenses, scaling down costs for locally redundant storage by 65 percent.

IT departments feeling the heat
Amid the fluctuating marketplace, IT teams for large and midsize companies are left wondering where they should start. Many organizations encountering a high volume of data traffic often require a public environment capable of handling it all. The requirements don’t stop there, either. Employees are continuing to use mobile devices to access company documents and information so they can work on-the-go and out of the office more frequently than before.

GoGrid has just released its 1-Button Deploy™ of HBase, available to all customers in the US-West-1 data center. This technology makes it easy to deploy either a development or production HBase cluster on GoGrid’s high-performance infrastructure. GoGrid’s 1-Button Deploy™ technology combines the capabilities of one of the leading NoSQL databases with our expertise in building high-performance Cloud Servers.

HBase is a scalable, high-performance, open-source database. HBase is often called the Hadoop distributed database – it leverages the Hadoop framework but adds several capabilities such as real-time queries and the ability to organize data into a table-like structure. GoGrid’s 1-Button Deploy™ of HBase takes advantage of our SSD and Raw Disk Cloud Servers while making it easy to deploy a fully configured cluster. GoGrid deploys the latest Hortonworks’ distribution of HBase on Hadoop 2.0. If you’ve ever tried to deploy HBase or Hadoop yourself, you know it can be challenging. GoGrid’s 1-button Deploy™ does all the heavy lifting and applies all the recommended configurations to ensure a smooth path to deployment.

Why GoGrid Cloud Servers?

SSD Cloud Servers have several high-performance characteristics. They all come with attached SSD storage and large available RAM for the high I/O uses common to HBase. The Name Nodes benefit from the large RAM options available on SSD Cloud Servers and the Data Nodes use our Raw Disk Cloud Servers, which are configured as JBOD (Just a Bunch of Disks). This is the recommended disk configuration for Data Nodes, and GoGrid is one of the first providers to offer this configuration in a Cloud Server. Both SSD and Raw Disk Cloud Servers use a redundant 10-Gbps public and private network to ensure you have the maximum bandwidth to transfer your data. Plus, the cloud makes it easy to add more Data Nodes to your cluster as needed. You can use GoGrid’s 1-Button Deploy™ to provision either a 5-server development cluster or an 11-server production cluster with Firewall Service enabled.

Development Environments

The smallest recommended size for a development cluster is 5 servers. Although it’s possible to run HBase on a single server, you won’t be able to test failover or how data is replicated across nodes. You’ll most likely have a small database so you won’t need as much RAM, but will still benefit from SSD storage and a fast network. The Data Nodes use Raw Disk Cloud Servers and are configured with a replication factor of 3.