Four new trends shaping data center and cloud technologies

The emergence of use cases such as connected cars, smart homes, patient care, and the IoT spurs new trends in data center and cloud technologies.

But how will these use cases shape data center and cloud technologies in the near future?

This blog raises questions posed by these use cases and discusses answers from the data center and cloud domain that address them. It also discusses optimizations applied by various data center and cloud vendors. New hybrid cloud trends are also analyzed.

Intelligent edge

Connected cars generate data indicating wear and tear on their components. This data, generated hourly, is of the order of 25 gigabytes. Network bandwidth would be wasted if all this data were uploaded to the cloud for analysis. Moreover, the data in the cloud would grow into petabytes, making gaining any insight from it about as difficult as finding a needle in a haystack. Smart homes and IoT devices also generate huge amounts of data, creating a similar problem.

The solution lies in a new class of data center equipment called intelligent edge. Intelligent edge devices collect intermediate data from endpoints such as cars, home devices, and IoT sensors. They analyze, filter unwanted data, and upload relevant information to the cloud for further analysis. In effect, network bandwidth and information storage usage in the cloud are reduced, resulting in substantial cost savings for vendors.

An IDC report forecasts that edge intelligence and the connectivity to drive intelligent systems will attain a CAGR of 7.2% and a market size of $2.2 trillion by 2020. HPE Edgeline Systems, Dell Edge Gateway 3001, 3002, 3003, 5000, and 5100 series, Cisco Edge Fog Fabric products, and Calix-AXOS E9-2 Intelligent Edge System are some examples of intelligent edge products.

Automated support using artificial intelligence and machine learning

Infrastructure vendors deploy thousands of devices at enterprise locations. In case of device outage, the support process moves from L1 to L2 to L3 and finally to L4, depending on the root cause. Typically, separate teams manage L1/L2 support while dedicated engineering teams manage L3/L4 support. L1/L2 issues are typically configuration and environment-related, while L3/L4 issues are more likely defects in the product. Product teams have large budgets for supporting such activities.

The new trend is to build solutions that can fully automate L1 and L2 support. The vendors use artificial intelligence and machine learning systems that consume historical device logs for initial training. In the event of device outage, logs can be fed into this system and it can report the root cause with certain accuracy. In some cases, these solutions are based on analytic algorithms and can easily resolve the first-level problems reported by enterprise customers.

HPE Nimble Storage’s Infosight is a classic solution that implements L1/L2 support automation. Pure Storage’s Pure1 Meta global predictive intelligence is another example. According to MarketsandMarkets, the machine learning and artificial intelligence -as-a-service market will reach $3.7 billion by 2021 at a CAGR of 43.7%, representing substantial scope for automated L1/L2 solutions.

DevOps and automation

DevOps maturity continues to improve across engineering organizations. According to Transparency Market Research, the DevOps and automation market will grow at a CAGR of 19.4% until 2020. Thanks to the increasing adoption of Continuation Integration (CI), Continuous Testing (CT), and Continuous Deployment (CD) through Build-as-a-Service, Test-as-a-Service and Infrastructure-as-a-Service respectively, release cycles are getting shorter. Further optimizations include advisors who use analytics to provide insights that shorten development and test cycles. For example, if a patch is to be released, the test cases to be executed can be derived analytically instead of the SME deciding based on his/her experience. In addition, the effectiveness of test cycles and nightly tests can be gauged depending on whether they are yielding defects.

Capgemini’s QTRON is a good example of a tool that further optimizes your product life cycle and can provide 20–30% additional life cycle gains.

Evolving Hybrid Cloud

Hybrid cloud remains in focus for enterprises and infrastructure technology vendors with its market poised to reach $92 billion by 2021 with a CAGR of 22.5%, according to Gartner. Public clouds such as AWS, Azure, etc. have become more mature in terms of scalability, security, and performance parameters. However, most enterprises are still experimenting with the best hybrid cloud (integrated private and public cloud) options and how private clouds perform with the same parameters. Most enterprises are reinventing the wheel but have limited successes in implementing hybrid cloud. Many of these implementations use open source technologies such as Open Stack, Docker, Kubernetes, etc., while others prefer VMware products. Microsoft and Google will inevitably play a part in the private cloud with on-premise platforms. In due course, enterprises will deploy a validated private cloud platform from a vendor and a corresponding public cloud for their hybrid cloud.

Summary:

For the last few years, there has been disruption of the data center and cloud market due to large-scale use of the public cloud and competition from startups. The industry is in upheaval due to the number of mergers and acquisitions by large incumbents seeking to leapfrog technology trends and match the speed of more agile and nimble start-ups. Though challenging, the new use cases, various optimizations using artificial intelligence and machine learning, and continued growth of hybrid cloud will help stabilize the market scenario and forge new business frontiers.