Using Cloud and Virtualization to Deliver Next-Generation Workloads

As the days of the PC wind down and more users embrace mobility and device agnosticism, organizations will need to utilize advanced cloud and virtualization methods to deliver new types of workloads to the user. This means creating more data center distribution, utilizing more cloud resources and taking your virtual infrastructure to the next level.

So what is cloud and virtualization helping deliver? What are some of these new applications and workloads that require additional resources? How will all of this impact the user today, and in the future? To better understand the cloud, data center and virtualization infrastructure, it’s important to look at what the end-user is actually using and the implications for your environment.

Delivering Rich Content and Media. It’s come down to much more than just application delivery. We are now sharing massive amounts of data, video and other types of rich content. This is all being pushed to mobile devices with an ever-present connection to the cloud. By having back-end virtual platform capable of handling vast amounts of users, organizations are able to deliver such content. The great part is that supportive infrastructure has really come a long way. WAN optimization can now be physical or virtual, located at various points in the cloud architecture. Couple this with adaptive user experience orchestration and you create a dynamic platform for rich-media delivery. Adaptive orchestration is able to analyze the inbound connecting device, understand the hardware, check the latency, and optimize the experience based on numerous metrics to ensure the best possible experience. This is all done seamlessly by intelligent cloud systems.

Utilizing Converged Infrastructure. Delivering next-generation workloads requires the power of a next-generation data center. As it becomes the modern home of everything, the data center platform had to evolve to handle new types of cloud designs and workload delivery models. A part of this evolution revolved around the server infrastructure that supported their respective workloads. Now, there are platforms to completely unify storage, networking and compute in one intelligent hardware plane. These platforms incorporate software-defined technologies and advanced levels of integration to deliver new types of workloads. These converged systems are capable of higher-density and greater user multi-tenancy. All of this translates to better resources utilization while still delivering complex workloads.

Evolving the end-point. There’s a reason that mobile devices and tablets have been doing so well. The user has become more mobile and has created a data-on-demand generation. This evolution of how we compute forced the change in the end-user device model. Devices are smarter, more compact and a lot more versatile than a standard desktop. Zero-clients built around system-on-chip (SoC) technologies are allowing corporations to completely re-define their end-point. These zero-client devices take minimal power, allow for a rip/replace methodology, and drastically reduce maintenance time and costs. Plus, more resource availability allows administrators to steam a rich experience to the user even with these tiny hardware footprints. Also, take a look at what Chromebook is doing with the concept of a completely web-enabled OS and device. This is an open architecture where both corporate and personal applications can be delivered. Ultimately, the device is only used for hardware, but can still provide a very rich experience for the user. The really interesting part is that more applications are being delivered to these types of end-points, but without the use of any clients. New technologies around HTML5 are breaking the barriers in how applications (regardless of the app’s DNA) can be delivered to web-ready devices. The future will see device and application agnosticism as users will be able to utilize intelligent web-ready resources to access their applications.

Next-generation workloads, data, and applications. The infrastructure that supports new applications and data delivery methods will continue to evolve. With that, applications and how they interconnect with their respective resources will evolve as well. Take, for example, the Kinetic Open Cloud Storage Platform from Seagate. This intelligent platform eliminates the storage server tier of traditional data center architectures by enabling applications to speak directly to the storage device, thereby reducing expenses associated with the acquisition, deployment, and support of hyperscale storage infrastructures. We’re seeing this happen with other resources too. APIs have come a long way in helping applications better integrate with the cloud. Cloud-ready APIs can help applications and other resources eliminate entire layers to increase communication efficiency. The ability to cross-connect between various resources, applications, and even entire data centers is a critical piece in delivering new types of workloads to the end-user.

There are more cloud services becoming available as the modern data center becomes your home to all evolving technologies. The way the end-user access their personal and corporate resources has completely changed over the past few years. With that change, organizations must also evolve in how they support their users.

By enabling mobility and cloud utilization, administrators are able to empower their users to be more productive and utilize end-points that they are most comfortable with. Ultimately, it’ll all come down to the applications and data that are being delivered. The hardware at the end-point will only be utilized for the resources it can provide to allow the application to run. This centralized, web-enabled, delivery model helps centralize critical data and allow the user to be truly on-the-go. Already we are seeing rich content be optimized with intelligent WANOP technologies. As more resources and bandwidth becomes available, organizations will be able to do even more with their data center platforms.

Get Daily Email News from DCK!

About the Author

Bill Kleyman is a veteran, enthusiastic technologist with experience in data center design, management and deployment. His architecture work includes virtualization and cloud deployments as well as business network design and implementation. Currently, Bill works as the National Director of Strategy and Innovation at MTM Technologies, a Stamford, CT based consulting firm.

Related Stories

With so much focus on cloud, it's equally important to meet security challenges that occur on modern virtualization platforms. In working with virtualization, just like any other technology, security must be a priority. It's time to create cloud-ready security. Read More

As communications shift to the cloud, it's creating new types of requirements that older security models just cannot meet. Next-generation security can help. Bill Kleyman looks at some of these new types of security products. Read More

Driven by IT consumerization and the ever growing number of personal end-points, the personal cloud is starting to make its presence known. Already users are demanding that their data, files, personalization settings, and even applications have the capability to travel with them. Columnist Bill Kleyman explores this emerging trend. Read More

A good mobility and Bring Your Own Device (BYOD) initiative, coupled with virtualization, can create a flexible environment capable of growing with the needs of the user – and helping the organization reach its goals, writes Bill Kleyman of MTM Technologies. Read More

IT consumerization and BYOD has created a new type of user where mobility and virtualization play a vital role in productivity. In deploying new types of mobility and virtualization solutions, administrators must consider the underlying infrastructure as well as the all-important end-user experience. Read More