Microsoft's Push Toward IT Automation

At this year’s Microsoft Management Summit (MMS) 2011 in Las Vegas, Microsoft announced a host of major enhancements to its System Center Management Suite. Not surprisingly, there was a heavy emphasis on the cloud, as well as service-oriented IT and the continued evolution of the dynamic data center.

However, it’s clear that most businesses, particularly small-to-midsized businesses (SMBs), aren’t ready for and might not need this type of IT automation. I know that in my labs, there isn’t dynamic anything. But it’s also clear that IT automation could be very useful for larger businesses and enterprises with a lot of servers to manage and hundreds of users to serve.

In some ways, Microsoft is trying to catch up to the technology that’s already offered by VMware. New features in the upcoming Microsoft System Center Virtual Machine Manager (VMM) 2012 release will take a page out of VMware’s playbook and add Dynamic Optimization and Power Optimization to VMM 2012. These features will add the ability to load-balance virtual machine (VM) placement on clusters, as well as adapt to low-level utilizations, group VMs together, and power off physical servers to save energy—all with no end-user downtime or interruption of services.

VMware’s Distributed Resource Scheduler (DRS) has had these capabilities for some time now. However, it’s not just about playing catch-up. Microsoft is also working to fulfill its vision of the dynamic data center that the company began pushing at TechEd in 2008. Of course, today that vision has evolved to include the cloud.

At MMS 2011, Microsoft clarified its position about cloud computing and the private cloud in particular. For Microsoft, the cloud has become a collection of computing resources that can be assigned to users or groups through Active Directory (AD). For the public cloud, those resources are outside the organization. For the private cloud, those resources are inside the organization. VMM 2012 enables the creation of private clouds by creating one or more fabrics that essentially represent the underlying infrastructure, which in turn is composed of collections of VMs or services. In many ways, the new generation of System Center can been seen as a way to add a cloud computing layer onto the private resources that already exist in your organization.

The combination of dynamic memory management and Dynamic Optimization and Power Optimization creates a very fluid and automatic IT infrastructure. The dynamic memory feature introduced in Windows Server 2008 R2 SP1 allows VM memory to be automatically increased and decreased as the VM’s workload requires it. Dynamic memory can help you achieve higher levels of server consolidation and can automatically ensure better performance for applications such as SQL Server by allocating memory when they need it. Dynamic Optimization takes this automation a step further by monitoring the virtualization host for CPU, memory, disk space consumption, disk I/O, and network I/O levels. Dynamic Optimization can then automatically initiate a Live Migration, moving one or more VMs to new hosts if performance falls outside the predefined boundaries. Within a VM, Hyper-V dynamically adjusts the VM memory; outside a VM, VMM 2012 dynamically moves VMs between virtualization hosts in response to changing workloads and power requirements.

Another important product that facilities the dynamic data center is Microsoft Server Application Virtualization (Server App-V). This product allows server applications, such as SQL Server, Microsoft Exchange Server, and Microsoft IIS, to run as virtual applications. Server App-V decouples the server application from the underlying OS and lets you change the way server applications such as SQL Server are deployed and used throughout the organization. Like Microsoft Application Virtualization (i.e., desktop App-V), with Server App-V, you first run the application to be virtualized through a sequencer that creates a virtual application package that can be streamed or Xcopy-deployed to a different server. You don’t need to install the server applications; you simply copy or stream the virtual application package to the new server. Server application virtualization is particularly important for enabling cloud scenarios because it frees the application from the physical server, thus enabling easy movement of the server application into an off-premises cloud infrastructure such as Windows Azure. Server App-V is included in VMM 2012—and not surprisingly, it will also be added to Windows Azure. Learn more about Microsoft Server Application Virtualization.

Few companies that I know of are actually using this level of dynamic infrastructure today. However, it seems clear that larger businesses should pay attention to this vision as it becomes a reality. As a side note, Microsoft didn’t completely leave out smaller companies. The cloud-based (what else?) Windows Intune service provides a subset of VMM 2012’s monitoring and management functionality. (For more information about Windows Intune, see “Windows Intune Brings PC Management Into the Cloud.”)

From the Blogs

Duplicate records clutter databases and render the data within them unclear. This kind of problem is very common, and it’s the main reason that deduping software exists. But there’s another benefit to deduplication software: the ability to infer connections between individual records from various data sets....More

Companies looking to grow and extract value from their data are increasingly turning to Chief Data Officers (CDOs) to execute their data strategy. The role is new, and a playbook is necessary to address the many challenges CDOs face....More

After spending 20 years building analytics, BI and database solutions, I've focused on Cloud data solutions over the past 2 years. I've chosen 5 common challenges that I face every day with Cloud migrations and that you'll face in your Cloud BI projects....More