Nowadays, CIOs need to both cut costs and increase performance. Energy has never been more important in working toward this productivity advantage.

It's now time for IT leaders to gain control over energy use -- and misuse -- in enterprise data centers. More often than not, very little energy capacity analysis and planning is being done on data centers that are five years old or older. Even newer data centers don’t always gather and analyze the available energy data being created amid all of the components.

And so automation software for capacity planning and monitoring has been newly designed and improved to best match long-term energy needs and resources in ways that cut total costs, while gaining the available capacity from old and new data centers.

Such data gathering, analysis and planning can break the inefficiency cycle that plagues many data centers where hotspots can mismatch cooling needs, and underused and under-needed servers are burning up energy needlessly. These so-called Smart Grid solutions jointly cut data center energy costs, reduce carbon emissions, and can dramatically free up capacity from overburdened or inefficient infrastructure.

This podcast features two executives from HP to delve more deeply into the notion of Smart Grid for Data Center. Now joinDoug Oathout, Vice President of Green IT Energy Servers and Storage at HP, and John Bennett, Worldwide Director of Data Center Transformation Solutions at HP. The discussion is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

The result is a realignment of traditional technology silos into adaptive pools that can be shared by any application, as well as optimized and managed as ongoing services. Under this model, resources are dynamically provisioned efficiently and automatically, gaining more business results productivity. This also helps rebalance IT spending away from a majority of spend on operations and more toward investments, innovations, and business improvements.

The next BriefingsDirect Analyst Insights Edition, Volume 49, hones in on the predictions for IT industry growth and impact, now that the recession appears to have bottomed out. We're going to ask our distinguished panel of analysts and experts for their top five predictions for IT growth through 2010 and beyond.

This periodic discussion and dissection of IT infrastructure related news and events with a panel of industry analysts and guests, comes to you with the help of our charter sponsor Active Endpoints, maker of the ActiveVOS business process management system.

Rigorously applying data and metrics to security can dramatically improve IT results and reduce overall risk to the business. By employing and applying more metrics and standards to security, the protection of IT becomes better, and the known threats can become evaluated uniformly.

With standards and greater reliance on data, security practitioners can understand better what they are up against, perhaps gaining close to real-time responses. They can know what's working -- or is not working -- both inside and outside of their organization.

The security metrics panel and sponsored podcast discussion are coming to you from The Open Group’s Enterprise Architecture Practitioners Conference in Seattle on Feb. 2, 2010. The goal is to determine the strategic imperatives for security metrics, and to discuss how to use them to change the outcomes in terms of IT’s value to the business.

ArchiMate provides ways to develop visualizations and control to beyond some of the confines of IT architecture to more swiftly obtain business benefits. To learn more, we interview an expert on this, Dr. Harmen van den Berg, partner and co-founder at BiZZdesign.

Enterprise business architecture is a set of artifacts and methods that helps business leaders make decisions about direction and communicate the changes that are required in order to achieve that vision, says Westbrock. Learn more from the podcast.

The notion of enterprise architecture (EA) has been in works for 30 years. But now the evolving maturity of IT -- and the importance of IT in modern business -- makes this concept of enterprise architecture especially important.

We therefore examine the newer definitions and role of the IT architect and how that might be shifting with an expert from the Open Group, Len Fehskens, Vice President of Skills and Capabilities. The interview is moderated by Dana Gardner, principal analyst at Interarbor Solutions.

Welcome to a special BriefingsDirect dual webinar and podcast presentation, Real-Time Web Data Services in Action at Deutsche Börse.

As the culmination of a four-part series on web data services (WDS), we're here to examine a fascinating use-case for data services with Deutsche Börse Group in Frankfurt, Germany. An innovative information service recently created there highlights how real-time content and data assembled from various online sources scattered across the Web provides a valuable analysis service.

The offering supports energy traders seeking to track global fluctuations and micro trends in oil and other related markets. But, the need for real-time and precise data affects more than energy traders and financial professionals. More than ever, all sorts of businesses need to know what's going on in and what's being said about their respective markets, products, and services.

In this series with Kapow Technologies, we've examined the need for WDS and ways that WDS and related tools can be used broadly to solve these problems. Now, we are going to learn the full story of how Deutsche Börse took web data resources, and not only efficiently assembled knowledge from automated robots, cleansing tools, and analytics management, but from these capabilities they also created high value and focused WDS offerings onto itself.

What are the likely directions for cloud computing? Based on the exploration of expected cloud benefits at a cutting edge global IT organization, the future looks extremely productive.

In this podcast we focus on the thinking on how cloud computing -- both the private and public varieties -- might be used at CERN, the European Organization for Nuclear Research in Geneva.

CERN has long been an influential bellwether on how extreme IT problems can be solved. Indeed, the World Wide Web owes a lot of its usefulness to early work done at CERN. Now the focus is on cloud computing. How real is it, and how might an organization like CERN approach cloud?

In many ways CERN is quite possibly the New York of cloud computing. If cloud can make it there, it can probably make it anywhere. That's because CERN deals with fantastically large data sets, massive throughput requirements, a global workforce, finite budgets, and an emphasis on standards and openness.

So please join us, as we track the evolution of high-performance computing (HPC) from clusters to grid to cloud models through the eyes of CERN, and with analysis and perspective from IDC, as well as technical thought leadership from Platform Computing.