At SAP TechEd 2010 held in Bangalore, India, last week, SAP AG announced availability of the new High-Performance Analytic Appliance (HANA).

According to the company, the SAP in-memory computing engine that resides at the heart of SAP HANA is an integrated database and calculation layer that allows the processing of massive quantities of real-time data in main memory to provide immediate results from analyses and transactions. The engine supports industry standards such as SQL and MDX but also incorporates a high-performance calculation engine that embeds procedural language support directly into the database kernel. This approach, SAP explains, eliminates the need to read data from the database, process it, and then write data back to the database.

"We genuinely believe that HANA and the in-memory computing technology inside HANA represents a once in a generation kind of technological shift that happens in the industry," said SAP CTO Vishal Sikka, speaking during a teleconference from India.

Over the last decade, several dramatic advances have been made in hardware that forced a rethinking of the software layers above the hardware, Sikka observed. The first and foremost of these is multicore processing and the other advance in hardware has been the evolution of main memory which is now "dramatically faster, bigger in size and cheaper" than it used to be. As a result of that for the last several years, SAP has been working on rethinking the software layers, in particular the data management layers, above this technology to take advantage of the massive power of multicore processors, he said.

HANA is a non-disruptive attachment to an existing ERP system, it is also a non disruptive attachment to an existing Business Warehouse system as well as to an existing Business Objects systems, says Sikka. "The promise of HANA is being able to offer what we call a new dimension of real-time analytics and being able to take the raw data out of the ERP system, the fine-grained transactional data, directly replicate that into a HANA system and to enable end users to directly analyze that without aggregation, without summarizations, without transformation - as is, as it comes in."

On the basis of this platform, Sikka says the company will transform the entire SAP product portfolio over time, including building new applications in the areas of planning, forecasting and simulation. With the launch of HANA, SAP is announcing the Strategic Workforce Planning application that will be released this month.

The SAP in-memory computing engine delivers technical breakthroughs such as CPU core utilization and massively parallel processing across nodes. Through work with customers during the SAP HANA pilot phase, SAP has been able to demonstrate what it describes as game-changing innovation in speed, scalability and compression.

In terms of speed, the in-memory computing engine has the ability to scan 2 million records per millisecond per core with over 10 million complex aggregations calculated on the fly per second per core. The core engine of SAP HANA has been designed ground-up around a multi-core architecture and implements adaptive, cache-aware algorithms. As a result, performance scales linearly across cores, CPUs and servers. Current analyses indicate full parallelization at 1000 cores and beyond. Lastly, the SAP in-memory computing engine employs advanced compression algorithms and data structures that minimize the memory footprint required to run the system while still maintaining full support for OLTP workloads. The same 450 billion record system referenced above was implemented on less than three terabytes of physical memory.