HP SAP HANA In-Memory Computing CoE & Big Data Plans

Big Data is as big as any other shift in the tech industry, said Colin Mahony, Vice President and General Manager of Vertica, at this week’s HP technology briefing. “So, Big Data to us at HP is really that tectonic shift that represents on the one hand a new computing paradigm and on the other hand an ability to incorporate a lot of different data types.”

The company plans to spend around $1 billion this year on R&D and marketing for its big data software portfolio. Approximately $800 million is earmarked for Vertica and Autonomy, with the remainder going to joint projects between HP Software and the storage team in HP’s Enterprise Group.

While Vertica uses a combination of memory and disk to address Big Data, especially sensors, telco network and gaming data, one of HP’s partners, SAP, has taken an all-in memory approach with its HANA product, he said. Just over a month HP announced the launch of a Center of Excellence for in-memory computing that will initially focus on offerings for the SAP HANA platform and the HANA-based SAP Business Suite, and so far so good, said HP’s Paul Miller, VP Converged Application Systems, Enterprise Group, in an email exchange. However, the real focus is on addressing client’s Big Data needs and over time as technology evolves, the CoE will extend its focus beyond SAP HANA, he said.

SAP announced availability of the HANA database for its business suite in January, believing that it has a legitimate option to displace its number one applications competitor, Oracle, in the database tier. “Speed” and a return to real-time seem to be the primary benefits promised for SAP HANA powering SAP Business Suite, noted Evan Quinn, Senior Principal Analyst, Enterprise Strategy Group. “20x data loading performance improvements, processing complex queries sub-second, and not having to choose between read and write optimization are examples of the speed improvements.”

HP said that compared to traditional methods of separating the data storage from the processing elements, in-memory computing allows for faster computation of big data sets. Its first roadmap milestone of the CoE is an 8-terabyte, single image development platform tuned for the SAP ERP and SAP Customer Relationship Management (CRM) applications on SAP HANA. Codenamed ‘Project Kraken’, this development prototype platform provided to SAP by HP is the forerunner to the large memory systems scheduled for future availability.

“Project Kraken is an initiative sponsored by the In-memory CoE that drives close collaboration between HP and SAP for in-memory computing and the HANA platform” said Miller. “It is an 8-terabyte, single image development platform tuned for the SAP ERP and SAP Customer Relationship Management (CRM) applications on SAP HANA. The features and validation testing conducted on the Kraken platform will be extended to the future x86 based HP Appsystems portfolio supporting SAP Business Suite on HANA. The ‘Kraken’ platform is the forerunner to the large memory systems scheduled for future availability from HP.

Also as a part of Project Kraken, HP is investing in pilot and proof of concept services to help clients position their investments in future in-memory technologies and understand the functional and technical issues for migrating to the new Business Suite applications on in-memory database technology, he said. “These proof-of-concepts are developed in collaboration with the clients and SAP will assist with quantifying the operational impact of implementing these solutions.”

Miller said HP Labs is conducting data centric research based on innovations around how we store, compute and move data that have a direct impact on what we understand today as in-memory computing. For storage, the company is introducing a new kind of memory technology called Memristors. “Memristors are one of the four basic circuit elements in electrical engineering and offer the ability to store large amounts of data permanently like hard disks, but are close to a million times faster than current disks and consume much lower energy. This technology has the ability to change how future computers store data.”

For compute, HP thinks that future data-centric data centers will need a new system building block – an architecture called nanostores. “Traditional systems have deep hierarchies. Our nanostore design pairs three-dimensionally stacked Memristor-based data storage close to energy-efficient computing. Each nanostore can operate as a full computer and can be optically connected to other nanostores to form even bigger systems that operate on larger volumes of data. And given that we have now gone to a data-centric model, with compute surrounding the data, such nanostore based data-centers can be incredibly more powerful than current computers for lower energy.”

Finally, said Miller, there is the ‘move’ aspect. New optical communications using cost-effective photonics technologies developed at HP Labs can replace traditional copper-based communication for higher performance and lower energy. “Recently, HP Labs has demonstrated an all-optically-connected data center switch where all backplane communications to and from the card are optical. This demonstration showed that future bandwidth can scale 30-fold compared to traditional copper based communication, while consuming one-tenth the energy.”