Teradata takes on SAP’s HANA with ‘intelligent’ in-memory

Teradata has stepped into the in-memory database market to take on the likes of SAP’s HANA with a solution that aims to reduce the cost of placing data in RAM by ‘intelligently’ identifying the most used data.

Derek du Preez
May 9, 2013

Share

Twitter

Facebook

LinkedIn

Teradata has stepped into the in-memory database market to take on the likes of SAP’s HANA with a solution that aims to reduce the cost of placing data in RAM by ‘intelligently’ identifying the most used data.

Teradata argues that pure in-memory databases, such as HANA, are not optimising costs for enterprises, as although the cost of RAM has decreased in recent years, it is still approximately 80 times more expensive than disk.

The tool, Teradata Intelligent Memory, predictively places the very hottest or most frequently used data into memory, then automatically updates and synchronises it. The process doesn’t need any human intervention.

It supplements other multi-temperature features including Teradata Virtual Storage which continuously migrates data to the appropriate level of storage, solid state drive (SSD) and hard disk drive (HDD) technology.

Teradata Virtual Storage provides an automated lifecycle management process where “cold” or the least used data is migrated to less expensive drives.

Computerworld UK spoke to Chris Twogood, VP of products and services marketing at Teradata, who explained why Teradata believes an intelligent in-memory database is more sensible for enterprises that are still concerned limiting costs.

“Accessing data off memory, versus going to disk, is about 3,000 times faster. One of the challenges is that even though the price of memory is going down, memory is still the most expensive component in an appliance stack. It’s about 80 times the cost of disk - it doesn’t make sense to take all of your data and fit it into memory, because not all data warrants 80 times premium,” said Twogood.

“What we have done within Teradata is that we have created this new extended memory space, which addresses the most frequently accessed data based upon temperature. This enables us to increase the query performance but not have to have as much memory in a solution as in-memory databases do, because you are dynamically monitoring and measuring.”

He added: “We think this is a smarter approach than in-memory databases, because it enables you to store data at a cheaper overall price point.”

Twogood said that Teradata analysed its customers use of data and found that in-memory capabilities aren’t necessary for all data.

“We have gone out and we have measured temperature and data usage in hundreds of our customers and it’s clearly the 80/20 rule – 80 percent of the users are accessing 20 percent of the data,” he said.

“So if you can determine what elements they are accessing and place it in the highest tier, you are going to intelligently drive overall performance.”

Twogood didn’t have information on the cost of Teradata Intelligent Memory, but said that prices would be made available from July.