HP CTO Lays Out HP's Vision For Future Computer Architecture

Pages

Memristor could be the most exciting technology introduced yet, Nth's Baldwin said.

"The power requirement for memristor is almost nothing, yet it will scale to a couple petabytes of capacity in a few years," he said. "It will replace disk and solid-state storage."

Combining Project Moonshot servers, new massively-scalable memory technologies and new photonic technology for moving mass amounts of data creates a new kind of computer architecture, HP's Fink said to applause from the audience.

"And now you know why I have one of the coolest jobs on Earth, because we get to solve these problems."

Even that is not yet enough.

Given the growing importance of big data and business analytics, HP is also looking at how to redesign applications to take the data scientist out of the solution and enable the ability to access data immediately without an intermediary, Fink said.

Getting that intermediary will be a big advance in taking advantage of new computer architectures, Baldwin said.

"I use HP Autonomy and IDOL for big data without understanding the math behind it," he said. "If we can get to the point where we can just ask the question and get the answer, imagine what we can do."

The next problem will be how to manage a million nodes, Fink said.

"That's not an exaggeration," he said. "When you think about that cloud infrastructure of the future, it's not a rack of 64 nodes. It's not a couple of racks of 1,000 nodes. It's a million cores. And how are you going to manage that?"

HP has a team looking at how to manage systems at that scale.

"The end goal is, we now have all the places to put the data, we can move the data, we can process the data," Fink said. "We now need the capabilities to access that data very, very easily in order to turn that data into value."