Hewlett Packard Enterprise (HPE) is an industry leading technology company that enables customers to go further, faster.

Gen-Z aims to solve the growing challenges associated with processing and analyzing huge amounts of data in real time while avoiding today’s system bottlenecks

Open standards drive the IT industry’s innovation engine. Today, a consortium called Gen-Z was unveiled which aims to develop a new universal interconnect that will enable simpler and more powerful computer architectures. The consortium includes industry leaders like Dell, HPE, Huawei and Samsung, among its founding members.

Gen-Z reminds me of the 1990s push for the Universal Serial Bus, which is now a regular feature of nearly every device imaginable: phones, PCs, televisions, even cars and IoT devices. USB was also an open standard to replace dedicated keyboard, mouse, serial and printer ports with a single, plug-and-play solution. We hope to see Gen-Z take the same path, be widely adopted and ignite a broad wave of innovation across the IT industry.

What is Gen-Z?

Today, each computer component is connected using a different type of interconnect. Memory connects using DDR, hard drives via SATA, flash drives and graphics processing units via PCIe, and so on. At the simplest level, Gen-Z is focused on developing a new interconnect and protocol that can be used to replace all of those with a single, open, high-performance interconnect.

The vision behind Gen-Z is to solve the growing challenges associated with processing and analyzing huge amounts of data in real-time while avoiding today’s system bottlenecks. The idea is to get back to the most basic functions that move data from one location to another. Gen-Z aims to get any device to communicate with any other device as if it were communicating with its own local memory using simple commands. Thus, we refer to it as a “memory semantic protocol”. You can connect vast numbers of components together in the same fabric and each can get equal access to the data.

Today, computers spend around 90 percent of their time and energy moving and translating data between different layers of memory and storage. For example, if you open a spreadsheet, you translate it from one format on a hard drive to a different format in memory. If you can access data where it lives, without copying or converting, then data access becomes much more reliable, cost and power efficient.

In practice, this simplicity enables system designers to make more powerful, more flexible and more power-efficient computer systems. One open, non-proprietary standard, with one memory semantic protocol, all the way from within one computer to data center scale.

Why Now?

The explosion of data, and the demand for real-time analysis of that data is common across industries from health care to media to manufacturing. At the root, what you see is a supply and demand problem: the increasing supply of data seems limitless, the demand for increased computing performance and data access is also limitless. Today’s computer architecture is starting to fall behind and it’s only going to get worse.

The search for new approaches to ingest, store and access data is on. There are some exciting new technologies emerging such as fast, non-volatile memory and task-specific “accelerator” processors. The problem is, we don’t have a good way to connect those new technologies using our existing interconnects.

Simply put, the hardware is holding us back and we need new solutions. But it’s clear to me and to all the consortium companies that we need to work together on those new solutions. Open standards mean that everything will work together and we can rapidly move forward to the benefit of all.

And that’s why Gen-Z, and the consortium, are so important.

Moving the Industry Forward

We don’t expect the industry to switch to a new interconnect and protocol overnight. One of the design tenets of Gen-Z is to offer a smooth transition. Thus, Gen-Z can exist alongside existing interconnects, and you can emulate traditional interconnects over Gen-Z, so current software will still run, making it cost-effective and easy to implement from basic systems to supercomputers.

We expect to see market solutions incorporating Gen-Z by 2018. This technology will also set the stage for true memory-centric computing, allowing the manipulation of huge data sets residing in large pools of fast, persistent memory. We also see Gen-Z as a critical component of HPE’s Memory-Driven Computing architecture. We’re using Gen-Z to form the fabric of The Machine and future High-Performance Computing platforms, as well as extending our composable infrastructure technology.

I’m also excited to see what our colleagues at the consortium member companies will do with Gen-Z, and how we can work together to advance the state of the art.

For more information on this consortium and HPE’s specific role click here.