Everybody in IT knows by now that flash memory is redefining the enterprise storage industry, mostly by decoupling performance from capacity. Most storage vendors are happy to just add flash to their existing product lines, often using it as cache, or as a storage tier handled transparently within the array. Few vendors take the opportunity to rethink the way storage works, though, from the basics of performance to how it meshes with the idea of public & private clouds. Coho Data, coming out of stealth mode with their first product, the DataStream, does just that.

The Coho Data DataStream is built around the notion that the old way of storing data on-premise is broken in the face of new technologies. First, because traditional spinning disk is slow it doesn’t take much CPU or network/interconnection capacity to handle it. Many midrange arrays will only have two controllers in them, with modest CPU and RAM, no matter how many disks you have attached. If you change the I/O balance from spinning disk to flash you will encounter problems, as the sheer performance of that flash memory will likely overwhelm the limited CPU and connectivity available on those controllers. High-end arrays such as those from Hitachi, IBM, and EMC often allow you to increase the controller capacity, but that comes with costs. From the capital expenditure to increase controller count as well as to the staff time involved in installation, rebalancing I/O, etc. As well as upgrading from an old architecture. For most enterprises the idea of adding flash memory to their storage means a forklift upgrade to a whole new array that is more suited to flash memory.

As you’d expect, the DataStream tiers data between the PCIe flash and the SATA disks. Internally, it doesn’t use RAID but maintains multiple copies of data blocks between the available MicroArrays. This helps defend against loss of a drive, flash card, or whole MicroArray, but it also helps defend against hot spots. The more MicroArrays available, the more resources available to spread busy volumes across. As the SATA disks are slow the MicroArray’s battery-backed RAM and PCIe flash can used to buffer writes, and any necessary destaging from flash to SATA can be done in a way that is optimal for the SATA disks. This is an approach we’ve seen on arrays from Nimble Storage and others and works well to balance performance with costs. By traditional array standards there is an absolutely epic amount of CPU time available on each MicroArray (24 GHz) so deduplication and compression are performed to maximize capacity, especially in the 1.6 TB of flash memory available. It can also use that CPU time to analyze the workloads, rebalancing itself, making better choices about steering I/O between all the resources, and offering projections for capacity and performance so that steps can be taken proactively to deal with workload changes.

Beyond the raw technology, though, the Coho Data DataStream offers a number of features designed to help businesses step into cloud models of operation. First, there is integration with VMware vCenter, offering VM-level snapshots, replication, and performance management based on Storage Profiles. While we’ve seen VM-level management in products from Tintri and others, performance management integrated with VMware Storage Profiles is unique and very welcome. Also unique is integrated chargeback, where costs are assigned to customers and easily visible. This is a very important feature to most businesses, and one of the main selling points of the cloud. At $2.50 per GB list price (before deduplication and compression) the arrays are competitively priced, but they’re still not free, and the CIO still wants to see where the money is going. Their chargeback is configurable via their well designed UI, taking into account the on-disk utilization, replication, snapshots, and performance guarantees that come through from the storage profiles. It’s a nice touch from a company that wants to be seen more as on-premise cloud storage than as a traditional disk array vendor, which in the current market is a good thing, especially with recent high-profile cloud storage failures like Nirvanix and recent developments around the USA’s National Security Agency. By using commodity hardware, modern networking techniques, intelligent software, a focus on reduction of staff time, deep automation, and a scale-out capacity and performance model Coho Data is bringing many of the benefits of cloud storage into the data center itself, safe from the NSA and the mistakes of cloud-vendor CEOs.

Share this Article:

Bob Plankers is an IT generalist with direct knowledge in many areas, such as storage, networking, virtualization, security, system admistration, and data center operations. He has 17 years of experience in IT, is a three-time VMware vExpert and blogger, and serves as the virtualization & cloud architect for a major Midwestern university.