Tag Archives: Hitachi

When the Hitachi Vantara storage array used by Frost Bank approached its five-year mark, Dan King, vice president of IT operations for the San Antonio -based chain of banks, started looking at alternatives. Simply replacing the existing system with a newer version of itself would take six months of planning, cleaning out a rack and migrating the data over. King wanted to determine if newer technology might save money, boost performance or simplify operations.

He considered cloud options, including Amazon, but decided these were too slow and couldn’t give him the kinds of IT controls he was used to (see sidebar). He ended up going with a new all-flash storage array, delivered as service from Pure Storage. Data migration promised to be relatively painless. Pure Storage said that it could also replace the six-rack array with a single rack that was faster — and do so in a weekend rather than six months.

But what really moved him to make the choice, he said, was what other customers told him about the data storage platform’s management capabilities. Pure Software’s core offering, Pure1, included cloud-based management and predictive support. The company was also rolling out Pure1 Meta, which included predictive intelligence that automatically optimizes the storage array.

“The way Pure designed the storage software makes using and managing the systems much simpler,” said King, an attendee at Pure Accelerate 2018 conference in San Francisco. “The other systems are designed to sell storage.”

More than faster hard drives

Since the dawn of IT, storage has been perceived as a hardware commodity. Enterprises purchased storage as a capital expense and provisioned it across enterprise users through a complex system of rules-based software.

But enterprise storage vendors are starting to adopt the playbook of public cloud providers, selling private storage infrastructure as a service.

This in part explains the rapid rise of Pure Storage, a data storage platform provider founded in 2009. The company, headquartered in Mountain View, Calif., sells against entrenched competitors like Dell EMC and Hitachi. Its focus on selling private storage as a service seems to involve more than just semantic juggling around recategorizing Capex as Opex. It has driven the company to explore better practices around upgrading, provisioning and optimizing storage at the hardware level. These are all functions that have typically been done outside of the hardware at the application level, which limits the speed at which the enterprise can make good on big initiatives like digital transformation.

“Storage has been an afterthought for most CIOs. It is not a place an executive or IT person thinks of to drive innovation for their architecture,” said Matt Burr, general manager of FlashBlade at Pure. But he said that attitude is changing; companies are being forced to digitalize their business models.

Baked-in AI, improved management

As companies look to repurpose their data for new applications, they are starting to connect legacy applications for storing the system of record to more nimble applications that have different patterns of reading and writing data, Burr explained. Not only can new data access patterns slow down performance, the interfaces between apps can break things in ways that are hard for IT organizations to diagnose.

To address this challenge, Pure’s data storage platform uses AI analytics to capture and analyze petabytes of data, with the aim of automating many of the traditional management challenges associated with keeping servers running — and doing that automation and optimization at the hardware level. This optimization at the hardware level leaves IT to innovate at the software level, with less fear of having to tweak the data management software to get good performance.

“We want to invest in technology that allows IT managers to anticipate problems before they happen,” Burr said.

Data dedupe at the hardware level

Frost Bank’s King has found Pure’s predictive approach helps identify signs of a problem early, before a failure cascades into other systems.

He said the vendor’s data storage platform has also made it much easier to migrate data. Before Frost Bank would have to get six rack spaces and develop a plan to install a unit, which would take six months. Afterward, they would have to deinstall the older storage array to make a hole for the next round of storage migrations.

Modern data stores tend to be highly redundant and reuse many of the same data structures across records. Traditional approaches to compression happen at the operating system level, which adds a lot of overhead. It also adds a bit of brittleness, making it harder to move to a better app that is more aligned with a digital transformation strategy.

But better compression and deduplication at the hardware level can dramatically shrink IT infrastructure with no overhead on the application itself.

For example, King observed that he was able to shrink at 150 TBs data storage infrastructure onto a 15 TB system on top of Pure. Same applications, same data, only Pure’s data storage platform was able to compress and deduplicate data on the fly without any additional burden on his IT team or new software to install.

Of course, vendors have experimented with deduplication at the app level, King pointed out, but he said that just added another level of complexity to manage at the software level, when the aim is to make things simpler.