Gartner just published its 2017 Data Virtualization Market Guide, and Primary Data is proud to be featured in the
report. DataSphere virtualizes
data by abstracting the data plane from the control plane, making it possible
to finally overcome storage silos and intelligently manage data across
enterprise infrastructure and into the cloud.

Citing the need for
enterprises to deliver “faster data access through flexibility and agility,” in
the report, Gartner noted that “familiar data integration patterns centered on
physical data movement … alone are no longer a sufficient solution for enabling
a digital business.” To overcome the storage sprawl that adds to enterprise
complexity today, Gartner recommends that IT leaders focused on data management
“identify data silos that are candidates for consolidation.”With cloud adoption
on the rise, it’s important to be careful that adding cloud resources doesn’t
just create a new island of storage that’s not connected to the rest of the
enterprise’s infrastructure.

This echoes Gartner’s advice to IT, as seen in other
research to help enterprises stay on top of the tsunami of data and
infrastructure commonly managed today. In its 2016 Strategic Roadmap for Data Center Infrastructure report, Gartner advised that, “the data center
infrastructure ‘supernova’ is here, with applications and data forcefully
spread across on-premises data centers, colocation, hosting and the cloud. IT
leaders must apply a future-looking, enterprise-wide steady hand to IT strategy
and planning, or lose control and enterprise agility.” Gartner recommended that
“the IT organization must become leaner, faster and more agile, and take on the
critical role of coordinating services, not controlling assets.”

DataSphere enables IT to do just what the Gartner doctors
ordered. By virtualizing data, DataSphere creates a global namespace that gives applications access to all
storage resources. The attributes of each storage resource can be defined
according to performance, price and protection attributes. Metadata is
monitored to determine if data is hot, cold, or somewhere in the middle; then
DataSphere can automatically align data with the ideal storage resource to meet
IT-defined Objectives for applications and their data. As time passes,
DataSphere can begin to detect patterns – say, if reporting data becomes hot at
the end of each quarter or year – and use machine learning capabilities to keep matching data to the
right resource before performance becomes a problem, and DataSphere’s out-of-band operations keep metadata management from impacting
application I/O requests.

Subscribe and Save

Gartner cautions that there are some hurdles IT needs to be
aware of when looking at a data virtualization solution for data management.
One is the cost, as the report notes that “many data virtualization tools do
not provide flexible subscription-based pricing options….” Not so with
DataSphere, as we offer an enterprise subscription option and another for smaller Lines of Business (LoB) serving remote or branch offices
(ROBO).

Enterprises do not need to buy new storage to adopt DataSphere.
In fact, it is designed to put the storage you already have (and want to add)
to work serving the right data, across your enterprise and into the cloud. Even
when the subscription cost is factored in, enterprises adopting DataSphere
estimate savings of up to 49% by the vast efficiency improvements made possible
by overcoming the storage silo problem and automatically aligning data to the
best resource to meet current business needs.

Manage Your Most Valuable Asset

In addition,
Gartner advises IT to keep a keen eye on their metadata, noting that metadata
management and architecting a metadata layer across systems is a “vital”
consideration when looking into data virtualization. Indeed, we believe that
metadata is in fact one of your most valuable assets. As a metadata engine, DataSphere is designed to
separate and offload the architecturally rigid relationship between applications
and where their data is stored.

Offloading metadata access with DataSphere delivers predictable, low-latency
metadata operations by guaranteeing that metadata operations do not get “stuck”
in the queue behind other data requests. Rather than having to wait for
sequential operations to complete, DataSphere can leverage parallel access with
the latest optimizations of the standard NFS v4.2 protocol. Leveraging NFS v4.2
significantly speeds up metadata and small file operations by requiring less
than half of the protocol-specific network round trips compared to NFS v3.

If you’re
heading to VMworld 2017 in Las Vegas later this month, we’d love to connect
with you to discuss how data virtualization can quickly solve many common
problems in data management. Drop us a line at deepdive@primarydata.com to arrange a meeting, or come by booth 831 to speak with us on
site. From overcoming the pain of data migrations, making it easy to adopt and
integrate the cloud to serve the right data, to scaling out NAS performance,
DataSphere can help you make many of today’s challenges obsolete for your
enterprise.