Why Data Protection-as-a-Service is Unpredictable

Economic unpredictability is the most significant challenge associated with data protection. The pool of data that must be protected continues to grow exponentially, and meanwhile, recoveries must be nearly instantaneous.

Our previous blog discussed the challenges inherent in meeting these unpredictable data protection requirements with a more traditional, fully on-premises IT infrastructure approach. The unpredictable spend cycles cause IT to scramble to allocate budget, often forcing a workaround to be cobbled together that may not provide sufficient levels of protection. The business may be forced to rely on legacy technologies, and pieces of the environment may be left exposed to data loss or slow recovery times. Not to mention, the business may also incur the cost and headache of running multiple point solutions at a time during the recovery process.

The pain points associated with investing in and managing a fully on-premises disaster recovery environment incentivize many IT planners to look to cloud-based data protection-as-a-service (DPaaS), including disaster recovery-as-a-service (DRaaS). One of the DPaaS goals is to smooth out the costs associated with data protection. However, DPaaS brings its own set of unpredictability and challenges.

DPaaS enables the business to defray upfront costs. However, the advent of stricter compliance regulations and increasing utilization of data and analytics to fuel new business outcomes require more data to be retained for longer periods of time. As capacity requirements increase and as data is stored over months and years, this cost model can become inhibitive, far surpassing the cost that would be required to store that data on premises. Meanwhile, egress fees accumulated when the business goes to recover data are unpredictable and may be very sizeable since cloud service providers charge a per-gigabyte fee for data to be removed from their cloud storage and placed onto another infrastructure. Adding to this cost structure is the fact that any on-premises hardware that is required to run primary storage workloads must still be purchased up-front, as opposed to when it is consumed. Analyzing cloud utilization and costs remains a developing art as most organizations have fuzzy visibility at best into the cost structure of their cloud services over time.

Typically, DPaaS does not coexist with on-premises data protection infrastructure so the organization will typically go a cloud-only route. As a result, these costs can scale quickly and significantly. This also puts the organization at risk since DPaaS solutions tend to focus heavily on software innovation and as a result, might not take advantage of hardware-level innovations. Additionally, many of the vendors in this space are new startups. The customer is relying solely on a vendor, which might not be in business in five, ten or fifteen years, for critical business continuity.

In addition to unpredictable costs, the recovery time itself is also unpredictable in a DPaaS model. Recovery time will vary depending on the compute cycles that the cloud service provider has available to apply to the recovery task (many other organizations might also be recovering data at the same time), how much compute is required to complete the recovery, and the cloud provider’s network latency.

Bringing consumption-based payment models to on-premises infrastructure is a compelling answer to the economic unpredictability of data protection. This approach enables customers to avoid heavy upfront capex investment in on-premises infrastructure, like in a DPaaS model, and introduces granular usage metering and scalability (both up and down) to enable customers to pay for the resources that are actually in use.

Our next blog will dig further into how a consumption-based data protection approach enhances predictability. In the meantime learn more about consumption-based IT and how it enables a more predictable data protection infrastructure, watch our on demand webinar, “Consumption-Based Data Management Providing Peace of Mind.”

Share this:

Like this:

Related

Senior Analyst, Krista Macomber produces analyst commentary and contributes to a range of client deliverables including white papers, webinars and videos for Storage Switzerland. She has a decade of experience covering all things storage, data center and cloud infrastructure, including: technology and vendor portfolio developments; customer buying behavior trends; and vendor ecosystems, go-to-market positioning, and business models. Her previous experience includes leading the IT infrastructure practice of analyst firm Technology Business Research, and leading market intelligence initiatives for media company TechTarget.