An IT industry insider's perspective on information, technology and customer challenges.

April 23, 2014

My blog has been off-the-air for the last six days or so. Thousands of daily visitors would either get an error message, or perhaps a mangled version of text only.

My blog service provider -- TypePad -- successfully fended off a massive DDoS (distributed denial of service) attack, but it took them many days to do so. Thousands of their customers could only watch helplessly, day after day, and hope for the best.

First, I realized just how highly dependent I had personally become regardling certain online services: the blog, my calendaring, sync-and-share, financial, travel, etc.

Take any one of them away for a few days, and I'm in a world of hurt -- with no Plan B. Worse, I don't have any good ideas on how to mitigate the risk going forward without serious complexity and cost.

Second, I had to wonder -- how many online services are prepared to respond to a similar extortion attempt? I talk to many folks who are involved in disaster recovery and business continuity -- I'd be curious on just how many recognize this relatively new threat, and have succcessfully prepared themselves -- or have even tested their capabilities.

When information is the new wealth, we're all potential victims of digital extortion.

In the last post, we introduced the notions of applications, their containers -- and policy. We also discussed how policy is interpreted by the control plane, while mediating access to services and providing the required perspective to multiple stakeholders.

In this post, we’ll extend our SDS model to discuss data services (snaps, dedupe, etc.) as well as the data plane where data is physically stored and persisted.

April 15, 2014

If we’re going to dig into software-defined storage, we’re going to need a conceptual model — just so we can keep the discussion organized.

The particular model I’ll be using for this discussion is the one VMware currently uses.

Vendor bias aside, I’ve personally found it the most useful model out there for explaining not only software-defined storage, but exposing important differences as compared to the way things are done today.

The model itself is not bound to any specific technology — but you will find many aspects already implemented in VMware’s current product set.

And an open invitation: if someone has a better model, please share it!

I think of a conceptual model as a precursor to an architectural model. The conceptual model details the functions and how they’d ideally interact; the architectural model instantiates them into a specific set of technologies and use cases.

As with any model, you’ll certainly find familiar functions and concepts — but here they are grouped and abstracted in different ways than you might expect.

If you’re new to this series — and are willing to do some prep — you might want to read this post and this post.

All those zettabytes of ones and zeros need to live somewhere. If they are to be of any value, they must be stored, protected and managed. The more information we produce, consume — and depend on — the more storage matters.

This was true twenty years ago; it will be true twenty years hence.

At the same time, it appears that software is eating our world: extending the power of human intellect in ways that continually surprise us — now often powered by the avalanche of information we are creating about ourselves and the world around us.

In particular, software is transforming how we think about data centers: the technologies and operating principles that enable us to produce, consume and act on information quickly and efficiently.

Software is inevitably changing core data center technologies — compute, network and storage — both individually and how they work together.

I believe this is what makes software-defined storage an interesting and relevant question for IT architects: how can we use software to become far better at storing, protecting and managing information?

If you’re a regular reader of this blog, you know my story: I’m a storage geek at heart, I work for VMware, and I think the next big phase of this industry is software-defined storage.

While I’ve written more than a few posts on the topic, I think I haven’t done justice to an important (yet complex!) set of concepts that promise to forever change the way we store, protect and manage information.

As far as I can tell, there’s a noticeable void.

I have yet to find a suitably weighty or thorough discussion on software-defined storage that satisfies. I never complain without proposing a solution: my goal is to to create a series of discussions that goes much deeper than what I’ve seen out there. My hope is that you will find it suitably satisfying.

As I work through the outline, it’s clear I’ll be speaking to those of an architectural bent: enterprise architects, cloud architects, and the like. These are the people who weigh the impact of big technology ideas, and find ways to pragmatically introduce the best of them into their environments.

My purpose is not only to share and educate, but also to discuss and debate — as peers and colleagues.

My game plan is simple: begin with motivations and rationales, move through a handful of key concepts, recap how SDS changes architectural thinking, examine how familiar processes are changed and improved, acknowledge the many obstacles and limitations, and — finally — suggest ways forward for those willing to move ahead.

The usual disclaimer: I am an employee of VMware, and spent a long time at EMC. Content that appears here is neither reviewed nor approved by VMware — although I’ve been collaborating with my colleagues on this.

Despite my best efforts, I will not be able to completely overcome these inherent biases. Then again, I can’t imagine a better set of experiences to discuss the topic at hand :)

For those of you not interested in this particular topic, my apologies in advance. I’ll do my best not to bore you.

When I am done, there will be a considerable amount of content that’s hopefully relevant in other forms. Who knows? Maybe it’ll be turned into a book.

If it does, I’ve got a catchy title picked out: Software-Defined Storage For Smarties

One reason is that I work in the same group that built the product, so of course I want to share some love.

But there’s more: having been in the DR and BC space for a l-o-o-o-n-g time, I think the new vCHS RaaS (recovery as a service) brings something comparatively new to the table.

And, more broadly, this new service clearly reflects the distinctive vCHS “works the way you do” philosophy.

All good.

I can see three use cases for vCHS - DR. The obvious one is as a net-new solution for important applications that should be protected, but aren’t. There’s another market out there with folks who aren’t happy with what they’ve already got for remote recovery (costs, complexity, etc.) And there’s a third great use case around tertiary protection in addition to an existing remote recovery solution.

April 08, 2014

One aspect of our industry that I find especially annoying is the "pay-to-say" analyst model. The usual scenario is that one vendor wants to discredit one or more other vendors to make themselves look better. They contract with a freelance analyst, who hopefully brings more expertise and the appearance of independence to the table.

The few analysts who use this model fiercely brand themselves "independent", perhaps in the sense that they are not affiliated with one of the big name industry analyst firms.

I guess by the same standard my lawyer is "independent", but I certainly pay for results!

Despite taking substantial criticism from many IT practitioners in a number of forums, George has stubbornly defended his statements, encouraging those who object to offer up a "professional response".

I find myself doing so reluctantly: weighing the need to correct many of George's and Colm's erroneous statements vs. giving unwarranted attention where none is deserved. All of my responses are based on widely published information; a simple Google search can disprove many of the assertions.

And, to be fair, I know many independent analysts who do very good work on behalf of their vendor clients.

April 01, 2014

The recent price cutting from the big public cloud vendors (Google, Amazon, Microsoft) has re-energized the familiar debate about which is more cost-effective: using a public cloud service, or doing it yourself?

While any intelligent answer involves a healthy amount of “it depends”, I found myself thinking about another time and place — where we as an industry were debating the merits of outsourcing vs. keeping IT operations in-house.

Then as now, the primary argument for outsourcing was economic. A lot of people took the bait — but it didn’t always work out well.

Of course, things are certainly different in 2014 — but are they all that different?

March 27, 2014

The positive response to one of my recent blog posts ("A Look In The Mirror: Are You Creative?") made me suddenly aware that there are quite a few folks out there who feel challenged in bridging the gap between their personal creativity and the normal corporate ethos.

If this is you, I know it can be unpleasant at times. You feel frustrated. You doubt yourself. You wonder if it's you, or them.

We invest heavily in our jobs and careers, and we would like more from the investment than our paycheck. We want very much to make meaningful contributions, and hopefully be recognized for those contributions. I meet these people across many professions: technical, legal, healthcare, teachers, etc. I've learned to recognize them quickly.

But how to go about bridging that gap between creativity and consistency?

I've been thinking about it, and I've come up with some guidelines that might help if you're so afflicted.