The Pivotal framework will allow businesses to respond more quickly to opportunities by arming their software developers with the proper software engineering tenets and tools for monetizing data.

How did we get here? Or rather, how did EMC get into the application construction business?

The evolution can be traced using the simple picture of "the way life used to be" between applications and data.

Applications used to live right next to the application data.

Starting in the early 1990s and extending right up to this very day, EMC has had a strong focus on increasing the performance, scale, and reliabiilty of the application data store. As these storage systems grew, the IT infrastructure connecting the applications to the storage system grew in complexity and scale.

Continued innovation in the storage of application data drove new innovation in managing application frameworks.

VMware's creation of an application container model revolutionized the framework with which applications were "wired" to connect to the application data store. One of the most important benefits of VMware's work has been the ability to reduce IT provisioning time (the connection of applications to data) from months to days (as displayed below).

This brings us to the current decade, and EMC (with VMware's help) has squarely moved into the application development space. Pivotal's goal for enterprise software development is similar to what VMware did for IT provisioning: bring the software development time of applications from months down to weeks/days.

This is not EMC's first foray into the application development space. However, Pivotal is focused on helping to write applications that participate in an analytics life cycle (as depicted below).

The picture above has three axes, with the following messaging:

Apps power businesses, and those apps generate data.

Analytic insights from that data drive new app functionality, which in-turn drives new data.

The faster you can move around that cycle, the faster you learn, innovate & pull away from the competition.

When I was thinking about these concepts my colleague Nikhil Sharma pointed out that Pivotal also brings App dev abstraction in a similar way to how VMware brought compute abstraction. The applications developed using Pivotal tools can run on any cloud platform.

So what are the "software tenets and tools" that facilitate software development speed? And how do these tools sit on top of an architectural stack that connects the applications to the storage?

I also pointed out in a recent post that traditional data center architectures are now trying to do three things at once by adding analytic capabilities to their existing data centers. In fact, the chart below describes the growth in analytic workloads that is projected on top of the X- and Y-axis infrastructure.

If it's true that application workloads drive innovation, then the six architectural areas described above will exhibit interesting new functionality as customers begin piecing them together for next-generation data centers supporting analytic architectures.

Employer

Volunteer

Disclaimer

The opinions expressed here are my personal opinions. Content published here is not read or approved in advance by DELL Technologies and does not necessarily reflect the views and opinions of DELL Technologies nor does it constitute any official communication of DELL Technologies.