The IT Imperatives

Before organizations can begin to turn the promise of the digital universe into reality, the following fundamental conditions in the management and deployment of IT will need to be met:

Information security will need to be tightened.

Information privacy will need to be ensured, particularly as more third-party consumer data is incorporated into the operational and analytical frameworks.

Data warehouses will need to be upgraded or swapped out for more flexible data repositories that can handle various data types, automatic tagging, autonomous data “check-in,” and many terabytes. These warehouses must be able to store the vast amount of data on the most efficient infrastructure, bowing to the reality that only a fraction of stored data is actually engaged at any given moment.

Data analytic output will need to be driven to more parts of the organization, including real-time input to operational decision making.

The datacenter will have to be virtualized and cloud computing incorporated into enterprise IT architectures.

IT managers will have to enable and manage the explosive growth in mobile devices that increasingly request and send data from all over the world.

IT managers will need to provide the appropriate security permissions to allow for data to be queried regardless of where it is stored, giving rise to virtual “data lakes.”

While the digital universe is doubling every two years in size, and the number of information “containers” every 20 months, the number of IT professionals on the planet may never double again – or at least not for 20 years. In the context of the digital universe, the number of gigabytes per IT professional will grow by a factor of eight between now and 2020, while the number of devices on the IoT grows by a factor of 2 (not counting virtual devices).

None of these challenges are insurmountable. Most companies are somewhere along on each of the vectors. There are more virtual servers installed now than physical servers. Data analytics software is already a $40 billion market and growing 10% a year. Security products and services are a $50 billion market. Cloud computing now accounts for 5% of total IT spending, growing to 10% by 2020. But real transformation to a data-driven or software-defined enterprise is an all-hands-on-deck imperative. IT alone can’t make the transition.

Perhaps the first order of business is information security and privacy. The irony of the digital universe is that, while most of it (almost 70%) is created by the actions of individuals – taking pictures, watching digital TV, being captured on surveillance cameras in airports – enterprises have contact with, and therefore liability and responsibility for more of it (85%), such as account information, email addresses, location stamps, and so on.

Not all the digital universe needs tight security, of course – old photos still on camera phones, digital TV shows watched and consumed, etc. But IDC estimates that at least 40% of it requires some level of security, from privacy protection to full-encryption “lockdown.” Unfortunately, the amount of the digital universe that needs protection, but which actually has protection, is less than half. Also unfortunately, the amount needing protection will grow as more of the IoT comes online.

On an enterprise level, the need for information security will reach outside the boundaries of the organization as third-party and public sector data become folded into ongoing analytics. Is the outside data clean? Have users opted in? Is it accurate? What are the supplier’s privacy policies and assurances to those from whence the data came?

The next order of business is probably organization around data. Traditionally, data stored for analysis is kept in data warehouses and must be cleaned and prepped carefully for later retrieval. Data is cataloged in a way that makes it easy to analyze predefined variables according to generally known data types.

But the data needed to support business transformation in the era of the Third Platform is much messier. It tends to be unstructured, diversely formatted, of uncertain accuracy (and sometimes uncertain origin), of unpredictable value, and often flowing into repositories and demanding attention in real time.

The role and importance of the datacenter (be it private/public cloud-based or otherwise) is being catapulted in the era of the Third Platform. The exponential growth in mobile (and wearable) devices, with little to no embedded storage and yet integrated with incredible processing power, will require data anytime, anywhere, on any device, and of all types. Datacenters must be prepared to feed this inevitable voracious compute appetite with data.

Together, these technologies and processes can slowly change the hidebound structure of today’s data stores to more egalitarian and flexible stores for tomorrow, which some vendors call “data lakes.” Data can flow in without prior classification and tools like Hadoop, parallel processing search programs, or complex event processing engines can process and classify the data later. Software is augmenting our limited capacity and bandwidth to make sense of the data deluge.

But these new software tools are not useful unless they are put to use in the service of specific business objectives. In a 2013 survey of 700 large North American enterprises, IDC found that fewer than 1% had achieved the highest level of Big Data usage – where the use of big data is operationalized and continuously providing process improvement realization. And these were the most likely candidates to be further along.