Big-Data Analytics & Cloud: The Perfect Storm

Most signs are pointing to a big increase in investment in big-data analytics and cloud in the coming year.

If you look at the field of business intelligence, it’s as though there’s a storm brewing, comprising big data and cloud computing, that can give more people scalable, affordable access to more information than ever before. How your organization weathers this storm will determine how well it's able to face the future. Big data is a phenomenon that has been around for a while and potentially allows a company to know more about its customers than ever before -- what their habits are, whether they’re likely to shop around for different products or services, and a range of other things that may not seem relevant, but could hold a series of keys to how to better do business.

I've written previously about how this creates an analytical picture that helps people do their jobs more effectively, creating a differential advantage for the savvy organization ready to put more data to work. Just think about TrackX, an asset tracking and supply chain solution that uses RFID tags, barcodes, and sensor technology to help manage physical assets. If there’s a bottleneck, TrackX helps them identify where it is and take remedial action. Anyone in the company has access to that information and can act on it. The value of such broad access can’t be overestimated. In fact, as cited by GE Software, some $10 trillion to $15 trillion could be added to the global GDP in the next 15 years based on the transformational advantages of this industrial Internet (see footnote 1 in the linked PDF).

A report by IDG, the International Data Group, shows that more than 42% of users have identified big-data applications as being in the top five areas of big-data investment. Just as importantly, 70% of enterprise organizations will deploy or have already begun big-data-related projects. They plan to spend, on average, about $8 million each. So, we are likely on the cusp of a major boom.

Right now, big-data projects are using cloud technology to propel them forward. Some recent events, not least the Snowden revelations, have made people wary about allowing their information to be held outside of their personal control, let alone by American providers. German law prohibits data being held outside the country at all, and French users are displaying a typically Gallic mistrust of anything that doesn’t have a French flag on it. Experts have estimated that billions of dollars have been invested in cloud computing by companies such as Deutsche Telekom and France Telecom, suggesting that there is an important regional divide. For these reasons and others, for many organizations the near-term cloud architecture bias will surely be private clouds, with hybrid and public clouds following a later enterprise adoption pattern.

Despite any reservations, revenues from public cloud computing are rising all the time. I’ve seen one estimate that has them reaching nearly $100 billion by 2017. And Amazon Web Services (AWS), is one of the key providers. These adopting customers include organizations like NASA, the Obama campaign, Netflix, and the CIA, so this is far more than a momentary fad. Solutions exist now for organizations to have access to simple, powerful reporting and analytics, starting at about 48 cents an hour. This type of easy access and affordability will drive an entirely new analytics consumption pattern.

And just as with big data, nobody knows exactly what applications will be developed to capitalize on the cloud, or at what level. Now that there is a high-performance handheld computer in your pocket, where people are accessing the cloud for stored music, streaming video, and every type of web-based information. In time, the idea that an organization used to employ its own (proprietary) significant local storage and computing power for any business application will seem bizarre, as shared cloud environments manage more computing tasks from end-to-end.

Finally, there is the information itself. Any suite of analytic applications will only be as accurate as the data it uses and can only be useful if the products of that data are in the hands of people who are able to interpret it and apply their findings. This is why the mobile device has gone from something that people use to play Angry Birds to being one of the most useful tools available for any person or business. Using a phone or tablet, allied to a cloud application, can give you access to reports and other data wherever you are and allow you to make accurate, relevant, real-time decisions.

I’ve seen estimates that more than half of business will rely on mobile devices for insight delivery by 2015, but there is a yawning gap between intentions and implementation before the handheld revolution realizes its potential. A report by Gartner also revealed that 85% of business users believe that having remote access to business intelligence tools is important, but only 8% are actually deploying them.

For me, and for other industry insiders, there is a clear way forward for businesses that have the foresight and pragmatism to realize it. Big data will be applied to solve business problems using cloud software to open up access, allowing individual users to read and input data on their mobile devices. Bound up in that template for the immediate future is the fact that almost every employee will need to be able to work with data in order to make the work environment as efficient as it can be. The risks of being caught out in the storm are great, but the benefits of capitalizing on it are huge.

Brian Gentile is Senior Vice President & General Manager of the TIBCO Analytics business unit of TIBCO Software, having joined the company as part of their acquisition of Jaspersoft Corporation, where he served as CEO since 2007. Brian has worked deeply in Big Data and Cloud ... View Full Bio