The big theme of the last year has been big data. There was a lot of innovation in many areas, but big data has had a huge impact on both how organizations plan their overall technology strategy as well as affecting other specific strategies such as analytics, cloud, mobile, social, and collaboration.

Steve kicked off by addressing the confusion (and cynicism) about the definition of “big data” — noting that people had supplied at least twenty different definitions in response to his question on Twitter. The popularity of the term has been driven by the rise of new open-source technology technology such as Hadoop, but it is now typically used to refer to what Gartner calls “extreme data”.

Extreme data is on the high end of one or more of the ‘3Vs’: Volume, Velocity, and Variety (and some note that there’s a fourth V, validity, that must be taken account of: data quality remains the #1 struggle for organizations trying to implement successful analytic projects).

To address all of these effectively, any “big data solution” has to encompass a wide range of different technologies. SAP is proposing a new “Big Data Processing Framework” that includes integration to new tools such as Hadoop, but also addresses the need for the other ‘V’s for a global approach to ingesting, storing, processing, and presenting data from both structured and less-structured sources. Many more details about this framework will be available in the coming months.

I am an Innovation Evangelist for SAP. This blog only contains my personal views, thoughts and opinions. It is not endorsed by SAP nor does it constitute any official communication of SAP. You should follow me on Twitter and Subscribe by email