3 Big Data Challenges: Expert Advice

The "big" part of big data doesn't tell the whole story. Let's talk volume, variety, and velocity of data--and how you can help your business make sense of all three.

The velocity aspect of big data is tied to growing demand for fast insights. That's relative, of course, but according to The Data Warehousing Institute's Big Data Analytics survey, released in September, 13% of analyses are now rerun or rescored "every few hours" or in real time, compared with 35% monthly, 14% weekly, and 24% daily.

There are many examples of data that might demand analysis in real time or near real time, or at least in less than a day. RFID sensor data and GPS spatial data show up in time-sensitive transportation logistics. Fast-moving financial trading data feeds fraud-detection and risk assessments. Marketing analyses, too, are increasingly time sensitive, with companies trying to cross-sell and up-sell while they have a customer's attention.

Combine marketing with mobile delivery, and you've entered the fast-moving domain of Bango Analytics. Bango started out in the mobile payment business, but it discovered some companies were setting the price to zero on its tools, simply to track access to mobile content. So two-and-a-half years ago it started a separate Bango Analytics service promising near-real-time insight.

Bango measures traffic to mobile Websites and ads and the use of mobile apps. The content might be articles in the case of media sites or storefront pages in the case of online retailers. Bango's custom tracking app runs on Microsoft SQL Server, so the company uses SQL Server Integration Services (SSIS) as the workflow engine for overall integration, starting with extraction. It applies transformations, rules, and precalculations using queries and scripts written in SQL, a step that minimizes processing demands in the data warehouse environment. SSIS puts the final results into CSV text files and loads the data into an Infobright-powered data mart.

To keep its transactions running smoothly, Bango copies transactional information into an operational data store on a different tier of servers, and SSIS extracts the data from there. To get up-to-date information into the company's Infobright analytic data store as quickly as possible, it keeps batches small, so it only takes five to six minutes from a click on a mobile device until the interaction is made available in Infobright. Reports and dashboards are delivered to customers through a Web interface. Near-real-time data is "a key selling point of our product, so the faster we can load, the better," says Tim Moss, Bango's chief data officer.

Bango processes billions of records each month, says Moss, yet the company's total 13-month data store has yet to break into the double-digit terabyte range. Here again, the compression supported by Infobright's column-oriented database helps keep the storage footprint small. Nonetheless, Moss says Bango is preparing for high-scale and high-velocity demands, knowing that the amount of mobile content and the number of mobile campaigns and apps will grow, and with it, pressure to show what's catching on.

Marketers have spoken on this point. The weekly reports that were good enough two years ago must now be generated daily, and that's led to the development of near-real-time dashboards. This trend has made once-exotic information management techniques such as micro-batch loading and change-data-capture much more common.

Taken together, volume, variety, and velocity are emerging as the three-headed beast that must be tamed as IT teams look to turn big data from a challenge into an opportunity. But old demons haven't gone away. Complex data such as supply chain records or geospatial information can prove to be more of a bottleneck than varied data. And a large number of users (1,000 plus), many queries, or complex queries that call on multiple attributes or need complex calculations--all can lead to performance problems. Fail to anticipate demands along any one of these dimensions, and you may outgrow your data warehousing platform much sooner than expected.

The technology for handling big data will get better. In terms of the information management that must be done before the data gets into the warehouse, we're still in the early days. Expect to see new tools, services, and best-practices developed to address the thorniest problems. Of course, your business partners want results now, so draw on your experience and exploit the tools you know. But it's also time to begin experimenting with new approaches. Big data isn't getting any smaller.

I noticed that you havenG«÷t mentioned the HPCC offering from LexisNexis Risk Solutions. Unlike Hadoop distributions which have only been available since 2009, HPCC is a mature platform, and provides for a data delivery engine together with a data transformation and linking system equivalent to Hadoop. The main advantages over other alternatives are the real-time delivery of data queries and the extremely powerful ECL language programming model.Check them out at: www.hpccsystems.com

To learn more about what organizations are doing to tackle attacks and threats we surveyed a group of 300 IT and infosec professionals to find out what their biggest IT security challenges are and what they're doing to defend against today's threats. Download the report to see what they're saying.

IT pros at banks, investment houses, insurance companies, and other financial services organizations are focused on a range of issues, from peer-to-peer lending to cybersecurity to performance, agility, and compliance. It all matters.

Join us for a roundup of the top stories on InformationWeek.com for the week of November 6, 2016. We'll be talking with the InformationWeek.com editors and correspondents who brought you the top stories of the week to get the "story behind the story."