6 Big Data Advances: Some Might Be Giants

In a full week for big data news, EMC, Intel and Revolution Analytics announcements stand out.

Cloudera makes Hadoop safer.

Here we're entering into the realm of incremental announcements of the kind we're seeing from lots of big data platform vendors. The theme is improved reliability, manageability, compliance control and so on, with upstart big data vendors more or less matching functionality already available on more mature platforms. Case in point, Cloudera announced Tuesday that it's introducing Cloudera Navigator, Cloudera Enterprise BDR and version 4.5 of Cloudera Manager software.

Cloudera Navigator is an all-new data-management tool for Hadoop that provides data access control, provisioning and auditing capabilities. This is essential stuff for security- and privacy-sensitive healthcare, financial services and government organizations, so it's more like checking an overdue requirement than bringing a breakthrough to Hadoop. The same goes for BDR, which offers enhanced backup and disaster recovery capabilities. Cloudera Manager 4.5 supports rolling updates, so you can now stage upgrades of Hadoop software without bringing down an entire cluster. The update also enhances monitoring capabilities.

Again, this is all stuff that big organizations will demand from Hadoop as they move applications out of pilot and into production. Cloudera is executing on obvious requirements here rather than pioneering breakthrough capabilities.

The key appeal for Microsoft shops, beyond running on Windows, is that the distribution is integrated with Microsoft System Center for administrative control and with Active Directory for access control and security. It also works Microsoft's virtualization platform, including Hyper-V and the System Center virtualization fabric, making it the first distribution to run Hadoop on virtualized infrastructure. Hortonworks would underscore that it's 100% open source, but there's not much more to say than "it's finally here."

MapR and Google rev their engines.

Performance-oriented Hadoop software supplier MapR and Google announced Tuesday that they have set a new world record for the MinuteSort benchmark, sorting 1.5 terabytes in 60 seconds using Google Compute Engine and the MapR Distribution of Apache Hadoop. The previous record, held by Yahoo with open source Apache software, was about 0.5 terabytes, and that was done with 3,452 physical servers, versus the 2,103 virtual/cloud servers employed by the Google Compute Engine.

Does that say more about Google's cloud service or MapR's software? "This shows off the ability to use Google Compute Engine to get very high-performance computing very quickly and in a very consistent manner, and you're seeing incredible IO performance," Google Developer Programs Engineer Marc Cohen told InformationWeek.

MapR's Jack Norris, VP of marketing, hastened to add that MapR's architecture -- which uses the Network File System in place of HDFS and implements other proprietary performance tweaks -- takes better advantage of Google Compute Engine power than would rival Hadoop architectures. For now, MapR has an exclusive partnership with Google. Unfortunately, the Google Compute Engine and the MapR Hadoop service running on that engine are still in limited preview release, and Cohen declined to comment on when it might become generally available.

Does a benchmark test have anything to do with real-world use of Hadoop? "A sort is a special case of a large MapReduce job, but any MapReduce work involves a lot of parallelism," said Cohen. "It requires that you bring up and coordinate thousands of machines, so we're proud of the fact that we can do it reliably and very fast."

So there you have it. Six announcements from Strata ranging from potentially huge -- if they live up to their billing -- to nice-to-have, expected and, in the last example, a gee-whiz lab stat. We're still at a stage in big data where everybody sees huge potential, but we always seem months, if not years, away from seeing real-world proof that the breakthroughs will be as tantalizing as the announcements suggest.

Hello Doug Henschen,A month ago, I was at a crossroads. I was unhappy with my job, I no longer wanted to be living at home, I was tired of being three states away from my boyfriend, and I was sick of feeling unfulfilled.I loved the job, but the people turned out to be less than willing to train and accept me, so back home I went.http://www.slideshare.net/nikhil_bath...

Great round-up of last week's Strata news Doug, thank you. We could not agree more re: SQL. At Causata, we believe SQL continues to be the industry-standard mechanism for breaking down data prisons. In our new v4 release, we've integrated a SQL Interface with our enterprise-class DMP and customer experience applications. We are utilizing Cloudera's Hadoop Distribution and management tools combined with our proprietary predictive analytic and machine learning algorithms. This combination of SQL, predictive analytics and machine learning will help marketers and analysts utilize their big data assets to create more meaningful and personalized customer offers.

Very insightful article Doug. One other open source technology to mention is HPCC Systems from LexisNexis, a data-intensive supercomputing platform for processing and solving big data analytical problems. Their open source Machine Learning Library and Matrix processing algorithms assist data scientists and developers with business intelligence and predictive analytics. Its integration with Hadoop, R and Pentaho extends further capabilities providing a complete solution for data ingestion, processing and delivery. In fact, a webhdfs implementation, (web based API provided by Hadoop) was recently released.More at http://hpccsystems.com/h2h

Lots of competitors are heaping FUD on EMC's move, but it's one of the most talked-about announcements of the week. That says something. It may be true that EMC is not currently a factor as a Hadoop distributor. Even if EMC's fortunes don't change and it doen't become "the Hadoop distributor," this aggressive embrace of Hadoop might get more conservative players to take action. IBM? Oracle? are you really embracing the future or just holding it off?

Most IT teams have their conventional databases covered in terms of security and business continuity. But as we enter the era of big data, Hadoop, and NoSQL, protection schemes need to evolve. In fact, big data could drive the next big security strategy shift.

Why should big data be more difficult to secure? In a word, variety. But the business won’t wait to use it to predict customer behavior, find correlations across disparate data sources, predict fraud or financial risk, and more.