Distributors

Analytics is often described as one of the biggest challenges associated with big data, but even before that step can happen, data has to be ingested and made available to enterprise users. That's where Apache Kafka comes in.

Originally developed at LinkedIn, Kafka is an open-source system for managing real-time streams of data from websites, applications and sensors.

Essentially, it acts as a sort of enterprise "central nervous system" that collects high-volume data about things like user activity, logs, application metrics, stock tickers and device instrumentation, for example, and makes it available as a real-time stream for consumption by enterprise users.

Kafka is often compared to technologies like ActiveMQ or RabbitMQ for on-premises implementations, or with Amazon Web Services' Kinesis for cloud customers, said Stephen O'Grady, a co-founder and principal analyst with RedMonk.

"It's becoming more visible because it's a high-quality open-source project, but also because its ability to handle high-velocity streams of information is increasingly in demand for usage in servicing workloads like IoT, among others," O'Grady added.

Since being conceived at LinkedIn, Kafka has gained high-profile support from companies such as Netflix, Uber, Cisco and Goldman Sachs. On Friday, it got a fresh boost from IBM, which announced the availability of two new Kafka-based services through its Bluemix platform.

IBM's new Streaming Analytics service aims to analyze millions of events per second for sub-millisecond response times and instant decision-making. IBM Message Hub, now in beta, provides scalable, distributed, high-throughput, asynchronous messaging for cloud applications, with the option of using a REST or Apache Kafka API (application programming interface) to communicate with other applications.

Kafka was open-sourced in 2011. Last year, three of Kafka's creators launched Confluent, a startup dedicated to helping enterprises use it in production at scale.

"During our explosive growth phase at LinkedIn, we could not keep up with the growing user base and the data that could be used to help us improve the user experience," said Neha Narkhede, one of Kafka's creators and Confluent's co-founders.

"What Kafka allows you to do is move data across the company and make it available as a continuously free-flowing stream within seconds to people who need to make use of it," Narkhede explained. "And it does that at scale."

The impact at LinkedIn was "transformational," she said. Today, LinkedIn remains the largest Kafka deployment in production; it exceeds 1.1 trillion messages per day.

Confluent, meanwhile, offers advanced management software by subscription to help large companies run Kafka for production systems. Among its customers are a major big-box retailer and "one of the largest credit-card issuers in the United States," Narkhede said.

The latter is using the technology for real-time fraud protection, she said.

Kafka is "an incredibly fast messaging bus" that's good at helping to integrate lots of different types of data quickly, said Jason Stamper, an analyst with 451 Research. "That’s why it’s emerging as one of the most popular choices."

Besides ActiveMQ and RabbitMQ, another product offering similar functionality is Apache Flume, he noted; Storm and Spark Streaming are similar in many ways as well.

Up until 2013 or so, "big data was all about massive quantities of data stuffed into Hadoop," he said. "Now, if you're not doing that, you're already behind the power curve."

Today, data from smartphones and other sources are giving enterprises the opportunity to engage with consumers in real time and provide contextual experiences, he said. That, in turn, rests on the ability to understand data faster.

"The Internet of Things is like a second wave of mobile," Hopkins explained. "Every vendor is positioning for an avalanche of data."

As a result, technology is adapting accordingly.

"Up to 2014 it was all about Hadoop, then it was Spark," he said. "Now, it's Hadoop, Spark and Kafka. These are three equal peers in the data-ingestion pipeline in this modern analytic architecture."

ARN Distributor Directory

ARN Vendor Directory

Slideshows

Selling the benefits of the software-defined data centre

In looking ahead to the future of the data centre, a software-defined reality is emerging. This exclusive ARN roundtable, in association with APC by Schneider Electric, Cisco, HPE, Lenovo and Veritas, assessed the benefits of selling a software-defined data centre strategy, uncovering the key market trends and ongoing partner opportunities.

The channel toasted the top performing players within the Australian market during 2017, as the 11th running of the ARN ICT Industry Awards kicked off with a Champagne Reception at the Hyatt Regency in Sydney - sponsored by Westcon-Comstor and Juniper Networks.

Copyright 2017 IDG Communications. ABN 14 001 592 650. All rights reserved. Reproduction in whole or in part in any form or medium without express written permission of IDG Communications is prohibited.