The need for speed: How fast data could make or break industrial IoT

The industrial internet of things can greatly improve the connectivity, efficiency, and scalability of an organization, while saving time and money

Companies everywhere are on the hunt for new ways to compete in their rapidly evolving markets, and nowhere is this truer than in traditional industries like energy, manufacturing, and transportation. From new opportunities to lower costs by increasing efficiency to more fundamental changes like evolving beyond selling capital goods to provide services, effectively turning expertise into a revenue generator, these companies are all working mightily to leverage technology as a change maker.

In almost every instance, data is the key asset. Not just data from internal systems, but a flood of data from sensors and related equipment throughout the supply, production and logistical chain—even those monitoring in-field customer deployments. Often referred to under the general umbrella of the industrial internet of things (IIoT), this connection of equipment, suppliers, distribution networks, and more works to generate a constant stream of data.

The ability to make use of data can be critical to the success of current and future business operations. For example, the data can significantly increase reliability through proactive maintenance and improved design or operating procedures, if used correctly. The data can be used to adapt supply and optimize distribution networks based on demand, cost, price, etc., effectively optimizing in real time to minimize oversupply or undersupply and lost revenue. It can even provide customers with better insight and understanding though direct access to up-to-date information on usage, cost, status and more.

Why IIoT projects fail

Even though the promise of IIoT is tantalizing, it has often proved out of reach as many companies stumble in their IIoT projects. In fact, a 2017 study conducted by Cisco showed that almost 75 percent of IIoT project are failing, with 60 percent not even making it out of the proof-of-concept stage.

For one thing, these IIoT projects can significantly increase IT complexity, and too many have failed in trying to piece together multiple platforms that quickly became too unwieldy to deploy. Similarly, these companies suddenly find themselves struggling with fundamental implications of shifting to a software and services model, such as contractual service level agreements, even as the scale and volume of internal and external connections grow.

Often the culprit is that these companies are attempting to simply retrofit and apply existing batch-data paradigms and processes to the fundamentally new requirements of IIoT. What these companies are missing is that IIoT demands a new approach, one that’s built around streaming and real-time processing, rather than cobbling together solutions that are inadequate for the immediacy needed in an IIoT ecosystem. In that sense, success lies in starting with a clean sheet of paper, simplifying the technology stack, and consolidating systems and technologies to ensure that data processing can keep pace with the velocity of rapidly arriving data.

A successful IIoT project

To help illustrate this point, let’s look at a successful IIoT project involving a large energy company. The company’s objective was to analyze sensor data in real time, assess changes in supply and demand, understand equipment and distribution availability, and alert engineers to potential problems before they escalate.

The energy sector is comprised of a complex supply and distribution network built from a combination of highly sophisticated capital equipment and highly fragile components. Small failures trigger a cascade of events with immediate negative impacts, ranging from local outages to large-scale grid instability. Even without failures there is significant variability, with constant fluctuations in supply, demand, and price. Any inability to adjust to these fluctuations in supply and demand—for example, by inadvertently oversupplying against unexpectedly low demand—wastes energy and thus incurs cost for no benefit. Compounding matters, service level agreements can add stiff penalties for failure to properly predict and meet requirements.

The energy provider in question sought to improve operational efficiency by moving to a data-driven architecture for its internal monitoring environment. But the moment-to-moment changes in the power grid meant that historical batch processing would be woefully insufficient for the task, instead requiring a technology stack capable of processing streaming data as it arrived.

Now we get to the heart of the challenge with IIoT, because not even many streaming technologies are up to the scale of this task. In the case of the energy provider, the company wanted to alert engineers of any problems using data from real-time sensors from throughout the power generation network. Special adapters were used to collect and aggregate the data from tens of thousands of sensors, and each sensor in turn measured tens of thousands of operational data points per second. The result is hundreds of millions of messages per second that need to be gathered then processed, analyzed for context and forwarded to front-end monitoring systems, all in less than one second.

Unfortunately, most streaming technologies struggle at this scale, requiring complex deployments that lead to performance challenges. In fact, the energy company initially attempted to process the flood of data using the tried and true Apache Kafka and Apache Spark streaming solutions, only to experience substantial latencies.

At that point, the company made an essential pivot to a more modern streaming solution based on the newer Apache Pulsar streaming processing platform, which offers a modern, scalable architecture that delivers substantially greater performance. The latencies were reduced to an acceptable level, and Pulsar’s simplicity and ease of deployment and use allowed the company to go from proof of concept to production, even for this large-scale project, in a little over six weeks.

This is where the magic starts for IIoT, because once the infrastructure to gather and assess data in real time is available, the opportunities start to multiply. For example, the energy company is planning to use the aggregated data to feed a mobile app that allows consumers to better see and understand energy usage, ultimately allowing them to operate in a more ecofriendly manner. The company is working to connect additional third-party adapters, allowing it to stream even more varied forms of sensor data for additional insight.

This is but one example of the transformation that’s promised by IIoT. While complex and insufficient IT infrastructure clearly caused stumbling blocks for many initial IIoT forays, a new wave of simpler and more powerful streaming data solutions has hit the market to address these issues. Because, as we’ve seen, the secret to making or breaking IIoT success lies in fast data.

This article is published as part of the IDG Contributor Network. Want to Join?

Karthik Ramasamy has over two decades experience in real-time data processing, parallel databases, big data infrastructure and networking. He is co-founder and chief product officer at Streamlio, and was engineering manager and technical lead for Real Time Analytics at Twitter, where he co-created Apache Heron, the company’s real-time processing engine.