As industries such as energy, manufacturing, transportation, and healthcare embark on digital transformation efforts, IoT device and software adoption has been part of the growing global trend to digitize industries and processes. One consequence of having more data: customers of IoT tech are finding a need for real-time data processing coming from all the new sensors and smart devices they are installing. To answer that need, experts are predicting the rapid adoption of edge computing in conjunction with IoT adoption.

In the initial stages of IoT adoption, companies were focused on the information they could collect and analyze. We’re now witnessing the rise of Industrial IoT where companies are taking data, integrating it with their core applications and using it to continually improve (or sometimes reinvent) business processes to save costs, and enhance productivity and efficiency. As companies increasingly look to apply AI and ML to their data streams, the potential for continually self-tuning industrial processes arises. To do so will in many cases require architecting data flows to incorporate some data processing on-premise as well as other locations, based on latency requirements and bandwidth costs. Edge computing is vital for IoT applications because it reduces latency and the need for complete reliance on the cloud.

IoT Adoption Trends and their relation to the edge

Digital transformation efforts aren’t just in the factory-there are serious efforts underway to implement smart city initiatives. Whether it’s environmental monitoring, water and waste management, traffic control, or streetlights, IoT helps streamline productivity and workload by allowing devices to communicate with each other. Video cameras are versatile sensors that have been used in many smart city use cases. For example, in the case of video surveillance in public spaces in smart cities, the cost of edge compared to the cost of cloud was cheaper throughout a five-year period, explained Christian Renaud, Research Vice President, Internet of Things at 451 Research in a recent webinar.

Some popular IoT use cases implemented at the moment in smart city projects are video surveillance (70%) with an average generated data of 16.6 TB per month, environmental monitoring (48%) with 13.6 TB of generated data per month and water monitoring and conservation (36%) that generates some 18.3 TB of data a month, according to 451 Research.

As shown by the video surveillance use case, there is a tremendous volume of data being generated. For actions such as facial or object recognition to be taken, data processing will usually need to take place the device or nearby. Long term storage for later analysis could take place on systems further away from the camera system network. Industrial and retail uses of video, including product inspection and inventory would follow a similar pattern of processing and storage, as while robotics control systems would typically be placed in a factory floor.

Where (and how) to place IoT workloads?

IoT workload venues vary depending on latency, bandwidth availability, regulatory requirements and country-specific regulations on data collection. Whether data is processed at the device edge, infrastructure edge or the core cloud, when choosing the best location, 60 percent of respondents said security is a key factor, followed by cost (58%), networking connections (53%), infrastructure resiliency (45%), availability of staff and expertise (39%) and latency (30%) according to 451 Research survey data.

Key factors for IoT workload locations

Source: 451 Research

While the use of edge computing can increase application and service efficiency and promises to significantly cut back operational costs, there are challenges as well. Device connectivity alongside IoT security is a common roadblock in the way of digital transformation, and by extension, edge computing as well. Edge compute environments will play a key role in real-time data processing and storage near the source, and in certain applications, 5G mobile networks can really make a difference in transporting data for low latency workloads.

Chart: Typical latency requirements for applications

Source: State of the Edge

Edge-related workloads are ones that require on-premise data analysis and action near the data generation source due to very low latency requirements, while near-edge workloads need on-premise action for low to medium latency (see chart above for examples). Workloads that land in the core are usually enterprise applications that can tolerate latency and will leverage high-capacity fiber links for long-term data analysis and archiving.

The shift away from the enterprise-owned data center

By 2025, as many as 80 percent of companies will have stopped using traditional data centers, according to Gartner.

“With the recent increase in business-driven IT initiatives, often outside of the traditional IT budget, there has been a rapid growth in implementations of IoT solutions, edge compute environments and ‘non-traditional’ IT,” Gartner says. “There has also been an increased focus on customer experience with outward-facing applications, and on the direct impact of poor customer experience on corporate reputation. This outward focus is causing many organizations to rethink the placement of certain applications based on network latency, customer population clusters and geopolitical limitations (for example, the EU’s General Data Protection Regulation [GDPR] or regulatory restrictions).”

Addressing the need for security to enable growth

Manufacturers have frequently overlooked in-built security for IoT devices; they just rushed to market with vulnerable devices in a rush to gain market share and not be left behind. While US Congress and the UK have been looking into establishing some security guidelines for IoT, the regulatory environment, much like the technology, is still in a grey area. That doesn’t mean, however, that companies shouldn’t do more to mitigate the risk of vulnerabilities. Experts are increasingly warning that the prevalence of billions of IoT sensors and devices communicating with each other has led to the creation of a massive new attack surface that could be compromised in an attack against vulnerable infrastructure.

A lack of industry standards and interoperability problems may have an impact on edge computing. Still, MarketsandMarkets forecasts in its August 2019 report that the cloud infrastructure inability to process growing loads in conjunction with the high number of IoT devices, 5G networks, will drive a market valuation for edge computing of $6.72 billion by 2022, compared to just $1.47 billion in 2017, at a CAGR of 35.4 percent. Telecom and IT, two of the verticals with the highest network load and bandwidth demand, are predicted to have the highest market share for edge computing, while retail is predicted to be the most thriving segment due to its extensive use of sensors and cameras used for data collection and processing at the edge, instead of a data center or cloud infrastructure.

Grand View Research, on the other hand, is more reserved with predictions in its June 2019 report, anticipating a market worth $3.24 billion by 2025, with North America being the main growth driver. Grand View Research also states that healthcare and life sciences will be the top segments with the highest growth rate.

The key takeaway from all of these market forecasts: edge computing is widely seen as an area of immense growth potential and will soon be part of most industries’ digital transformation efforts.