TRENDING

Reality Check

The fallacy of 'fog computing'

Recently, a Wall Street Journal article declared that cloud computing was passé, and the future was in a new “type” of computing being marketed by Cisco called “fog computing.”

This is why reporters should be trained not to parrot the marketing hype of large companies. Just the fact that the two companies mentioned in the article, Cisco and IBM, are mere bit players in the cloud computing space that are desperately trying to “break into” the lead pack should raise a red flag on the judgment of the article.

First let’s examine the meaning of fog computing and then explain why government IT managers should avoid using this term.

Given that most of the search results on “fog computing” are from Cisco, we need to go the source of this term to get any clarity. Here is Cisco’s definition: “Fog computing is a paradigm that extends cloud computing and services to the edge of the network. Similar to cloud, fog provides data, compute, storage and application services to end-users.” Clear as mud, right?

Let me help Cisco out and explain the concept. Basically, the company is trying to make the case that the Internet of Things, which is millions and possibly billions of Internet-connected sensors that are all around us and connected to the cloud, equals “fog computing” in the same way that actual fog is loosely equivalent to “clouds all around us.”

So, Cisco took the physical notion of a “cloud on the ground” and tried to carve out a new computing category by combining two existing categories (Internet of Things + cloud). Now that we understand this sleight of hand, let’s examine why the concept should be avoided.

The concept of fog computing is tied to a physical location, in this case the ground or “clouds all around us.” Unfortunately, this location-dependent concept is the antithesis to the notion of cloud computing, which is based upon non-locational computing. In other words, a significant benefit of cloud computing is the fact that you don’t know where the servers reside in the same way that you don’t know where your electricity comes from.

Besides the fact that this aligns the concept of cloud computing with other public utilities, it also lets cloud computing scale up or down as needed precisely because it is not location dependent. Therefore computing capacity can be delivered from many, many locations.

As a metaphor, fog is generally considered to be a bad thing. The “fog of war” refers to the confusion on the battlefield that can potentially lead to fratricide. So, here again we see this marketing-driven concept fails to adequately describe this subset of cloud computing in a meaningful way. People generally don’t want to be “in the fog.” Sorry, Cisco.

Fog computing only serves to increase confusion about the cloud. I recently saw a trailer for a new movie called “Sex Tape” where a private, intimate video is accidentally uploaded to the cloud and a frantic couple tries to retrieve all the iPads that synced to the cloud and thus received the video. My favorite dialogue in the trailer is when Jason Segel says, “It went up, it went up to the cloud.” Cameron Diaz sarcastically responds, “And you can’t get it down from the cloud?” To which he angrily blurts, “Nobody understands the cloud! It’s a [F-word] mystery!”

It is this “cloud confusion” that Cisco increases with this unnecessary distraction.

The reality is that the Internet of Things is just one application of cloud computing. Adding more client devices, even sensors, to the cloud is not a new type of computing. In fact, the cloud is a necessary condition for the Internet of Things to work as envisioned. Muddying the waters here helps no one except a company that is trying to insert itself into a crowded cloud conversation by trying to make its Internet routers into edge computing devices.

One of the hardest jobs of government IT managers is to separate the technical wheat from the chaff. In this instance, don’t be fooled by the marketing hype which tries to lead you away from the path of sensible and secure cloud computing. In other words, don’t get lost in the fog!

inside gcn

Reader Comments

Thu, Oct 16, 2014
Harshit Gupta
India

I feel the views expressed in this blog are based on mere definitions. An empirical comparison of the two computing paradigms will be more beneficial for the reader.

Sun, Oct 12, 2014
Unesco Telemedicine

The need and the importance of the Fog is seen facing the SMALL DATA in Healthcare.
http://www.slideshare.net/OFRoca/small-data-the-fog-iot-iiki2014

Wed, Oct 1, 2014
Jake
Texas

Michael, honest question, in what part of ICT do you work (or more specifically, do you work in telecom)? Your quibbles with the marketing terminology used by Cisco doesn't really interest me one way or another, but your gross misrepresentation of fog computing is concerning. Was the WSJ article incorrect to say 'forget cloud computing' as though fog computing would replace it? Yes, as the two are complimentary concepts; fog computing is an extension of cloud computing for specific use cases. However, the primary draw of fog computing is that it will be needed for ultra low latency applications, with response times that even under theoretical lab conditions would be difficult or impossible to employ through cloud-based processing. One such potential application where fog computing may become necessary is with the employment of centralized RAN, where RRHs can be pooled to a single localized BBU 'hotel.' The latency and jitter requirements for this concept would be much more easily implemented through fog computing than cloud computing.

Thu, Aug 21, 2014
Adam Drobot
Dallas, TX

The wonder of blogs is that anyone can have an opinion - not always correct but still an opinion. Cisco has thrown a lot of effort marketing fog computing - now let's see what it means, because the write up in this blog is way off the mark. Flavio Bonomi who actually came up with the term "Fog" is a techie and not a marketeer. What Flavio observed is that there is an architectural trade off in the Internet of Things as to where the resources are located (computing, storage, communications,....). They can be in massive data centers or they can be at the edge but centrally managed. He simply argues that there are many applications where primary computing and storage assets close to the edge will result in better performance in terms of latency and jitter and eventually cost. This is specifically true of real time applications with tight response requirements. As an example the AllSeen Alliance effort is defining methods and protocols which allow proximate "Things" to interact locally peer-peer without having to ping pong back and forth to the cloud and to rely on and share local resources - Fog style computing and storage. The Industrial Internet Consortium is also on the same track.

Please post your comments here. Comments are moderated, so they may not appear immediately
after submitting. We will not post comments that we consider abusive or off-topic.