The Rise (Again and Again and Again) of Edge Computing

Where computing power resides is like a sine wave, ebbing and flowing from being centralized to being on the edge. When I first started working with computers, we all programmed from mainframes via a dumb terminal. Computing resources were all housed in the mainframe. (BTW – back then we had a kind of early IM capability and that is how we would communicate with each other on what time to go to lunch).

Then the tower personal computers came to the fore and computing resources moved to the edge. That was the end of the company mainframe. Then they became “movable” in the form of laptops. But laptops were expensive so “thin clients” became the rage as companies attempted to control CAPEX costs and resources moved back to being centralized. Not sure that really worked though since road warriors on planes couldn’t get anything done (no WiFi back then on a plane).

Smartphones are really like computers now. And with smartphones, tablets and laptops of all forms, the power has moved to the edge.

But has it? With the rise of cloud computing, everything is being centralized again. Or was, but now we’ve seen (the inevitable I have to say) issues of latency with the cloud. Especially as it relates to real time communications. With virtualization, you can now move “cloud” resources to the edge, closer to the customer, maybe even on premise, as location and function can be separated. Enterprise can still have a cloud, but they can choose what to keep in premise (edge computing) versus what can truly be in a cloud. They might choose this because of security concerns, latency concerns, or spending concerns with regards to backhauling data.

There are more choices now, and it’s really a hybrid world where you can pick and choose what you want, but in the end it’s all a big sine wave.