The Internet of Autonomous Things: Strong Devices, Weakly Connected

A look around the Internet of Things space will have you convinced that the future is small, weak devices, sensors and actuators, in constant communication with powerful intelligence in the cloud, which does all the interesting bits of work. You can see this by the fact that most every "Internet of Things Platform" is a data collection and command pushing platform. Each has a twist, some times own hardware, some times an innovation in the networking stack or the business model. But the underlying assumptions are surprisingly stable: Weak devices, strongly connected.

Have a look at the trends, though. Processing power and memmory are getting faster, smaller, and cheaper. Just as we got used to the Raspberry Pi and the mk808, we got Intel's Edison, a dual-core chip with half a GB of RAM, top-tier connectivity. Then, just today, The Raspberry Pi foundation released its own Compute module, for production use cases.

On the other hand, mobile spectrum isn't a given. And that's because telecoms are a common resource divided between devices, and subject to physics and the telcoms' desire to invest, not Moore's law. Despite the recent introduction of 4G, the UK is already fretting about the coming mobile spectrum crunch, and is heavily investing in R&D to avoid it. Perhaps you'd think that this only applies in rural areas, until you check your mobile phone's reception in the City of London during rush hour. When everyone's phone is pinging for data at max power, there's not enough to go around. Now think of a world with 6.5 devices per inhabitant. And this isn't to say rural areas aren't an issue. Broadband and decent mobile signal can only be hoped for in the well-covered urban centres, and that only goes for the developed world.

Trends are good, but use is king. The bottom line is that you don't go far in the embedded/connected world before you see the need for autonomous behaviour, beyond simply streaming data to and receiving commands from a cloud.

Filtering: For bandwidth, storage, cost, power, or legal reasons, there is often a need to intelligently filter the information that gets broadcast to the cloud.

Reaction speed: Some times immediate action is needed, and there is not enough time to wait for the cloud to opine on a situation. Think of the ABS system in your car waiting to be told the correct action to take.

Quality of service: You don't want your device becoming inoperational when the network goes down, nor do you want it to be ignoring data input until it can get back online.

Remember, this is a trend we've seen in the Web. The browser started as this weak outpost that could basically render some text. Now, it runs entire applications (including, apparently, high-end games). We expect our web apps to continue working when we're offline, and the UI to react without the lag of a server round-trip.

Putting it all together, this is a dimension of the future Internet of Things we see coming. Strong devices, weakly connected to their cloud masters. Devices running complex, high-level code, making real decisions, and devices that continue working with or without network connectivity. This is not to say that the weak device scenarios are invalid, but the realities around network availability and use cases will make them quite often veer towards more autonomous behaviour.

Team Resin.io

Resin.io allows you to 'git push' to your hardware devices, like you can today with servers in the data centre. We cross-compile your code in the cloud, turn it to a Docker container, and ship it!