AI, Big Data and the cloud – how do they impact IoT?

IoB InsidersProfessor William Webb, CEO of the Weightless SIG, discusses what is needed before IoT can benefit from AI, Big Data and the cloud.

There is much discussion at present of artificial intelligence (AI), Big Data, mobile edge computing (MEC) and other concepts to bring to bear processing power to enhance the world of the Internet of Things (IoT). What difference will these really make?

To understand this, it is worth remembering that almost always the biggest problem in IoT is getting the data from the sensor or device to the cloud. For the most part sensors are cheap and simple, and developing central processing algorithms not overly difficult. But deploying a wireless solution that is cheap, ubiquitous, enables ten-year battery life and is globally harmonized is very tough. The closest we have got to date is GPRS but that is now moving towards end of life. Challengers such as Sigfox are slowly growing but all have flaws.

This means it is often premature to talk about the benefits of Big Data. The concept of being able to learn from large datasets is excellent, and it seems highly likely that many new insights will develop. Big Data has already been used with data derived from people to look at areas like the links between commuting and illness, and given a mass of new data from IoT could doubtless deliver many new insights. But this cannot happen until the data exists, and with perhaps only one to two percent of predicted IoT devices deployed it is still very much early days.

AI is somewhat different in enabling devices to act appropriately. This is most immediately obvious in the responses of a device like Amazon Echo or in the ability of a smartphone to display appropriate reminders and contextual information. These are not really IoT devices, more points of interaction into the Internet. Most IoT devices have limited interaction and just relay information such as whether the trash is full or what the temperature is. This information is then sent to the cloud where it could be an input to an AI algorithm which might, for example, activate the home heating earlier knowing a cold spell is expected and the home owner is due to get up early. Indeed, adding AI to a device like a fridge is probably wasted since it will need other contextual information to be able to operate intelligently. Broadly, then, IoT devices are unlikely to have AI themselves but be part of a system under control of an AI entity.

Moving computing power towards the edge of the network is most-often discussed for cellular networks where it can help reduce latency by making content available more quickly. Latency is rarely important for machines but the concept of more localized processing can be useful in some aspects of IoT, particularly where the information can be compressed before transmission. A simple example would be a car number-plate camera. The simplest solution would send back a pixel image of each car for the network to resolve. An alternative would be to recognize the number-plate in the device and send back just the characters. The latter would use less than one percent of the bandwidth but would require some local processing. In almost all cases such local compression is less expensive and requires less power than sending raw information. It also reduces the load on the core network.

The underlying message is that all of these computing paradigms have great promise for the IoT, but only once we have widespread connectivity of devices and availability of data. We are still probably two to five years from having an effective connectivity model during which time we can evolve AI and Big Data in the world of the Internet and connected people, ready for the advent of connected devices.