Interoperability Issues Cloud IoT Vision

The industry needs to overcome interoperability issues if it is to fully tap the promise of the Internet of Things.

Just as microscopes let humans discover bacteria we could not see with the naked eye, the technologies behind the Internet of Things and big-data could help us "see" new realities. But to enable such vision we need ideas for how to overcome some interoperability issues.

The assumption that "things" are connected via sensors to networks is common these days. Indeed, factories have been highly instrumented for many years, but if we examine supply chains more closely we often lose visibility of products and processes precisely because of challenges in connecting to networks, which increasingly are wireless.

The Auto-ID Center at MIT developed a set of specifications for passive RFID wireless data collection systems at UHF frequencies, working with sister labs at Cambridge University, St. Gallen, Fudan, Keiyo, the University of Adelaide, and Kaist. The specs were licensed by GS1/EPCglobal, the worldwide association of all manufacturers.

The specs include a serialization scheme for uniquely identifying objects that has been widely adopted in the electronics industry and is spreading to other industries. Other wireless techniques to connect things include specifications at low frequency, high frequency, WiFi, Bluetooth, ZigBee, and UWB frequencies and protocols.

In addition to being able to "see" things remotely, it would be nice to add telemetry data to these transport protocols. Such data could tell us for instance how hot or cold, or how wet or dry objects are.

We could use diverse sensing techniques to measure physical, chemical, and biological qualities. For example, we could monitor the condition of bacteria, using various mechanical, optical, semiconductor, and biological sensors.

Unfortunately, there are very few common sensor specifications, especially for embedded systems that aim to minimize cost and power consumption. As a result, one hospital is running more than 90 network monitoring systems to validate, calibrate, and monitor medical devices and still has poor visibility on network interference, a hospital executive told me recently.

In my opinion , if all sensors ,irrespective of their make, are required to communicate with a standard set of high level commands such as the AT command set for modems , the world of IoT will become more manageable. These commands could be formulated based upon the parameters that the sensors are measuring . these parameters could be temperature, pressure, humidity, weight, location co-ordinates etc The AT like command set will then become a high level interface for all the sensors interconnected

The IOT protocols I have look at read like wire-level protocols defined by Electrical Engineers. I think that for better interoperability, IOT protocols need to be forumulated in such a way that they carry the required information but afford flexibility. IOT protocols may need to be managed by the IETF for this to happen.

This year White House Fellows Sokwoo Rhee, the founder of Millennial Networks, an early IoT startup spun out of MIT; and Geoff Mulligan, the head of the IPSO Alliance, certainly bring unique capabilities to this general problem area, although it is hard to imagine a single group tackling such diverse issues.

It will be interesting to see how this Industrial Internet US consortium is formed, what role there will be for international partners, and how individuals and companies can contribute.

In the interim, as has been suggested by some of the commenters, there is a multiplicity of forums including MIT consortia, for convening suppliers, customers and designers around a common set of requirements.

And that's only scratching the surface. Imagine all the software and data integration challenges that have grown exponentially with the new healthcare law. EMR will be pretty ineffective if one proprietary system can't read what another proprietary system is saying -- even if the hardware technology makes them compatible.

Sometimes, if you really have to get something accomplished, you simply make it happen. The idea of IoT, if it is assumed to be something brand new (which I don't think is the case), can make people think that, f'rinstance, you sprinkle dozens or hundreds of sensors throughout a facility, they auto-identify themselves, they automatically determine their location, and magically any computer on that net will be running monitoring and diagnostic software to give any operator a clear picture of ... something or other.

But really, this stuff has been happening for decades. The practical solution is that the designer of the system knows ahead of time where these sensors (or other devices, btw, we're not just talking about sensors) are located, each sensor or other device is uniquely identified to the network, and software provided in the applicable computers montitors and controls these devices as needed, for that particular system in that particular venue.

It's always good to be striving for more plug and play standardization. And I'm sure it's going to happen, via untold number of industry groups and conferences. However the problem with this idea that "you just turn it on and it works" is that these IoT systems are solving oodles of different problems. Not just one or a few.

For example, since hospitals were mentioned, all hospitals are not the same. The machinery in them is not the same. What has to be monitored remotely and controlled remotely is not going to be the same. And with each new medical device, say a new design MRI machine, will natrurally come new features and new parameters to be measured or controlled. So I tackle these problems a lot more on a case by case basis.