IoT Essentials: The Top 30 Design Criteria

Recently, a client asked what features, capabilities and attributes they needed to possess in their Internet of Things (IoT) development project to make their offering attractive for the North American marketplace. So, I offered the following key bullet points for them to consider based upon the issues that we have repeatedly faced in the deployment of other solutions from existing vendor partners.

I am sharing these points here with the idea of helping others to create next generation IoT solution that will deliver comprehensive, ready-for-the-market deployments. Comments and feedback at improving this list are welcome. Have I missed any pressing issues from your perspective?

Lineman installing a power-line IoT sensor

The following comments are not written in any specific order and the priority of these points will change based upon the developer, projects, and the maturity of the developments.

Develop an end-to-end ecosystem, providing only parts of a solution is largely problematic and shifts the responsibility for systems integration upon to the end users. Often, they have no ability to tweak software for incompatibility issues, which are common at this stage of the IoT lifecycle. So, no science projects!

Adhere to an open architecture, standards-based approach. Avoid customized or modified solutions.

Ensure that all radio solutions come with full compliance to the domestic regulatory conditions, for North American that can include, but are not limited to: FCC, ISED, UL, CSA, etc. Other standards for RFI / EMI and safety / explosion-proofing may be required in specific industries – mining, oil & gas, nuclear, utilities, etc.

IoT has been earning a poor reputation for security concerns, so incorporate a robust security solution into the offerings that uses NIST compliant data encryption capabilities wherever required.

Use blockchain wherever able

Security needs to be harmonized and built-in from the start, so no a bolt-on solutions later.

Offer Edge Computing as an option to your solution, since the trend is towards “pushing the intelligence to the edge”. Experts predict that by 2025 perhaps 50% of all IoT raw data will live only on the network fabric and will never go to the centre (cloud, data centre)

IoT networks are constrained networks, so using the right constrained network protocols is critical. Today, CoAP and MQTT are dominant. Backend systems often can only support one or the other, so ensure that your solution can use both or you need to provide real-time protocol conversion on the data as it is in flight over the networks

Artificial Intelligence is dominant today and most buyers expect to leverage it in their solutions. IoT churns out Big Data, so the only practical means to deal with this big data and to make meaningful sense of it is with AI. Therefore, plan on AI or the addition of AI into your solution.

IoT is an IPv6 solution. So, respect that aspect.

Seamless integration with other technologies like NB-IoT, LTE-M1, wired, and private wireless networks needs to be transparent and devoid of excessive latency at connection points.

While cloud based headend platforms are desired, it is also good for some customers to host their own platforms.

Integration with third party platforms is good too.

The headend platforms must manage all IoT services, regardless if you provide your own or with a third-party provider. These services include: on-boarding, authentication, security, data management, monitoring and alarming, software updates, trouble ticketing, published APIs to other platforms, remote configuration and management, re-imaging of the device, traffic grooming, etc.

If the headend platform is host on the cloud, be cognizant of round-trip latency for on-boarding and security when traffic flows globally. Latency of over two minutes has been seen from Canada to France to Canada for a LoRa end node on-boarding and authentication process, thus it timed out the process and rendered the solution useless. Hosting cloud platforms domestically may be necessary as a result.

Data residency is rising up to be a critical issue. Therefore, global hosting can be impossible due to privacy legislation. Not all countries share equal levels of privacy legislation. For some applications, such as medical or cannabis use, the data cannot transit some borders due to privacy implications. So, data residency is a major issue today.

Multi-tenancy of many users sharing common IoT infrastructure is critical and necessary. So, isolation of traffic flows and protecting the rights of each user on the shared infrastructure is essential.

Data storage at the extreme edge, edge, cloudlet, and cloud, as well as logged for archive and legislative purposes is all important for robustness, redundancy, collaboration between apps, compliance, trends, patterns, computational reasons, and more. The ‘time to live’ issue is becoming critical for data storage. Protecting the data wherever it is stored is critical too.

External sources of data that may stream to an edge computing device dictate bidirectional flows of traffic. An example might be weather data connecting to a farm irrigation system to manage the solution on raining days.

Node to node communication whereby end nodes collaborate with one another is also expected to emerge as a significant feature. So, permitting data flows to organically move as required is key, therefore, shipping data to the centre and then back to the edge is problematic.

Exception data is expected to be central to any data model. Why ship data to the cloud if no changes have occurred at the edge? Just transmit exceptions.

Derived data is a summary of the edge raw data and shares a condensed version or a proxy data version with the cloud, when there is little value to capture the raw dataset to the cloud.

Data on-demand is the ability to flow raw datagrams to the cloud when necessary. This would not be a normal scenario, but one that is often event triggered.

Users want solutions that are interchangeable and vendor agnostic. They want to tender for devices to add to the IoT network and attach these devices seamlessly. We have this capability with Wi-Fi today and they want the same approach used in IoT.

Develop your solution incrementally and with the agile approach. It is okay not to have every feature for day one, but a roadmap and a vision of where the solution will evolve to become is required. Begin with the end in mind; and develop your solution as a journey in order to get to market quickly and efficiently to realize revenue. Continual improvement and feature set enhancement is expected over the lifecycle of the offering.

Listen to your customers. Ask them what they want, and which features, and capabilities are important to them. Too often, we see solutions that are developed in isolation from practical application, so a mismatch of fit results. ‘Fit’ is the most important idea for a customer.

Virtual Reality (VR), Augmented reality (AR), and Data Visualization as critical. With Big Data, how will the users make sense of, and comprehend the meaning of, the data? So, devise a way for the customers to understand the results easily and quickly.

With Big Data, never forget the five “Vs” (though there are those who would argue there are maybe even more Vs):

Volume – though “big data” doesn’t need to be of any specific size, we can safely say that you will not be able to load big data sets into Microsoft Excel

Velocity – just how fast data is being received, as well as how quickly the data needs to be analyzed so it can be used to make meaningful decisions

Variety – the number of data sources that make up your datasets, including sensor data, plain text, rich documents, video, social analytics, etc.

Veracity – how reliable your datasets are, which is especially important because if you cannot trust the data in the first place, no amount of analysis will yield good results.

Visualization – how do we achieve meaning and understanding from the data

Data cleanliness – a Data Scientist once told me that she spends 85% of her time preparing and cleaning the data before use. Clean data is an imperative. So, how can we reduce this data preparation process and have clean data from the outset?

Understand exactly what IoT is and is not. The definition is often very blurry as everyone is jumping on the proverbial bandwagon and is calling just about every solution an IoT solution. These myriad of descriptions are often misdirection and marketing ploys. So, beware of false claims and confusing inclusions.

Data visualization is essential to comprehending the big data derived from IoT flows

This list is meant as a starting point of design and issue considerations for new IoT solution offering concerns. It is not complete and you may have issues that need to be added to this list. Hopefully this list is a good starting point and saves you going through the hassles and challenges that I have endured to get to this point in understanding.

It is not mandatory to resolve every feature and capability within your stand alone offering. You can solve these issues by collaborating and partnering. For example, maybe you make a great sensor and gateway solution, but do not have a headend solution. If you make your IoT technology compatible with the Cisco LoRa or Mesh 2.0 offerings, that will get you a great headend. In that case, you can leverage Cisco’s comprehensive headend to augment your IoT end node devices and make it a complete offering. The same can be said of leveraging IBM’s Cloud, Analytics, Blockchain, and Artificial Intelligence – Watson IoT. Through collaboration, teaming, partnering and applying open standards you can still get to the desired outcomes. So, you do not have to make every aspect of the IoT ecosystems yourself, but can achieve it all in other creative ways through collaboration.

About the Author:

Michael Martin has more than 35 years of experience in systems design for broadband networks, optical fibre, wireless and digital communications technologies.

He is a Senior Executive with IBM Canada’s GTS Network Services Group. Over the past 13 years with IBM, he has worked in the GBS Global Center of Competency for Energy and Utilities and the GTS Global Center of Excellence for Energy and Utilities. He was previously a founding partner and President of MICAN Communications and before that was President of Comlink Systems Limited and Ensat Broadcast Services, Inc., both divisions of Cygnal Technologies Corporation (CYN: TSX).

Martin currently serves on the Board of Directors for TeraGo Inc (TGO: TSX) and previously served on the Board of Directors for Avante Logixx Inc. (XX: TSX.V).

He serves as a Member, SCC ISO-IEC JTC 1/SC-41 – Internet of Things and related technologies, ISO – International Organization for Standardization, and as a member of the NIST SP 500-325 Fog Computing Conceptual Model, National Institute of Standards and Technology.

He served on the Board of Governors of the University of Ontario Institute of Technology (UOIT) and on the Board of Advisers of five different Colleges in Ontario. For 16 years he served on the Board of the Society of Motion Picture and Television Engineers (SMPTE), Toronto Section.

He holds three master’s degrees, in business (MBA), communication (MA), and education (MEd). As well, he has diplomas and certifications in business, computer programming, internetworking, project management, media, photography, and communication technology.

Search

Search for:

Focus of these posts

Internetworking communication technology is changing our world. It impacts every aspect of commerce and personal life. A great network design is ubiquitous or transparent to the user. Data rates are going in two very different directions. Broadband fibre optics is commonly providing 10 gigabits or more, while the Internet of Things is connecting millions of devices, albeit at just 10, 30, or 100 kbps. Connectivity stitches the world into one shared fabric.