Smart Cities

A Smart City is a blending of current and emerging technologies being employed to allow a city to better manage its assets and deliver value to its residents. It is an emerging concept and still very much in exploration. The 2 core technology areas being investigated as the primary value creators are ICT (Information and Communications Technology) and the IoT (Internet of Things).

Smart City

What isn’t fully understood is the relationships between any or all of the list below:

what is worth measuring?

how to measure it (what sensor, what platform)?

how often?

in what detail?

to learn what from?

how quickly to transport the reading?

how much will it cost to transport the data?

via what technologies?

stored how?

accessed how?

analysed how?

Quite a big list.

Did you know there is a Smart Cities Plan for Australia? I only recently found out. And if you read through it there are more questions than answers. Which I think is the right balance given where we are positioned in trying to understand what is possible versus what is useful.

Smart Cities Plan

There are some obvious areas already being tackled by ICT systems. These include:

transport logistics (road, rail, freight, air, sea)

public transport

utility services (gas, water, electricity, waste)

weather prediction

environmental monitoring

And there are a range of trials underway to try and understand what using a broader sensor mix and more widely deployed sensors might do to improve amenity, even if they aren’t all very high quality sensors. Again the questions come back to:

what sensors?

how many and where?

how accurate?

how much do they and their platform cost?

measured how often?

at what latency?

what to do with the data?

Smart Cities Segments

IoT Challenges

Although the Internet of Things (IoT) has a huge promise to live up to, there is a still a lot of confusion over how to go about it. This breaks up into 3 distinct areas.

IoT Hardware

The first is the IoT Hardware device that is deployed to the field. These come in a wide range of shapes, sizes, power profiles and capabilities. So we are seeing everything from full computing platform devices (Windows, Linux, Other) deployed as well as tiny resource constrained platforms such as Sensor Node devices. Examples of the later are Wimoto Motes and our own FLEXIO Telemetry devices which are OS-less Sensor Nodes.

The trade offs are between:

power consumption

power supply

always online versus post on a schedule or by exception

cost (device, data, installation, maintenance)

size

open standard versus proprietary

upgrade capable (over the air OTA firmware or software capability)

security

As of a month ago, the KPMG IoT Innovation Network reported there are 450 different IoT platforms available. And most don’t talk to each other. Many lock you in. Many only work with their specific hardware. So picking a hardware platform is only part of the challenge. And new products appear every week.

IoT Innovation Network

IoT Communications

The second area of challenge is the communications. Everyone is trying to get away from Cellular IoT Communications because the Telecommunications Companies pricing model has traditionally been higher than they want to pay, and because the power required means you need a much higher power budget. So there has been a push to find other options which has opened the way for players like LoRa and sigfox.

However the CAT-M1 and NB-IoT Telecommunications Standards mean that the pendulum could easily go back the other way. CAT-M1 reduces the data rate (no streaming video needed for most IoT devices) and changes the modulation scheme so you get a better range at a much lower power consumption. And unlike sigfox, you aren’t severely constrained on how much data you can move or how often. CAT-M1 has just gone live in Australia on the Telstra network and we are about to do our first trials.

Quectel BG96 CAT-M1 Module

NB-IoT doesn’t yet have an official availability date but we aren’t too concerned about that. NB-IoT is really aimed at the smart meter market and similar devices which have low amounts of data and upload it infrequently. So a water meter running off battery for 10+ years is an example of what it is targeting. We will find CAT-M1 a lot more useful. And the modules that support CAT-M1 currently also support NB-IoT so we are designing now and can make the decision later.

IoT Back End

The third area of challenge is the back end. Pick the wrong data service and storage provider and you could find you don’t own your own data and you have to pay every time you want a report on it. And you can’t get at it to port it to another system. And if the volume of data grows the cost can grow even faster as many offer a low entry point but the pricing get expensive quickly once you exceed the first threshold.

Because of this there is an strongly emerging preference for open systems or for systems that do allow you to push and pull data as it suits you.

So our strategy to date has been to provide our own intermediate web service and then republish the data in the required format to suit the end user / client. The result is the best of both worlds. We can deploy resource constrained field devices which are low power and low cost, then communicate with high security and high cost platforms using the intermediate service to do the heavy lifting. And we don’t try and imprison the data and trap the client.

This isn’t the only approach and so we also create devices and incorporate protocols that allow them to directly connect to other systems. This includes porting our core IP to other URLs which are then owned by our clients. So far we haven’t found that one single approach suits every scenario.

Smart City

You can’t be smart if you don’t know anything. And this is certainly true for Smart Cities. To be a Smart City requires Sensors and Telemetry. But the jury is still out on how much and what kind.

5G for IoT

Thanks to the team at VDC Research who compile some very useful information on Embedded and IoT (Internet of Things) trends. It is free to join and the deal is that you contribute to their surveys in order to get access to some reports for free. They also do detailed reports for business purposes which are available for purchase.

VDC Research

The following 5G IoT Infographic was put together by them to show the progression of 5G cellular or Mobile Communications in terms of its impact in the Embedded Systems and IoT space. If you click on it you will get a cleaner version to look at and you’ll probably want to zoom in a bit.

5G IoT Infographic

I was interested to see that there are still no fully confirmed standards for 5G. And my previous post on Cellular IoT Communications shows this to be a trend where NB-IoT is still being ratified even though there are chip sets on the market. It is also sobering to think about where all the data will get stored as devices running Gb/sec data streams will have to be sending it somewhere. Big Data keeps getting bigger.

LPWAN = Low Power Wide Area Network

LPWAN is typically thought about as cellular data networks but that involves a contradiction since cellular and low power are inherently in conflict with each other. For instance, a standard 3G or 4G cellular modem will have a peak current draw of up to 2A during transmission and needs to be carefully power managed if running from batteries. This has meant that a 10 year operating life from a primary cell battery either needs a huge primary cell or very infrequent communications. So what are the alternatives?

In IoT Versus M2M we looked at how the real benefit of IoT (Internet of Things) is that rather than a single Machine to Machine link being established, there are now multiple devices connected via shared web services and their combined data is being used to create extra value, and particularly if Big Data analytics is added to the mix.

SigFox Logo

LoRa Alliance

There is also a lot of potential disruption in this. LoRa and SigFox are both looking to provide lower cost networks to replace dependency on cellular network operators for coverage and also address the power consumption problem. There is an excellent comparison of these 2 systems in SigFox versus LoRa. And both are trying to disrupt existing cellular network providers. An overall view at available at NB-IoT versus LoRa versus SigFox.

NB-IoT

Which introduces Narrow Band IoT or NB-IoT as it is now commonly abbreviated to. Just to continue the confusion of acronyms, it is also called CAT-NB and CAT-NB1. There is a detailed view of this technology and its likely long term adoption at NB-Iot is dead – Long live NB-IoT.

The summary is that NB-IoT is too late to market and requires too much equipment changeover to win the early adopter market, especially in the USA, but will win in the long term. In the interim there is a host of other options also being developed. The cellular network operators have realised, at least 5 years too late, that their business and technology models were both under attack simultaneously. This is a particularly dangerous form of disruption.

Low Power Cellular

So if up until now, low power and cellular were not usually compatible concepts, what is changing to address that?

To reduce power consumption, you have to have one or more of the following:

reduce transmit power

increase receiver sensitivity

reduce transmit duration

increase transmit interval

reduce network registration time

reduce data rate

Some of these can be mutually exclusive. However the key elements that are working together is to reduce the data rate and use a modulation scheme that means the transmitter power can be reduced. LoRa does this very well and NB-IoT is looking to achieve a similar thing. There are trade-offs and the lower data rate for NB-IoT means it is best suited to very small packets. CAT-M1 will require less power for larger packets because the faster data rate means the transmit time is a lot shorter.

Low Cost Cellular

So we have looked at the power consumption angle. How about cost and business model. And there are 2 aspects to cost. There is the hardware cost and there is a the network operations cost. To reduce cost you have to do one or more of the following:

reduce silicon and software protocol stack complexity

high volume production allows economies of scale for hardware

increase the number of channels available in the network

increase the number of simultaneous connections in the network

reduce margins

Both SigFox and NB-IoT aim to make the end device hardware cost as low as possible. In the case of NB-IoT and CAT-M1 the channel bandwidth can be reduced and so the same bandwidth can support multiple devices instead of just one. The power level in the device transmitter is reduced by reducing the bandwidth and data rate. As an example, a CAT-M1 module has a peak transmitter current draw of 500mA which is a factor of 4 lower than CAT-1. So low cost and low power can go together very well.

The graph below shows how the various cellular standards relate to each other.

Cellular IoT standards and how they relate

IoT Deployment Options

We have been using standard 3G/4G Cellular modems for our broadly distributed IoT offerings. As of the end of this month, we ship our first CAT-1 based offerings. These have the advantage of supporting both 4G with fall back to 3G. Although NB-IoT hardware is available now from both Quectel and u-blox, the networks in Australia don’t yet support it. And while NB-IoT is ideal for fixed location assets, we also do mobile systems so these need to be CAT-M1 once it is available.

CAT-M1 is expected to be available in Australia on the Telstranetwork around September 2017. I am also taking this as meaning that NB-IoT is 2018 or possibly even longer. So we plan to move to CAT-M1 as soon as it is available. The modules are expected to be available about the same time as the network upgrades.

Industry 4.0 and Bosch Australia

This is the first of a 2 part past covering the SEBN (South East Business Networks) business breakfast just before Christmas 2016. The first speaker was Gavin Smith of Bosch Australia. His talk was title “Life After Auto” and here is my summary.

Gavin Smith – Bosch Australia

In the 1960s you could make anything in Australia because the import tariffs were high and we were a long way away from the rest of the world. But by 2008 all that had changed. Although Robert Bosch is the largest tier 1 automotive supplier in the world, and the largest automotive company that doesn’t assemble vehicles, the original Bosch Australia factory is no longer there and a new one built and they are about to expand again.

So there is a lot of change. He also quoted Jack Welsh of GE fame: “If the rate of change on the outside exceeds the rate of change on the inside, the end is near”!

High volume no longer has to be a lot of the same thing. They are now doing high mix electronics manufacture and are about expand that as they have run out of capacity. This follows the Industry 4.0 model rather than traditional manufacturing. The design team is also expanding s they are now do bespoke product design with the intention of making them locally.

Bosch are also keeping track of the following Megatrends:

Demography

Urbanisation

Energy and climate

Connectivity

Bosch – Megatrends

And all of this relies heavily on IoT (Internet of Things) devices and Big Data. To be a global supply chain player or to have a modern product you will have to have connectivity and visibility of every part of your process and your supply chain as well. And for Industry 4.0 you will especially need it for inside the factory. This is already happening.

Robert Bosch are also looking at incubation for new ideas internally and also externally. This is a great idea and something we are also doing with both clients and prospects.

They are also looking to attract more women into STEM (Science, Technology, Engineering and Maths). Something I am also keen to see happen.

Industry 4.0 example

Gavin finished with a video that showed just how streamlined the Design to Manufacture path could become. Something essential to the realisation of a true Industry 4.0 mass customisation.

While it is worth remembering that some of the above is a view of how the Industry 4.0 future could be, rather than what today looks like, Europe have been pursuing this trend for 15 years. So we have quite a bit of conceptual catching up to do as well as implementation capability. And we need to start early which is why the Casey Tech School project and Schools of the Future are so important.

Data and Analytics

Today, data is available for nearly everything you can think of. Or if it isn’t, then it isn’t hard to add new data sources, both internal and external to a business. But data alone isn’t the answer. It is what you do with it, learn from it and decide based on it that really makes the difference.

Data Analytics

So how do you know what everyone else is doing?

Or how you compare?

aiia , the Australian Information Industry Association, are doing a survey on Data and Analytics in Australia and sharing the results with anyone who contributes. So I’m writing this to encourage you to contribute. I did.

Predicting the Future

How hard can it be. Surely everything follows on from everything else?

This is what was behind Sir Isaac Newton’s proposition that if we work out the equations of the universe and plug in the initial conditions, we can predict everything. And so science became the new religion of western society.

Until quantum mechanics came along.

So there are 3 ways the future can prove unpredictable. We can have unexpected discoveries (breakthroughs), we can have existing ideas that meld together in unexpected ways (convergence), and we can have false ideas eradicated (proof). The latter is the harder and the first is the easier to understand the implications of. So I am going to focus on convergence.

Convergence

These comments below are taken from Peter Diamandis and you can join his mailing list too if you want to get access to thinking like this.

Peter Diamandis

Unexpected convergent consequences… this is what happens when eight different exponential technologies all explode onto the scene at once.

An expert might be reasonably good at predicting the growth of a single exponential technology (e.g. the Internet of Things), but try to predict the future when the following eight technologies are all doubling, morphing and recombining… You have a very exciting (read: unpredictable) future.

Computation

Internet of Things (Sensors & Networks)

Robotics/Drones

Artificial Intelligence

3D Printing

Materials Science

Virtual/Augmented Reality

Synthetic Biology

This year at my Abundance 360 Summit I decided to explore this concept in sessions I called Convergence Catalyzers.

For each technology, I brought in an industry expert to identify their Top 5 Recent Breakthroughs (2012-2015) and their Top 5 Anticipated Breakthroughs (2016-2018). Then, we explored the patterns that emerged.

This blog (the first of seven) is a look at Networks and Sensors (i.e. the Internet of Everything). Future blogs will look at the remaining tech areas.

Networks and Sensors

At A360 my first guest was Raj Talluri, the Senior VP of Product Management at Qualcomm, who oversees their Internet of Things (IoT) and mobile computing businesses. Here’s some context before we dive in.

The Earth is being covered by an ever-expanding mesh of networks and sensors that form the Internet of Things (or the Internet of Everything). Think of the IoT as the network of all digitally accessible objects, estimated at 15 billion in number today, and expected to grow to more than 50 billion by 2020.

But what makes this even more powerful, is that each of these connected devices, are themselves made up of a dozen sensors measuring everything from vibration, position and light, to blood chemistries and heart rate.

Imagine a world rapidly approaching a trillion sensor economy where the IoT enables a data-driven future in which you can know anything you want, anytime you want, anywhere you want. A world of instant, high-bandwidth, communications and near perfect information.

The implications of this are staggering, and I asked Raj to share his top five breakthroughs from the past three years to illustrate some of them.

Recent Top 5 Breakthroughs (2013 – 2015)

Here are the breakthroughs Raj identified in Networks and Sensor technology from 2012-2015.

Emergence of Continuous Low-Power Always-On Sensors

One of the major advances from the past three years has been the proliferation of “always on” sensors.

As Raj explains, “You’ll be amazed how many of your phone sensors are always on. If you look at your phone, there were times when you had to press the button to say “hello Google” or “hi Siri”. Now, you don’t. You just talk to it and it figures it out.”

“This has been made possible because you’re now able to make very low power sensors that listen to you all the time, keyword detect and do the data processing.”

Smartphones Drives Sensor Volume at Low Cost

The number of sensors in your smartphone today have exploded. Raj continues, “We are now seeing 10, 20 and even 30 sensors embedded in our smartphones. Things like proximity sensors when you pick your phone up, gyros, cameras, depth sensors and so on. This has really driven down cost and driven the discovery of new sensors, because there are a billion smartphones [sold] every year. It’s a huge opportunity.”

“Systems” Fuse Continuous Sensor Data & Cloud Processing

Seamless integration of processing is happening in the cloud and on your device. Raj explains, “When you say, ‘Okay, Google,’ a part of what happens next is on the phone and a part is on the cloud. You don’t really know where the processing is being done, on your device or on the cloud, the hand off is seamless.”

4K Video Format Goes Mainstream

4K screen resolution is close to the point that the brain is unable to notice pixels. As such, somewhere between 4K and 8K, virtual reality become visually equal to visual reality.

Raj explains how this technology is exploding: “If you buy a 4K TV and watch 4K content, it’s very hard to go back to 1080p. It almost feels like you were watching a VHS tape when DVDs came out. Today, if you look at what we’ve done at Qualcomm in the high-end processors space, we shipped over 200 to 250 million processors that actually record in 4K.”

Opening of Sensor APIs to 3rd Party Apps Development Community

The reality is that the majority of phone apps now come from third party developers. This explosion in apps (perhaps 50 to 100 per phone) is only possible because of (i) the opening of the APIs for the sensors in the devices and (ii) the community of developers that has emerged as a result.

So what’s in store for the near future?

Anticipated Top 5 Breakthroughs (2016 – 2018)

Here are Raj’s predictions for the most exciting, disruptive developments coming in Networks and Sensors in the next three years.

As entrepreneurs and investors, these are the areas you should be focusing on, as the business opportunities are tremendous.

Wireless Network Densification (4G/5G): Cost / Megabit Plummets

The cost per megabit of connection is going to plummet – essentially nearing “free” in the very near future.

Raj expands, “Already in places like Indonesia, we find that people are actually getting data plans at a price of $5 a month. In most of the world, the cost per megabit is extremely low as the cost of launching networks is plummeting.”

Emergent Peer-To-Peer Tech Drives Automotive Communication & Safety

Soon all of your devices at home and work (screens, thermostats, DVRs, computers, even cars) will automatically connect seamlessly. You won’t have to make conscious decisions about how to connect your washing machine. When it finishes washing the clothes, you will get a notification on your phone.”

Global Internet Connectivity via Satellite Plummets in Cost

Qualcomm, in partnership with Richard Branson, are working to deploying a 648 satellite constellation called OneWeb. Raj explains, “Global Internet connectivity through satellites is finally going to happen… Just think about three billion new people coming online at a megabit per second. It is going to be completely different kind of experience.”

Exponential Growth in Connections to Internet from Various Devices – Personal/Home/Cities

Raj says, “I often ask people: how many IP addresses do you think you have at your house?” Most people have no clue. They say, “Maybe two or three…”

For Raj (and most of us) it’s more like 50… your TVs, your set top boxes, phone, iPads, Nest, cameras, light bulbs…

“In the next few years, the number of things that will be connected to the Internet at any given point of time in your life is going to be so huge that the way they work is going to be very different. You won’t need to reach for your phone to do something. Coupled with sensor networks, you’ll just be able to speak and ask for what you want.”

Major Improvements of Head-Mounted User Interfaces with Rich Bandwidth and Onboard Sensors

Over the next three years, we’ll see rapid uptake of VR and AR headsets, each with

4K displays and cameras, and packed with a suite of sensors connected by high bandwidth communications to the cloud. The result is that each of us is wearing an incredible User Interface with high-speed communications that will make our virtual experiences so good that you won’t need to travel to experience something.”

There is a lot to think about there. We are heavily involved in the Internet of Things (IoT) space and particularly see the opportunities that come from low cost communications with low power electronics and always on monitoring. Suddenly you can have the flood monitoring system you never thought was possible. Or bush fire front monitoring. Or pretty much anything else you can thing of that has a sensor option already developed. And Big Data adds another dimension to this where the multiple different sensing technologies combine their data together to provide information and insights not previously possible. If I was to add another category to Peter Diamandis insights, it is that Big Data will out weight them all.

My thanks got to Luke McIndoe of Nebo Engineering for passing this on to me. They are a group of highly skilled engineers who are Piping and Pressure Vessel Designers among other things.

IoT versus M2M

M2M or Machine to Machine communications have been around for a long time. So how is the IoT or the Internet of Thingsdifferent?

The picture below shows that basic paradigm shift between the 2 concepts.

IoT versus M2M

The easiest way to think about this is that Machine to Machine communications is a subset of the Internet of Things. IoT can make use of M2M but M2M on its own is not sufficient to create IoT.

One the left you see that you can have specific devices connected to each other via some form of communications. So I can send a Fax or check my fitness level. Even get back data from the Pioneer and Voyage spacecraft as they explore the solar system and eventually head beyond it.

One the right, we add ubiquitous communications, cloud services and Big Data correlation and we have a much more power ecosystem that also creates a lot more value. And that is what is driving IoT Growth, the extra value it creates.

How to change the world

To make a significant difference in our world, you either have to do something outstanding or have a lot of influence. Or both. And it will need many people to contribute over the course of the project.

So I keep an eye on some of the forums that help people to collaborate in order to facilitate this:

And I spotted this piece of news that shows a radically different way to deliver remote Internet services.

The project is a classic example of using Big Data to solve a big problem. The project is of course Project Loon and is intended to connect remote or difficult sites to the Internet including disaster recovery scenarios. The challenge is how to do it without wiring or expensive satellite links and the name suggests it is one of those ideas out of left field. The solution being trialled is a series of balloons floating in the upper atmosphere which provide an intermediate link between ground based transceivers. It uses standard ISM frequencies at 2.4GHz and 5.8GHz and so can operate with standard equipment and delivers GSM levels of Internet speed.

The truly brilliant part is taking the winds direction data for the upper stratosphere and using that to control the altitude of the balloons so they stay where they need to be. A fantastic example of gathering and using data on a scale that most of us can barely appreciate.

I learned a few more things about Big Data that I hadn’t considered up until now. There are:

storing the data is a major issue

moving the data between storage and processing is an even bigger issue

processing capacity is increasing faster than storage or transport capacity

for simulations, the results matrices are so huge that reducing them before storage is the only way they can be handled

An example where all these points converge is climate modelling where the exponential growth in sensors and the complexity of the models mean that there is too much data too widely dispersed to get it to one place, process it and get the results back out efficiently. A new methodology is required for problems like this.

IO Bottleneck

So we are back to the old information IO Bottleneck problem. The graph that really got my attention tracked the growth in data access rates versus the growth in data processing rates.

Data Storage Versus Data Processing

The rate of performance improvement in disks (red line) is much lower than that in computing systems (blue line), driving the need for larger disk counts in each generation of Supercomputer. This approach isn’t sustainable regardless of whether you look at cost, power or reliability. Richard Freitas of IBM Almaden Research provided some of this data for IEEE.

So we have reached the point where the storage and movement of data is now the limiting factor in computing analysis. 40 years ago Seymour Cray had to overcome this at the individual computing system level to build the Supercomputers he is famous for. Today we have hit it at the system level.

Areas being looked at for innovative solutions are:

continue looking for higher density and faster storage systems

data compression or subsetting algorithms to reduce the amount of data to be moved or stored

parallel processing techniques with parallel storage to reduce the bottleneck

results summarisation so less storage is required for the analysis results

And all this while trying to maintain data integrity and traceability for proof of scientific rigour. Answers will be found, that much we can be sure of from history.

And there is a lot of money to be made from doing this well. Forbes put $50 Billion as the value of the Big Data Market.

How Much Data?

According to IBM, 90% of the data created in the history of the world, was created in the past 2 years. The article was looking at Social Media Information but the claim was generic. Talk about Information Overload. How do we keep up with this?

There are sceptics that believe this Data Deluge is overstated but even if they are out by a factor of 10, it seems we are in danger of moving from the Information Age to drowning in data.

I worked with a very fast thinker once. Working with him was like trying to see ahead underwater while travelling in the wake of an outboard motor engine. The trick was to decide what to ignore so you could just address the important things. He used it as a tactic to get his own way during meetings. I was reminded of this while thinking about this topic. It seems the whole human race is about to face the same dilemma. How to sort the important information from the huge volume of total information being produced.

Information Overload

Information Relevance

Not all of information produced is of the same quality, usefulness or relevance. Assessing Information Relevance will become increasingly more important. A post on Facebook letting us all know that someone’s dog just farted is not as valuable to know for most of us compared to the passing of a new law that puts a carbon tax on high carbon emitters.

The CERNLarge Hadron Collider (LHC) is expected to produce data equal to 1% of the worlds production rate when it is running. This required a new approach to data storage. For those who aren’t familiar with it, the Large Hadron Collider is a higher energy version of the Australian Synchrotron which has specialised detectors that examine the fine details of how the matter of the universe is constructed. The intent is to look for evidence that the Higgs Boson exists as predicted by the Standard Model of particle physics.

Test Everything

I mention it here because they have to record the experimental data knowing that it may be some time before they can fully interpret it. They have planned for the Information Overload as well as the long term Information Storage.

In fact it is a great example of long term planning with the original proposal in 1985 and the construction beginning in 1994 and being complete in 2008. You see the steps involved in LHC Milestones.

Information Storage

If we used DVDs it would produce a stack that goes to the Moon and back. That’s too big to store as DVDs.

The increase in data comes from 3 sources:

new data sources such as ubiquitous sensors, LHC, business metrics, research…

increased data creation from existing sources such as social media, blogs, web publishing…

unprecedented processing power

So far the storage solution is the growth of server farms and while many higher density storage technologies are being investigated, most data is stored on conventional hard disks. Redundacy and data security are of course hot topics.

Even in the much smaller world of Successful Endeavours where we develop new products and have to do the Innovation, research, Prototypes and testing associated with them; managing all the data requires both discipline and planning.