A new report from Navigant Research examines the 5G technology standard and its use cases, focusing on smart grid applications and possible business model opportunities. 5G communications technology is expected to underpin the Fourth Industrial Revolution, where the confluence of ubiquitous mobile broadband, pervasive sensing, and artificial intelligence promises to drive massive change across industry and society.

This coming generation of wireless communications promises to make the Internet of Things (IoT) and the Internet of Energy (IoE) a reality, while delivering added benefits for power utilities. According to a new report from @NavigantRSRCH, 5G’s flexible, multi-spectrum, multi-function architecture will provide a platform able to support critical latency-sensitive applications as well as low power, low cost applications such as ubiquitous sensing throughout the distribution grid.

“5G will deliver a step-change in public wireless networking that has the potential to dramatically enhance the way utilities network their grid assets and systems,” says Richelle Elberg, principal research analyst with Navigant Research. “Many of the concerns utilities have historically had with public networks should be alleviated, and the opportunity for a truly ubiquitous, future proof network is one that utilities should not overlook as 5G networks come to fruition.”

Richelle Elberg

In addition to enabling smart fleet management as well as edge computing and cloud technologies for distributed automation and intelligent control, 5G networks may also provide new business model opportunities to power utilities, according to the report. Because the technology depends upon dense small cell architecture, opportunities for utilities to partner with carriers could arise, making dispersed assets such as poles, wires, and rights of way valuable assets.

The report, 5G and the Internet of Energy, provides an overview of the 5G technology standard and its use cases. It discusses the many smart grid applications these networks will support and proposes certain business model opportunities it may create. The report also provides a discussion of the global carriers and infrastructure vendors that are leading the charge toward the 5G future.

Utilities—and their vendors—planning network upgrades over the next 5-10 years must fully understand the longer-term implications of the 5G evolution as they consider their need for future-proof and holistic connectivity.

It’s been more than a decade since we first heard the phrase “data is the new oil.” But while this idea may well define the next generation of business, there’s important context surrounding it that often gets overlooked.

Since making this statement, many others have echoed these same words. What we now have come to realise is that data is a commodity, a raw material, and it’s only valuable when it can be turned into intelligence. Without the right tools to refine it, it’s just a bunch of ones and zeros. Study after study shows that while most enterprises understand the importance of data, they continue to struggle to draw real value from it, says Matt Mills, CEO and Board Member at MapR Technologies.

Amongst the most powerful and largest tech titans in the world, the idea of data turned into intelligence is gospel. It’s why companies like Apple, Amazon, Google, Facebook, and Microsoft dominate the charts these days. They not only know the value of data, but also how to transform it from raw insight into a competitive advantage.

Unlike these companies, many organisations today are using 30-year old technologies to “refine” their data and are frustrated with the little progress they are making. The simple fact is that older technologies are often too fragile and simply aren’t built for the diversity or the sheer volume of data today. The fact of the matter is this is just the beginning. Many believe that from now on, data and the digital universe will double in size every two years.

Successful data-driven companies in the early stages of their digital transformation journeys are choosing a modern data platform that is both optimised for performance today and provides the speed, scale and reliability that are required for next-generation intelligent solutions.

The modern data platform has 10 key characteristics:

A single platform that performs analytics and applications at once
Manages all data from big to small, structured and unstructured, tables, streams, or files – all data types from any source
A database that runs rich data-intensive applications and in-place analytics
Global cloud data fabric that brings together all data from across every cloud to ingest, store, manage, process, apply, and analyse data as it happens
Diverse compute engines to take advantage of analytics, machine learning and artificial intelligence
Delivers cloud economics by operating on any and every cloud of your choice, public or private
No lock-in, supporting open APIs
DataOps ready to champion the new process to create and deploy new, intelligent modern applications, products and services
Trusted with security built from the ground up
Streaming and edge first for all data-in-motion from any data source as data happens and enabling microservices natively

Many software products today can handle some aspects of modern day data platforms, but few if any can actually deliver on all of these requirements. This is where most companies get into trouble. They try to extend the software to do things it was never intended to do. These limitations are a key reason why many companies never reach their goals and objectives with data.

Here at MapR we have built the premier data platform for today — and tomorrow’s — leading enterprises. […]

IoTA Wales (Internet of Things Accelerator Wales) is a new accelerator program launched in Wales, U.K to provide IoT startups financial backing, access to business network and knowledge sharing.

The program is co-launched by Innovation Point, The Accelerator Network, Barclays Eagle Labs, Inspire Wales and the Development Bank of Wales. The program’s main aim is to help fund and grow small IoT startups looking to bring their products to market or grow their startup.

The program will select 10 companies per cycle and inject £50,000 against 10% equity in the startup.

Each startup also gets access to paid mentors who will guide them towards product/service commercialization and other important milestones. Each of the admitted startups also receives engineering support from Sony Manufacturing and Cardiff University, test-bed and gateways from Pinacl, and membership of Tech UK.

Apart from initial support in kind and cash, the startups will also get a chance to raise follow-on funding in April, 2018.

With major partners from the building sector, NodOn® is growing fast with two complete range of smart devices using the well-known technologies Z-Wave® and EnOcean®. This year at CES the company will talk about its innovation spirit and will present the first complete range of Bluetooth® Mesh devices for smart home.

With wireless and interoperable smart home and building devices, NodOn® is a major actor of the smart home industry in Europe and is working with major companies of the building sector. NodOn® will be the first to make Bluetooth® Mesh complete range of products come alive for smart home and building.

A range of products for professionals to cover all uses of smart home

Real actor of the smart home & building industry, NodOn® designs sensors, actuators and controllers: smart plugs, wall switches, remotes, relay switches (lights, heating, home access), temperature and motion sensors, etc. NodOn® products are using Z-Wave® and EnOcean®, the most well-known technologies of the smart home sector, and are compatible with most of the home automation gateways of the market.

NodOn® products are designed for professionals and can be set up for new homes and buildings or to retrofit any interior. “Our products are created to keep the home flexible and to make it evolve over the years. It fits the home perfectly to ensure comfort and energy savings to the user.” says Thomas GAUTHIER, CEO of NodOn®.

The startup has quickly made its reputation on the market thanks to a complete offer combining technical performance, responsiveness and a good customer service as well as real expertise in smart home. In 2018 more than 10 000 flats in France will be equipped with NodOn® products. Meet NodOn® at CES to know more about its references.

Bluetooth® Mesh devices: a world premiere by NodOn®

What if smart home sensors or actuators could be set up and controlled without any hub? What if the user didn’t needed anything more than its smartphone? Well, this idea is now coming up to life thanks to NodOn®, who have entered in exclusivity the Bluetooth® Smart Mesh program and Bluetooth® SIG consortium to make it real. Compatibility will no more be an issue.

NodOn® plugs, wall switches, sensors, relay switches and more will be secured products, easy to set up through a smartphone App and with low energy consumption. Those first Bluetooth® Mesh plug and play products are to discover at our booth.

New NodOn® Z-Wave S2 security door/window opening sensor

Combining S2 security layer, as secured as Apple® HomeKit, and soon UL-634 certified, this brand new Z-Wave® S2 security door/window sensor will protect home, valuables and loved ones with a number one concern: security.

When cellphones were first released in the mid-1980s, a handset would set you back $4000 (€3378.98) — the equivalent of almost $10,000 (€8447.45) today. Here, Markus Brettschneider, group senior vice president and general manager for global food & beverage applications for ABB, explains how smaller food manufacturers can take advantage of low-cost digital technology.

When the first cellphone was released in 1983, this breakthrough technology was reserved for high-ranking business people and the social elite. Yet, decreasing technological costs have led to cellphones becoming arguably the most common technology available today. In fact, the UN’s 2014 telecommunications figures revealed that there are almost as many cellphone subscriptions as there are people on Earth.

This is indicative of a technological trend known as quality-adjusted price, which ties closely into the concept of Moore’s law. As technology rapidly develops at a pace that leads to significant performance increases year on year, the cost of that technology decreases at a similar pace.

This presents an important opportunity for smaller businesses to take advantage of newer technologies that were previously only accessible to large companies. In the food production industry, for example, there has been a significant increase in the adoption of digital technologies and software among larger businesses.Yet, food manufacturing companies of all sizes can tap into the productivity and efficiency benefits offered by digitalisation.

Markus Brettschneider

The digital food plant

While equipment and robotics have been the key drivers of plant improvement in past decades, the rise of the industrial internet of things (IIoT) has placed greater importance on software and insight. In particular, many plant managers now use digital solutions to monitor the status of equipment to mitigate performance problems.

For example, most food processing plants will have automated at least one part of the production line with a conveyor system. As with any piece of equipment, parts of this system will gradually wear down from repeated use over time. For critical components such as the motor, this leads to a slow decline in performance and risks downtime due to breakage.

Plant managers must therefore undertake predictive maintenance to address any issues before they become problems. To do this effectively, plant managers must have accurate performance data from the conveyor’s low-voltage motors.

Rather than invest in new systems that feature IIoT functionality, businesses can install multi-function sensors to collect and analyse performance data.

For example, the ABB Ability Smart Sensor for motors allows engineers to digitalise food production plants with minimal expenditure. These sensors fit directly onto the motor’s frame and monitor key performance factors such as temperature and vibration. This data is transmitted to the cloud, where it is analysed and reports are generated for plant engineers.

What the data provides is new insight into the health of motors used in production, enabling a shift from reactive maintenance to predictive maintenance. With a simple “stop light” system of green, yellow and red lights, fleet motor status is simple to assess.

Just as cellphones are no longer exclusive to the likes of CEOs of listed companies, digitalisation is not reserved for large food […]

The fundamental shift of the enterprise toward the cloud has posed a conundrum for many. The largest issue is the state of most enterprise networks. These networks were designed for an era gone by. Their original designs could not foresee the coming of technologies such as SDN, SDWAN, Segment Routing, the Cloud and an exponential increase in bandwidth that have all happened over the past 10 years.

The IPv4 Internet BGP routing table alone has experienced a 10% year over year growth between 2009 and 2017 along. In 2009 the table eclipsed 286,000 routes. Here in 2017 we are at approximately 650,000. These figures only account for IPv4 routes, and not the full IPv4 and IPv6 tables. During that same period we have gone from token ring and 10Base-T to 100GbE.

The possibilities offered by the Internet of Things (IoT) are growing strongly in Belgium and throughout the world. Orange Belgium decided last year to invest in new IoT technologies based on international standards, and has now announced the availability of Narrow Band-IoT (NB-IoT) and LTE-M technologies, so called “Mobile IoT”, across the whole Belgian territory, reaching nationwide coverage both indoors and outdoors.

The NB-IoT and LTE-M technologies are Low Power Wide Area (LPWA) cellular network layers that will allow millions of everyday objects to be connected to the Internet of Things (IoT). LPWA technologies offer many advantages when connecting objects to the Internet of Things, such as extending the battery life of the connected equipment by up to 10 years thanks to low energy consumption, or drastically reducing the cost of the radio modules inside the devices that need to be connected. Smart furniture, smart parking and asset tracking for instance become possible.

Orange turns Belgium into a mobile IoT test lab

The Mobile IoT technologies bring a different approach to LPWA technologies. Apart from extended battery life and the minimal cost of radio modules, NB-IoT and LTE-M is also said to enable:

Full bidirectional communication between the object and the network allowing firmware software updates over the air;

Gabriel Flichy, chief technology officer of Orange Belgium, explains: “The choice for Mobile IoT technologies is future-proof as it is fully consistent with the future evolution towards 5G. It will then be possible to connect objects that require very high reliability e.g. for the remote control of critical devices and automation processes.”

“Today we are already using our Mobile IoT network together with the Flemish government for a project with connected bikes in collaboration with our partners Huawei and Sensinxs; IMEC is testing a smart plug project as part of Antwerp’s City of Things program and we have teamed up with CommuniThings for smart parking solutions. We invite all application developers, system integrators and early adopters to test the advantages of Orange Mobile IoT network.”

Overall Mobile IoT will act as the catalyst for companies to establish connections that would not have been viable with existing technologies. Consumers will see a huge variety of new products, services and applications enabled by Mobile IoT.

Gemalto, the provider in digital security, is supplying the eSIM (embedded SIM) solution for Microsoft’s Surface Pro with LTE Advanced, the most connected laptop in its class which will begin shipping to business customers in December 2017. Gemalto’s partnership with Microsoft enabled Surface to become the first fully integrated embedded SIM PC in the Windows ecosystem.

Gemalto’s advanced technology supports seamless activation of mobile subscriptions for users of the innovative Surface Pro with LTE Advanced. This smooth experience leverages Gemalto’s remote subscription management solution in conjunction with Windows 10. Surface customers expect their products to deliver advanced technology and with Gemalto’s eSIM solution, all possible connectivity options are available out-of-box, including the purchase of cellular data from the device itself.

Compliant with the GSMA Remote SIM Provisioning specifications, Gemalto’s eSIM solution is fully integrated with Windows 10. This integration enables the Gemalto solution to have a complete servicing model so that patching and lifecycle management features are available as the technology and standards evolve over time. This capability extends the value promise of Surface as new experiences and capabilities will be available to today’s purchasers of the Surface Pro with LTE Advanced.

“The Surface Pro has redefined the laptop category,” said Paul Bischof, director, Devices Program Management at Microsoft. “Gemalto’s eSIM solution is helping us to materialise our vision of an uncompromised customer experience.”

“Adoption of eSIM technology is growing rapidly. Mobile operators recognise the potential of seamless connectivity and increased convenience as a way of expanding their customer reach to additional devices” said Frédéric Vasnier, executive vice president Mobile Service and IoT for Gemalto. “We are at the beginning of a significant technology transformation and the Surface Pro with LTE Advanced represents the start.”

There is growing recognition that businesses today need to be increasingly ‘data driven’ in order to succeed. Those businesses that can best utilize data are the ones that can better serve their customers, out-compete their competitors and increase their operational efficiency. However, to be data driven, you need to be able to access, manage, distribute and analyze all of your available data while it is still valuable; and to understand and harness new potential data sources.

Key to this is Data Modernization. Data Modernization starts with the recognition that existing systems, architectures and processes may not be sufficient to handle the requirements of a data-driven enterprise, and that new innovative technologies need to be adopted to succeed. While the replacement of legacy technology is not a new phenomenon, the particular sets of pressures, leading to the current wave of modernization, are.

In this article we will delve into the very real pressures pushing enterprises down the path of data modernization, and approaches to achieving this goal in realistic time frames.

Under Pressure

Business leaders world-wide have to balance a number of competing pressures to identify the most appropriate technologies, architectures and processes for their business. While cost is always an issue, this has to be measured against the rewards of innovation, and risks of failure versus the status quo.

This leads to cycles for technology, with early adopters potentially leap-frogging their more conservative counterparts who may not then be able to catch up if they wait for full technological maturity. In recent years, the length of these cycles has been dramatically reduced, and formally solid business models have been disrupted by insightful competitors, or outright newcomers.

Data Management and Analytics are not immune to this trend, and the increasing importance of data has added to the risk of maintaining the status quo. Business are looking at Data Modernization to solve problems such as:

How do we move to scalable, cost-efficient infrastructures such as the cloud without disrupting our business processes?

How do we manage the expected or actual increase in data volume and velocity?

How do we work in an environment with changing regulatory requirements?

What will be the impact and use cases for potentially disruptive technologies like AI, Blockchain, Digital Labor, and IoT, and how do we incorporate them?

How can we reduce the latency of our analytics to provide business insights faster and drive real-time decision making?

It is clear to many that the prevalent and legacy Data Management technologies may not be up to the task of solving these problems, and a new direction is needed to move businesses forward. But the reality is that many existing systems cannot be just ripped out and replaced with shiny new things, without severely impacting operations.

How We Got Here

From the 1980s to the 2000s, databases were the predominant source of enterprise data. The majority of this data came from human entry within applications, web pages, etc. with some automation. Data from many applications was collected and analyzed in Data Warehouses, providing the business with analytics. However, in the last 10 years or so, it was recognized that machine data, logs produced by web servers, networking equipment and other systems, could also provide value. This new unstructured data, with a great amount of variety, needed newer Big Data systems to handle it, and different technologies for analytics.

Both of these waves were driven by the notion that storage was cheap and, with Big Data, almost infinite, whereas CPU and Memory was expensive. Outside of specific industries that required real-time actions – such as equipment automation and algorithmic trading – the notion of truly real-time processing was seen to be out of reach.

However, in the past few years, the industry has been driven to rethink this paradigm. IoT has arrived very rapidly. Connected devices have been around for some time, and industries like manufacturing have been utilizing sensors and automation for years. But it is the consumerization of devices, coupled with the promise of cloud processing, that have really driven the new wave of IoT. And with IoT comes the realization that storage is not infinite, and another processing paradigm is required.

As I outlined in this article, the predicted rate of future data generation – primarily, but not solely, driven by IoT – will massively outpace our ability to store it. And if we can’t store all the data, yet need to extract value from it, we are left to conclude it must be processed in-memory in a streaming fashion. Fortunately, CPU and memory have been become much more affordable, and what was unthinkable 10 years ago, is now possible.

A Streaming Future

Real-time in-memory stream processing of all data, not just IoT, can now be a reality, and should be part of any Data Modernization plans. This does not have to happen overnight, but can be applied use-case-by-use-case without necessitating a rip and replace of existing systems.

The most important step enterprise companies can make today is to move towards a ‘streaming first’ architecture. A Streaming First architecture is one in which at least the collection of all data is performed in a real-time, continuous fashion. Understanding that a company can’t modernize overnight, at least achieving the capability of continuous, real-time data collection enables organizations to integrate with legacy technologies, while reaping the benefits of a modern data infrastructure that can combat the ever-growing business and technology demands within the enterprise.

In practical terms, this means:

Using Change Data Capture to turn databases into streams of inserts, updates and deletes;

Reading from files as they are written to instead of shipping complete logs; and

Harnessing data from devices and message queues without storing it first.

Once data is being streamed, the solutions to the problems stated previously become more manageable. Database change streams can help keep cloud databases synchronized with on-premise while moving to a hybrid cloud architecture. In-memory edge-processing and analytics can scale to huge data volumes, and be used to extract the information content from data, massively reducing its volume prior to storage. Streaming systems with self-service analytics can be instrumental in remaining nimble, and continuously monitoring systems to ensure regulatory compliance. And new technologies become much easier to integrate if, instead of separate silos and data stores, you have a flexible streaming data distribution mechanism that provides low latency capabilities for real-time insights.

Data Modernization is becoming essential for businesses focused on operational efficiency, customer experience, and gaining a competitive edge. And a ‘streaming first’ architecture is a necessary component of Data Modernization. Collecting and analyzing data in a streaming fashion enables organizations to act on data while it has operational value, as well as storing only the most relevant data. With the data volumes predicted to grow exponentially, a streaming-first architecture is the truly the next evolution in Data Management.

The world operates in real-time, shouldn’t your business as well?

About the author: Steve Wilkes is co-founder and CTO of Striim. Prior to founding Striim, Steve was the senior director of the Advanced Technology Group at GoldenGate Software, focused on data integration. He continued in this role following the acquisition by Oracle, where he also took the lead for Oracle’s cloud data integration strategy. Earlier in his career, Steve served in senior technology and product roles at The Middleware Company, AltoWeb and Cap Gemini’s Advanced Technology Group. Steve holds a Master of Engineering degree in microelectronics and software engineering from the University of Newcastle-upon-Tyne in the UK.

As 2017 is almost over, we’d like to take a moment to reflect on our engagement journey with the youth and academic communities in Africa.

One of our fundamental objectives has been to create awareness that will lead to meaningful participation of the African community in ICANN and in the wider Internet governance ecosystem. According to the Africa Union, 65 percent of Africans are below the age of 35. The World Economic Forum states that the 10 youngest populations of the world are all in Africa. This reality makes it imperative that we put youth at the center of our regional engagement agenda.

Africa, home to hundreds of higher education institutions in 54 countries, is also diverse, which makes effective engagement a challenge. So far, we have focused on regional platforms such as national and regional educational networks (NRENs) and the Association of African Universities (AAU), to bring together administrators, students, and faculty. Additionally, we have organized multiple workshops and public lectures directly with universities whenever our resources were sufficient. This blog describes some of our outreach efforts during the year.

Two years ago, we piloted the youth community workshops, an initiative targeted at introducing young people under thirty to ICANN and the Internet ecosystem. This program complemented our global ICANN NextGen and Fellowship programs. The first workshop of this series took place in April 2016 in Ouagadougou, Burkina Faso, followed by two others – in Kenya in May and in Benin in December. In 2017, we have continued our academic outreach efforts with a number of events.

14th AAU General Conference

In June, ICANN hosted a workshop and participated in thematic panels geared toward academics and students at the AAU General Conference in Accra, Ghana. We also collaborated with Ghana ICANN Fellowship alumni, who organized an ICANN exhibition booth at the conference.

In November, we participated at UbuntuNet-Connect, the annual meeting of the Ubuntu Alliance, held in Addis Ababa, Ethiopia. We presented Domain Name Security Extensions (DNSSEC) and the key signing key (KSK) rollover to attending academics.

Direct University Outreach

In late May, on the periphery of the Africa Internet Summit Summit’17 (AIS2017) held in Nairobi, Kenya, we participated in a public lecture and panel session at the Multimedia University of Kenya. Over 50 engineering and computer science students attended the session that they later called “eye opening.” Interest in ICANN was evident, and students were encouraged to reach out – to collaborate on research ideas and efforts.

Matogoro Jabera, an ICANN Fellowship alumnus, was inspired by his experience attending an ICANN Public Meeting as a Fellow. He shared his experience by organizing a workshop on Internet governance for his students. Held in September, this third successful school of Internet governance was supported by the leadership of the University of Dodoma in Tanzania and other Fellowship alumni like Bonface Witaba. ICANN, ICANNwiki, and the .za Central Registry (ZACR) also supported and participated in this workshop.

These efforts have yielded encouraging outcomes. A good number of ICANN alumni, whether from Namibia, Malawi, Benin or the Congo DRC are now active in their communities. They are working regularly with our team to deliver in-class public lectures, put together workshops and set up inaugural National IGFs to further national inclusive dialogues on Internet issues. In addition, multiple institutions of higher learning and universities have expressed their willingness for structured Memoranda with ICANN to streamline and institutionalize our partnerships.

Through these developments, it is possible to envision a future in which the youth of Africa can fully participate in the Internet ecosystem. And this progress will only happen if we all work together. It is our sincere hope that our alumni, from both the NextGen and Fellowship programs would actively seek to localize the multistakeholder Internet dialogue at the national level!

We want to thank all our partners and participants in these events. As ICANN, we remain committed to organizing programs that tap into the overwhelming potential of the youth and academic communities in Africa.