Freescale hosted another successful Freescale Technology Forum (FTF) last week in Austin, TX at the brand new JW Marriott and the company kept everyone buzzing with its new product announcements, updates on the NXP merger, and product showcases. In addition, we were treated with expansive and interactive keynotes by CEO Gregg Lowe who demonstrated a variety of ways in which Freescale is fostering an ecosystem dedicated to accelerating the development of next-generation systems as well as Apple co-founder, Steve Wozniak.

The merger with NXP was an extremely hot topic with everyone clamoring for answers as to the implications of such a large acquisition for both sides. Back when the merger was first announced earlier this year, I found the unity to be synergistic with little overlap. The area with the greatest overlap between the two companies, RF products, was resolved in May with the sale of NXP’s RF Power business to Chinese private equity firm JAC Capital. Both Freescale and NXP have teams of 30-40 personnel aggressively working towards the integration of the companies. It was apparent at the event, though, that much is still to be done in terms of planning for the alignment of specific product families and integrating solutions between the two organizations.

The motivation behind the acquisition of NXP is multifaceted. First, security is paramount to the development of IoT products and solutions. The addition of NXP’s security IP will amplify Freescale’s penetration of such connected applications and facilitate the development of more-comprehensive solutions. Not to be forgotten is that both companies have seen strong growth in terms of revenues and profit margin and investor requirements are pushing for consolidation within the maturing embedded processor landscape. The merger is expected to save the combined entity nearly $400M in OPEX and $100M in COGS. No plant closures are expected and the company is focused on making the transition at seamless as possible for customers (so no re-qualifying, no products being moved geographically, etc.).

Freescale took a unique approach to entering the Bluetooth Low Energy (BLE) realm (or Bluetooth Smart as they like to use) by also integrating an IEEE 802.15.4 radio (the foundation for ZigBee and Thread) on its newly announced Kinetis KW40Z wireless MCUs for a variety of IoT applications. In addition, Freescale’s royalty-free BLE host stack is planned to support the upcoming Bluetooth Smart Mesh networking protocol. The heterogeneous Kinetis KW40Z MCU (there will be SKUs with just Bluetooth or just 802.15.4 support as well) is ripe for adding further capability to many of the emerging HAN and IoT applications leveraging mesh networks including smart lighting, smart door locks, building automation, as well as asset or fitness monitoring.

Other embedded processor announcements included:

S32K MCUs – A new automotive MCU line aimed to significantly simplify software development

i.MX 6Dual SCM (single chip module) Family – First of its new SCM portfolio of small integrated systems; fits application processor, PMIC, flash memory, embedded sw/fw, and security on a board the size of a dime

04/09/2015

The rising wave of embedded market opportunities is being carried by the Internet of Things. Technology leaders are quickly coming together to ensure their organizations (and hardware solutions) are compatible with new standards and third-party vendors. Open industry groups and alliances will be instrumental in accelerating the development, deployment, and support of end-to-end IoT products and solutions through the next several years - though this will not be the only approach.

The Open Interconnect Consortium (OIC) and AllSeen Alliance are examples of similar yet different IoT groups. While both of their Linux-based open source projects, IoTivity and AllJoyn respectively, promote interoperability across a wide variety of vertical markets and use cases, they differ in technical implementation, licensing policies, and overall progress. For instance, the OIC, with founders including Dell, Intel, and Wind River, calls for companies to “license the technology contributed to the group”. The consortium plans to publish a standard this year and has just released a preview of its IoTivity project. Meanwhile, the AllSeen Alliance recently (January 2015) modified its IP Policy so users must comply with a patent pledge, does not plan to publish an open standard, and is preparing to release the third version of its AllJoyn protocol. With dozens of members joining every month, founding organizations such as Microsoft and Qualcomm, and an upcoming third project release, AllSeen is currently ahead of the OIC in terms of development and membership base. However, with a soon-to-be finalized standard and its newly formed liaison with the Industrial Internet Consortium (IIC), we expect the OIC to gain much more momentum this year.

Another group gaining traction is Thread Group, founded by ARM, Nest, and Samsung, which has a similar goal as OIC and AllSeen, but focuses solely on connected home appliances. Unlike the OIC and AllSeen Alliance, which consistently promote the openness of their IoT projects, Thread Group is a closed ecosystem of various organizations. Thread’s lack of openness will ultimately hinder its membership growth and potential expansion into new verticals, but will allow for more central control of the roadmap and supporting protocols and technologies. Regardless, membership is currently growing and Thread plans on releasing a product certification program later this year.

The Industrial Internet Consortium, founded by AT&T, Cisco, GE, IBM, and Intel, runs on a different purpose and recently formed a liaison with OIC to exchange “its use cases and architectural requirements focused on industrial markets” for OIC’s promise to meet the requirements in its specification and IoTivity. This liason will “help to accelerate the delivery of an industrial-grade IoT architectural framework.” With almost 150 members, the IIC identifies “the requirements for open interoperability standards and [defines] common architectures.” Within a year, the IIC has already released its first energy-focused testbed and will release its reference architecture in the coming months. IIC’s new releases and partnership with OIC is a solid formula for an increase in members and development support this year. Despite its partnership with OIC, which is extremely open, the IIC only allows its members to view any of its contents.

Founded by more than 10 companies including Cisco and Atmel in 2008, the IPSO Alliance promotes IP as a solution by documenting its use in technologies defined at standards organizations like the Internet Engineering Task Force. IPSO is currently working on an IoT architecture guidelines across multiple vertical markets with an emphasis on facilitating the usage and sharing of IP. Despite IPSO’s age and recent publications, the group’s membership growth seems to have reached a plateau at 44 companies. It is hard to see any further development from this alliance despite its duration. IPSO publishes all contents on its website, however it also requires a membership to be able to view its technical guidelines and use cases.

Focused solely on building a “strong and sustainable market advantage” through solutions based on Intel architectures, the Intel IoT Solutions Alliance is driven by “creating hardware, software, firmware, tools, and systems integration”. The group was formerly named the Intel Intelligent Systems Alliance. Intel’s alliance is steadily pumping out new solutions with a total of more than 2,500 currently available from its 250+ members. The Intel IoT Solutions Alliance primarily competes with ARM’s mbed initiative and supporting architecture partners.

Hoping to bring together different industries, sectors, and companies in Europe, the Alliance for Internet of Things Innovation (AIOTI) was launched less than a month ago on March 24th, 2015. With support from the European Commission, the AIOTI is gaining members quickly even in its early stages.

As the IoT industry is booming, so is the number of IoT-driven alliances being formed and looking to steer the [embedded] market. While the growing number of IoT consortia, specifications, and projects will help improve various different connected solutions, all groups are driven to unify organizations by creating universal technologies, standards, and frameworks. But with each tech giant partaking in different alliances and creating standards in the hopes that others will adapt to theirs, the IoT is running into early fragmentation with a clashing of alliances fighting for their technology to be the one used by all. As most groups are in their early stages, much of the IoT has yet to be defined (and controlled).

03/03/2015

NXP’s acquisition of Freescale to form a $40B company is much more than two organizations unifying under a common banner – it is the wedding of leading embedded technology suppliers with similar, yet different, market focus and goals. Both companies provide a rich mix of embedded processors, analog and mixed signal solutions, wireless ICs, and other hardware. While there is some overlap in the companies’ microcontroller, RF, and sensor products, the rest of Freescale and NXP’s offerings are hugely complimentary to each other with cross-selling opportunities in a variety of markets including automotive, consumer electronics, and industrial automation.

While the acquisition is not a revolutionary change in direction for NXP and its target verticals, the new combined company will feature a much broader software portfolio, new products, and much greater corporate size that will enable it to compete more effectively with other semiconductor juggernauts and align with growing user requirements for facilitating development of the software stack.

Freescale has carried strong corporate momentum over the last few years as a result of its aggressive push into the Internet of Things (IoT) and investments made facilitating software development for embedded engineers. Freescale supports a variety of software development tools and environments in addition to providing the free MQX real-time operating system (RTOS) and its commercial CodeWarrior OSEK RTOS. Freescale also provides a variety of software libraries, frameworks, protocol stacks, and more. NXP, on the other hand, has done very little to facilitate software development on its MCUs and other hardware products or to supply crucial components of the software stack – instead relying heavily on its software partners for support. Embedded software has been vital to Freescale’s embedded processor market share growth and differentiation.

The acquisition of Freescale propelled NXP to become the second-largest supplier of embedded MCUs in addition to now being a leading vendor of SoCs as well. The combined assets and offerings of Freescale and NXP will also enable the company to better compete with its rival in the embedded processors space, Renesas. While there are a lot of similarities between Freescale and NXP, the general disruption of a massive merger such as this will certainly challenge the steady revenue growth enjoyed by both companies during the past three years. Nonetheless, the combined company expects to be able to shed hundreds of millions of dollars each year as a result of expanded buying power and annual cost synergies.

One of the new vertical opportunities for NXP produced by the merger is in the communications and networking space for applications such as gateways, small cell base stations, SDN switches/routers, and network attached storage with the QuorIQ platforms and PowerQUICC communications processors. While the company could divest this particular business to maintain its focus on other industries, VDC believes NXP should maintain Freescale’s communications and networking processor lines to take advantage of the rampant growth expected in that space through 2018 (5-year revenue CAGR of 10.8%) caused by the growing strain on network operators and service providers due to pervasive mobile computing and the IoT.

The proposed merger (expected to close in 2H 2015) is hugely beneficial for both companies. For Freescale, the acquisition means being able to shed some of the lingering restrictions from its long-term corporate debt issues. For NXP, it’s an opportunity to become a much broader provider of embedded technology with higher margin products. The move also greatly bolsters its software support which is becoming increasingly important to embedded hardware value every year. It will be imperative that NXP continues to develop and build its investments in enabling software and tooling support to ensure a lasting, beneficial marriage of the company’s traditional and new businesses, products, and solutions.

10/16/2014

The value of an embedded processor is increasingly defined by its supporting software development tools and platforms, according to a recently published study by VDC Research. The most important selection criteria for embedded processors, according to VDC’s findings, is the availability of programming tools and software (see exhibit). The Internet of Things (IoT) will accelerate this trend as design teams wrestle with implementing often newfound low-power connectivity on systems that are generally more complex. Mitigating software development efforts is therefore an increasingly vital trait of embedded and IoT processor solutions.

Software development solutions are available from a variety of ecosystem players including processor vendors, core IP licensors, ISVs, and more. In fact, software enablement is a major component to the success of embedded processor market share leaders like Freescale, Intel, and Renesas. Freescale, for example, provides a variety of development tools tailored for its processor product families in addition to specific applications and functionality such as automotive and wireless connectivity. Software development capabilities are a major factor for third-party core architectures as well, as each has their own homegrown solutions supported by ecosystem partners. As a result, ARM, Imagination Technologies, and Intel have all made dramatic investments into their respective ecosystems for tooling, programming environments, OSs, middleware, and other software over the past several years.

Use and support of integrated development environments (IDEs) and more sophisticated software/systems engineering tools has grown in parallel with the increasing end-user requirements for more robust software stacks. Currently offered/supported by the majority of embedded processor vendors, IDEs help aggregate and centralize vendors’ development tools and other resources such as SDKs, application notes, sample programs, and more. "We expect it will become more critical for these vendors to extend the breadth of their offerings to incorporate other tools such as system configuration and automated testing tools," says VDC analyst Dan Mandell. "Embedded software requires the greatest distribution of development costs and resources for today’s projects and is a major opportunity for which processor vendors can add value/differentiation, attract new customers, and/or pursue new markets."

Embedded hardware suppliers have been forced to evolve from pushing devices to supporting comprehensive solutions. Beyond the core metrics of price and performance, software takes precedence among processor selection criteria. IDEs in particular have become a common battleground for swaying influence on purchasing decisions. Processor vendors will need to continue building their software expertise and support to encapsulate more end-user requirements and, ultimately, sell more hardware.

VDC’s recently published IoT & Embedded Processors market research report forecasts and analyzes the markets for commercially available CPUs, GPUs, MCUs, and SoCs and their role in powering future embedded systems. Click Here for more information about this study and our various other coverage areas.

10/09/2014

As embedded auxiliary technologies such as geo fences and gyroscopes gain more popularity, the need for sensor fusion across device classes has become increasingly important. This importance has led to several mergers and innovative technology strategies from the likes of Fairchild Semiconductor and Kionix that will become more noticeable in the coming months. We expect sensor fusion to continue gaining traction in mobile/wearable device form factors and translate to the greater embedded market over the next several years.

Sensor fusion is the process by which data from several different sensors is fused together to compute algorithms too complex and resource intensive for a single sensory system to execute on it is own. In other words, sensor fusion combines data from multiple sensor types to enable greater functionality in a given device.

The entire market for sensor fusion in embedded hardware is growing at an extremely fast rate. Rich Collins, Marketing Manager of Synopsis, believes that the growth in sensor fusion is due to the need for additional processing capability and software for embedded systems which have generally grown increasingly complex. Some of the most compelling solutions have come from companies such as Synopsis, with its DesignWare Sensor Subsystem, as the current desire in the embedded marketplace is for more integrated solutions. Another interesting innovation has been seen from Kionix with their MEMS sensors which are scalable across multiple operating systems and support third-party software. Kionix is becoming increasingly notable from their developments in exercise technology and 3D gaming.

As a result of the increasing desire for embedded sensory integration, several major players in the sensor fusion market have seen mergers and acquisitions in the past 12 months. For instance, in May 2014, Fairchild Semiconductor, a provider of high performance semiconductors, announced its acquisition of Xsens, a Dutch company known for its motion tracking software. In June 2014, Audience announced it was acquiring Sensor Platforms. The fusion of these companies is leading to creative new ways of writing the algorithms required for sensor fusion. The escalating activity within the sensor fusion market will also drive growth within the MEMS market, of which several of these players have close ties to.

Sensor fusion is already translating well from consumer devices to the embedded market as a result of increasing connectivity, decreasing prices for sensor devices, and the internet of things. The limitless applications for sensor fusion will have major implications for future embedded systems.

10/04/2014

VDC Research congratulates STMicroelectronics for being selected as the winner of our embedded hardware ‘Best of Show’ award at ARM TechCon 2014!

STMicroelectronics announced the STM32 F7 line of microcontrollers based on the new ARM Cortex-M7 architecture last week, and it had a functioning unit at its booth running various benchmarks side-by-side with an STM32 processor based on the Cortex-M4. The company compared the processing time needed to process 3 different types of data in separate tests based on ray tracing, fractal imaging, and 3D-vectorial computation. The Cortex-M7 device delivered almost twice the performance and digital signal processing of its predecessor. The STM32 F7 MCU series operates at frequencies up to 200 MHz using the core’s 6-stage superscalar pipeline to produce up to 1000 CoreMarks.

The new 32-bit Cortex-M7 processor architecture fills the midpoint between ARM’s current Cortex-M4 and low-end Cortex-A IP. It is targeted towards high-end embedded and IoT applications such as motor control, industrial automation, image processing, connected car, smart home, wearables, and more. Other leading ARM partners for the Cortex-M7 processor include Atmel and Freescale.

Once more, congratulations to STMicroelectronics!

We have another blog available from our time at ARM TechCon 2014 and JavaOne highlighting notable embedded security demonstrations and solutions.

For more information on our recently published Embedded Processors report, which analyzes the market for commercially available CPUs, GPUs, MCUs, and SoCs and their role in powering future embedded systems, click here.

09/19/2014

What if, no matter where you are or how many people are connected to a network, you were able to perform every function you wanted with lower latency and at faster speeds than what is possible today? Think limitless connectivity: 5G mobile communications technology. While the very concept of 5G technology is still in its infancy, leaps and bounds are being made each year to establish its underlying technologies via research efforts across the globe. This next generation wireless system is projected to debut as early as 2018 by the 5G Creative Mobile Strategy forum in South Korea. However, most engineers do not believe 5G technologies will truly make its way into the hands of end users until sometime after 2020. VDC believes the latter, as several market elements between standardization and communications technology must advance and align to fulfill the prospect of the next mobile network generation.

Although the definition of 5G has yet to be solidified, common visions for this technology include: broader spectrum availability, multiple antennas, new waveforms, and heterogeneous networks. In order to make efficient use of available spectrum, 5G researchers are investigating millimeter wavebands which offer wider bandwidths thus increasing data capacity. By implementing multiple antennas, also known as multiple-input and multiple output (MIMO) technology, the spectrum will be able to maintain multiple data streams seamlessly. In order to support greater capacity, new and more-proficient signal structures—such as non-orthogonal multiple access multiplex (NOMA) and generalized frequency division multiplexing (GFDM)—must be examined. Heterogeneous Networks, or “HetNets,” will augment, possibly even replace, traditional, large tower base stations. HetNets employ small cells (e.g. femtocells, picocells, etc.) to strengthen connectivity when placed close together on rooftops or buildings. The utilization of HetNets will benefit high-traffic areas and increase backhaul capacity. If these visions of 5G are brought to fruition, wireless communication systems will never be the same. Although the aforementioned components of 5G will undoubtedly create an overall better network, there are many factors that will determine the direction and success of this next generation system.

While embedded technology will eventually be capable of creating a 5G network, it does not mean that there is currently a real business driver behind the switch to 5G. The companies that supply a 5G network are going to have to create not only a rich user experience in a technical sense, but they must also develop incentives and new business models encouraging adoption. If the transition is not seamless, 5G could go down the same path as WiMAX which fizzled away from popular use. Even 4G LTE has not picked up as quickly as vendors and network operators would have liked.

So what makes 5G different from these historical “break-through” technologies? Potential drawbacks of this next generation network include battery life and high product costs. If devices will always be connected, embedded engineers will need to implement new strategies to reduce device power consumption—which will come in the form of new advanced hardware architectures, low-power (yet higher-compute) processing cores, and improved battery technology. Another associated factor of always being connected is cost; how much does endless connectivity cost exactly? Additional costs for establishing and maintaining connectivity will be levied by mobile network operators to justify the capital expenditure required for 5G deployments. A 5G network requires several technological advances that will consequently increase the cost of associated devices. With these potential drawbacks in mind, several countries, companies, and academic institutions are investing billions of dollars worth of research into the creation of a 5th generation wireless system.

Few countries are in the race for creating and implementing a true 5G wireless system. South Korea, Japan, and the United States are the front runners with China and the European Union trailing behind. South Korea has begun to invest a $1.5 billion budget into their 5G Creative Mobile Strategy. This creative forum is led by SK Telecom and includes steering committee members Ericsson-LG, LG Electronics, and Samsung, as well as Intel and Qualcomm. Japan’s NTT DoCoMo is collaborating with six vendors—Alcatel-Lucent, Ericsson, Fujitsu, NEC, Nokia, and Samsung—in research and experimental trial efforts. Like Japan, the United States has no government-driven 5G research and development efforts. Thus, academia and private enterprises are leading these countries’ 5G research efforts. U.S. companies such as National Instruments and Agilent Technologies are top researchers in 5G technology. WICAT is a multi-university, academic research and design center and conducts extensive research surrounding 5G. WICAT includes Polytechnic Institute of NYU, Virginia Tech, University of Texas at Austin, Auburn University, and the University of Virginia. Intel formed a research partnership with several universities to investigate 5G technologies. Other companies involved in 5G research include Broadcom and Huawei. These institutions are only of few of the thousands that are researching next-gen networks.

Possibly the largest determinant in developing and deploying 5G wireless systems is infrastructure capability. Even if there is a high demand for this new network service, whether the cellular industry has the funds to build up the new required infrastructure is arguably the most important factor of 5G’s future. The technology may be there, but installing new wire lines and base stations requires significant investments.

5G will have massive implications for nearly every industry. In a world with 5G-based networks, M2M connectivity will be the norm and expected. Whether you are connected to your fridge, microwave, car, or office equipment, these machines will be able to monitor themselves and advise users of their status quo. Connectivity will be pinnacle to business processes; thus value chains will be shaken, facilitating new opportunities within industry and new business models. Pervasive M2M connectivity will help automate basic tasks for companies, creating a more dexterous workforce. M2M services are a powerful driver in bringing 5G to the forefront; however, the technology and user experience must be seamless in order make this limitless connectivity a standard in our world.

08/04/2014

As the hype and potential for the Internet of Things (IoT) continue to grow, we wanted to address some of the key issues concerning device interoperability and developer collaboration. Specifically, we will be looking at some of the most important communications protocols for the IoT today and how industry players are preparing for the future.

A communications protocol is an established system of rules that spells out the specific details for a type of communication between devices. This is distinct from a communications standard, which protocols are built upon and serve as a regular format to enable interoperability.

Four such communications protocols currently gaining the most traction for the development of IoT applications are MQTT, DDS, XMPP, and AMQP.

MQTT is a protocol that compiles data collected from IoT-enabled devices and transmits to back-end servers. MQTT is primarily implemented in remote monitoring applications such as energy use and equipment maintenance. Facebook, for example, is a well-known user of MQTT for its Facebook Messenger application as it is able to function with limited battery power and data bandwidth.

Next, the DDS protocol enables device-to-device communications through transmitting data collected. This type of communication is typically used for high-performance systems such as medical devices, transportation, smart cities, and military devices that require instant connectivity. NASA, for example, has used DDS middleware to support human-to-robot communications from earth to space.

XMPP serves for person-to-person communication, enabling personal control of IoT-enabled devices through users’ smartphones. It is generally used for consumer devices, and applications such as Google Talk have used the XMPP protocol.

Finally, AMQP is a protocol that facilitates server-to-server communication and enables a secure and reliable connection for control or analysis of data collected. AMQP was developed in the banking industry and is used most often in business messaging to send messages between servers using a tracking mechanism that ensures secure delivery.

A number of technology leaders are already in the midst of developing standards specifically for the IoT market. In 2011, Qualcomm announced the development of the open source AllJoyn protocol, later forming the AllSeen Alliance as its supervisor alongside Cisco, Microsoft, LG, HTC, and others. The AllJoyn protocol intends to enable connectivity and maintenance across connected devices.

Another consortium that is focused on developing standards for industrial IoT use is the Industrial Internet Consortium (IIC). The IIC was founded by Cisco, AT&T, GE, IBM, and Intel, but it has not yet released any specifications since it was first announced in March. Further, the Open Interconnect Consortium (OIC) was first announced earlier this summer and includes Atmel, Samsung, Wind River, Dell and Broadcom among its members. The OIC has stated that it intends to collaborate with the open source community in order to help foster innovation.

The most recent news came from Google last month, when they announced the development of a new networking protocol, Thread, with the aim of establishing a communications standard for IoT-enabled household devices. Nest, a connected device manufacturer acquired by Google earlier this year, already uses Thread in its line of products, which includes Nest Learning Thermostat and Nest Protect.

This rise in industry consortia shows an active effort to increase standardization for the IoT sector, which has until recently seen mostly fragmented and varying growth without intercompany cooperation. In order to truly increase connectivity between different devices, developers will need to work together to ensure compatibility among protocols as the IoT continues to spread.

06/26/2014

The indisputable rise of connectivity prompted by the Internet of Things across industries will spur strong demand for gateway devices to bridge potentially thousands of sensors, machines, or other products per device to the internet or cloud for years to come. Consequently, the gateway has access to a trove of potentially sensitive and valuable data. An M2M/intelligent gateway is therefore a major security asset or liability, dependent on OEMs’ efforts to prevent current and emerging threats from afflicting host networks, or the device itself. Embedded security will be vital to the success of current and future M2M gateway solutions.

M2M Gateway:A compact, flexible class of hardware platform enabling communication between end devices in the field (e.g. actuators, sensors, etc.) and the cloud/internet, with the added functionality and/or flexibility (e.g. multi-standard support) and/or ruggedness of an embedded industrial computer. M2M gateways reside on the edge (i.e. defined boundary) of wired and wireless networks.

Nearly all OEMs we interviewed in support of our upcoming M2M and Intelligent Gateways Market Analysis Report cited data security as a major point of concern. In fact, preliminary findings from our 2014 Annual Embedded Engineer Survey echo this issue; Security Issues are cited as being the most challenging aspect of introducing embedded cloud services through M2M gateways.

Gateway OEMs are implementing a number of technologies and standards within their products to ensure end-to-end data security for end users. For example, Eurotech’s ReliaGATE gateways come standard with support for encryption protocols like 3DES, AES, IPsec, and SSH. ILS Technology’s M2M Asset Gateway platform enables role-based security controls that allow users to segregate asset access in addition to the resources within each asset. Sufficiently addressing the vast array of security threats will require a multi-pronged approach by OEMs on various levels of the gateway solution stack – not only digitally but physically as well in some deployments.

As the IoT continues to spread, so too will the rise in the number of different types of attacks and malicious software. OEMs of M2M gateways will need to ensure proper security of their devices pre- and post-deployment. Over-the-air updates are a fundamental requirement of today’s M2M gateways to ensure protection throughout the lifetime of the device. Embedded security requirements are greater in some vertical applications than others, but all must meet basic protocols to merely ensure network uptime – if nothing else. Security is a major concern for the IoT, and OEMs will be held accountable for breaches and other intrusions on their devices.

To find out more about our M2M & Intelligent Gateways and other Embedded Hardware research, click here.

05/09/2014

This year’s Axeda Connexion conference in Boston was full of interesting and insightful ideas and views of the future of M2M communications and the Internet of Things. A variety of global enterprises sponsored the event and contributed to various keynotes and sessions including AT&T, Broadcom, Deutsche Telekom, Intel, Oracle, Salesforce, Wipro, and many more. The Internet of Things will have a major impact on traditional business models for embedded hardware players, software vendors, service providers, network operators – everyone throughout the value chain. The change for end users will be as much cultural as it will be technological.

The following are five key takeaways from our time at Axeda Connexion 2014:

The bulk of the M2M opportunity is with assets already deployed today.The leading industries for M2M solutions, such as Automotive, Industrial Automation, and Oil & Gas, all have equipment and devices featuring long lifecycles and high costs. End users are looking to connect their “dumb” devices and sensors to enable new applications in a very cost-sensitive manner. The need for connectivity in legacy equipment will drive embedded hardware market form factors that can easily augment a system with connectivity – either via modules, new chipsets, peripheral systems, etc.

Security is a paramount concern of IoT development.Nearly every presentation or keynote highlighted security as a major concern for the development of connected systems. OEMs will need to regularly adjust to new threats in order to preserve the integrity of the device, platform, network, and last but not least, the data. The example frequently cited was the alarming lack of security found in many connected medical devices produced today. Ecosystem players will need to adapt new methods and lines of thinking to preserve trust (brand integrity) with end users when it comes to security in the IoT.

Those who adopt big data applications now will have a tremendous competitive advantage for years to come.Big data applications such as asset monitoring, analytics, and remote services can enable several internal and external benefits, and these competitive advantages are only increasing due to lagging adoption from competitors. For instance, analytics can provide new insights for future product designs and help facilitate iterative development in an embedded world where time-to-market pressures are constantly growing. Big data applications can also enable newfound efficiencies for internal processes such as field service and operations. Markus Breitbach, Vice President of Global Sales & Marketing of Deutsche Telekom, claims that the ROI of connected solutions is less than a year for most end users.

Strategic partnerships are a requirement for scalability in the IoT.Integration is the biggest strength of an IoT solution. As the value proposition of an embedded device shifts toward software from hardware, as a result of the proliferation of more-powerful processors, memory, and systems, traditional manufacturers will be required to become more IT-centric to supply more of the solution stack (and services). Doing so will require strategic partnerships with a variety of ecosystem players, particularly when supporting cellular connectivity or international deployments. Rob Sobie, Vice President, Healthcare at Emerson, said at his session (in so many words) that Emerson could never have pursued the IoT market on its own. To put that into perspective, Emerson generated about $24.6B in revenue last year and has approximately 132,000 employees worldwide.

Growth of M2M connectivity will be different for each industry.Each industry will have its own timeline for adoption of connectivity and big data technology. One concern in areas such as the Medical and Financial industries will be tight regulations surrounding patient/personal data, which are exacerbated by international laws. Consumer applications, though, are already permeating in developed countries with the proliferation of mobile and wearable devices. The Industrial Automation and Energy industries, though, have very clear use cases and tangible benefits from enabling a connected solution, and will continue to be among the driving vertical markets for the IoT.