We are in an era of amazing exponential change. It is coming very fast and is impacting all we do and how we live. A few years back, The McKinsey Global Institute published an informative analysis that comprehensively examined the economic impact of global technology trends. The study was called, “Disruptive Technologies: Advances that will transform life, business, and the global economy.” The McKinsey predictions were highly accurate. As we move ahead into another year of unprecedented technology advancement, it is useful to examine the trends & technologies and applied verticals that are already shaping 2018. By 2019, 40% of digital transformation initiatives will use artificial intelligence; 75% of commercial enterprise apps will use AI; over 90% of consumers will interact with customer support bots; and over 50% of new industrial robots will leverage AI, Big Data and Digital Transformation.

Below are a compilation of categories, lists and short analyses that should be useful as heuristic tools in tracking and navigating the rapid and transformational changes in our paths.

TRENDS & TECHNOLOGIES:

Artificial Intelligence: Gartner describes artificial intelligence as a “technology that appears to emulate human performance typically by learning, coming to its own conclusions, appearing to understand complex content, engaging in natural dialogs with people, enhancing human cognitive performance or replacing people on execution of non-routine tasks.“ The promise of these technologies are very exciting. Microsoft UK’s chief envisioning officer Dave Choplin claimed that AI is “the most important technology that anybody on the planet is working on today.” Human/computer interface breakthroughs that will extend human brain capacity and memory. There are new developments in “neuromorphic” tech, that can incorporate nano-chips into wearables modeled on the human brain. Eventually these nano-chips may be implanted into our brains artificially augmenting human thought and reasoning capabilities. Companies are already developing technology to distribute artificial intelligence software to millions of graphics and computer processors around the world. McKinsey predicts a $5 to 7 trillion potential economic impact by 2025 from automation of knowledge work by intelligent software systems that can perform knowledge work tasks from unstructured commands. We may also have artificially intelligent personal assistants, perhaps even in holographic forms in some sort of augmented reality. Google, Facebook, Microsoft, and Twitter have formed and are prioritizing artificial intelligent teams throughout their companies.

Machine Learning: According to Gartner, Machine-learning technology combines the "information of everything" with smart machine algorithms to make an algorithmic business possible. Machine learning algorithms are composed of a set of many technologies — deep learning, neural networks, natural-language processing and other technologies — used in unsupervised as well as supervised learning ways to understand information, activities and the world. In business, an application of machine learning includes chatbots, avatars, and digital assistants. 3D Printing and 4D self-assembling printing: 3-D printing is trailblazing future manufacturing. 3-D printing connotes a three-dimensional object that is created layer by layer via computer aided design) programs. To be able to print the object, the computer divides it into flat layers that are printed one by one. By printing with advanced pliable materials such as plastics, ceramics, metals, and graphene there have already been breakthroughs in prosthetics for medicine and wearable sensors.

High Performance Computing (Super & Quantum): The world of computing has witnessed seismic advancements since the invention of the electronic calculator in the 1960s. The past few years in information processing have been especially transformational in our hyper-connected world. What were once thought of as science fiction fantasies are now technological realties. Classical computing has become more exponentially faster and more capable and our enabling devices smaller and more adaptable. In today’s world, computing rules almost all that we do and much of it is already stored in The Cloud. The exponential upsurge of data and its uses directly impact the critical infrastructure of society, including health care, security, transportation, communications, and energy. We are starting to evolve beyond classical computing into a new data era called quantum computing. It is envisioned that quantum computing will accelerate us into the future by impacting the landscape of artificial intelligence and data analytics. The quantum computing power and speed will help us solve some of the biggest and most complex challenges we face as humans. Futurist Ray Kurzweil said that mankind will be able to “expand the scope of our intelligence a billion-fold” and that “the power of computing doubles, on average, every two years. Seymour Cray is commonly referred to as the "father of supercomputing" and his company, Cray Computing, is still a driving force in the industry. Supercomputers are differentiated from mainframe computers by their vast data storage capacities and expansive computational powers. The website Techtarget.com provides a strong working definition of HPC: "the use of parallel processing for running advanced application programs efficiently, reliably and quickly. The most common users of HPC systems are scientific researchers, engineers and academic institutions. Some government agencies, particularly the military, also rely on HPC for complex applications." HPC works hand-in-hand with supercomputing as it requires the aggregation of computer power to address problems and find solutions. The National Academy of Sciences, in its study "The Future of Supercomputing," envisions investments in supercomputing as highly beneficial and that it plays an essential role in national security and in scientific discovery.

Quantum Computing: “Quantum information science has the potential to revolutionize all manner of industries, open up new fields of discovery and accelerate scientific breakthroughs,” Michael Kratsios, Deputy Assistant to the President for Technology policy. The world of computing has witnessed seismic advancements since the invention of the electronic calculator in the 1960s. The past few years in information processing have been especially transformational. What were once thought of as science fiction fantasies are now technological realties. Classical computing has become more exponentially faster and more capable and our enabling devices smaller and more adaptable. We are starting to evolve beyond classical computing into a new data era called quantum computing. It is envisioned that quantum computing will accelerate us into the future by impacting the landscape of artificial intelligence and data analytics. The quantum computing power and speed will help us solve some of the biggest and most complex challenges we face as humans. Gartner describes quantum computing as: “[T]he use of atomic quantum states to effect computation. Data is held in qubits (quantum bits), which have the ability to hold all possible states simultaneously. Data held in qubits is affected by data held in other qubits, even when physically separated. This effect is known as entanglement.” In a simplified description, quantum computers use quantum bits or qubits instead of using binary traditional bits of ones and zeros for digital communications The Chairman of the House Homeland Security Committee minced no words Wednesday about the ongoing global arms races to develop quantum computing, saying the country that harnesses it would have a remarkable advantage in the 21st century. “I think whatever superpower gets that first, it would be like the equivalent of first digital nuclear bomb,” Rep. Mike McCaul said at the American Enterprise Institute about U.S. competition with global adversaries like China, Russia, North Korea and Iran. “Whatever country that gets that first is going be an extraordinary superpower. It will blow computing as we know it out of the water.”

Big Data & Analytics: According to the Gartner IT Glossary, Big Data is high volume, high-velocity and high-variety information assets that demand cost effective, innovative forms of information processing for enhanced insight and decision making. to Eric Schmidt, CEO of Google, estimated that we produce more data every other day than we did from the inception of early civilization until the year 2003 combined. Therefore, organizing, managing and analyzing data is more important than ever. It is estimated that by 2020, there will be 35 zettabytes of digital data. Big Data is comprised of the data governance of everything including; geospatial data, 3D data, audio and video, and unstructured text, and social media. A major focus of R&D investment is how to take high-speed data streams of both 'structured data" (residing in a predetermined field) and "unstructured data" (not organized in a pre-defined manner). Eighty percent of data is unstructured. That means specialized optic technologies, software algorithms and innovative processes are necessary to de-clutter data and allow for distillation and sophisticated assessment. The goal of this type of technology is to develop a deployable, fully automated, real-time, secure way to collect and analyze complex streams of data. Digital Transformation of data includes digitizing the user experience, data flow, supply chain management, governance, engagement, e-government and virtual government. In its basic description, it is turning paper into electronic records. Going from paper-based to electronically based systems of documentation requires data collection, processing and analysis. Last year at the annual World Economic Forum meeting in DAVOS, it was announced the combined value of digital transformation -- for society and the industry -- could be greater than $100 trillion by 2025. That transformation includes the immersive inclusion of digital technologies and cloud-based platforms. It also includes analytics, sensors, mobility and a new era of automation impacting all industries and verticals including financial, energy, security, communications, and health. Data Analytics: With the rise of big data, data driven decision making and probability thinking is growing. Data is everywhere flowing from the sensor networks that surround us and is the transactional roots of our activities. What , why, and how we make choices in our lives are reflected in and can be discerned through the collection, organization, and taxonomy of that data. When the extracted data is systematically combined with multi-layered analytics, it creates a forensic and predictive meaning that can be transformed into actionable insights in reporting systems. The future of applied data analytics looms bright and the data sets of disparate information are seemingly endless. Technological R&D advances such as "machine thinking," which will allow connected devices on the "Internet of Things" to talk to and learn from each other, will contribute immensely to the use of data analytics. Open Data sharing is also catalyzing the development of new analytical capabilities. In the private sector, information mined from transactions can be used for demographical analysis and to calculate consumer purchasing habits, credit risks and predict consumer trends. Financial institutions can use predictive algorithms to create the best financial management options from market and transactional data. Combined with social media analytics, optimizing economic forecasting has become a new data analytic art.

Internet of Things: Internet of Things (IoT) refers to the general idea of things that are readable, recognizable, locatable, addressable, and/or controllable via the Internet. Most everything nowadays is connected to the internet by sensors. Cisco, who terms the “Internet of Things”, “The Internet of Everything,” predicts that 50 billion devices (including our smartphones, appliances, and office equipment) will be wirelessly connected via a network of sensors to the internet by 2020. the analyst firm IDC predicts that spending on IoT will reach nearly $1.4 trillion in 2021. Dr. Janusz Bryzek, vice president of MEMS and Sensing Solutions at Fairchild Semiconductor, predicts there will be 45 trillion networked sensors 20 years from now. This will be driven by smart systems, including IoT, mobile and wearable market growth, digital health, context computing, global environmental monitoring, and artificial intelligence (AI), hyperimaging, macroscopes, medical “labs on a chip,” and silicon photonics IoT is conjoined with the Internet of Everything (IoE). Cisco defines IoE as the networked connection of people, process, data, and things. The benefit of IoE is derived from the compound impact of connecting people, process, data, and things, and the value this increased connectedness creates as “everything” comes online. Gartner lists the pillars of IoE as: People, Data, Process, and Things. Security is and will continue be a major factor in both IoT and IoE. 5G and IoT: Fifth-generation wireless, or 5G, is the latest iteration of cellular technology, engineered to greatly increase the speed and responsiveness of wireless networks. 5G will be the backbone of most future networks. New 5G connections which will be used in IoT connectivity, critical infrastructure, and all industry verticals.

Smart Cities: Smart Cities integrate transportation, energy, water resources, waste collections, smart-building technologies, and security technologies and services. The term “smart city” connotes creating a public/private infrastructure to conduct activities that protect and secure citizens. This includes shared situational awareness and enabling integrated operational actions to prevent, mitigate, respond to, and recover from cyber incidents as well as crime, terrorism and natural disasters. In the past few years, cities have migrated from analog to digital and have become increasingly “smarter”. A smart city uses digital technologies for information and communication technologies to enhance quality and performance of urban services, to reduce costs and resource consumption, and to engage more effectively and actively with its citizens. A smart city is indeed a laboratory for applied innovation. A smart city and its accompanying ecosystem can influence and impact the industrial verticals including transportation, energy, power generation, and agriculture. Frost & Sullivan estimates the combined global market potential of smart city segments (transportation, healthcare, building, infrastructure, energy, governance) to be $1.5 Trillion ($20B by 2050 on sensors alone according to Navigant Technology). Experts at the SmartAmerica Challenge predict that $41 Trillion will be spent on smart cities over the next 20 years to upgrade infrastructure to benefit from IoT advances

Cybersecurity: 5,207 breaches and 7.89 billion information records were compromised globally in 2017. Information assurance and resilience are the glues that will keep our world of converged sensors and algorithms operational. Technology development continues to evolve with the introduction of new innovations to address the cybersecurity framework that includes networks, payloads, endpoints, firewalls, anti-virus software, and encryption. This framework will provide for better resiliency and also forensic analysis capabilities. Some newer areas of cybersecurity spending will be in the areas of cloud, authentication, biometrics, mobility, automation, including self-encrypting drives. And, of course, super-computing and quantum computing. In the U.S., most (approximately 85 percent) of the cybersecurity critical infrastructure including defense, oil and gas, electric power grids, healthcare, utilities, communications, transportation, banking, and finance, is owned by the private sector and regulated by the public sector. With larger attack surfaces and connectivity, companies and agencies have to be better at incident detection, response, and recover. There is a growing need for the following in government and private sector: 1) Better encryption, authentication and biometrics (quantum encryption, keyless authentication, etc.); 2) automated network security and adaptive self-encrypting drives (artificial intelligence and machine learning) to protect critical infrastructure in all categories; 3) the protection of critical infrastructure through technologies and public-private cooperation; 4) technologies for "real time" horizon scanning and monitoring of networks; 5) advanced defense for framework layers (network, payload, endpoint, firewalls and anti-virus); and 6) diagnostic and forensics analysis. Cybersecurity automation certainly is key, "There are too many things happening - too much data, too many attackers, too much of an attack surface to defend - that without those automated capabilities that you get with artificial intelligence and machine learning, you don't have a prayer of being able to defend yourself," Art Coviello, a partner at Rally Ventures and the former chairman of RSA. There are many other emerging technologies that are part of future cybersecurity toolkit. They include edge computing, encryption, virtualization, photonics, hypervisors, hardware based trust anchors, anti-malware detection systems, and converged software defined environment.

Blockchain also offers promise as a cybersecurity application. It is a peer to peer network with a shared, distributed ledger. Blockchain’s decentralized technology offers cyber-defenses from many types of attacks because it removes single failure points that many often hackers prey upon. It is already being used in the financial sector and offers selective transparency and privacy.

Robotics: Robots are no longer Jetson like technologies, they exist. They are physical machines used to automate tasks and usually directed by computer programs. They have applications for manufacturing, and construction, and for exploration of terrain, oceans and space. They are now also the subject of both policy questions and moral dilemmas. Will they take jobs away from workers? And in a future incarnation; combined with artificial intelligence, will robots pose a threat to mankind?

Materials Science: Exciting research in materials science are creating stronger, durable, lighter, and even “self-healing” materials. Nanomaterials artificially engineered at molecular scale synthetic composites are now being designed at the inter-atomic level. The capability to design and manufacture infrastructure such as bridges, roads, buildings with stronger, adaptable, self-intelligent, and seemingly eternal materials will revolutionize the construction and transportation industries.

UAS and drones can also be included in this category. Rapid proliferation of UAS is both military and civilian markets require solutions for identification, interdiction, and monitoring. UAS and Drones also have many potential applications in transportation, commerce, emergency response, and security. Supercomputing: Supercomputing and the corollary of high-performance computing have become the means mechanisms for those vital tasks.

Virtualization and Augmented Reality: The world is going virtual and it being supported by a myriad of new and exciting technologies including artificial intelligence, augmented reality, and exponential connectivity to both people and objects. Augmented reality intertwines the physical and digital world by computer-generated sensory input such as sound, video, graphics, and sometimes even smell. Google Glass and Oculus Rift, are already good examples of these emerging technologies. Virtual communications combined with virtual reality will become integrated into business applications. It can also serve as an outlet of entertainment and already is in gaming at on attractions such as “Soaring” at Disney World. The analytical firm Digi-Capital forecasts Augmented/Virtual Reality revenue forecast to hit $120 billion by 2020.

Wearables: these include flexible electronics Wrist bands, rings, glasses, ear pods, contact lenses, attachable, wearable, and embedded. There may be upwards of 604 million users of wearable biometrics by 2019 according to Goode Intelligence. The trend of wearables is an emerging one with seemingly limitless possibilities. The question is no longer when wearable tech will be available, but how fast, these technologies will extend human /computer interface capabilities and how ingrained in our daily lives that these technologies will ultimately become.

These are just some of the emerging tech areas and verticals that are already transforming our world. There are many more that were left out as new discoveries materials science, chemistry, physics and nanotechnologies offer endless possibilities. Wait for 2019!