This article assumes you have a certain familiarity with the Digital Twin concepts, if you don't, start with this shorter article.

The Digital Twin & Machines Journey to the Digital World

I started to focus on Digital Twins and the potential for Systems of Asset about 18 months ago and, at that time, it looked like a rare secret material! Nowadays, Google sends me a daily alert on the "digital twin" keyword with 4 hits on average. I used to curate all these mentions on a Pinterest board, but there is too much volume to handle now and I am starting to give up.

Much has been said on the Internet about the idea of modeling physical things - read machines - and all the potential outcomes that businesses can derive. Fundamentally, it is a fairly simple idea. Machines have started their journey to the Digital World with control systems: computers that were designed to operate the system within required specifications and also guarantee safety. In the Industrial space of complex machinery, safety is paramount. The next step has been to use some of the data from these control systems to do something else.

An interesting example can be found in aviation: the flight data recorder. It captures a select number of parameters and records these in order to provide audit and forensics in case of a catastrophic event. The various control systems of subsystems in an airplane were not designed for that, it was added after. Interestingly enough, in the 90s, British Airways started to do what most industries are doing nowadays under the broad term "Industrial Internet": they looked at the flight recorder data and used some of the derived insights to better maintain their airplanes. As instrumentation grew, more sensors and better actuators, we can now fully digitalize machines.

So, the Digital Twin idea can be simply explained: as physical systems are now designed to be instrumented, you can collect data all along their lifecycles: from invention, to design, to manufacturing, to operation and maintenance, until decommissioning. As all this data is persisted and understood through proper metadata management, you can start to develop intelligence by using analytics and machine learning. The accumulated data gives you the pictures of the past and present conditions and performance. The intelligence gives you early warning and predictions, in short, the future. Learn from the past and present, predict the future. The Digital Twin is the proxy of a physical system in the digital space, it can tell you everything it knows about that system and offers predictions. Over time, the Digital Twin will also be used to control the physical system it is paired to. Every physical system will have its unique Digital Twin. Note that complex systems might not have only one unique Digital Twin of everything, but could be a composite of the Digital Twin of different parts, but this is beyond the scope of this discussion.

This technology serves important use cases in the industrial world: monitoring and diagnostics, maintenance optimization, individual and collective operations optimization, close back the loop from equipment performance to equipment design, etc.

Now that we have set the context of the idea of the Digital Twin and its potential business value, I would like to reflect on the technological aspects and the possibilities.

The RDBMS Analogy

To better understand the technology vision and how it should materialize to open up lots of possibilities, there is a very interesting analogy or parallel that we can draw with a technology that dominated the 80s: Relational Database Management Systems, aka RDBMS.

RDBMS technology also has three pillars: Database Structures (metadata), Records (data) and Database Engine (that became a platform in the long term).

Today's approach to Digital Twins is still not as powerful as RDBMS are. There is a language to create relational database structure, it is called DDL: data description language. Once you have the structure, records are created and manipulated through APIs, these are INSERT/UPDATE/DELETE/SELECT in RDBMS. Database engines allow for optimization of indexing and storage of the records, as well as the performance of the APIs. None of these artifacts have been standardized for Digital Twins yet, and I personally think this is going to be a critical step in the potential mass adoption of the concept.

If you drill down into the analogy, there are even more interesting insights, it has to do with Master Data and a single source of truth.

Master Data and Single Source of Truth

When we started to deploy RDBMS, an explosion of Apps happened. Almost every single Enterprise started to build a database structure and began loading records about their customers, products, transactions, etc. Long story short, this lead to a Master Data Management nightmare.

Think about two examples. First, the definition of a customer would be deeply integrated into the App, meaning that the Ordering system would have its own, and the Invoicing would have a second one and another for Customer Service. The second example comes from the Banking world. Today, most Banking System are still based on the initial assumption that you are not a customer, but an account number. Bank portals do a good job at hiding that in most of their operations, but not all. If you call your bank to inquire about your checking account, you will first be authenticated. Likely at the end of the discussion, the bank person will ask you if you need something else. If you answer "yes, I would also like to discuss my investment account", the clerk will be very happy to transfer you, and guess what the investment account guy will do: authenticate you again.

Let's also consider Salesforce. Salesforce is well known for its success as a SaaS vendor, riding the Cloud wave like no one else (except maybe Amazon AWS and nowadays Microsoft). But, much more important to my judgment, Salesforce had the vision of a common object model, extensible, for the definition of a customer. It started with CRM, extended to Customer Servicing and Marketing, and opened it up as a platform play. The rest is history.

I can clearly see the same path here for Digital Twin technology and the Industrial IoT.

Today, a lot of Industrial companies are focused on delivering outcomes, and build or buy apps that serve these outcomes. Most of these apps have a portion of the Digital Twin vision or approach. Yet, their Digital Twins are imprisoned in the apps. Integration will a play a key part in ensuring that these apps speak to each other and function properly within an IoT framework.

It took decades for Master Data Management to become a hot issue. Yet, we had time because we started by building single-purpose apps, there was no internet and no electronic ecosystems immediately forcing us to think in terms of a single source of truth, reuse, and sharing. The infrastructure is here now, and digital business platforms are a requirement for pretty much any business to stay competitive.

As we digitalize the physical world, I predict a lot of chaos in the short run because of a lack of "single source of digital truth" for the physical world that we want to integrate into our digital economies.

The Digital Twin approach will be the solution, but we need vision, platforms, integration and discipline to succeed. RDBMS revolutionize the way we handle transactional data and create the "Systems of Record" category. Lots of business value was created in the race to dominate that category. Digital Twin teoffersogy offer the same potential for "Systems of Asset" to emerge as a category.