Author: Jerry Overton and JC Brigham

A key challenge for today’s manufacturing industry is not the lack of new ideas and products, but rather the ability to design and build new products efficiently. With the rise of digital technology, manufacturers can now tap into data-driven solutions that make use of computing — the cheapest and most abundant resource available to the industry today.

At the same time, IT is becoming an integral part of many products, a trend enabled by inexpensive sensors, processors and storage, purpose-built software and purpose-built clouds that allow data storage and ubiquitous connectivity.

This environment sets the stage for a digital-twin approach to product development. The process uses stochastic simulation to generate what-if scenarios that can help manufacturers avoid costly product quality issues, while speeding time to market and increasing throughput.

The scientific method goes digital

The digital-twin approach may sound exotic, but it is really just a modern twist on a very old idea: the scientific method. In the same way scientists test through experimentation, manufacturers can build stochastic simulations, generate experiments and use the findings to minimize risk and innovate in the process.

Tesla provides an excellent example of the concept. The electric car manufacturer has a digital twin for every car it builds, tied to the car’s vehicle identification number (VIN). Data is constantly transmitted back and forth from the car to the factory.

For instance, if a driver has a rattle in a door, it can be fixed by downloading software that tweaks the hydraulics of that particular door. Tesla regularly pushes out software updates to customers’ cars based on the data received from them.

How to create a digital twin

Creating a digital twin starts with establishing new pipelines of manufacturing data. We can automate the collection of data, for example, from materials and design. When integrated with historical operations performance data, we now have what we need to support a digital twin.

The next step is to take the manufacturing process and model it using rules. But instead of using retrospective models, we use prescriptive models.

Retrospective models, like those commonly used in predictive modeling, try to calculate the future based on past trends. Models such as this have been successful in some areas of manufacturing prediction, but they focus on optimization rather than on breakthrough innovation.

With the digital-twin approach, we build stochastic simulations, or prescriptive models. We do this by creating rules that map from design to performance and add randomness to simulate risk.

The prescriptive data from the simulations shows how new products will work. By analyzing it, we can detect design flaws early. We can predict and minimize cost. And we can use this mountain of intelligence to build improved products in the future.

Because randomness is inherent in the model, we can simulate the kind of uncertainty encountered in the real world. And since computing power is cheap, we can afford to run millions of scenarios, anticipating an entire spectrum of possible outcomes, rather than just a single expected result. In fact, we can learn as much from the digital twin as we can from the real-world original.

The Internet of Things (IoT) adds another layer of insight. We can augment the manufacturing process with sensors and automatically generate data about operations, performance and maintenance. By using industrial machine learning— a scalable solution for ingesting data, building algorithms and deploying them into production — we can turn the streaming variant of the digital twin into a continuous source of manufacturing insight.

A real world example

For demonstration, DXC built a digital twin for a hybrid-car manufacturing process using a limited number of variables. The goal was to use a digital twin to help companies predict how a car would perform before committing to expensive changes in the manufacturing process.

We looked at hybrid cars, which can be built with different options for transmission, vehicle class, engine displacement and fuel type. These factors affect the car’s performance in such metrics as miles per gallon, and they change the retail price.

We took data that related manufacturing options to car performance in the real world and cross-validated the performance of our simulations using a different set of test data.

DXC has partnered with Microsoft to build machine-learning solutions on an industrial scale, and we used the Microsoft Cortana Intelligence Suite to run the digital twin, continuously simulating new ways of creating hybrid cars.

In one scenario, we supposed a manufacturer wanted a line of cars that would appeal to young professionals living in large cities. These customers often care about fuel efficiency and want an optimal city mpg rating. Although they can afford the additional expense of a hybrid car, it’s important to keep the price lower for this market segment.

With the cognitive computing native to Microsoft Cortana, we built a digital twin, deployed it into production and asked natural-language questions such as:

“What are the simulation runs with the best predicted city mpg and predicted five-year savings?”

The result was a design dashboard showing simulations that best target the market for young urban professionals. With this method, we can predict the best fuel type, engine displacement, transmission and vehicle class options to optimize both long-term affordability and city miles per gallon.

Digital twin and IoT

Digital twin really sits in the continuum of the IoT. If we agree that the foundation of IoT consists of connectivity, sensors and analytics, then predictive maintenance becomes an established IoT application.

Predictive maintenance is case-based reasoning enabled by data. The digital-twin approach handles this by incorporating product data, including maintenance history, from design to operation and beyond.

GE is piloting a “digital wind farm” concept used to inform the configuration of individual wind turbines prior to procurement and construction. Once the farm is built, each virtual turbine is fed data from its physical equivalent, and software adjusts turbine-specific parameters, such as torque of the generator and speed of the blades, to optimize power production at the plant level. The hope is to generate 20 percent gains in efficiency.

Dassault Systèmes has built an aerospace- and defense-specific manufacturing operations management product called “Build to Operate.” The solution can monitor, control and validate all aspects of manufacturing operations, ranging from replicable processes and production sequences to the flow of deliverables throughout their supply chain — and on a global scale. Airbus Helicopters has deployed this system for current and future helicopter manufacturing.

Although the application of digital-twin technology is still in its early stages, the possibilities for the manufacturing industry are tremendous. The ability to design, produce and repair products with the guiding intelligence of data-driven simulations will be a game-changer that leads to greater efficiency and bigger innovations in the field.

Jerry Overton is a data scientist and Distinguished Engineer at DXC, a global leader of next-generation IT solutions. Jerry is head of advanced analytics research and a principal creator of DXC’s Industrial Machine Learning (IML) offering. In his blog, Doing Data Science, Jerry shares his experiences leading open research in data science.

Joan-Carol (JC) Brigham has been an analyst in DXC’s ResearchNetwork for eight years. She has led strategy work and managed much of the startup of industry research in the ResearchNetwork. Currently, she is a principal and business manager analyzing the manufacturing industry.