Building a Digital Twin

Building a Digital Twin

Technology visionary Monica Schnitger, reviews how digitalization, in particular the concept of the Digital Twin, is transforming project execution. In this, the first of two blogs on the theme, she shares insight on how companies are rising to the challenge.

Since the oil price crash of 2014 and the sustained price pressures in the “lower for longer” market, upstream oil and gas, marine, power and related heavy industries have been under unprecedented pressure to operate more efficiently. With time and budget at a premium, the concept of digitization can appear to be a silver bullet. The vision is well established in the form of the Digital Twin: the opportunity to replicate each actual physical asset, whether that be a cargo vessel or a refinery, as integrated data and information. This “twin” is a true version of the reality. As such, it also monitors performance in real time using sensors to measure variables and transmit information to control rooms.

While a Digital Twin can display current performance, keeping the engineering data up to date is an ongoing task. And it can be a challenge to implement for complex plant operators, shipyards and engineering, procurement and contracting companies (EPCs). For example, if you're a 100-year old engineering firm serving a conservative customer base, or a shipyard with workers whose fingers are too scarred from years of metal working to use a tablet or an oil major, trying to eke out a bit more production in an environment that's financially unforgiving, digitalizing your operations is not often straight forward. But you need to move forward, and the benefits in terms of time and cost savings and more efficient operating procedures are clear. So, how do you start? Where is it best to begin? What are pitfalls others can help you avoid? What benefits can be gained by creating a Digital Twin? After all, change is painful so is it worth it?

Leaders from EPCs, asset owners in oil and gas, chemical, power, pharma, shipbuilding and other industries; consultants and techies are diverse.

First, it’s important to recognize that every enterprise is applying digitalization in its own way -- commensurate with their level of technological readiness. Some are still 2D shops, so it means moving from 2D to 3D. Others are implementing new-to-them technology such as laser scanning for analysis (from as-designed, or over the life of an asset, or to prove milestone completion) after using it to augment as-designed computer-aided design (CAD) models. Still others are investigating augmented and virtual reality platforms as a way of leveraging their existing 3D models and underlying or auxiliary data. The point: each is starting where they are today, looking at the technology available to them and determining the benefit they believe they can gain and jumping in, piloting something new and, if successful, rolling it out to other projects, teams or locations.

In my experience, many larger enterprises use homegrown solutions to manage some parts of their engineering data and processes. But this is proving less and less sustainable, as technology change accelerates and as newer concepts, like augmented and virtual reality, are far easier to plug into commercial, off-the-shelf (COTS) platforms. The consensus seemed to be that commercial tools and platforms are capable enough, shift the burden of maintenance and extensibility to companies who specialize in software development (after all, power companies make power, not software), and ensure that cross-system compatibility is maintained via standards.

Security is another important factor. This industry isn't yet wholly comfortable with the Cloud as a compute resource or as a place to store data. Conversely, industry is used to the Cloud as a collaboration platform. But consensus is growing that putting security into the hands of a Cloud provider isn't such a bad idea when one is faced with the possibility of a bad actor with a plug-in device bringing down an entire operating facility.

Another challenge that the industry is still grappling with is who should bear the cost and value involved in creating a Digital Twin. The current industry norm presumes that the 3D model and attributes are created for clash detection and construction planning - activities that take place before handover, so are part of the EPC's routine work. Maintaining the Digital Twin after handover costs money, too; should that sit with the operator? If the benefit is seen in maintenance, repair and operations or health and safety, is maintaining the Digital Twin now their cost center? This is similar to a challenge that arises in the manufacturing sector, where various companies at various points of the supply chain ask: why should I do more work that makes me look less efficient so that someone downstream can use this data? These are gnarly issues with no easy answers.

Of course, we need to consider the human scale. A company can’t select a technology solution that it’s employees cannot use, people need to buy-in to changes to work processes, and no change is resistance-free... This is nothing earth-shaking or new, but it’s important to remember in such a techie context.

Monica is a leading technology analyst, based in the US. She has developed industry forecasts, market models and market statistics for the CAD/CAM,CAE, PLM, GIS, infrastructure, AEC and plant design software markets since 1999. She writes and speaks on these topics for technology buyers, investors and developers, drawing on 30 years of experience in engineering, CAD/CAM and market analysis. She holds a B.S. in Naval Architecture and Marine Engineering from MIT and an honors MBA from the F.W. Olin School of Management at Babson College.