Asset Management using the Digital Twin concept

 

David Wagg, Professor of Nonlinear Dynamics at The University of Sheffield discusses the application of the digital twin concept in asset management.

The digital twin concept has been widely promoted as having major benefits for industrial applications. At the top of the list of these potential benefits is to develop improved asset management techniques. However, for many engineers, it’s not clear how this might be realised in practice. Why is this?

Well, at its heart, the digital twin concept is about creating systems that are highly bespoke. So much so, that you could tell the difference between two individuals in a nominally identical population of structures or assets. As a result, everything to do with digital twins is highly context specific and definitions are either very generic or widely differing. So, context is key to understanding how a digital twin might be useful. This means context both in terms of desired functionality and in terms of the specific properties of the structure or asset - which we call the physical twin.

Leaving aside the specific properties of the structure or asset for now, let’s consider functionality. In terms of asset management, the main aim of creating a digital twin is to assess the current status and future behaviour of the physical twin, and therefore inform key decisions. Of course, there are already long-established methods in areas such as process control and structural health monitoring for tackling these issues, and it’s helpful to categorise the functionalities into broad levels of sophistication.

At the most basic level, supervision of the physical twin is the first level of functionality. This means monitoring and recording the current status of the physical twin. Next is an operational function, whereby command and control inputs can be given in-order to change the behaviour. Of course, many industrial plant and asset management systems already have highly-developed supervision and operational capabilities, and so we define these systems as pre-digital twins.

It’s important to note that each level of sophistication incorporates the previous level of functionality as well, and next in the hierarchy is the simulation digital twin. As well as supervising and operating the physical twin, this type of digital twin can carry out simulations of the behaviour of the physical twin. Again, this would be context specific, but a typical example would be to use the simulations in order to help manage activities such as maintenance or plant-life extension. Beyond this would be the intelligent digital twin which adds the ability to learn from data, via techniques such as machine learning, together with increased levels of decision support and scenario planning. The final level of sophistication is the autonomous digital twin, which includes all previous capabilities in addition to the ability to manage the asset concerned with minimal human intervention.

As mentioned before, it’s very important to emphasise context here. For some applications, it may be possible to realise an autonomous digital twin, whereas for others this would be purely aspirational. Generally, the more complex the physical twin is, the harder it becomes to move up the levels of sophistication.

Generic definitions for the digital twin also give the impression that somehow the final outcome will be something “all-inclusive”. So, in order to move to something specific within the context of interest, the next step is to define the required elements and the required processes in the digital twin. Required elements could include things such as; finite element model(s); data sets; control system; computer aided design. Required processes could include things such as; data collection; physics-based modelling; uncertainty quantification; model updating; data-augmented modelling; output visualisation. Again, all depending on your chosen asset management context.

One required element that is essential to all digital twins is workflow. In fact, it could be argued that this is the digital twin, as all the other elements and processes are typically existing. The workflow is the overall coordination algorithm that connects all the required processes and elements to the user - ideally via a user interface that can show visual representations of the physical twin and/or its key data.

Let us consider a simple asset management example. We have a structure with embedded sensors from which we can make hourly measurements. We also have a series of finite element and multi-body models of the key structural components, in addition to a lower fidelity finite element representation of the overall structure. Despite having multiple models there is still significant uncertainty in using them to make predictions about the structure, and so the intention is to create a digital twin that can reduce this uncertainty, by augmenting the models with information from measured data.

So, in this example, the required elements would be each of the models plus the data sets, and the required processes could be; data collection; signal processing; uncertainty quantification; model updating; data-augmented modelling. Each of these processes is now developed as an independent method and as something that can be incorporated into the overall workflow. This coordination by a bespoke workflow may already be enough to gain benefit, but in addition it may be possible to gain important benefits from how the required processes might be designed to interact within the workflow.

For example; processes can be designed to operate over different time scales, hourly, daily, weekly and then cross referenced; different numerical models can be used for computational verification; meta-models can be created and used to run very fast simulations; model-updating and data-augmented modelling methods can be compared and used to reduce model-based uncertainties. The outcome should be reduced levels of uncertainty, and a simulation digital twin that evolves with time, and supports the management decisions required for the life of the structure.

Each of the required processes mentioned above already has a large technical literature associated, but only a very few are used in a wider context such as the digital twin. Building these processes into a context specific workflow, is really the key underlying process. It should be mentioned that it is possible to build a bad workflow, and properties such as soundness should be considered carefully in order to have a robust digital twin.

There is a natural link between workflow and models for business processes which should enable users to embed business specific functionality into the digital twin as required. So, for effective use of this idea, focus on the context to enable you to build something bespoke to your requirements. This should enable the benefits of the digital twin to be applied to your asset management processes.

 
Daniel Camara