Digital twins are becoming ubiquitous in many areas of engineering [see ‘Can you trust your digital twin?‘ on November 23rd, 2016]. Although at the same time, the terminology is becoming blurred as digital shadows and digital models are treated as if they are synonymous with digital twins. A digital model is a digitised replica of physical entity which lacks any automatic data exchange between the entity and its replica. A digital shadow is the digital representation of a physical object with a one-way flow of information from the object to its representation. But a digital twin is a functional representation with a live feedback loop to its counterpart in the real-world. The feedback loop is based on a continuous update to the digital twin about the condition and performance of the physical entity based on data from sensors and on analysis from the digital twin about the performance of the physical entity. This enables a digital twin to provide a service to many stakeholders. For example, the users of a digital twin of an aircraft engine could include the manufacturer, the operator, the maintenance providers and the insurers. These capabilities imply digital twins are themselves becoming products which exist in a digital context that might connect many digital products thus forming an integrated digital environment. I wrote about integrated digital environments when they were a concept and the primary challenges were technical in nature [see ‘Enabling or disruptive technology for nuclear engineering?‘ on January 28th, 2015]. Many of these technical challenges have been resolved and the next set of challenges are economic and commercial ones associated with launching digital twins into global markets that lack adequate understanding, legislation, security, regulation or governance for digital products. In collaboration with my colleagues at the Virtual Engineering Centre, we have recently published a white paper, entitled ‘Transforming digital twins into digital products that thrive in the real world‘ that reviews these issues and identifies the need to establish digital contexts that embrace the social, economic and technical requirements for the appropriate use of digital twins [see ‘Digital twins could put at risk what it means to be human‘ on November 18th, 2020].
While pandemic lockdowns and travel bans are having a severe impact on spontaneity and creativity in research [see ‘Lacking creativity‘ on October 28th, 2020], they have induced a high level of ingenuity to achieve the final objective of the DIMES project, which is to conduct prototype demonstrations and evaluation tests of the DIMES integrated measurement system. We have gone beyond the project brief by developing a remote installation system that allows local engineers at a test site to successfully set-up and run our measurement system. This has saved thousands of airmiles and several tonnes of CO2 emissions as well as hours waiting in airport terminals and sitting in planes. These savings were made by members of our project team working remotely from their bases in Chesterfield, Liverpool, Ulm and Zurich instead of flying to the test site in Toulouse to perform the installation in a section of a fuselage, and then visiting a second time to conduct the evaluation tests. For this first remote installation, we were fortunate to have our collaborator from Airbus available to support us [see ‘Most valued player on performs remote installation‘ on December 2nd, 2020]. We are about to stretch our capabilities further by conducting a remote installation and evaluation test during a full-scale aircraft test at the Aerospace Research Centre of the National Research Council Canada in Ottawa, Canada with a team who have never seen the DIMES system and knew nothing about it until about a month ago. I could claim that this remote installation and test will save another couple of tonnes of CO2; but, in practice, we would probably not be performing a demonstration in Canada if we had not developed the remote installation capability.
The DIMES project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 820951. The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.
One of the exciting aspects of leading a university research group is that you can never be quite sure where the research is going next. We published a nice example of this unpredictability last week in Royal Society Open Science in a paper called ‘Transformation of measurement uncertainties into low-dimensional feature space‘ . While the title is an accurate description of the contents, it does not give much away and certainly does not reveal that we proposed a new method for assessing the occurrence of El Niño events. For some time we have been working with massive datasets of measurements from arrays of sensors and representing them by fitting polynomials in a process known as image decomposition [see ‘Recognising strain‘ on October 28th, 2015]. The relatively small number of coefficients from these polynomials can be collated into a feature vector which facilitates comparison with other datasets [see for example, ‘Out of the valley of death into a hype cycle‘ on February 24th, 2021]. Our recent paper provides a solution to the issue of representing the measurement uncertainty in the same space as the feature vector which is roughly what we set out to do. We demonstrated our new method for representing the measurement uncertainty by calibrating and validating a computational model of a simple beam in bending using data from an earlier study in a EU-funded project called VANESSA  — so no surprises there. However, then my co-author and PhD student, Antonis Alexiadis went looking for other interesting datasets with which to demonstrate the new method. He found a set of spatially-varying uncertainties associated with a metamodel of soil moisture in a river basin in China  and global oceanographic temperature fields collected monthly over 11 years from 2002 to 2012 . We used the latter set of data to develop a new technique for assessing the occurrence of El-Niño events in the Pacific Ocean. Our technique is based on global ocean dynamics rather than on the small region in the Pacific Ocean which is usually used and has the added advantages of providing a confidence level on the assessment as well as enabling straightforward comparisons of predictions and measurements. The comparison of predictions and measurements is a recurring theme in our current research but I did not expect it to lead into ocean dynamics.
Image is Figure 11 from  showing convex hulls fitted to the cloud of points representing the uncertainty intervals for the ocean temperature measurements for each month in 2002 using only the three most significant principal components . The lack of overlap between hulls can be interpreted as implying a significant difference in the temperature between months.
 Alexiadis, A. and Ferson, S. and Patterson, E.A., , 2021. Transformation of measurement uncertainties into low-dimensional feature vector space. Royal Society Open Science, 8(3): 201086.
 Lampeas G, Pasialis V, Lin X, Patterson EA. 2015. On the validation of solid mechanics models using optical measurements and data decomposition. Simulation Modelling Practice and Theory 52, 92-107.
 Kang J, Jin R, Li X, Zhang Y. 2017, Block Kriging with measurement errors: a case study of the spatial prediction of soil moisture in the middle reaches of Heihe River Basin. IEEE Geoscience and Remote Sensing Letters, 14, 87-91.
 Gaillard F, Reynaud T, Thierry V, Kolodziejczyk N, von Schuckmann K. 2016. In situ-based reanalysis of the global ocean temperature and salinity with ISAS: variability of the heat content and steric height. J. Climate. 29, 1305-1323.
Last week brought excitement and disappointment in approximately equal measures for my research on tracking nanoparticles [see ‘Slow moving nanoparticles‘ on December 13th, 2017 and ‘Going against the flow‘ on February 3rd, 2021]. The disappointment was that our grant proposal on ‘Optical tracking of virus-cell interaction’ was not ranked highly enough to receive funding from Engineering and Physical Sciences Research Council. Rejection is an occupational hazard for academics seeking to win grants and you learn to accept it, learn from the constructive criticism and look for ways of reworking the ideas into a new proposal. If you don’t compete then you can’t win. The excitement was that we have moved our apparatus for tracking nanoparticles into a new laboratory, which has been set up for it, so that we can start work on a pilot study looking at the ‘Interaction of bacteria and viruses with cellular and hard surfaces’. We are also advertising for a PhD student to start in September 2021 to work on ‘Developing pre-clinical models to optimise nanoparticle based drug delivery for the treatment of diabetic retinopathy‘. This is an exciting development because it represents our first step from fundamental research on tracking nanoparticles in biological media towards clinical applications of the technology. Diabetic retinopathy is an age-related condition that threatens your sight and currently is managed by delivery of drugs to the inside of the eye which requires frequent visits to a clinic for injections into the vitreous fluid of the eye. There is potential to use nanoparticles to deliver drugs more efficiently and to support these developments we plan that the PhD student will use our real-time, non-invasive, label-free tracking technology to quantify nanoparticle motion through the vitreous fluid and the interaction of nanoparticles with the cells of the retina.