Tag Archives: mechanics

Deep uncertainty and meta-ignorance

Decorative imageThe term ‘unknown unknowns’ was made famous by Donald Rumsfeld almost 20 years ago when, as US Secretary of State for Defense, he used it in describing the lack of evidence about terrorist groups being supplied with weapons of mass destruction by the Iraqi government. However, the term was probably coined by almost 50 years earlier by Joseph Luft and Harrington Ingham when they developed the Johari window as a heuristic tool to help people to better understand their relationships.  In engineering, and other fields in which predictive models are important tools, it is used to describe situations about which there is deep uncertainty.  Deep uncertainty refers situations where experts do not know or cannot agree about what models to use, how to describe the uncertainties present, or how to interpret the outcomes from predictive models.  Rumsfeld talked about known knowns, known unknowns, and unknown unknowns; and an alternative simpler but perhaps less catchy classification is ‘The knowns, the unknown, and the unknowable‘ which was used by Diebold, Doherty and Herring as part of the title of their book on financial risk management.  David Spiegelhalter suggests ‘risk, uncertainty and ignorance’ before providing a more sophisticated classification: aleatory uncertainty, epistemic uncertainty and ontological uncertainty.  Aleatory uncertainty is the inevitable unpredictability of the future that can be fully described using probability.  Epistemic uncertainty is a lack of knowledge about the structure and parameters of models used to predict the future.  While ontological uncertainty is a complete lack of knowledge and understanding about the entire modelling process, i.e. deep uncertainty.  When it is not recognised that ontological uncertainty is present then we have meta-ignorance which means failing to even consider the possibility of being wrong.  For a number of years, part of my research effort has been focussed on predictive models that are unprincipled and untestable; in other words, they are not built on widely-accepted principles or scientific laws and it is not feasible to conduct physical tests to acquire data to demonstrate their validity [see editorial ‘On the credibility of engineering models and meta-models‘, JSA 50(4):2015].  Some people would say untestability implies a model is not scientific based on Popper’s statement about scientific method requiring a theory to be refutable.  However, in reality unprincipled and untestable models are encountered in a range of fields, including space engineering, fusion energy and toxicology.  We have developed a set of credibility factors that are designed as a heuristic tool to allow the relevance of such models and their predictions to be evaluated systematically [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020].  One outcome is to allow experts to agree on their disagreements and ignorance, i.e., to define the extent of our ontological uncertainty, which is an important step towards making rational decisions about the future when there is deep uncertainty.

References

Diebold FX, Doherty NA, Herring RJ, eds. The Known, the Unknown, and the Unknowable in Financial Risk Management: Measurement and Theory Advancing Practice. Princeton, NJ: Princeton University Press, 2010.

Spiegelhalter D,  Risk and uncertainty communication. Annual Review of Statistics and Its Application, 4, pp.31-60, 2017.

Patterson EA, Whelan MP. On the validation of variable fidelity multi-physics simulations. J. Sound and Vibration. 448:247-58, 2019.

Patterson EA, Whelan MP, Worth AP. The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application. Computational Toxicology. 100144, 2020.

Noisy progressive failure of a composite panel

Photograph showing close-up of progressive failure in a composite materialComposite materials have revolutionized many fields of engineering by providing lightweight strong components whose internal structure can be tailored to optimise their load-bearing capabilities. Engineering composites consist of high-strength fibres embedded in a lightweight matrix that keeps the fibres in position and provides the shape of the component.  While many composite materials have an impressive structural performance, some also exhibit spectacular failure modes with noises like guitar strings snapping when fibres start to fail and with jagged eruptions of material appearing on the surface, as shown in the image.  A year ago, I reported on our work in the DIMES project, to test the capabilities of our integrated measurement system to detect and track damage in real-time in a metallic section from an aircraft wing [see ‘Condition monitoring using infrared imaging‘ on June 17th, 2020].  Last month, we completed a further round of tests at Empa to demonstrate the system’s capabilities on composite structures which have been tested almost to destruction.  One of the advantages of composite structures is their capability to function and bear load despite quite high levels of damage, which meant we were able to record the progressive rupture of one of our test panels during cyclic fatigue loading.  Watch and listen to this short video to see and hear the material being torn apart – ignore the loud creaking and groaning from the test rig, it’s the quieter sound like dead leaves being swept up.

The University of Liverpool is the coordinator of the DIMES project and the other partners are Empa, Dantec Dynamics GmbH and Strain Solutions LtdAirbus is the topic manager on behalf of the Clean Sky 2 Joint Undertaking.

The DIMES project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 820951.

 

The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.

An upside to lockdown

While pandemic lockdowns and travel bans are having a severe impact on spontaneity and creativity in research [see ‘Lacking creativity‘ on October 28th, 2020], they have induced a high level of ingenuity to achieve the final objective of the DIMES project, which is to conduct prototype demonstrations and evaluation tests of the DIMES integrated measurement system.  We have gone beyond the project brief by developing a remote installation system that allows local engineers at a test site to successfully set-up and run our measurement system. This has saved thousands of airmiles and several tonnes of CO2 emissions as well as hours waiting in airport terminals and sitting in planes.  These savings were made by members of our project team working remotely from their bases in Chesterfield, Liverpool, Ulm and Zurich instead of flying to the test site in Toulouse to perform the installation in a section of a fuselage, and then visiting a second time to conduct the evaluation tests.  For this first remote installation, we were fortunate to have our collaborator from Airbus available to support us [see ‘Most valued player on performs remote installation‘ on December 2nd, 2020].  We are about to stretch our capabilities further by conducting a remote installation and evaluation test during a full-scale aircraft test at the Aerospace Research Centre of the National Research Council Canada in Ottawa, Canada with a team who have never seen the DIMES system and knew nothing about it until about a month ago.  I could claim that this remote installation and test will save another couple of tonnes of CO2; but, in practice, we would probably not be performing a demonstration in Canada if we had not developed the remote installation capability. 

The University of Liverpool is the coordinator of the DIMES project and the other partners are Empa, Dantec Dynamics GmbH and Strain Solutions LtdAirbus is our topic manager.

Logos of Clean Sky 2 and EUThe DIMES project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 820951.  The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.

 

From strain measurements to assessing El Niño events

Figure 11 from RSOS 201086One of the exciting aspects of leading a university research group is that you can never be quite sure where the research is going next.  We published a nice example of this unpredictability last week in Royal Society Open Science in a paper called ‘Transformation of measurement uncertainties into low-dimensional feature space‘ [1].  While the title is an accurate description of the contents, it does not give much away and certainly does not reveal that we proposed a new method for assessing the occurrence of El Niño events.  For some time we have been working with massive datasets of measurements from arrays of sensors and representing them by fitting polynomials in a process known as image decomposition [see ‘Recognising strain‘ on October 28th, 2015]. The relatively small number of coefficients from these polynomials can be collated into a feature vector which facilitates comparison with other datasets [see for example, ‘Out of the valley of death into a hype cycle‘ on February 24th, 2021].  Our recent paper provides a solution to the issue of representing the measurement uncertainty in the same space as the feature vector which is roughly what we set out to do.  We demonstrated our new method for representing the measurement uncertainty by calibrating and validating a computational model of a simple beam in bending using data from an earlier study in a EU-funded project called VANESSA [2] — so no surprises there.  However, then my co-author and PhD student, Antonis Alexiadis went looking for other interesting datasets with which to demonstrate the new method.  He found a set of spatially-varying uncertainties associated with a metamodel of soil moisture in a river basin in China [3] and global oceanographic temperature fields collected monthly over 11 years from 2002 to 2012 [4].  We used the latter set of data to develop a new technique for assessing the occurrence of El-Niño events in the Pacific Ocean.  Our technique is based on global ocean dynamics rather than on the small region in the Pacific Ocean which is usually used and has the added advantages of providing a confidence level on the assessment as well as enabling straightforward comparisons of predictions and measurements.  The comparison of predictions and measurements is a recurring theme in our current research but I did not expect it to lead into ocean dynamics.

Image is Figure 11 from [1] showing convex hulls fitted to the cloud of points representing the uncertainty intervals for the ocean temperature measurements for each month in 2002 using only the three most significant principal components . The lack of overlap between hulls can be interpreted as implying a significant difference in the temperature between months.

References:

[1] Alexiadis, A. and Ferson, S. and  Patterson, E.A., , 2021. Transformation of measurement uncertainties into low-dimensional feature vector space. Royal Society Open Science, 8(3): 201086.

[2] Lampeas G, Pasialis V, Lin X, Patterson EA. 2015.  On the validation of solid mechanics models using optical measurements and data decomposition. Simulation Modelling Practice and Theory 52, 92-107.

[3] Kang J, Jin R, Li X, Zhang Y. 2017, Block Kriging with measurement errors: a case study of the spatial prediction of soil moisture in the middle reaches of Heihe River Basin. IEEE Geoscience and Remote Sensing Letters, 14, 87-91.

[4] Gaillard F, Reynaud T, Thierry V, Kolodziejczyk N, von Schuckmann K. 2016. In situ-based reanalysis of the global ocean temperature and salinity with ISAS: variability of the heat content and steric height. J. Climate. 29, 1305-1323.