Tag Archives: knowledge

Deep uncertainty and meta-ignorance

Decorative imageThe term ‘unknown unknowns’ was made famous by Donald Rumsfeld almost 20 years ago when, as US Secretary of State for Defense, he used it in describing the lack of evidence about terrorist groups being supplied with weapons of mass destruction by the Iraqi government. However, the term was probably coined by almost 50 years earlier by Joseph Luft and Harrington Ingham when they developed the Johari window as a heuristic tool to help people to better understand their relationships.  In engineering, and other fields in which predictive models are important tools, it is used to describe situations about which there is deep uncertainty.  Deep uncertainty refers situations where experts do not know or cannot agree about what models to use, how to describe the uncertainties present, or how to interpret the outcomes from predictive models.  Rumsfeld talked about known knowns, known unknowns, and unknown unknowns; and an alternative simpler but perhaps less catchy classification is ‘The knowns, the unknown, and the unknowable‘ which was used by Diebold, Doherty and Herring as part of the title of their book on financial risk management.  David Spiegelhalter suggests ‘risk, uncertainty and ignorance’ before providing a more sophisticated classification: aleatory uncertainty, epistemic uncertainty and ontological uncertainty.  Aleatory uncertainty is the inevitable unpredictability of the future that can be fully described using probability.  Epistemic uncertainty is a lack of knowledge about the structure and parameters of models used to predict the future.  While ontological uncertainty is a complete lack of knowledge and understanding about the entire modelling process, i.e. deep uncertainty.  When it is not recognised that ontological uncertainty is present then we have meta-ignorance which means failing to even consider the possibility of being wrong.  For a number of years, part of my research effort has been focussed on predictive models that are unprincipled and untestable; in other words, they are not built on widely-accepted principles or scientific laws and it is not feasible to conduct physical tests to acquire data to demonstrate their validity [see editorial ‘On the credibility of engineering models and meta-models‘, JSA 50(4):2015].  Some people would say untestability implies a model is not scientific based on Popper’s statement about scientific method requiring a theory to be refutable.  However, in reality unprincipled and untestable models are encountered in a range of fields, including space engineering, fusion energy and toxicology.  We have developed a set of credibility factors that are designed as a heuristic tool to allow the relevance of such models and their predictions to be evaluated systematically [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020].  One outcome is to allow experts to agree on their disagreements and ignorance, i.e., to define the extent of our ontological uncertainty, which is an important step towards making rational decisions about the future when there is deep uncertainty.

References

Diebold FX, Doherty NA, Herring RJ, eds. The Known, the Unknown, and the Unknowable in Financial Risk Management: Measurement and Theory Advancing Practice. Princeton, NJ: Princeton University Press, 2010.

Spiegelhalter D,  Risk and uncertainty communication. Annual Review of Statistics and Its Application, 4, pp.31-60, 2017.

Patterson EA, Whelan MP. On the validation of variable fidelity multi-physics simulations. J. Sound and Vibration. 448:247-58, 2019.

Patterson EA, Whelan MP, Worth AP. The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application. Computational Toxicology. 100144, 2020.

Negative capability and optimal ambiguity

Decorative photograph of sculpture on Liverpool waterfront at nightHow is your negative capability?  The very term ‘negative capability’ conveys confusion and ambiguity.  It means our ability to accept uncertainty, a lack of knowledge or control.  It was coined by John Keats to describe the skill of appreciating something without fully understanding it.  It implies suspending judgment about something in order to learn more about it.  This is difficult because we have to move out of a low entropy mindset and consider how it fits in a range of possible mindsets or neuronal assemblies, which raises our psychological entropy and with it our anxiety and mental stress [see ’Psychological entropy increased by effectual leaders‘ on February 10th, 2021].  If we are able to tolerate an optimal level of ambiguity and uncertainty then we might be able to develop an appreciation of a complex system and even an ability to anticipate its behaviour without a full knowledge or understanding of it.  Our sub-conscious brain has excellent negative capabilities; for example, most of us can catch a ball without understanding, or even knowing, anything about the mechanics of its flight towards us, or we accept a ride home from a friend with no knowledge of their driving skills and no control over the vehicle.  Although, if our conscious brain knows that they crashed their car last week then it might override the sub-conscious and cause us to think again before declining the offer of a ride home.  Perhaps this is because our conscious brain tends to have less negative capability and likes to be in control.  Engineers like to talk about their intuition which is probably synonymous with their negative capability because it is their ability to appreciate and anticipate the behaviour of an engineering system without a full knowledge and understanding of it.  This intuition is usually based on experience and perhaps resides in the subconscious mind because if you ask an engineer to explain a decision or prediction based on their intuition then they will probably struggle to provide a complete and rational explanation.  They are comfortable with an optimal level of ambiguity although of course you might not be so comfortable.

Sources:

Richard Gunderman, ‘John Keats’ concept of ‘negative capability’ – or sitting in uncertainty –  is needed now more than ever’.  The Conversation, February 21st, 2021.

David Jeffery, Letter: Keats was uneasy about the pursuit of perfection.  FT Weekend, April 2nd, 2021.

Caputo JD. Truth: philosophy in transit. London: Penguin, 2013.

Digital twins that thrive in the real-world

Decorative image

Windows of the Soul II [3D video art installation: http://www.haigallery.com/sonia-falcone/%5D

Digital twins are becoming ubiquitous in many areas of engineering [see ‘Can you trust your digital twin?‘ on November 23rd, 2016].  Although at the same time, the terminology is becoming blurred as digital shadows and digital models are treated as if they are synonymous with digital twins.  A digital model is a digitised replica of physical entity which lacks any automatic data exchange between the entity and its replica.  A digital shadow is the digital representation of a physical object with a one-way flow of information from the object to its representation.  But a digital twin is a functional representation with a live feedback loop to its counterpart in the real-world.  The feedback loop is based on a continuous update to the digital twin about the condition and performance of the physical entity based on data from sensors and on analysis from the digital twin about the performance of the physical entity.  This enables a digital twin to provide a service to many stakeholders.  For example, the users of a digital twin of an aircraft engine could include the manufacturer, the operator, the maintenance providers and the insurers.  These capabilities imply digital twins are themselves becoming products which exist in a digital context that might connect many digital products thus forming an integrated digital environment.  I wrote about integrated digital environments when they were a concept and the primary challenges were technical in nature [see ‘Enabling or disruptive technology for nuclear engineering?‘ on January 28th, 2015].  Many of these technical challenges have been resolved and the next set of challenges are economic and commercial ones associated with launching digital twins into global markets that lack adequate understanding, legislation, security, regulation or governance for digital products.  In collaboration with my colleagues at the Virtual Engineering Centre, we have recently published a white paper, entitled ‘Transforming digital twins into digital products that thrive in the real world‘ that reviews these issues and identifies the need to establish digital contexts that embrace the social, economic and technical requirements for the appropriate use of digital twins [see ‘Digital twins could put at risk what it means to be human‘ on November 18th, 2020].

Going against the flow

Decorative photograph of a mountain riverLast week I wrote about research we have been carrying out over the last decade that is being applied to large scale structures in the aerospace industry (see ‘Slowly crossing the valley of death‘ on January 27th, 2021). I also work on very much smaller ‘structures’ that are only tens of nanometers in diameter, or about a billion times smaller than the test samples in last week’s post (see ‘Toxic nanoparticles?‘ on November 13th, 2013). The connection is the use of light to measure shape, deformation and motion; and then utilising the measurements to validate predictions from theoretical or computational models. About three years ago, we published research which demonstrated that the motion of very small particles (less than about 300 nanometres) at low concentrations (less than about a billion per millilitre) in a fluid was dominated by the molecules of the fluid rather than interactions between the particles (see Coglitore et al, 2017 and ‘Slow moving nanoparticles‘ on December 13th, 2017). This data confirmed results from earlier molecular dynamic simulations that contradicted predictions using the Stokes-Einstein equation, which was derived by Einstein in his PhD thesis for a ‘Stokes’ particle undergoing Brownian motion. The Stokes-Einstein equation works well for large particles but the physics of motion changes when the particles are very small and far apart so that Van der Waals forces and electrostatic forces play a dominant role, as we have shown in a more recent paper (see Giorgi et al, 2019).  This becomes relevant when evaluating nanoparticles as potential drug delivery systems or assessing the toxicological impact of nanoparticles.  We have shown recently that instruments based on dynamic scattering of light from nanoparticles are likely to be inaccurate because they are based on fitting measurement data to the Stokes-Einstein equation.  In a paper published last month, we found that asymmetric flow field flow fractionation (or AF4)  in combination with dynamic light scattering when used to detect the size of nanoparticles in suspension, tended to over-estimate the diameter of particles smaller than 60 nanometres at low concentrations by upto a factor of two (see Giorgi et al, 2021).  Someone commented recently that our work in this area was not highly cited but perhaps this is unsurprising when it undermines a current paradigm.  We have certainly learnt to handle rejection letters, to redouble our efforts to demonstrate the rigor in our research and to present conclusions in a manner that appears to build on existing knowledge rather than demolishing it.

Sources:

Coglitore, D., Edwardson, S.P., Macko, P., Patterson, E.A. and Whelan, M., 2017. Transition from fractional to classical Stokes–Einstein behaviour in simple fluids. Royal Society open science, 4(12), p.170507.

Giorgi, F., Coglitore, D., Curran, J.M., Gilliland, D., Macko, P., Whelan, M., Worth, A. and Patterson, E.A., 2019. The influence of inter-particle forces on diffusion at the nanoscale. Scientific reports, 9(1), pp.1-6.

Giorgi, F., Curran, J.M., Gilliland, D., La Spina, R., Whelan, M.P. & Patterson, E.A. 2021, Limitations of nanoparticles size characterization by asymmetric flow field-fractionation coupled with online dynamic light scattering, Chromatographia, doi.org/10/1007/s10337-020-03997-7.

Image is a photograph of a fast flowing mountain river taken in Yellowstone National Park during a roadtrip across the USA in 2006.