Tag Archives: uncertainty

Somethings will always be unknown

Decorative image of a fruit fly nervous system Albert Cardona HHMI Janelia Research Campus Welcome Image Awards 2015The philosophy of science has oscillated between believing that everything is knowable and that somethings will always be unknowable. In 1872, the German physiologist, Emil du Bois-Reymond declared ‘we do not know and will not know’ implying that there would always be limits to our scientific knowledge. Thirty years later, David Hilbert, a German mathematician stated that nothing is unknowable in the natural sciences. He believed that by considering some things to be unknowable we limited our ability to know. However, Kurt Godel, a Viennese mathematician who moved to Princeton in 1940, demonstrated in his incompleteness theorems that for any finite mathematical system there will always be statements which are true but unprovable and that a finite mathematical system cannot demonstrate its own consistency. I think that this implies some things will remain unknowable or at least uncertain. Godel believed that his theorems implied that the power of the human mind is infinitely more powerful than any finite machine and Roger Penrose has deployed these incompleteness theorems to argue that consciousness transcends the formal logic of computers, which perhaps implies that artificial intelligence will never replace human intelligence [see ‘Four requirements for consciousness‘ on January 22nd, 2020].  At a more mundane level, Godel’s theorems imply that engineers will always have to deal with the unknowable when using mathematical models to predict the behaviour of complex systems and, of course, to avoid meta-ignorance, we have to assume that there are always unknown unknowns [see ‘Deep uncertainty and meta-ignorance‘ on July 21st, 2021].

Source: Book review by Nick Stephen, ‘Journey to the Edge of Reason by Stephen Budiansky – ruthless logic‘ FT Weekend, 1st June 2021.

Deep uncertainty and meta-ignorance

Decorative imageThe term ‘unknown unknowns’ was made famous by Donald Rumsfeld almost 20 years ago when, as US Secretary of State for Defense, he used it in describing the lack of evidence about terrorist groups being supplied with weapons of mass destruction by the Iraqi government. However, the term was probably coined by almost 50 years earlier by Joseph Luft and Harrington Ingham when they developed the Johari window as a heuristic tool to help people to better understand their relationships.  In engineering, and other fields in which predictive models are important tools, it is used to describe situations about which there is deep uncertainty.  Deep uncertainty refers situations where experts do not know or cannot agree about what models to use, how to describe the uncertainties present, or how to interpret the outcomes from predictive models.  Rumsfeld talked about known knowns, known unknowns, and unknown unknowns; and an alternative simpler but perhaps less catchy classification is ‘The knowns, the unknown, and the unknowable‘ which was used by Diebold, Doherty and Herring as part of the title of their book on financial risk management.  David Spiegelhalter suggests ‘risk, uncertainty and ignorance’ before providing a more sophisticated classification: aleatory uncertainty, epistemic uncertainty and ontological uncertainty.  Aleatory uncertainty is the inevitable unpredictability of the future that can be fully described using probability.  Epistemic uncertainty is a lack of knowledge about the structure and parameters of models used to predict the future.  While ontological uncertainty is a complete lack of knowledge and understanding about the entire modelling process, i.e. deep uncertainty.  When it is not recognised that ontological uncertainty is present then we have meta-ignorance which means failing to even consider the possibility of being wrong.  For a number of years, part of my research effort has been focussed on predictive models that are unprincipled and untestable; in other words, they are not built on widely-accepted principles or scientific laws and it is not feasible to conduct physical tests to acquire data to demonstrate their validity [see editorial ‘On the credibility of engineering models and meta-models‘, JSA 50(4):2015].  Some people would say untestability implies a model is not scientific based on Popper’s statement about scientific method requiring a theory to be refutable.  However, in reality unprincipled and untestable models are encountered in a range of fields, including space engineering, fusion energy and toxicology.  We have developed a set of credibility factors that are designed as a heuristic tool to allow the relevance of such models and their predictions to be evaluated systematically [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020].  One outcome is to allow experts to agree on their disagreements and ignorance, i.e., to define the extent of our ontological uncertainty, which is an important step towards making rational decisions about the future when there is deep uncertainty.

References

Diebold FX, Doherty NA, Herring RJ, eds. The Known, the Unknown, and the Unknowable in Financial Risk Management: Measurement and Theory Advancing Practice. Princeton, NJ: Princeton University Press, 2010.

Spiegelhalter D,  Risk and uncertainty communication. Annual Review of Statistics and Its Application, 4, pp.31-60, 2017.

Patterson EA, Whelan MP. On the validation of variable fidelity multi-physics simulations. J. Sound and Vibration. 448:247-58, 2019.

Patterson EA, Whelan MP, Worth AP. The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application. Computational Toxicology. 100144, 2020.

Negative capability and optimal ambiguity

Decorative photograph of sculpture on Liverpool waterfront at nightHow is your negative capability?  The very term ‘negative capability’ conveys confusion and ambiguity.  It means our ability to accept uncertainty, a lack of knowledge or control.  It was coined by John Keats to describe the skill of appreciating something without fully understanding it.  It implies suspending judgment about something in order to learn more about it.  This is difficult because we have to move out of a low entropy mindset and consider how it fits in a range of possible mindsets or neuronal assemblies, which raises our psychological entropy and with it our anxiety and mental stress [see ’Psychological entropy increased by effectual leaders‘ on February 10th, 2021].  If we are able to tolerate an optimal level of ambiguity and uncertainty then we might be able to develop an appreciation of a complex system and even an ability to anticipate its behaviour without a full knowledge or understanding of it.  Our sub-conscious brain has excellent negative capabilities; for example, most of us can catch a ball without understanding, or even knowing, anything about the mechanics of its flight towards us, or we accept a ride home from a friend with no knowledge of their driving skills and no control over the vehicle.  Although, if our conscious brain knows that they crashed their car last week then it might override the sub-conscious and cause us to think again before declining the offer of a ride home.  Perhaps this is because our conscious brain tends to have less negative capability and likes to be in control.  Engineers like to talk about their intuition which is probably synonymous with their negative capability because it is their ability to appreciate and anticipate the behaviour of an engineering system without a full knowledge and understanding of it.  This intuition is usually based on experience and perhaps resides in the subconscious mind because if you ask an engineer to explain a decision or prediction based on their intuition then they will probably struggle to provide a complete and rational explanation.  They are comfortable with an optimal level of ambiguity although of course you might not be so comfortable.

Sources:

Richard Gunderman, ‘John Keats’ concept of ‘negative capability’ – or sitting in uncertainty –  is needed now more than ever’.  The Conversation, February 21st, 2021.

David Jeffery, Letter: Keats was uneasy about the pursuit of perfection.  FT Weekend, April 2nd, 2021.

Caputo JD. Truth: philosophy in transit. London: Penguin, 2013.

Near earth objects make tomorrow a little less than certain

Photograph of earth from spaceA couple of weeks ago, I wrote about unattainable uncertainty [see ‘Uncertainty is unattainable and near-uncertainty unaffordable’ on May 12th, 2021] and you might have thought that some things are certain, such as that tomorrow will follow today.  However, even that’s not certain – it has been estimated that there is a 1 in 300,000 chance in the next one hundred years of an asteroid impact on earth resulting in more than one million fatalities.  It might seem like a very small probability that you will not be around tomorrow due to an asteroid impact; however, as Sir David Spiegelhalter has pointed out, if that probability of fatalities related to an industrial installation, then it would be considered an intolerable risk by UK Health and Safety Executive.  By the way, if you want a more accurate estimate of the probability that an asteroid impact will prevent you seeing tomorrow then NASA provides information about the next Near Earth Object (NEO) to pass within 10 lunar distances (the distance between the moon and the earth which is 384,000 km) at https://cneos.jpl.nasa.gov/.  121 NEOs came within one lunar distance during the last twelve months of which the largest had a diameter of between 88 m and 200m, which is about the size of an Olympic Stadium, and came within 310,000 km; while the closest came within 8000 km, less than a Earth’s diameter which is 12,742 km, and was between 4.8 m and 11 m in diameter or about the size of two double-decker buses.  Spiegelhalter reassures us by telling us that there is no record of anyone, except a cow, being killed by an asteroid whereas tragically the same cannot be said about double decker buses!

Sources:

Reinhardt JC, Chen X, Liu W, Manchev P, Pate-Cornell ME. Asteroid risk assessment: a probabilistic approach. Risk Anal. 36:244–61, 2016.

Spiegelhalter D. Risk and uncertainty communication. Annual Review of Statistics and Its Application. 4:31-60, 2017.