The economists John Kay and Mervyn King assert in their book ‘Radical Uncertainty – decision-making beyond numbers‘ that ‘economic forecasting is necessarily harder than weather forecasting’ because the world of economics is non-stationary whereas the weather is governed by unchanging laws of nature. Kay and King observe that both central banks and meteorological offices have ‘to convey inescapable uncertainty to people who crave unavailable certainty’. In other words, the necessary assumptions and idealisations combined with the inaccuracies of the input data of both economic and meteorological models produce inevitable uncertainty in the predictions. However, people seeking to make decisions based on the predictions want certainty because it is very difficult to make choices when faced with uncertainty – it raises our psychological entropy [see ‘Psychological entropy increased by ineffective leaders‘ on February 10th, 2021]. Engineers face similar difficulties providing systems with inescapable uncertainties to people desiring unavailable certainty in terms of the reliability. The second law of thermodynamics ensures that perfection is unattainable [see ‘Impossible perfection‘ on June 5th, 2013] and there will always be flaws of some description present in a system [see ‘Scattering electrons reveal dislocations in material structure‘ on November 11th, 2020]. Of course, we can expend more resources to eliminate flaws and increase the reliability of a system but the second law will always limit our success. Consequently, to finish where I started with a quote from Kay and King, ‘certainty is unattainable and the price of near-certainty unaffordable’ in both economics and engineering.
One of the exciting aspects of leading a university research group is that you can never be quite sure where the research is going next. We published a nice example of this unpredictability last week in Royal Society Open Science in a paper called ‘Transformation of measurement uncertainties into low-dimensional feature space‘ . While the title is an accurate description of the contents, it does not give much away and certainly does not reveal that we proposed a new method for assessing the occurrence of El Niño events. For some time we have been working with massive datasets of measurements from arrays of sensors and representing them by fitting polynomials in a process known as image decomposition [see ‘Recognising strain‘ on October 28th, 2015]. The relatively small number of coefficients from these polynomials can be collated into a feature vector which facilitates comparison with other datasets [see for example, ‘Out of the valley of death into a hype cycle‘ on February 24th, 2021]. Our recent paper provides a solution to the issue of representing the measurement uncertainty in the same space as the feature vector which is roughly what we set out to do. We demonstrated our new method for representing the measurement uncertainty by calibrating and validating a computational model of a simple beam in bending using data from an earlier study in a EU-funded project called VANESSA  — so no surprises there. However, then my co-author and PhD student, Antonis Alexiadis went looking for other interesting datasets with which to demonstrate the new method. He found a set of spatially-varying uncertainties associated with a metamodel of soil moisture in a river basin in China  and global oceanographic temperature fields collected monthly over 11 years from 2002 to 2012 . We used the latter set of data to develop a new technique for assessing the occurrence of El-Niño events in the Pacific Ocean. Our technique is based on global ocean dynamics rather than on the small region in the Pacific Ocean which is usually used and has the added advantages of providing a confidence level on the assessment as well as enabling straightforward comparisons of predictions and measurements. The comparison of predictions and measurements is a recurring theme in our current research but I did not expect it to lead into ocean dynamics.
Image is Figure 11 from  showing convex hulls fitted to the cloud of points representing the uncertainty intervals for the ocean temperature measurements for each month in 2002 using only the three most significant principal components . The lack of overlap between hulls can be interpreted as implying a significant difference in the temperature between months.
 Alexiadis, A. and Ferson, S. and Patterson, E.A., , 2021. Transformation of measurement uncertainties into low-dimensional feature vector space. Royal Society Open Science, 8(3): 201086.
 Lampeas G, Pasialis V, Lin X, Patterson EA. 2015. On the validation of solid mechanics models using optical measurements and data decomposition. Simulation Modelling Practice and Theory 52, 92-107.
 Kang J, Jin R, Li X, Zhang Y. 2017, Block Kriging with measurement errors: a case study of the spatial prediction of soil moisture in the middle reaches of Heihe River Basin. IEEE Geoscience and Remote Sensing Letters, 14, 87-91.
 Gaillard F, Reynaud T, Thierry V, Kolodziejczyk N, von Schuckmann K. 2016. In situ-based reanalysis of the global ocean temperature and salinity with ISAS: variability of the heat content and steric height. J. Climate. 29, 1305-1323.
A couple of months ago I wrote about a set of credibility factors for computational models [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020] that we designed to inform interactions between researchers, model builders and decision-makers that will establish trust in the predictions from computational models . This is important because computational modelling is becoming ubiquitous in the development of everything from automobiles and power stations to drugs and vaccines which inevitably leads to its use in supporting regulatory applications. However, there is another motivation underpinning our work which is that the systems being modelled are becoming increasingly complex with the likelihood that they will exhibit emergent behaviour [see ‘Emergent properties‘ on September 16th, 2015] and this makes it increasingly unlikely that a reductionist approach to establishing model credibility will be successful . The reductionist approach to science, which was pioneered by Descartes and Newton, has served science well for hundreds of years and is based on the concept that everything about a complex system can be understood by reducing it to the smallest constituent part. It is the method of analysis that underpins almost everything you learn as an undergraduate engineer or physicist. However, reductionism loses its power when a system is more than the sum of its parts, i.e., when it exhibits emergent behaviour. Our approach to establishing model credibility is more holistic than traditional methods. This seems appropriate when modelling complex systems for which a complete knowledge of the relationships and patterns of behaviour may not be attainable, e.g., when unexpected or unexplainable emergent behaviour occurs . The hegemony of reductionism in science made us nervous about writing about its short-comings four years ago when we first published our ideas about model credibility . So, I was pleased to see a paper published last year  that identified five fundamental properties of biology that weaken the power of reductionism, namely (1) biological variation is widespread and persistent, (2) biological systems are relentlessly nonlinear, (3) biological systems contain redundancy, (4) biology consists of multiple systems interacting across different time and spatial scales, and (5) biological properties are emergent. Many engineered systems possess all five of these fundamental properties – you just to need to look at them from the appropriate perspective, for example, through a microscope to see the variation in microstructure of a mass-produced part. Hence, in the future, there will need to be an increasing emphasis on holistic approaches and systems thinking in both the education and practices of engineers as well as biologists.
For more on emergence in computational modelling see Manuel Delanda Philosophy and Simulation: The Emergence of Synthetic Reason, Continuum, London, 2011. And, for more systems thinking see Fritjof Capra and Luigi Luisi, The Systems View of Life: A Unifying Vision, Cambridge University Press, 2014.
 Patterson EA, Whelan MP & Worth A, The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application, Computational Toxicology, 17: 100144, 2021.
 Patterson EA &Whelan MP, A framework to establish credibility of computational models in biology. Progress in biophysics and molecular biology, 129: 13-19, 2017.
 Patterson EA & Whelan MP, On the validation of variable fidelity multi-physics simulations, J. Sound & Vibration, 448:247-258, 2019.
 Pruett WA, Clemmer JS & Hester RL, Physiological Modeling and Simulation—Validation, Credibility, and Application. Annual Review of Biomedical Engineering, 22:185-206, 2020.
Last week I wrote about research we have been carrying out over the last decade that is being applied to large scale structures in the aerospace industry (see ‘Slowly crossing the valley of death‘ on January 27th, 2021). I also work on very much smaller ‘structures’ that are only tens of nanometers in diameter, or about a billion times smaller than the test samples in last week’s post (see ‘Toxic nanoparticles?‘ on November 13th, 2013). The connection is the use of light to measure shape, deformation and motion; and then utilising the measurements to validate predictions from theoretical or computational models. About three years ago, we published research which demonstrated that the motion of very small particles (less than about 300 nanometres) at low concentrations (less than about a billion per millilitre) in a fluid was dominated by the molecules of the fluid rather than interactions between the particles (see Coglitore et al, 2017 and ‘Slow moving nanoparticles‘ on December 13th, 2017). This data confirmed results from earlier molecular dynamic simulations that contradicted predictions using the Stokes-Einstein equation, which was derived by Einstein in his PhD thesis for a ‘Stokes’ particle undergoing Brownian motion. The Stokes-Einstein equation works well for large particles but the physics of motion changes when the particles are very small and far apart so that Van der Waals forces and electrostatic forces play a dominant role, as we have shown in a more recent paper (see Giorgi et al, 2019). This becomes relevant when evaluating nanoparticles as potential drug delivery systems or assessing the toxicological impact of nanoparticles. We have shown recently that instruments based on dynamic scattering of light from nanoparticles are likely to be inaccurate because they are based on fitting measurement data to the Stokes-Einstein equation. In a paper published last month, we found that asymmetric flow field flow fractionation (or AF4) in combination with dynamic light scattering when used to detect the size of nanoparticles in suspension, tended to over-estimate the diameter of particles smaller than 60 nanometres at low concentrations by upto a factor of two (see Giorgi et al, 2021). Someone commented recently that our work in this area was not highly cited but perhaps this is unsurprising when it undermines a current paradigm. We have certainly learnt to handle rejection letters, to redouble our efforts to demonstrate the rigor in our research and to present conclusions in a manner that appears to build on existing knowledge rather than demolishing it.
Coglitore, D., Edwardson, S.P., Macko, P., Patterson, E.A. and Whelan, M., 2017. Transition from fractional to classical Stokes–Einstein behaviour in simple fluids. Royal Society open science, 4(12), p.170507.
Giorgi, F., Coglitore, D., Curran, J.M., Gilliland, D., Macko, P., Whelan, M., Worth, A. and Patterson, E.A., 2019. The influence of inter-particle forces on diffusion at the nanoscale. Scientific reports, 9(1), pp.1-6.
Giorgi, F., Curran, J.M., Gilliland, D., La Spina, R., Whelan, M.P. & Patterson, E.A. 2021, Limitations of nanoparticles size characterization by asymmetric flow field-fractionation coupled with online dynamic light scattering, Chromatographia, doi.org/10/1007/s10337-020-03997-7.
Image is a photograph of a fast flowing mountain river taken in Yellowstone National Park during a roadtrip across the USA in 2006.