Tag Archives: Engineering

Certainty is unattainable and near-certainty unaffordable

The economists John Kay and Mervyn King assert in their book ‘Radical Uncertainty – decision-making beyond numbers‘ that ‘economic forecasting is necessarily harder than weather forecasting’ because the world of economics is non-stationary whereas the weather is governed by unchanging laws of nature. Kay and King observe that both central banks and meteorological offices have ‘to convey inescapable uncertainty to people who crave unavailable certainty’. In other words, the necessary assumptions and idealisations combined with the inaccuracies of the input data of both economic and meteorological models produce inevitable uncertainty in the predictions. However, people seeking to make decisions based on the predictions want certainty because it is very difficult to make choices when faced with uncertainty – it raises our psychological entropy [see ‘Psychological entropy increased by ineffective leaders‘ on February 10th, 2021].  Engineers face similar difficulties providing systems with inescapable uncertainties to people desiring unavailable certainty in terms of the reliability.  The second law of thermodynamics ensures that perfection is unattainable [see ‘Impossible perfection‘ on June 5th, 2013] and there will always be flaws of some description present in a system [see ‘Scattering electrons reveal dislocations in material structure‘ on November 11th, 2020].  Of course, we can expend more resources to eliminate flaws and increase the reliability of a system but the second law will always limit our success. Consequently, to finish where I started with a quote from Kay and King, ‘certainty is unattainable and the price of near-certainty unaffordable’ in both economics and engineering.

Everything is flux but it’s not always been recognised

Decorative photograph or ruins of Fountains Abbey next to River SkellI am teaching thermodynamics to first year undergraduate students at the moment and in most previous years this experience has stimulated me to blog about thermodynamics [for example: ‘Isolated systems in nature?’ on February 12th, 2020].  However, this year I am more than half-way through the module and this is the first post on the topic.  Perhaps that is an impact of teaching on-line via live broadcasts rather than the performance involved in lecturing to hundreds of students in a lecture theatre.  Last week I introduced the second law of thermodynamics and explained its origins in efforts to improve the efficiency of steam engines by 19th century engineers and physicists, including Rudolf Clausius (1822 – 1888), William Thomson (1827 – 1907) and Ludwig Boltzmann (1844 – 1906).  The second law of thermodynamics states that the entropy of the universe increases during all real processes, where entropy can be described as the degree of disorder. The traditional narrative is that thermodynamics was developed by the Victorians; however, I think that the ancient Greeks had a pretty good understanding of it without calling it thermodynamics.  Heraclitus (c. 535 BCE – c. 475 BCE) understood that everything is in flux and nothing is at rest so that the world is one colossal process.  This concept comes close to the modern interpretation of the second of law of thermodynamics in which the entropy in the universe is constantly increasing leading to continuous change.  Heraclitus just did not state the direction of flux.  Unfortunately, Plato (c. 429 BCE – c. 347 BCE) did not agree with Heraclitus, but thought that some divine intervention had imposed order on pre-existing chaos to create an ordered universe, which precludes a constant flux and probably set back Western thought for a couple of millennia.  However, it seems likely that in the 17th century, Newton (1643 – 1727) and Leibniz (1646 – 1716), when they independently invented calculus, had more than an inkling about everything being in flux.  In the 18th century, the pioneering geologist James Hutton (1726 – 1797), while examining the tilted layers of the cliff at Siccar Point in Berwickshire, realised that the Earth was not simply created but instead is in a state of constant flux.  His ideas were spurned at the time and he was accused of atheism.  Boltzmann also had to vigorously defend his ideas to such an extent that his mental health deteriorated and he committed suicide while on vacation with his wife and daughter.  Today, it is widely accepted that the second law of thermodynamics governs all natural and synthetic processes, and many people have heard of entropy [see ‘Entropy on the brain’ on November 29th, 2017] but far fewer understand it [see ‘Two cultures’ on March 5th, 2013].  It is perhaps still controversial to talk about the theoretical long-term consequence of the second law, which is cosmic heat death corresponding to an equilibrium state of maximum entropy and uniform temperature across the universe such that nothing happens and life cannot exist [see ‘Will it all be over soon?’ on November 2nd, 2016].  This concept caused problems to 19th century thinkers, particular James Clerk Maxwell (1831 – 1979), and even perhaps to Plato who theorised two worlds in his theory of forms, one unchanging and the other in constant change, maybe in an effort to dodge the potential implications of degeneration of the universe into chaos.

Image: decaying ruins of Fountains Abbey beside the River Skell.  Heraclitus is reported to have said ‘no man ever steps twice into the same river; for it’s not the same river and he’s not the same man’.

An upside to lockdown

While pandemic lockdowns and travel bans are having a severe impact on spontaneity and creativity in research [see ‘Lacking creativity‘ on October 28th, 2020], they have induced a high level of ingenuity to achieve the final objective of the DIMES project, which is to conduct prototype demonstrations and evaluation tests of the DIMES integrated measurement system.  We have gone beyond the project brief by developing a remote installation system that allows local engineers at a test site to successfully set-up and run our measurement system. This has saved thousands of airmiles and several tonnes of CO2 emissions as well as hours waiting in airport terminals and sitting in planes.  These savings were made by members of our project team working remotely from their bases in Chesterfield, Liverpool, Ulm and Zurich instead of flying to the test site in Toulouse to perform the installation in a section of a fuselage, and then visiting a second time to conduct the evaluation tests.  For this first remote installation, we were fortunate to have our collaborator from Airbus available to support us [see ‘Most valued player on performs remote installation‘ on December 2nd, 2020].  We are about to stretch our capabilities further by conducting a remote installation and evaluation test during a full-scale aircraft test at the Aerospace Research Centre of the National Research Council Canada in Ottawa, Canada with a team who have never seen the DIMES system and knew nothing about it until about a month ago.  I could claim that this remote installation and test will save another couple of tonnes of CO2; but, in practice, we would probably not be performing a demonstration in Canada if we had not developed the remote installation capability. 

The University of Liverpool is the coordinator of the DIMES project and the other partners are Empa, Dantec Dynamics GmbH and Strain Solutions LtdAirbus is our topic manager.

Logos of Clean Sky 2 and EUThe DIMES project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 820951.  The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.

 

From strain measurements to assessing El Niño events

Figure 11 from RSOS 201086One of the exciting aspects of leading a university research group is that you can never be quite sure where the research is going next.  We published a nice example of this unpredictability last week in Royal Society Open Science in a paper called ‘Transformation of measurement uncertainties into low-dimensional feature space‘ [1].  While the title is an accurate description of the contents, it does not give much away and certainly does not reveal that we proposed a new method for assessing the occurrence of El Niño events.  For some time we have been working with massive datasets of measurements from arrays of sensors and representing them by fitting polynomials in a process known as image decomposition [see ‘Recognising strain‘ on October 28th, 2015]. The relatively small number of coefficients from these polynomials can be collated into a feature vector which facilitates comparison with other datasets [see for example, ‘Out of the valley of death into a hype cycle‘ on February 24th, 2021].  Our recent paper provides a solution to the issue of representing the measurement uncertainty in the same space as the feature vector which is roughly what we set out to do.  We demonstrated our new method for representing the measurement uncertainty by calibrating and validating a computational model of a simple beam in bending using data from an earlier study in a EU-funded project called VANESSA [2] — so no surprises there.  However, then my co-author and PhD student, Antonis Alexiadis went looking for other interesting datasets with which to demonstrate the new method.  He found a set of spatially-varying uncertainties associated with a metamodel of soil moisture in a river basin in China [3] and global oceanographic temperature fields collected monthly over 11 years from 2002 to 2012 [4].  We used the latter set of data to develop a new technique for assessing the occurrence of El-Niño events in the Pacific Ocean.  Our technique is based on global ocean dynamics rather than on the small region in the Pacific Ocean which is usually used and has the added advantages of providing a confidence level on the assessment as well as enabling straightforward comparisons of predictions and measurements.  The comparison of predictions and measurements is a recurring theme in our current research but I did not expect it to lead into ocean dynamics.

Image is Figure 11 from [1] showing convex hulls fitted to the cloud of points representing the uncertainty intervals for the ocean temperature measurements for each month in 2002 using only the three most significant principal components . The lack of overlap between hulls can be interpreted as implying a significant difference in the temperature between months.

References:

[1] Alexiadis, A. and Ferson, S. and  Patterson, E.A., , 2021. Transformation of measurement uncertainties into low-dimensional feature vector space. Royal Society Open Science, 8(3): 201086.

[2] Lampeas G, Pasialis V, Lin X, Patterson EA. 2015.  On the validation of solid mechanics models using optical measurements and data decomposition. Simulation Modelling Practice and Theory 52, 92-107.

[3] Kang J, Jin R, Li X, Zhang Y. 2017, Block Kriging with measurement errors: a case study of the spatial prediction of soil moisture in the middle reaches of Heihe River Basin. IEEE Geoscience and Remote Sensing Letters, 14, 87-91.

[4] Gaillard F, Reynaud T, Thierry V, Kolodziejczyk N, von Schuckmann K. 2016. In situ-based reanalysis of the global ocean temperature and salinity with ISAS: variability of the heat content and steric height. J. Climate. 29, 1305-1323.