Tag Archives: uncertainty

Industrial uncertainty

Last month I spent almost a week in Zurich.  It is one of our favourite European cities [see ‘A reflection of existentialism‘ on December 20th, 2017]; however, on this occasion there was no time for sight-seeing because I was there for the mid-term meeting of the MOTIVATE project and to conduct some tests and demonstrations in the laboratories of our host, EMPA, the Swiss Federal Laboratories for Materials Science and Technology.  Two of our project partners, Dantec Dynamics GmbH based in Ulm, Germany, and the Athena Research Centre in Patras, Greece, have developed methods for quantifying the uncertainty present in measurements of deformation made in an industrial environment using digital image correlation (DIC) [see ‘256 shades of grey‘ on January 22, 2014].  Digital image correlation is a technique in which we usually apply a random speckle pattern to the object which allows us to track the movement of the object surface over time by searching for the new position of the speckles in the photographs of the object.  If we use a pair of cameras in a stereoscopic arrangement, then we can measure in-plane and out-of-plane displacements.  Digital image correlation is a well-established measurement technique that has become ubiquitous in mechanics laboratories. In previous EU projects, we have developed technology for quantifying uncertainty in in-plane [SPOTS project] and out-of-plane [ADVISE project] measurements in a laboratory environment.  However, when you take the digital image correlation equipment into an industrial environment, for instance an aircraft hangar to make measurements during a full-scale test, then additional sources of uncertainty and error appear. The new technology demonstrated last month allows these additional uncertainties to be quantified.  As part of the MOTIVATE project, we will be involved in a full-scale test on a large section of an Airbus aircraft next year and so, we will be able to utilise the new technology for the first time.

The photograph shows preparations for the demonstrations in EMPA’s laboratories.  In the foreground is a stereoscopic digital image correlation system with which we are about to make measurements of deformation of a section of aircraft skin, supplied by Airbus, which has a speckle pattern on its surface and is about to be loaded in compression by the large servo-hydraulic test machine.

References:

From SPOTS project:

Patterson EA, Hack E, Brailly P, Burguete RL, Saleem Q, Seibert T, Tomlinson RA & Whelan M, Calibration and evaluation of optical systems for full-field strain measurement, Optics and Lasers in Engineering, 45(5):550-564, 2007.

Whelan MP, Albrecht D, Hack E & Patterson EA, Calibration of a speckle interferometry full-field strain measurement system, Strain, 44(2):180-190, 2008.

From ADVISE project:

Hack E, Lin X, Patterson EA & Sebastian CM, A reference material for establishing uncertainties in full-field displacement measurements, Measurement Science and Technology, 26:075004, 2015.

Epistemic triage

A couple of weeks ago I wrote about epistemic dependence and the idea that we need to trust experts because we are unable to verify everything ourselves as life is too short and there are too many things to think about.  However, this approach exposes us to the risk of being misled and Julian Baggini has suggested that this risk is increasing with the growth of psychology, which has allowed more people to master methods of manipulating us, that has led to ‘a kind of arms race of deception in which truth is the main casualty.’  He suggests that when we are presented with new information then we should perform an epstemic triage by asking:

  • Is this a domain in which anyone can speak the truth?
  • What kind of expert is a trustworthy source of truth in that domain?
  • Is a particular expert to be trusted?

The deluge of information, which streams in front of our eyes when we look at the screens of our phones, computers and televisions, seems to leave most of us grasping for a hold on reality.  Perhaps we should treat it all as fiction until have performed Baggini’s triage, at least on the sources of the information streams, if not also the individual items of information.

Source:

Julian Baggini, A short history of truth: consolations for a post-truth world, London: Quercus Editions Ltd, 2017.

Establishing fidelity and credibility in tests & simulations (FACTS)

A month or so ago I gave a lecture entitled ‘Establishing FACTS (Fidelity And Credibility in Tests & Simulations)’ to the local branch of the Institution of Engineering Technology (IET). Of course my title was a play on words because the Oxford English Dictionary defines a ‘fact’ as ‘a thing that is known or proved to be true’ or ‘information used as evidence or as part of report’.   One of my current research interests is how we establish predictions from simulations as evidence that can be used reliably in decision-making.  This is important because simulations based on computational models have become ubiquitous in engineering for, amongst other things, design optimisation and evaluation of structural integrity.   These models need to possess the appropriate level of fidelity and to be credible in the eyes of decision-makers, not just their creators.  Model credibility is usually provided through validation processes using a small number of physical tests that must yield a large quantity of reliable and relevant data [see ‘Getting smarter‘ on June 21st, 2017].  Reliable and relevant data means making measurements with low levels of uncertainty under real-world conditions which is usually challenging.

These topics recur through much of my research and have found applications in aerospace engineering, nuclear engineering and biology. My lecture to the IET gave an overview of these ideas using applications from each of these fields, some of which I have described in past posts.  So, I have now created a new page on this blog with a catalogue of these past posts on the theme of ‘FACTS‘.  Feel free to have a browse!

Tyranny of quantification

There is a growing feeling that our use of metrics is doing more harm than good.  My title today is a mis-quote from Rebecca Solnit; she actually said ‘tyranny of the quantifiable‘ or perhaps it is combination of her quote and the title of a new book by Jerry Muller: ‘The Tyranny of Metrics‘ that was reviewed in the FT Weekend on 27/28 January 2018 by Tim Harford, who recently published a book called Messy that dealt with similar issues, amongst other things.

I wrote ‘growing feeling’ and then almost fell into the trap of attempting to quantify the feeling by providing you with some evidence; but, I stopped short of trying to assign any numbers to the feeling and its growth – that would have been illogical since the definition of a feeling is ‘an emotional state or reaction, an idea or belief, especially a vague or irrational one’.

Harford puts it slightly differently: that ‘many of us have a vague sense that metrics are leading us astray, stripping away context, devaluing subtle human judgment‘.  Advances in sensors and the ubiquity of computing power allows vast amounts of data to be acquired and processed into metrics that can be ranked and used to make and justify decisions.  Data and consequently, empiricism is king.  Rationalism has been cast out into the wilderness.  Like Muller, I am not suggesting that metrics are useless, but that they are only one tool in decision-making and that they need to used by those with relevent expertise and experience in order to avoid unexpected consequences.

To quote Muller: ‘measurement is not an alternative to judgement: measurement demands judgement – judgement about whether to measure, what to measure, how to evaluate the significance of what’s been measured, whether rewards and penalties will be attached to the results, and to whom to make the measurements available‘.

Sources:

Lunch with the FT – Rebecca Solnit by Rana Foroohar in FT Weekend 10/11 February 2018

Desperate measures by Tim Harford in FT Weekend 27/28 February 2018

Muller JZ, The Tyranny of Metrics, Princeton NJ: Princeton University Press, 2018.

Image: http://maxpixel.freegreatpicture.com/Measurement-Stopwatch-Timer-Clock-Symbol-Icon-2624277