Tag Archives: uncertainty

On the trustworthiness of multi-physics models

I stayed in Sheffield city centre a few weeks ago and walked past the standard measures in the photograph on my way to speak at a workshop.  In the past, when the cutlery and tool-making industry in Sheffield was focussed around small workshops, or little mesters, as they were known, these standards would have been used to check the tools being manufactured.  A few hundred years later, the range of standards in existence has extended far beyond the weights and measures where it started, and now includes standards for processes and artefacts as well as for measurements.  The process of validating computational models of engineering infrastructure is moving slowly towards establishing an internationally recognised standard [see two of my earliest posts: ‘Model validation‘ on September 18th, 2012 and ‘Setting standards‘ on January 29th, 2014].  We have guidelines that recommend approaches for different parts of the validation process [see ‘Setting standards‘ on January 29th, 2014]; however, many types of computational model present significant challenges when establishing their reliability [see ‘Spatial-temporal models of protein structures‘ on March 27th, 2019].  Under the auspices of the MOTIVATE project, we are gathering experts in Zurich on November 5th, 2019 to discuss the challenges of validating multi-physics models, establishing credibility and the future use of data from experiments.  It is the fourth in a series of workshops held previously in Shanghai, London and Munich.  For more information and to register follow this link. Come and join our discussions in one of my favourite cities where we will be following ‘In Einstein’s footprints‘ [posted on February 27th, 2019].

The MOTIVATE project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 754660.

Nuclear winter school

I spent the first full-week of January 2019 at a Winter School for a pair of Centres for Doctoral Training focussed on Nuclear Energy (see NGN CDT & ICO CDT).  Together the two centres involve eight UK universities and most of the key players in the UK industry.  So, the Winter School offers an opportunity for researchers in nuclear science and engineering, from academia and industry, to gather together for a week and share their knowledge and experience with more than 80 PhD students.  Each student gives a report on the progress of their research to the whole gathering as either a short oral presentation or a poster.  It’s an exhausting but stimulating week for everyone due to both the packed programmme and the range of subjects covered from fundamental science through to large-scale engineering and socio-economic issues.

Here are a few things that caught my eye:

First, the images in the thumbnail above which Paul Cosgrove from the University of Cambridge used to introduce his talk on modelling thermal and neutron fluxes.  They could be from an art gallery but actually they are from the VTT Technical Research Centre of Finland and show the geometry of an advanced test reactor [ATR] (top); the rate of collisions in the ATR (middle); and the neutron density distribution (bottom).

Second, a great app for your phone called electricityMap that shows you a live map of global carbon emissions and when you click on a country it reveals the sources of electricity by type, i.e. nuclear, gas, wind etc, as well as imports and exports of electricity.  Dame Sue Ion told us about it during her key-note lecture.  I think all politicians and journalists need it installed on their phones to check their facts before they start talking about energy policy.

Third, the scale of the concrete infrastructure required in current designs of nuclear power stations compared to the reactor vessel where the energy is generated.  The pictures show the construction site for the Vogtle nuclear power station in Georgia, USA (left) and the reactor pressure vessel being lowered into position (right).  The scale of nuclear power stations was one of the reasons highlighted by Steve Smith from Algometrics for why investors are not showing much interest in them (see ‘Small is beautiful and affordable in nuclear power-stations‘ on January 14th, 2015).  Amongst the other reasons are: too expensive (about £25 billion), too long to build (often decades), too back-end loaded (i.e. no revenue until complete), too complicated (legally, economically & socially), too uncertain politically, too toxic due to poor track record of returns to investors, too opaque in terms of management of industry.  That’s quite a few challenges for the next generation of nuclear scientists and engineers to tackle.  We are making a start by creating design tools that will enable mass-production of nuclear power stations (see ‘Enabling or disruptive technology for nuclear engineering?‘ on January 28th, 2015) following the processes used to produce other massive engineering structures, such as the Airbus A380 (see Integrated Digital Nuclear Design Programme); but the nuclear industry has to move fast to catch up with other sectors of the energy business, such as gas-fired powerstations or wind turbines.  If it were to succeed then the energy market would be massively transformed.

 

Industrial uncertainty

Last month I spent almost a week in Zurich.  It is one of our favourite European cities [see ‘A reflection of existentialism‘ on December 20th, 2017]; however, on this occasion there was no time for sight-seeing because I was there for the mid-term meeting of the MOTIVATE project and to conduct some tests and demonstrations in the laboratories of our host, EMPA, the Swiss Federal Laboratories for Materials Science and Technology.  Two of our project partners, Dantec Dynamics GmbH based in Ulm, Germany, and the Athena Research Centre in Patras, Greece, have developed methods for quantifying the uncertainty present in measurements of deformation made in an industrial environment using digital image correlation (DIC) [see ‘256 shades of grey‘ on January 22, 2014].  Digital image correlation is a technique in which we usually apply a random speckle pattern to the object which allows us to track the movement of the object surface over time by searching for the new position of the speckles in the photographs of the object.  If we use a pair of cameras in a stereoscopic arrangement, then we can measure in-plane and out-of-plane displacements.  Digital image correlation is a well-established measurement technique that has become ubiquitous in mechanics laboratories. In previous EU projects, we have developed technology for quantifying uncertainty in in-plane [SPOTS project] and out-of-plane [ADVISE project] measurements in a laboratory environment.  However, when you take the digital image correlation equipment into an industrial environment, for instance an aircraft hangar to make measurements during a full-scale test, then additional sources of uncertainty and error appear. The new technology demonstrated last month allows these additional uncertainties to be quantified.  As part of the MOTIVATE project, we will be involved in a full-scale test on a large section of an Airbus aircraft next year and so, we will be able to utilise the new technology for the first time.

The photograph shows preparations for the demonstrations in EMPA’s laboratories.  In the foreground is a stereoscopic digital image correlation system with which we are about to make measurements of deformation of a section of aircraft skin, supplied by Airbus, which has a speckle pattern on its surface and is about to be loaded in compression by the large servo-hydraulic test machine.

References:

From SPOTS project:

Patterson EA, Hack E, Brailly P, Burguete RL, Saleem Q, Seibert T, Tomlinson RA & Whelan M, Calibration and evaluation of optical systems for full-field strain measurement, Optics and Lasers in Engineering, 45(5):550-564, 2007.

Whelan MP, Albrecht D, Hack E & Patterson EA, Calibration of a speckle interferometry full-field strain measurement system, Strain, 44(2):180-190, 2008.

From ADVISE project:

Hack E, Lin X, Patterson EA & Sebastian CM, A reference material for establishing uncertainties in full-field displacement measurements, Measurement Science and Technology, 26:075004, 2015.

Epistemic triage

A couple of weeks ago I wrote about epistemic dependence and the idea that we need to trust experts because we are unable to verify everything ourselves as life is too short and there are too many things to think about.  However, this approach exposes us to the risk of being misled and Julian Baggini has suggested that this risk is increasing with the growth of psychology, which has allowed more people to master methods of manipulating us, that has led to ‘a kind of arms race of deception in which truth is the main casualty.’  He suggests that when we are presented with new information then we should perform an epstemic triage by asking:

  • Is this a domain in which anyone can speak the truth?
  • What kind of expert is a trustworthy source of truth in that domain?
  • Is a particular expert to be trusted?

The deluge of information, which streams in front of our eyes when we look at the screens of our phones, computers and televisions, seems to leave most of us grasping for a hold on reality.  Perhaps we should treat it all as fiction until have performed Baggini’s triage, at least on the sources of the information streams, if not also the individual items of information.

Source:

Julian Baggini, A short history of truth: consolations for a post-truth world, London: Quercus Editions Ltd, 2017.