Tag Archives: uncertainty

Nuclear winter school

I spent the first full-week of January 2019 at a Winter School for a pair of Centres for Doctoral Training focussed on Nuclear Energy (see NGN CDT & ICO CDT).  Together the two centres involve eight UK universities and most of the key players in the UK industry.  So, the Winter School offers an opportunity for researchers in nuclear science and engineering, from academia and industry, to gather together for a week and share their knowledge and experience with more than 80 PhD students.  Each student gives a report on the progress of their research to the whole gathering as either a short oral presentation or a poster.  It’s an exhausting but stimulating week for everyone due to both the packed programmme and the range of subjects covered from fundamental science through to large-scale engineering and socio-economic issues.

Here are a few things that caught my eye:

First, the images in the thumbnail above which Paul Cosgrove from the University of Cambridge used to introduce his talk on modelling thermal and neutron fluxes.  They could be from an art gallery but actually they are from the VTT Technical Research Centre of Finland and show the geometry of an advanced test reactor [ATR] (top); the rate of collisions in the ATR (middle); and the neutron density distribution (bottom).

Second, a great app for your phone called electricityMap that shows you a live map of global carbon emissions and when you click on a country it reveals the sources of electricity by type, i.e. nuclear, gas, wind etc, as well as imports and exports of electricity.  Dame Sue Ion told us about it during her key-note lecture.  I think all politicians and journalists need it installed on their phones to check their facts before they start talking about energy policy.

Third, the scale of the concrete infrastructure required in current designs of nuclear power stations compared to the reactor vessel where the energy is generated.  The pictures show the construction site for the Vogtle nuclear power station in Georgia, USA (left) and the reactor pressure vessel being lowered into position (right).  The scale of nuclear power stations was one of the reasons highlighted by Steve Smith from Algometrics for why investors are not showing much interest in them (see ‘Small is beautiful and affordable in nuclear power-stations‘ on January 14th, 2015).  Amongst the other reasons are: too expensive (about £25 billion), too long to build (often decades), too back-end loaded (i.e. no revenue until complete), too complicated (legally, economically & socially), too uncertain politically, too toxic due to poor track record of returns to investors, too opaque in terms of management of industry.  That’s quite a few challenges for the next generation of nuclear scientists and engineers to tackle.  We are making a start by creating design tools that will enable mass-production of nuclear power stations (see ‘Enabling or disruptive technology for nuclear engineering?‘ on January 28th, 2015) following the processes used to produce other massive engineering structures, such as the Airbus A380 (see Integrated Digital Nuclear Design Programme); but the nuclear industry has to move fast to catch up with other sectors of the energy business, such as gas-fired powerstations or wind turbines.  If it were to succeed then the energy market would be massively transformed.

 

Industrial uncertainty

Last month I spent almost a week in Zurich.  It is one of our favourite European cities [see ‘A reflection of existentialism‘ on December 20th, 2017]; however, on this occasion there was no time for sight-seeing because I was there for the mid-term meeting of the MOTIVATE project and to conduct some tests and demonstrations in the laboratories of our host, EMPA, the Swiss Federal Laboratories for Materials Science and Technology.  Two of our project partners, Dantec Dynamics GmbH based in Ulm, Germany, and the Athena Research Centre in Patras, Greece, have developed methods for quantifying the uncertainty present in measurements of deformation made in an industrial environment using digital image correlation (DIC) [see ‘256 shades of grey‘ on January 22, 2014].  Digital image correlation is a technique in which we usually apply a random speckle pattern to the object which allows us to track the movement of the object surface over time by searching for the new position of the speckles in the photographs of the object.  If we use a pair of cameras in a stereoscopic arrangement, then we can measure in-plane and out-of-plane displacements.  Digital image correlation is a well-established measurement technique that has become ubiquitous in mechanics laboratories. In previous EU projects, we have developed technology for quantifying uncertainty in in-plane [SPOTS project] and out-of-plane [ADVISE project] measurements in a laboratory environment.  However, when you take the digital image correlation equipment into an industrial environment, for instance an aircraft hangar to make measurements during a full-scale test, then additional sources of uncertainty and error appear. The new technology demonstrated last month allows these additional uncertainties to be quantified.  As part of the MOTIVATE project, we will be involved in a full-scale test on a large section of an Airbus aircraft next year and so, we will be able to utilise the new technology for the first time.

The photograph shows preparations for the demonstrations in EMPA’s laboratories.  In the foreground is a stereoscopic digital image correlation system with which we are about to make measurements of deformation of a section of aircraft skin, supplied by Airbus, which has a speckle pattern on its surface and is about to be loaded in compression by the large servo-hydraulic test machine.

References:

From SPOTS project:

Patterson EA, Hack E, Brailly P, Burguete RL, Saleem Q, Seibert T, Tomlinson RA & Whelan M, Calibration and evaluation of optical systems for full-field strain measurement, Optics and Lasers in Engineering, 45(5):550-564, 2007.

Whelan MP, Albrecht D, Hack E & Patterson EA, Calibration of a speckle interferometry full-field strain measurement system, Strain, 44(2):180-190, 2008.

From ADVISE project:

Hack E, Lin X, Patterson EA & Sebastian CM, A reference material for establishing uncertainties in full-field displacement measurements, Measurement Science and Technology, 26:075004, 2015.

Epistemic triage

A couple of weeks ago I wrote about epistemic dependence and the idea that we need to trust experts because we are unable to verify everything ourselves as life is too short and there are too many things to think about.  However, this approach exposes us to the risk of being misled and Julian Baggini has suggested that this risk is increasing with the growth of psychology, which has allowed more people to master methods of manipulating us, that has led to ‘a kind of arms race of deception in which truth is the main casualty.’  He suggests that when we are presented with new information then we should perform an epstemic triage by asking:

  • Is this a domain in which anyone can speak the truth?
  • What kind of expert is a trustworthy source of truth in that domain?
  • Is a particular expert to be trusted?

The deluge of information, which streams in front of our eyes when we look at the screens of our phones, computers and televisions, seems to leave most of us grasping for a hold on reality.  Perhaps we should treat it all as fiction until have performed Baggini’s triage, at least on the sources of the information streams, if not also the individual items of information.

Source:

Julian Baggini, A short history of truth: consolations for a post-truth world, London: Quercus Editions Ltd, 2017.

Establishing fidelity and credibility in tests & simulations (FACTS)

A month or so ago I gave a lecture entitled ‘Establishing FACTS (Fidelity And Credibility in Tests & Simulations)’ to the local branch of the Institution of Engineering Technology (IET). Of course my title was a play on words because the Oxford English Dictionary defines a ‘fact’ as ‘a thing that is known or proved to be true’ or ‘information used as evidence or as part of report’.   One of my current research interests is how we establish predictions from simulations as evidence that can be used reliably in decision-making.  This is important because simulations based on computational models have become ubiquitous in engineering for, amongst other things, design optimisation and evaluation of structural integrity.   These models need to possess the appropriate level of fidelity and to be credible in the eyes of decision-makers, not just their creators.  Model credibility is usually provided through validation processes using a small number of physical tests that must yield a large quantity of reliable and relevant data [see ‘Getting smarter‘ on June 21st, 2017].  Reliable and relevant data means making measurements with low levels of uncertainty under real-world conditions which is usually challenging.

These topics recur through much of my research and have found applications in aerospace engineering, nuclear engineering and biology. My lecture to the IET gave an overview of these ideas using applications from each of these fields, some of which I have described in past posts.  So, I have now created a new page on this blog with a catalogue of these past posts on the theme of ‘FACTS‘.  Feel free to have a browse!