Tag Archives: nuclear energy

Planetary Emergency

Global energy budget from Trenberth et al 2009

This week’s lecture in my thermodynamics course for first-year undergraduate students was about thermodynamic systems and the energy flows in and out of them. I concluded the lecture by talking about our planet as a thermodynamic system using the classic schematic in the thumbnail [see ‘Ample sufficiency of solar energy‘ on October 25th, 2017 for more discussion on this schematic].  This is usually a popular lecture but this year it had particular resonance because of the widely publicised strikes by students for action on climate change.  I have called before for individuals to take responsibility given the intransigence of governments [see ‘Are we all free riders‘ on June 6th, 2016 or ‘New Year Resolution‘ on December 31st, 2014]; so, it is good to see young people making their views and feelings known.

Weather-related events, such as widespread flooding and fires, are reported so frequently in the media that perhaps we have started to ignore them as portents of climate change.  For me, three headlines events have reinforced the gravity of the situation:

  1. The publication earlier this month of a joint report by UNICEF and the Royal College of Paediatrics and Child Health that air pollution in the UK so high that it is infringing the fundamental rights of children to grow up in a clean and safe environment; and, under the Government’s current plans, air pollution in the UK is expected to remain at dangerous levels for at least another 10 years.
  2. The warning earlier this month from the Meteorological Office in London that global warming could exceed 1.5C above pre-industrial levels within five years.  In my lecture, I highlighted that a 2C rise would be equal to the temperature 3 million years ago when sea levels were 25 to 35m high; and, a 1m rise in sea level would displace 145 million people globally [according to Blockstein & Weigmann, 2010].
  3. The suspension of construction of the new nuclear power station on Anglesey by Hitachi, which leaves the UK Government’s energy strategy in disarray with only one of the six planned new power stations under construction.  This leaves the UK unable to switch from fossil-fuelled to electric vehicles and dependent on fossil fuel to meet current electricity demand.

I apologise for my UK focus this week but whereever you are reading this blog you could probably find similar headlines in your region.  For instance, the 2016 UNICEF report states that one in seven children worldwide live in toxic air and air pollution is a major contributing factor in the deaths of around 600,000 children under five every year.  These three headlines illustrate that there is a planetary emergency because climate change is rapidly and radically altering the ecosystem with likely dire consequences for all living things; that despite a near-existential threat to the next generation as a consequence of air pollution most governments are effectively doing nothing; and that in the UK we are locked into a fossil-fuel dependency for the foreseeable future due to a lack of competent planning and commitment from the government which will compound the air pollution and climate change problems.

Our politicians need to stop arguing about borders and starting worrying about the whole planet.  We are all in this together and no man-made border will protect us from the impact of making the planet a hostile environment for life.

Nuclear winter school

I spent the first full-week of January 2019 at a Winter School for a pair of Centres for Doctoral Training focussed on Nuclear Energy (see NGN CDT & ICO CDT).  Together the two centres involve eight UK universities and most of the key players in the UK industry.  So, the Winter School offers an opportunity for researchers in nuclear science and engineering, from academia and industry, to gather together for a week and share their knowledge and experience with more than 80 PhD students.  Each student gives a report on the progress of their research to the whole gathering as either a short oral presentation or a poster.  It’s an exhausting but stimulating week for everyone due to both the packed programmme and the range of subjects covered from fundamental science through to large-scale engineering and socio-economic issues.

Here are a few things that caught my eye:

First, the images in the thumbnail above which Paul Cosgrove from the University of Cambridge used to introduce his talk on modelling thermal and neutron fluxes.  They could be from an art gallery but actually they are from the VTT Technical Research Centre of Finland and show the geometry of an advanced test reactor [ATR] (top); the rate of collisions in the ATR (middle); and the neutron density distribution (bottom).

Second, a great app for your phone called electricityMap that shows you a live map of global carbon emissions and when you click on a country it reveals the sources of electricity by type, i.e. nuclear, gas, wind etc, as well as imports and exports of electricity.  Dame Sue Ion told us about it during her key-note lecture.  I think all politicians and journalists need it installed on their phones to check their facts before they start talking about energy policy.

Third, the scale of the concrete infrastructure required in current designs of nuclear power stations compared to the reactor vessel where the energy is generated.  The pictures show the construction site for the Vogtle nuclear power station in Georgia, USA (left) and the reactor pressure vessel being lowered into position (right).  The scale of nuclear power stations was one of the reasons highlighted by Steve Smith from Algometrics for why investors are not showing much interest in them (see ‘Small is beautiful and affordable in nuclear power-stations‘ on January 14th, 2015).  Amongst the other reasons are: too expensive (about £25 billion), too long to build (often decades), too back-end loaded (i.e. no revenue until complete), too complicated (legally, economically & socially), too uncertain politically, too toxic due to poor track record of returns to investors, too opaque in terms of management of industry.  That’s quite a few challenges for the next generation of nuclear scientists and engineers to tackle.  We are making a start by creating design tools that will enable mass-production of nuclear power stations (see ‘Enabling or disruptive technology for nuclear engineering?‘ on January 28th, 2015) following the processes used to produce other massive engineering structures, such as the Airbus A380 (see Integrated Digital Nuclear Design Programme); but the nuclear industry has to move fast to catch up with other sectors of the energy business, such as gas-fired powerstations or wind turbines.  If it were to succeed then the energy market would be massively transformed.

 

Million to one

‘All models are wrong, but some are useful’ is a quote, usually attributed to George Box, that is often cited in the context of computer models and simulations.  Working out which models are useful can be difficult and it is essential to get it right when a model is to be used to design an aircraft, support the safety case for a nuclear power station or inform regulatory risk assessment on a new chemical.  One way to identify a useful model to assess its predictions against measurements made in the real-world [see ‘Model validation’ on September 18th, 2012].  Many people have worked on validation metrics that allow predicted and measured signals to be compared; and, some result in a statement of the probability that the predicted and measured signal belong to the same population.  This works well if the predictions and measurements are, for example, the temperature measured at a single weather station over a period of time; however, these validation metrics cannot handle fields of data, for instance the map of temperature, measured with an infrared camera, in a power station during start-up.  We have been working on resolving this issue and we have recently published a paper on ‘A probabilistic metric for the validation of computational models’.  We reduce the dimensionality of a field of data, represented by values in a matrix, to a vector using orthogonal decomposition [see ‘Recognizing strain’ on October 28th, 2015].  The data field could be a map of temperature, the strain field in an aircraft wing or the topology of a landscape – it does not matter.  The decomposition is performed separately and identically on the predicted and measured data fields to create to two vectors – one each for the predictions and measurements.  We look at the differences in these two vectors and compare them against the uncertainty in the measurements to arrive at a probability that the predictions belong to the same population as the measurements.  There are subtleties in the process that I have omitted but essentially, we can take two data fields composed of millions of values and arrive at a single number to describe the usefulness of the model’s predictions.

Our paper was published by the Royal Society with a press release but in the same week as the proposed Brexit agreement and so I would like to think that it was ignored due to the overwhelming interest in the political storm around Brexit rather than its esoteric nature.

Source:

Dvurecenska K, Graham S, Patelli E & Patterson EA, A probabilistic metric for the validation of computational models, Royal Society Open Science, 5:1180687, 2018.

Establishing fidelity and credibility in tests & simulations (FACTS)

A month or so ago I gave a lecture entitled ‘Establishing FACTS (Fidelity And Credibility in Tests & Simulations)’ to the local branch of the Institution of Engineering Technology (IET). Of course my title was a play on words because the Oxford English Dictionary defines a ‘fact’ as ‘a thing that is known or proved to be true’ or ‘information used as evidence or as part of report’.   One of my current research interests is how we establish predictions from simulations as evidence that can be used reliably in decision-making.  This is important because simulations based on computational models have become ubiquitous in engineering for, amongst other things, design optimisation and evaluation of structural integrity.   These models need to possess the appropriate level of fidelity and to be credible in the eyes of decision-makers, not just their creators.  Model credibility is usually provided through validation processes using a small number of physical tests that must yield a large quantity of reliable and relevant data [see ‘Getting smarter‘ on June 21st, 2017].  Reliable and relevant data means making measurements with low levels of uncertainty under real-world conditions which is usually challenging.

These topics recur through much of my research and have found applications in aerospace engineering, nuclear engineering and biology. My lecture to the IET gave an overview of these ideas using applications from each of these fields, some of which I have described in past posts.  So, I have now created a new page on this blog with a catalogue of these past posts on the theme of ‘FACTS‘.  Feel free to have a browse!