Tag Archives: nuclear energy

A school trip to Japan

Teachers, students and the parents gather outside their high school one Saturday at the beginning of August.  They chatter anxiously as they wait for everyone to arrive and while bags are loaded into the school mini-bus.  Four teachers and eight students are wearing specially-made name badges with a small silicon chip in one corner.  There are lots of hugs and kisses as these twelve people climb into the mini-bus for the journey to Charles de Gaulle airport.  At Charles de Gaulle airport they go through the usual security procedures, taking off their jackets and coats, which then go through the scanner, before boarding the 12-hour flight to Tokyo.  They arrive tired and bedraggled early on Sunday afternoon.  The following day they visit the French embassy in Toyko and are given a guided tour after passing through a security scanner in the entrance.  On Tuesday they are driven from Tokyo, northwards along the Pacific coast, through Iwaki City to the railway station at Tomioka, which was completely swept away by the tsunami in March 2011.  They have all seen the pictures of the wave overwhelming everthing in its path; but it’s difficult to imagine it as they are shown around.  The next stop is the Miyakoji district of Tamura City whose residents were the first to be allowed to return in April 2014 after being evacuated following the incident at the Fukushima Daiichi nuclear power plant.  The students and teachers stay for two nights in the homes of students from Fukishima high school.  Their hosts are wearing matching name-badges with little silicon chip on them.  On Wednesday they visited Aizu and then a peach farm in northern Fukushima Prefecture on Thursday; before starting their journey home on Friday.

As they leave Fukushima Prefecture, their name badges were collected, and the silicon chips sent off for analysis.  The chips were sensors that detect gamma rays with a sensitivity of 0.1 uSv/hr [micro Sieverts per hour] which record hourly dose rates with a date stamp.  The results for the French school party are shown in the graphic – my account above describes an actual visit mage in August 2015.  The name badges with an onboard sensor are known as D-shuttles and the students were participating in a study that has been published recently by Professor Hayano of the University of Tokyo.  The events described above are highlighted in the D-shuttle data in the figure on-line here.  The highest reading from the D-shuttle, on August 2nd, is due to cosmic radiation received during the 12-hour flight from Paris to Tokyo.

There has been extensive monitoring of Fukushima residents.  In 2012, more than 30,000 people were given full-body scans at Hirata Central Hospital and 100% of children and 99% of adults were below the scanner’s detection limit of 100 Bq per body, which compares with the average body burden of an adult male in Japan of 535 Bq per body found in 1964.  For more on types of radioactivity see my post ‘Hiding in the basement’ on December 18th, 2013.

Source:

Hayano R, Measurement and communication: what worked and what did not in Fukushima, Annals of the ICRP, (45):14-22, 2016.

Hayano RS, Tsubokura M, Miyazaki M et al, Internal radiocesium contamination of adults and children in Fukushima 7 to 20 monts after the Fukushima NPP accident as measured by extensive whole-body-counter survey. Proc. Japan Acad. Ser. B 89:157-163, 2013.

Uchiyama M, Nakamura Y, Kobayashi S, Analysis of bidy-burden measurements of 137Cs and 40K in a Japanese group over a period of 5 years following the Chernobyl accident, Health Phys., 71:320-325, 1996.

Footnotes:

A Sievert is the ionising effect of 1 Joule of energy on 1 kilogram of biological tissue.

A Becquerel is a measure of radioactivity equivalent to the  quantity of radioactive material in which one nucleus decays per second.

Image: http://www.fukushima-dialogues.com/wp-content/uploads/2016/02/schema-D-shuttle-porte.png

More uncertainty about matter and energy

woodlandvalley

When I wrote about wave-particle duality and an electron possessing the characteristics of both matter and energy [see my post entitled ‘Electron uncertainty’ on July 27th, 2016], I dodged the issue of what are matter and energy.  As an engineer, I think of matter as being the solids, liquids and gases that are both manufactured and occur in nature.  We should probably add plasmas to this list, as they are created in an increasing number of engineering processes, including power generation using nuclear fission.  But maybe plasmas should be classified as energy, since they are clouds of unbounded charged particles, often electrons.   Matter is constructed from atoms and atoms from sub-atomic particles, such as electrons that can behave as particles or waves of energy.  So clearly, the boundary between matter and energy is blurred or fuzzy.  And, Einstein’s famous equation describes how energy and matter can be equated, i.e. energy is equal to mass times the speed of light squared.

Engineers tend to define energy as the capacity to do work, which is fine for manufactured or generated energy, but is inadequate when thinking about the energy of sub-atomic particles, which probably is why Feynman said we don’t really know what energy is.  Most of us think about energy as the stuff that comes down an electricity cable or that we get from eating a banana.  However, Evelyn Pielou points out in her book, The Nature of Energy, that energy in nature surrounds us all of the time, not just in the atmosphere or water flowing in rivers and oceans but locked into the structure of plants and rocks.

Matter and energy are human constructs and nature does not do rigid classifications, so perhaps we should think about a plant as a highly-organised localised zone of high density energy [see my post entitled ‘Fields of flowers‘ on July 8th, 2015].  We will always be uncertain about some things and as our ability to probe the world around us improves we will find that we are no longer certain about things we thought we understood.  For instance, research has shown that Bucky balls, which are spherical fullerene molecules containing sixty carbon atoms with a mass of 720 atomic mass units, and so seem to be quite substantial bits of matter, exhibit wave-particle duality in certain conditions.

We need to learn to accept uncertainty and appreciate the opportunities it presents to us rather than seek unattainable certainty.

Note: an atomic mass unit is also known as a Dalton and is equivalent to 1.66×10-27kg

Sources:

Pielou EC, The Energy of Nature, Chicago: The University of Chicago Press, 2001.

Arndt M, Nairz O, Vos-Andreae J, Keller C, van der Zouw G & Zeilinger A, Wave-particle duality of C60 molecules, Nature 401, 680-682 (14 October 1999).

 

Credibility is in the eye of the beholder

Picture1Last month I described how computational models were used as more than fables in many areas of applied science, including engineering and precision medicine [‘Models as fables’ on March 16th, 2016].  When people need to make decisions with socioeconomic and, or personal costs, based on the predictions from these models, then the models need to be credible.  Credibility is like beauty, it is in the eye of the beholder.   It is a challenging problem to convince decision-makers, who are often not expert in the technology or modelling techniques, that the predictions are reliable and accurate.  After all, a model that is reliable and accurate but in which decision-makers have no confidence is almost useless.  In my research we are interested in the credibility of computational mechanics models that are used to optimise the design of load-bearing structures, whether it is the frame of a building, the wing of an aircraft or a hip prosthesis.  We have techniques that allow us to characterise maps of strain using feature vectors [see my post entitled ‘Recognising strain‘ on October 28th, 2015] and then to compare the ‘distances’ between the vectors representing the predictions and measurements.  If the predicted map of strain  is an perfect representation of the map measured in a physical prototype, then this ‘distance’ will be zero.  Of course, this never happens because there is noise in the measured data and our models are never perfect because they contain simplifying assumptions that make the modelling viable.  The difficult question is how much difference is acceptable between the predictions and measurements .  The public expect certainty with respect to the performance of an engineering structure whereas engineers know that there is always some uncertainty – we can reduce it but that costs money.  Money for more sophisticated models, for more computational resources to execute the models, and for more and better quality measurements.

Models as fables

moel arthurIn his book, ‘Economic Rules – Why economics works, when it fails and how to tell the difference‘, Dani Rodrik describes models as fables – short stories that revolve around a few principal characters who live in an unnamed generic place and whose behaviour and interaction produce an outcome that serves as a lesson of sorts.  This seems to me to be a healthy perspective compared to the almost slavish belief in computational models that is common today in many quarters.  However, in engineering and increasingly in precision medicine, we use computational models as reliable and detailed predictors of the performance of specific systems.  Quantifying this reliability in a way that is useful to non-expert decision-makers is a current area of my research.  This work originated in aerospace engineering where it is possible, though expensive, to acquire comprehensive and information-rich data from experiments and then to validate models by comparing their predictions to measurements.  We have progressed to nuclear power engineering in which the extreme conditions and time-scales lead to sparse or incomplete data that make it more challenging to assess the reliability of computational models.  Now, we are just starting to consider models in computational biology where the inherent variability of biological data and our inability to control the real world present even bigger challenges to establishing model reliability.

Sources:

Dani Rodrik, Economic Rules: Why economics works, when it fails and how to tell the difference, Oxford University Press, 2015

Patterson, E.A., Taylor, R.J. & Bankhead, M., A framework for an integrated nuclear digital environment, Progress in Nuclear Energy, 87:97-103, 2016

Hack, E., Lampeas, G. & Patterson, E.A., An evaluation of a protocol for the validation of computational solid mechanics models, J. Strain Analysis, 51(1):5-13, 2016.

Patterson, E.A., Challenges in experimental strain analysis: interfaces and temperature extremes, J. Strain Analysis, 50(5): 282-3, 2015

Patterson, E.A., On the credibility of engineering models and meta-models, J. Strain Analysis, 50(4):218-220, 2015