Tag Archives: nuclear energy

Bringing an end to thermodynamic whoopee

Two weeks ago I used two infographics to illustrate the dominant role of energy use in generating greenhouse gas emissions and the disportionate production of greenhouse gas emission by the rich [see ‘Where we are and what we have‘ on November 24th, 2021].  Energy use is responsible for 73% of global greenhouse gas emissions and 16% of the world’s population are responsible for 38% of global CO2 emissions.  Today’s infographics illustrate the energy flows from source to consumption for the USA (above), UK and Europe (thumbnails below).  In the USA fossil fuels (coal, natural gas and petroleum) are the source of nearly 80% of their energy, in the UK it is a little more than 80% and the chart for Europe is less detailed but the proportion looks similar. COP 26 committed countries to ending ‘support for the international unabated fossil fuel energy sector by the end of 2022’ and recognised ‘investing in unabated fossil-related energy projects increasingly entails both social and economic risks, especially through the form of stranded assets, and has ensuing negative impacts on government revenue, local employment, taxpayers, utility ratepayers and public health.’  However, to reduce our dependency on fossil fuels we need a strategy, a plan of action for a fundamental change in how we power industry, heat our homes and propel our vehicles.  A hydrogen economy requires the production of hydrogen without using fossil fuels, electric cars and electric domestic heating requires our electricity generating capacity to be at least trebled by 2050 in order to hit the net zero target. This scale and speed of  transition to zero-carbon sources is such that it will have to be achieved using an integrated blend of green energy sources, including solar, wind and nuclear energy.  For example, in the UK our current electricity generating capacity is about 76 GW and 1 GW is equivalent to 3.1 million photovoltaic (PV) panels, or 364 utility scale wind turbines [www.energy.gov/eere/articles/how-much-power-1-gigawatt] so trebling capacity from one of these sources alone would imply more than 700 million PV panels, or one wind turbine every square mile.  It is easy to write policies but it is much harder to implement them and make things happen especially when transformational change is required.  We cannot expect things to happen simply because our leaders have signed agreements and made statements.  Now, national plans are required to ween us from our addiction to fossil fuels – it will be difficult but the alternative is that global warming might cause the planet to become uninhabitable for us.  It is time to stop ‘making thermodynamic whoopee with fossil fuels’ to quote Kurt Vonnegut [see ‘And then we discovered thermodynamics‘ on February 3rd, 2016].

 

 

 

 

 

 

 

 

 

Sources:

Kurt Vonnegut, A Man without a Country, New York: Seven Stories Press, 2005.  He wrote ‘we have now all but destroyed this once salubrious planet as a life-support system in fewer than two hundred years, mainly by making thermodynamic whoopee with fossil fuels’.

US Energy flow chart: https://flowcharts.llnl.gov/commodities/energy

EU Energy flow chart: https://ec.europa.eu/eurostat/web/energy/energy-flow-diagrams

UK Energy flow chart: https://www.gov.uk/government/collections/energy-flow-charts#2020

If you don’t succeed, try and try again…

Photograph of S-shaped plateYou would not think it was difficult to build a thin flat metallic plate using a digital description of the plate and a Laser Powder Bed Fusion (L-PBF) machine which can build complex components, such as hip prostheses.  But it is.  As we have discovered since we started our research project on the thermoacoustic response of additively manufactured parts (see ‘Slow start to an exciting new project on thermoacoustic response of AM metals‘ on September 9th, 2020).  L-PBF involves using a laser beam to melt selected regions of a thin layer of metal powder spread over a flat bed.  The selected regions represent a cross-section of the desired three-dimensional component and repeating the process for each successive cross-section results in the additive building of the component as each layer solidifies.  And there in those last four words lies the problem because ‘as each layer solidifies’ the temperature distribution between the layers causes different levels of thermal expansion that results in strains being locked into our thin plates.  Our plates are too thin to build with their plane surfaces horizontal or perpendicular to the laser beam so instead we build them with their plane surface parallel to the laser beam, or vertical like a street sign.  In our early attempts, the residual stresses induced by the locked-in strains caused the plate to buckle into an S-shape before it was complete (see image).  We solved this problem by building buttresses at the edges of the plate.  However, when we remove the buttresses and detach the plate from the build platform, it buckles into a dome-shape.  Actually, you can press the centre of the plate and make it snap back and forth noisily.  While we are making progress in understanding the mechanisms at work, we have some way to go before we can confidently produce flat plates using additive manufacturing that we can use in comparisons with our earlier work on the performance of conventionally, or subtractively, manufactured plates subject to the thermoacoustic loading experienced by the skin of a hypersonic vehicle [see ‘Potential dynamic buckling in hypersonic vehicle skin‘ on July 1st 2020) or the containment walls in a fusion reactor.  Sometimes research is painfully slow but no one ever talks about it.  Maybe because we quickly forget the painful parts once we have a successful outcome to brag about. But it is often precisely the painful repetitions of “try and try again” that allow us to reach the bragging stage of a successful outcome.

The research is funded jointly by the National Science Foundation (NSF) in the USA and the Engineering and Physical Sciences Research Council (EPSRC) in the UK (see Grants on the Web).

References

Silva AS, Sebastian CM, Lambros J and Patterson EA, 2019. High temperature modal analysis of a non-uniformly heated rectangular plate: Experiments and simulations. J. Sound & Vibration, 443, pp.397-410.

Magana-Carranza R, Sutcliffe CJ, Patterson EA, 2021, The effect of processing parameters and material properties on residual forces induced in Laser Powder Bed Fusion (L-PBF). Additive Manufacturing. 46:102192

Deep uncertainty and meta-ignorance

Decorative imageThe term ‘unknown unknowns’ was made famous by Donald Rumsfeld almost 20 years ago when, as US Secretary of State for Defense, he used it in describing the lack of evidence about terrorist groups being supplied with weapons of mass destruction by the Iraqi government. However, the term was probably coined by almost 50 years earlier by Joseph Luft and Harrington Ingham when they developed the Johari window as a heuristic tool to help people to better understand their relationships.  In engineering, and other fields in which predictive models are important tools, it is used to describe situations about which there is deep uncertainty.  Deep uncertainty refers situations where experts do not know or cannot agree about what models to use, how to describe the uncertainties present, or how to interpret the outcomes from predictive models.  Rumsfeld talked about known knowns, known unknowns, and unknown unknowns; and an alternative simpler but perhaps less catchy classification is ‘The knowns, the unknown, and the unknowable‘ which was used by Diebold, Doherty and Herring as part of the title of their book on financial risk management.  David Spiegelhalter suggests ‘risk, uncertainty and ignorance’ before providing a more sophisticated classification: aleatory uncertainty, epistemic uncertainty and ontological uncertainty.  Aleatory uncertainty is the inevitable unpredictability of the future that can be fully described using probability.  Epistemic uncertainty is a lack of knowledge about the structure and parameters of models used to predict the future.  While ontological uncertainty is a complete lack of knowledge and understanding about the entire modelling process, i.e. deep uncertainty.  When it is not recognised that ontological uncertainty is present then we have meta-ignorance which means failing to even consider the possibility of being wrong.  For a number of years, part of my research effort has been focussed on predictive models that are unprincipled and untestable; in other words, they are not built on widely-accepted principles or scientific laws and it is not feasible to conduct physical tests to acquire data to demonstrate their validity [see editorial ‘On the credibility of engineering models and meta-models‘, JSA 50(4):2015].  Some people would say untestability implies a model is not scientific based on Popper’s statement about scientific method requiring a theory to be refutable.  However, in reality unprincipled and untestable models are encountered in a range of fields, including space engineering, fusion energy and toxicology.  We have developed a set of credibility factors that are designed as a heuristic tool to allow the relevance of such models and their predictions to be evaluated systematically [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020].  One outcome is to allow experts to agree on their disagreements and ignorance, i.e., to define the extent of our ontological uncertainty, which is an important step towards making rational decisions about the future when there is deep uncertainty.

References

Diebold FX, Doherty NA, Herring RJ, eds. The Known, the Unknown, and the Unknowable in Financial Risk Management: Measurement and Theory Advancing Practice. Princeton, NJ: Princeton University Press, 2010.

Spiegelhalter D,  Risk and uncertainty communication. Annual Review of Statistics and Its Application, 4, pp.31-60, 2017.

Patterson EA, Whelan MP. On the validation of variable fidelity multi-physics simulations. J. Sound and Vibration. 448:247-58, 2019.

Patterson EA, Whelan MP, Worth AP. The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application. Computational Toxicology. 100144, 2020.

Digital twins that thrive in the real-world

Decorative image

Windows of the Soul II [3D video art installation: http://www.haigallery.com/sonia-falcone/%5D

Digital twins are becoming ubiquitous in many areas of engineering [see ‘Can you trust your digital twin?‘ on November 23rd, 2016].  Although at the same time, the terminology is becoming blurred as digital shadows and digital models are treated as if they are synonymous with digital twins.  A digital model is a digitised replica of physical entity which lacks any automatic data exchange between the entity and its replica.  A digital shadow is the digital representation of a physical object with a one-way flow of information from the object to its representation.  But a digital twin is a functional representation with a live feedback loop to its counterpart in the real-world.  The feedback loop is based on a continuous update to the digital twin about the condition and performance of the physical entity based on data from sensors and on analysis from the digital twin about the performance of the physical entity.  This enables a digital twin to provide a service to many stakeholders.  For example, the users of a digital twin of an aircraft engine could include the manufacturer, the operator, the maintenance providers and the insurers.  These capabilities imply digital twins are themselves becoming products which exist in a digital context that might connect many digital products thus forming an integrated digital environment.  I wrote about integrated digital environments when they were a concept and the primary challenges were technical in nature [see ‘Enabling or disruptive technology for nuclear engineering?‘ on January 28th, 2015].  Many of these technical challenges have been resolved and the next set of challenges are economic and commercial ones associated with launching digital twins into global markets that lack adequate understanding, legislation, security, regulation or governance for digital products.  In collaboration with my colleagues at the Virtual Engineering Centre, we have recently published a white paper, entitled ‘Transforming digital twins into digital products that thrive in the real world‘ that reviews these issues and identifies the need to establish digital contexts that embrace the social, economic and technical requirements for the appropriate use of digital twins [see ‘Digital twins could put at risk what it means to be human‘ on November 18th, 2020].