Tag Archives: simulation

Fourth industrial revolution

Have you noticed that we are in the throes of a fourth industrial revolution?

The first industrial revolution occurred towards the end of the 18th century with the introduction of steam power and mechanisation.  The second industrial revolution took place at the end of the 19th and beginning of the 20th century and was driven by the invention of electrical devices and mass production.  The third industrial revolution was brought about by computers and automation at the end of the 20th century.  The fourth industrial revolution is happening as result of combining physical and cyber systems.  It is also called Industry 4.0 and is seen as the integration of additive manufacturing, augmented reality, Big Data, cloud computing, cyber security, Internet of Things (IoT), simulation and systems engineering.  Most organisations are struggling with the integration process and, as a consequence, are only exploiting a fraction of the capabilities of the new technology.  Revolutions are, by their nature, disruptive and those organisations that embrace and exploit the innovations will benefit while the existence of the remainder is under threat [see [‘The disrupting benefit of innovation’ on May 23rd, 2018].

Our work on the Integrated Nuclear Digital Environment, on Digital Twins, in the MOTIVATE project and on hierarchical modelling in engineering and biology is all part of the revolution.

Links to these research posts:

Enabling or disruptive technology for nuclear engineering?’ on January 28th, 2015

Can you trust your digital twin?’ on November 23rd, 2016

Getting Smarter’ on June 21st, 2017

‘Hierarchical modelling in engineering and biology’ [March 14th, 2018]


Image: Christoph Roser at AllAboutLean.com from https://commons.wikimedia.org/wiki/File:Industry_4.0.png [CC BY-SA 4.0].

Hierarchical modelling in engineering and biology

In the 1979 Glenn Harris proposed an analytical hierarchy of models for estimating tactical force effectiveness for the US Army which was represented as a pyramid with four layers with a theatre/campaign simulation at the apex supported by mission level simulations below which was engagement model and engineering models of assets/equipment at the base.  The idea was adopted by the aerospace industry [see the graphic on the left] who place the complete aircraft on the apex supported by systems, sub-systems and components beneath in increasing numbers with the pyramid divided vertically in half to represent physical tests on one side and simulations on the other.  This represents the need to validate predictions from computational models with measurements in the real-world [see post on ‘Model validation‘ on September 18th, 2012]. These diagrams are schematic representations used by engineers to plan and organise the extensive programmes of modelling and physical testing undertaken during the design of new aircraft [see post on ‘Models as fables‘ on March 16th, 2016].  The objective of the MOTIVATE research project is to reduce quantity and increase the quality of the physical tests so that pyramid becomes lop-sided, i.e. the triangle representing the experiments and tests is a much thinner slice than the one representing the modelling and simulations [see post on ‘Brave New World‘ on January 10th, 2018].

At the same time, I am working with colleagues in toxicology on approaches to establishing credibility in predictive models for chemical risk assessment.  I have constructed an equivalent pyramid to represent the system hierarchy which is shown on the right in the graphic.  The challenge is the lack of measurement data in the top left of the pyramid, for both moral and legal reasons, which means that there is very limited real-world data available to confirm the predictions from computational models represented on the right of the pyramid.  In other words, my colleagues in toxicology, and computational biology in general, are where my collaborators in the aerospace industry would like to be while my collaborators in the aerospace want to be where the computational biologists find themselves already.  The challenge is that in both cases a paradigm shift is required from objectivism toward relativism;  since, in the absence of comprehensive real-world measurement data, validation or confirmation of predictions becomes a social process involving judgement about where the predictions lie on a continuum of usefulness.


Harris GL, Computer models, laboratory simulators, and test ranges: meeting the challenge of estimating tactical force effectiveness in the 1980’s, US Army Command and General Staff College, May 1979.

Trevisani DA & Sisti AF, Air Force hierarchy of models: a look inside the great pyramid, Proc. SPIE 4026, Enabling Technology for Simulation Science IV, 23 June 2000.

Patterson EA & Whelan MP, A framework to establish credibility of computational models in biology, Progress in Biophysics and Molecular Biology, 129:13-19, 2017.

Brave New World

OLYMPUS DIGITAL CAMERATerm has started, and our students are preparing for end-of-semester examinations; so, I suspect that they would welcome the opportunity to deploy the sleeping-learning that Aldous Huxley envisaged in his ‘Brave New World’ of 2540.  In the brave new world of digital engineering, some engineers are attempting to conceive of a world in which experiments have become obsolete because we can rely on computational modelling to simulate engineering systems.  This ambitious goal is a driver for the MOTIVATE project [see my post entitled ‘Getting smarter‘ on June 21st, 2017]; an EU-project that kicked-off about six months ago and was the subject of a brainstorming session in the Red Deer in Sheffield last September [see my post entitled ‘Anything other than lager, stout or porter!‘ on September 6th, 2017.  The project has its own website now at www.engineeringvalidation.org

A world without experiments is almost unimaginable for engineers whose education and training is deeply rooted in empiricism, which is the philosophical approach that requires assumptions, models and theories to be tested against observations from the real-world before they can be accepted.  In the MOTIVATE project, we are thinking about ways in which fewer experiments can provide more and better measured data for the validation of computational models of engineering systems.   In December, under the auspices of the project, experts from academia, industry and national labs from across Europe met near Bristol and debated how to reshape the traditional flow-chart used in the validation of engineering models, which places equal weight on experiments and computational models [see ASME V&V 10-2006 Figure 2].  In a smaller follow-up meeting in Zurich, just before Christmas [see my post ‘A reflection of existentialism‘ on December 20th, 2017], we blended the ideas from the Bristol session into a new flow-chart that could lead to the validation of some engineering systems without conducting experiments in parallel.  This is not perhaps as radical as it sounds because this happens already for some evolutionary designs, especially if they are not safety-critical.  Nevertheless, if we are to achieve the paradigm shift towards the new digital world, then we will have to convince the wider engineering community about our novel approach through demonstrations of its successful application, which sounds like empiricism again!  More on that in future updates.

Image by Erwin Hack: Coffee and pastries awaiting technical experts debating behind the closed door.

Getting smarter

A350 XWB passes Maximum Wing Bending test [from: http://www.airbus.com/galleries/photo-gallery%5D

Garbage in, garbage out (GIGO) is a perennial problem in computational simulations of engineering structures.  If the description of the geometry of the structure, the material behaviour, the loading conditions or the boundary conditions are incorrect (garbage in), then the simulation generates predictions that are wrong (garbage out), or least an unreliable representation of reality.  It is not easy to describe precisely the geometry, material, loading and environment of a complex structure, such as an aircraft or a powerstation; because, the complete description is either unavailable or too complicated.  Hence, modellers make assumptions about the unknown information and, or to simplify the description.  This means the predictions from the simulation have to be tested against reality in order to establish confidence in them – a process known as model validation [see my post entitled ‘Model validation‘ on September 18th, 2012].

It is good practice to design experiments specifically to generate data for model validation but it is expensive, especially when your structure is a huge passenger aircraft.  So naturally, you would like to extract as much information from each experiment as possible and to perform as few experiments as possible, whilst both ensuring predictions are reliable and providing confidence in them.  In other words, you have to be very smart about designing and conducting the experiments as well as performing the validation process.

Together with researchers at Empa in Zurich, the Industrial Systems Institute of the Athena Research Centre in Athens and Dantec Dynamics in Ulm, I am embarking on a new EU Horizon 2020 project to try and make us smarter about experiments and validation.  The project, known as MOTIVATE [Matrix Optimization for Testing by Interaction of Virtual and Test Environments (Grant Nr. 754660)], is funded through the Clean Sky 2 Joint Undertaking with Airbus acting as our topic manager to guide us towards an outcome that will be applicable in industry.  We held our kick-off meeting in Liverpool last week, which is why it is uppermost in my mind at the moment.  We have 36-months to get smarter on an industrial scale and demonstrate it in a full-scale test on an aircraft structure.  So, some sleepness nights ahead…



ASME V&V 10-2006, Guide for verification & validation in computational solid mechanics, American Society of Mech. Engineers, New York, 2006.

European Committee for Standardisation (CEN), Validation of computational solid mechanics models, CEN Workshop Agreement, CWA 16799:2014 E.

Hack E & Lampeas G (Guest Editors) & Patterson EA (Editor), Special issue on advances in validation of computational mechanics models, J. Strain Analysis, 51 (1), 2016.