Category Archives: FACTS

Fourth industrial revolution

Have you noticed that we are in the throes of a fourth industrial revolution?

The first industrial revolution occurred towards the end of the 18th century with the introduction of steam power and mechanisation.  The second industrial revolution took place at the end of the 19th and beginning of the 20th century and was driven by the invention of electrical devices and mass production.  The third industrial revolution was brought about by computers and automation at the end of the 20th century.  The fourth industrial revolution is happening as result of combining physical and cyber systems.  It is also called Industry 4.0 and is seen as the integration of additive manufacturing, augmented reality, Big Data, cloud computing, cyber security, Internet of Things (IoT), simulation and systems engineering.  Most organisations are struggling with the integration process and, as a consequence, are only exploiting a fraction of the capabilities of the new technology.  Revolutions are, by their nature, disruptive and those organisations that embrace and exploit the innovations will benefit while the existence of the remainder is under threat [see [‘The disrupting benefit of innovation’ on May 23rd, 2018].

Our work on the Integrated Nuclear Digital Environment, on Digital Twins, in the MOTIVATE project and on hierarchical modelling in engineering and biology is all part of the revolution.

Links to these research posts:

Enabling or disruptive technology for nuclear engineering?’ on January 28th, 2015

Can you trust your digital twin?’ on November 23rd, 2016

Getting Smarter’ on June 21st, 2017

‘Hierarchical modelling in engineering and biology’ [March 14th, 2018]


Image: Christoph Roser at from [CC BY-SA 4.0].

INSTRUCTIVE research relevance

The Southwest airplane accident last week has been initially attributed to a fatigue crack in a fan blade in the engine.  One of the reasons that this an extremely rare event is the enormous research effort that has been expended on the design, testing and maintenance of the engines and the airframe.  It’s an ongoing research effort to address the trilemma of aircraft that are safe, sustainable and low cost to build and operate.  In collaboration with Strain Solutions Limited, we are in the last year of a three-year project called INSTRUCTIVE which is funded by the Clean Sky 2 programme of the European Commission [see ‘Instructive report and Brexit‘ on March 29th, 2017].   The focus of the research is the development of techniques for use in the aerospace industry to detect the initiation of cracks in the airframe before the crack is visible to the naked eye [see ‘Instructive update‘ on October 4th, 2017].  Laboratory-based techniques exist with this capability and the objective is to transfer the technology to the industrial scale and environment – initially in structural tests performed as part of the design and certification process and perhaps later as part of inspections of aircraft in service.  So far, we have moved from the small components reported in the update posted in October, to a chunk of aircraft fuselage in our lab and we are preparing to participate in a test being conducted by Airbus later this year.

We are also planning a knowledge exchange workshop on ‘Real-time damage tracking in engineering structures’ on November 21st, 2018 at the University of Liverpool’s London campus.  The one-day workshop is being organised in collaboration with the British Society for Strain Measurement.  More details to follow – it will be free!

Image Credit: Powering the 737: CFM56-7 series | by Frans Zwart at  [CC BY-NC-ND 2.0]


Hierarchical modelling in engineering and biology

In the 1979 Glenn Harris proposed an analytical hierarchy of models for estimating tactical force effectiveness for the US Army which was represented as a pyramid with four layers with a theatre/campaign simulation at the apex supported by mission level simulations below which was engagement model and engineering models of assets/equipment at the base.  The idea was adopted by the aerospace industry [see the graphic on the left] who place the complete aircraft on the apex supported by systems, sub-systems and components beneath in increasing numbers with the pyramid divided vertically in half to represent physical tests on one side and simulations on the other.  This represents the need to validate predictions from computational models with measurements in the real-world [see post on ‘Model validation‘ on September 18th, 2012]. These diagrams are schematic representations used by engineers to plan and organise the extensive programmes of modelling and physical testing undertaken during the design of new aircraft [see post on ‘Models as fables‘ on March 16th, 2016].  The objective of the MOTIVATE research project is to reduce quantity and increase the quality of the physical tests so that pyramid becomes lop-sided, i.e. the triangle representing the experiments and tests is a much thinner slice than the one representing the modelling and simulations [see post on ‘Brave New World‘ on January 10th, 2018].

At the same time, I am working with colleagues in toxicology on approaches to establishing credibility in predictive models for chemical risk assessment.  I have constructed an equivalent pyramid to represent the system hierarchy which is shown on the right in the graphic.  The challenge is the lack of measurement data in the top left of the pyramid, for both moral and legal reasons, which means that there is very limited real-world data available to confirm the predictions from computational models represented on the right of the pyramid.  In other words, my colleagues in toxicology, and computational biology in general, are where my collaborators in the aerospace industry would like to be while my collaborators in the aerospace want to be where the computational biologists find themselves already.  The challenge is that in both cases a paradigm shift is required from objectivism toward relativism;  since, in the absence of comprehensive real-world measurement data, validation or confirmation of predictions becomes a social process involving judgement about where the predictions lie on a continuum of usefulness.


Harris GL, Computer models, laboratory simulators, and test ranges: meeting the challenge of estimating tactical force effectiveness in the 1980’s, US Army Command and General Staff College, May 1979.

Trevisani DA & Sisti AF, Air Force hierarchy of models: a look inside the great pyramid, Proc. SPIE 4026, Enabling Technology for Simulation Science IV, 23 June 2000.

Patterson EA & Whelan MP, A framework to establish credibility of computational models in biology, Progress in Biophysics and Molecular Biology, 129:13-19, 2017.

Slow moving nanoparticles

Random track of a nanoparticle superimposed on its image generated in the microscope using a pin-hole and narrowband filter.

A couple of weeks ago I bragged about research from my group being included in a press release from the Royal Society [see post entitled ‘Press Release!‘ on November 15th, 2017].  I hate to be boring but it’s happened again.  Some research that we have been performing with the European Union’s Joint Research Centre in Ispra [see my post entitled ‘Toxic nanoparticles‘ on November 13th, 2013] has been published this morning by the Royal Society Open Science.

Our experimental measurements of the free motion of small nanoparticles in a fluid have shown that they move slower than expected.  At low concentrations, unexpectedly large groups of molecules in the form of nanoparticles up to 150-300nm in diameter behave more like an individual molecule than a particle.  Our experiments support predictions from computer simulations by other researchers, which suggest that at low concentrations the motion of small nanoparticles in a fluid might be dominated by van der Waals forces rather the thermal motion of the surrounding molecules.  At the nanoscale there is still much that we do not understand and so these findings will have potential implications for predicting nanoparticle transport, for instance in drug delivery [e.g., via the nasal passage to the central nervous system], and for understanding enhanced heat transfer in nanofluids, which is important in designing systems such as cooling for electronics, solar collectors and nuclear reactors.

Our article’s title is ‘Transition from fractional to classical Stokes-Einstein behaviour in simple fluids‘ which does not reveal much unless you are familiar with the behaviour of particles and molecules.  So, here’s a quick explanation: Robert Brown gave his name to the motion of particles suspended in a fluid after reporting the random motion or diffusion of pollen particles in water in 1828.  In 1906, Einstein postulated that the motion of a suspended particle is generated by the thermal motion of the surrounding fluid molecules.  While Stokes law relates the drag force on the particle to its size and fluid viscosity.  Hence, the Brownian motion of a particle can be described by the combined Stokes-Einstein relationship.  However, at the molecular scale, the motion of individual molecules in a fluid is dominated by van der Waals forces, which results in the size of the molecule being unimportant and the diffusion of the molecule being inversely proportional to a fractional power of the fluid viscosity; hence the term fractional Stokes-Einstein behaviour.  Nanoparticles that approach the size of large molecules are not visible in an optical microscope and so we have tracked them using a special technique based on imaging their shadow [see my post ‘Seeing the invisible‘ on October 29th, 2014].


Coglitore D, Edwardson SP, Macko P, Patterson EA, Whelan MP, Transition from fractional to classical Stokes-Einstein behaviour in simple fluids, Royal Society Open Science, 4:170507, 2017. doi: