Category Archives: MOTIVATE project

Industrial uncertainty

Last month I spent almost a week in Zurich.  It is one of our favourite European cities [see ‘A reflection of existentialism‘ on December 20th, 2017]; however, on this occasion there was no time for sight-seeing because I was there for the mid-term meeting of the MOTIVATE project and to conduct some tests and demonstrations in the laboratories of our host, EMPA, the Swiss Federal Laboratories for Materials Science and Technology.  Two of our project partners, Dantec Dynamics GmbH based in Ulm, Germany, and the Athena Research Centre in Patras, Greece, have developed methods for quantifying the uncertainty present in measurements of deformation made in an industrial environment using digital image correlation (DIC) [see ‘256 shades of grey‘ on January 22, 2014].  Digital image correlation is a technique in which we usually apply a random speckle pattern to the object which allows us to track the movement of the object surface over time by searching for the new position of the speckles in the photographs of the object.  If we use a pair of cameras in a stereoscopic arrangement, then we can measure in-plane and out-of-plane displacements.  Digital image correlation is a well-established measurement technique that has become ubiquitous in mechanics laboratories. In previous EU projects, we have developed technology for quantifying uncertainty in in-plane [SPOTS project] and out-of-plane [ADVISE project] measurements in a laboratory environment.  However, when you take the digital image correlation equipment into an industrial environment, for instance an aircraft hangar to make measurements during a full-scale test, then additional sources of uncertainty and error appear. The new technology demonstrated last month allows these additional uncertainties to be quantified.  As part of the MOTIVATE project, we will be involved in a full-scale test on a large section of an Airbus aircraft next year and so, we will be able to utilise the new technology for the first time.

The photograph shows preparations for the demonstrations in EMPA’s laboratories.  In the foreground is a stereoscopic digital image correlation system with which we are about to make measurements of deformation of a section of aircraft skin, supplied by Airbus, which has a speckle pattern on its surface and is about to be loaded in compression by the large servo-hydraulic test machine.

References:

From SPOTS project:

Patterson EA, Hack E, Brailly P, Burguete RL, Saleem Q, Seibert T, Tomlinson RA & Whelan M, Calibration and evaluation of optical systems for full-field strain measurement, Optics and Lasers in Engineering, 45(5):550-564, 2007.

Whelan MP, Albrecht D, Hack E & Patterson EA, Calibration of a speckle interferometry full-field strain measurement system, Strain, 44(2):180-190, 2008.

From ADVISE project:

Hack E, Lin X, Patterson EA & Sebastian CM, A reference material for establishing uncertainties in full-field displacement measurements, Measurement Science and Technology, 26:075004, 2015.

Million to one

‘All models are wrong, but some are useful’ is a quote, usually attributed to George Box, that is often cited in the context of computer models and simulations.  Working out which models are useful can be difficult and it is essential to get it right when a model is to be used to design an aircraft, support the safety case for a nuclear power station or inform regulatory risk assessment on a new chemical.  One way to identify a useful model to assess its predictions against measurements made in the real-world [see ‘Model validation’ on September 18th, 2012].  Many people have worked on validation metrics that allow predicted and measured signals to be compared; and, some result in a statement of the probability that the predicted and measured signal belong to the same population.  This works well if the predictions and measurements are, for example, the temperature measured at a single weather station over a period of time; however, these validation metrics cannot handle fields of data, for instance the map of temperature, measured with an infrared camera, in a power station during start-up.  We have been working on resolving this issue and we have recently published a paper on ‘A probabilistic metric for the validation of computational models’.  We reduce the dimensionality of a field of data, represented by values in a matrix, to a vector using orthogonal decomposition [see ‘Recognizing strain’ on October 28th, 2015].  The data field could be a map of temperature, the strain field in an aircraft wing or the topology of a landscape – it does not matter.  The decomposition is performed separately and identically on the predicted and measured data fields to create to two vectors – one each for the predictions and measurements.  We look at the differences in these two vectors and compare them against the uncertainty in the measurements to arrive at a probability that the predictions belong to the same population as the measurements.  There are subtleties in the process that I have omitted but essentially, we can take two data fields composed of millions of values and arrive at a single number to describe the usefulness of the model’s predictions.

Our paper was published by the Royal Society with a press release but in the same week as the proposed Brexit agreement and so I would like to think that it was ignored due to the overwhelming interest in the political storm around Brexit rather than its esoteric nature.

Source:

Dvurecenska K, Graham S, Patelli E & Patterson EA, A probabilistic metric for the validation of computational models, Royal Society Open Science, 5:1180687, 2018.

Fourth industrial revolution

Have you noticed that we are in the throes of a fourth industrial revolution?

The first industrial revolution occurred towards the end of the 18th century with the introduction of steam power and mechanisation.  The second industrial revolution took place at the end of the 19th and beginning of the 20th century and was driven by the invention of electrical devices and mass production.  The third industrial revolution was brought about by computers and automation at the end of the 20th century.  The fourth industrial revolution is happening as result of combining physical and cyber systems.  It is also called Industry 4.0 and is seen as the integration of additive manufacturing, augmented reality, Big Data, cloud computing, cyber security, Internet of Things (IoT), simulation and systems engineering.  Most organisations are struggling with the integration process and, as a consequence, are only exploiting a fraction of the capabilities of the new technology.  Revolutions are, by their nature, disruptive and those organisations that embrace and exploit the innovations will benefit while the existence of the remainder is under threat [see [‘The disrupting benefit of innovation’ on May 23rd, 2018].

Our work on the Integrated Nuclear Digital Environment, on Digital Twins, in the MOTIVATE project and on hierarchical modelling in engineering and biology is all part of the revolution.

Links to these research posts:

Enabling or disruptive technology for nuclear engineering?’ on January 28th, 2015

Can you trust your digital twin?’ on November 23rd, 2016

Getting Smarter’ on June 21st, 2017

‘Hierarchical modelling in engineering and biology’ [March 14th, 2018]

 

Image: Christoph Roser at AllAboutLean.com from https://commons.wikimedia.org/wiki/File:Industry_4.0.png [CC BY-SA 4.0].

Spontaneously MOTIVATEd

Some posts arise spontaneously, stimulated by something that I have read or done, while others are part of commitment to communicate on a topic related to my research or teaching, such as the CALE series.  The motivation for a post seems unrelated to its popularity.  This post is part of that commitment to communicate.

After 12 months, our EU-supported research project, MOTIVATE [see ‘Getting Smarter‘ on June 21st, 2017] is one-third complete in terms of time; and, as in all research it appears to have made a slow start with much effort expended on conceptualizing, planning, reviewing prior research and discussions.  However, we are on-schedule and have delivered on one of our four research tasks with the result that we have a new validation metric and a new flowchart for the validation process.  The validation metric was revealed at the Photomechanics 2018 conference in Toulouse earlier this year [see ‘Massive Engineering‘ on April 4th, 2018].  The new flowchart [see the graphic] is the result of a brainstorming [see ‘Brave New World‘ on January 10th, 2018] and much subsequent discussion; and will be presented at a conference in Brussels next month [ICEM 2018] at which we will invite feedback [proceedings paper].  The big change from the classical flowchart [see for example ASME V&V guide] is the inclusion of historical data with the possibility of not requiring experiments to provide data for validation purposes. This is probably a paradigm shift for the engineering community, or at least the V&V [Validation & Verification] community.  So, we are expecting some robust feedback – feel free to comment on this blog!

References:

Hack E, Burguete RL, Dvurecenska K, Lampeas G, Patterson EA, Siebert T & Szigeti E, Steps toward industrial validation experiments, In Proceedings Int. Conf. Experimental Mechanics, Brussels, July 2018 [pdf here].

Dvurcenska K, Patelli E & Patterson EA, What’s the probability that a simulation agrees with your experiment? In Proceedings Photomechanics 2018, Toulouse, March 2018.