Category Archives: mechanics

In Einstein’s footprints?

Grand Hall of the Guild of Carpenters, Zurich

During the past week, I have been working with members of my research group on a series of papers for a conference in the USA that a small group of us will be attending in the summer.  Dissemination is an important step in the research process; there is no point in doing the research if we lock the results away in a desk drawer and forget about them.  Nowadays, the funding organisations that support our research expect to see a plan of dissemination as part of our proposals for research; and hence, we have an obligation to present our results to the scientific community as well as to communicate them more widely, for instance through this blog.

That’s all fine; but nevertheless, I don’t find most conferences a worthwhile experience.  Often, there are too many uncoordinated sessions running in parallel that contain presentations describing tiny steps forward in knowledge and understanding which fail to compel your attention [see ‘Compelling presentations‘ on March 21st, 2018].  Of course, they can provide an opportunity to network, especially for those researchers in the early stages of their careers; but, in my experience, they are rarely the location for serious intellectual discussion or debate.  This is more likely to happen in small workshops focussed on a ‘hot-topic’ and with a carefully selected eclectic mix of speakers interspersed with chaired discussion sessions.

I have been involved in organising a number of such workshops in Glasgow, London, Munich and Shanghai over the last decade.  The next one will be in Zurich in November 2019 in Guild Hall of Carpenters (Zunfthaus zur Zimmerleuten) where Einstein lectured in November 1910 to the Zurich Physical Society ‘On Boltzmann’s principle and some of its direct consequences‘.  Our subject will be different: ‘Validation of Computational Mechanics Models’; but we hope that the debate on credible models, multi-physics simulations and surviving with experimental data will be as lively as in 1910.  If you would like to contribute then download the pdf from this link; and if you just like to attend the one-day workshop then we will be announcing registration soon and there is no charge!

We have published the outcomes from some of our previous workshops:

Advances in Validation of Computational Mechanics Models (from the 2014 workshop in Munich), Journal of Strain Analysis, vol. 51, no.1, 2016

Strain Measurement in Extreme Environments (from the 2012 workshop in Glasgow), Journal of Strain Analysis, vol. 49, no. 4, 2014.

Validation of Computational Solid Mechanics Models (from the 2011 workshop in Shanghai), Journal of Strain Analysis, vol. 48, no.1, 2013.

The workshop is supported by the MOTIVATE project and further details are available at http://www.engineeringvalidation.org/4th-workshop

The MOTIVATE project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 754660.

INSTRUCTIVE final reckoning

Our EU project, INSTRUCTIVE came to an end with the closing of 2018.  We have achieved all of our milestones and deliverables; and, now have 51 (=60-9) days to submit our final reports.  We have already presented the technical contents of those reports to representatives of our sponsors in a final review meeting just before the Christmas break.  I think that they were pleased with our progress; our findings certainly stimulated debate about how to move forward and implement the new technologies – lots of new questions that we did not know we should be asking when we started the project.

We are also disseminating the key results more publicly because this is an obligation inherent with receiving public funding for our research; but also, because I see no purpose in advancing knowledge without sharing it. During the course of the project we have given research updates at three conferences and the papers/abstracts for these are available via the University of Liverpool Repository [#1, #2 & #3].  And, we are in the process of producing three papers for publication in archived journals.

However, the real tangible benefit of the project is the move to next stage of development for the technology supported by a new project, called DIMES, that started on January 1st, 2019.  The aim of the DIMES project is to develop and demonstrate systems with the capability to detect a crack or delamination in a metallic or composite structure, and the potential to be deployed as part of an on-board structural health monitoring system for passenger aircraft.  In other words, the INSTRUCTIVE project has successfully demonstrated that a new philosophy for monitoring damage in aerospace structures, using disturbances to the strain field caused by the damage, is at least as effective as traditional non-destructive evaluation (NDE) techniques and in some circumstances provides much more sensitivity about the initiation and propagation of damage.  This has been sufficiently successful in the laboratory and on aircraft components in an industrial environment that is worth exploring its deployment for on-board monitoring and the first step is to use it in ground-based tests.

There will be more on DIMES as the project gets underway and updates on its progress will replace the twice-yearly ones on INSTRUCTIVE.

The series of posts on the INSTRUCTIVE project can be found at https://realizeengineering.blog/category/myresearch/instructive-project/

instructive acknowledgement

Industrial uncertainty

Last month I spent almost a week in Zurich.  It is one of our favourite European cities [see ‘A reflection of existentialism‘ on December 20th, 2017]; however, on this occasion there was no time for sight-seeing because I was there for the mid-term meeting of the MOTIVATE project and to conduct some tests and demonstrations in the laboratories of our host, EMPA, the Swiss Federal Laboratories for Materials Science and Technology.  Two of our project partners, Dantec Dynamics GmbH based in Ulm, Germany, and the Athena Research Centre in Patras, Greece, have developed methods for quantifying the uncertainty present in measurements of deformation made in an industrial environment using digital image correlation (DIC) [see ‘256 shades of grey‘ on January 22, 2014].  Digital image correlation is a technique in which we usually apply a random speckle pattern to the object which allows us to track the movement of the object surface over time by searching for the new position of the speckles in the photographs of the object.  If we use a pair of cameras in a stereoscopic arrangement, then we can measure in-plane and out-of-plane displacements.  Digital image correlation is a well-established measurement technique that has become ubiquitous in mechanics laboratories. In previous EU projects, we have developed technology for quantifying uncertainty in in-plane [SPOTS project] and out-of-plane [ADVISE project] measurements in a laboratory environment.  However, when you take the digital image correlation equipment into an industrial environment, for instance an aircraft hangar to make measurements during a full-scale test, then additional sources of uncertainty and error appear. The new technology demonstrated last month allows these additional uncertainties to be quantified.  As part of the MOTIVATE project, we will be involved in a full-scale test on a large section of an Airbus aircraft next year and so, we will be able to utilise the new technology for the first time.

The photograph shows preparations for the demonstrations in EMPA’s laboratories.  In the foreground is a stereoscopic digital image correlation system with which we are about to make measurements of deformation of a section of aircraft skin, supplied by Airbus, which has a speckle pattern on its surface and is about to be loaded in compression by the large servo-hydraulic test machine.

References:

From SPOTS project:

Patterson EA, Hack E, Brailly P, Burguete RL, Saleem Q, Seibert T, Tomlinson RA & Whelan M, Calibration and evaluation of optical systems for full-field strain measurement, Optics and Lasers in Engineering, 45(5):550-564, 2007.

Whelan MP, Albrecht D, Hack E & Patterson EA, Calibration of a speckle interferometry full-field strain measurement system, Strain, 44(2):180-190, 2008.

From ADVISE project:

Hack E, Lin X, Patterson EA & Sebastian CM, A reference material for establishing uncertainties in full-field displacement measurements, Measurement Science and Technology, 26:075004, 2015.

Million to one

‘All models are wrong, but some are useful’ is a quote, usually attributed to George Box, that is often cited in the context of computer models and simulations.  Working out which models are useful can be difficult and it is essential to get it right when a model is to be used to design an aircraft, support the safety case for a nuclear power station or inform regulatory risk assessment on a new chemical.  One way to identify a useful model to assess its predictions against measurements made in the real-world [see ‘Model validation’ on September 18th, 2012].  Many people have worked on validation metrics that allow predicted and measured signals to be compared; and, some result in a statement of the probability that the predicted and measured signal belong to the same population.  This works well if the predictions and measurements are, for example, the temperature measured at a single weather station over a period of time; however, these validation metrics cannot handle fields of data, for instance the map of temperature, measured with an infrared camera, in a power station during start-up.  We have been working on resolving this issue and we have recently published a paper on ‘A probabilistic metric for the validation of computational models’.  We reduce the dimensionality of a field of data, represented by values in a matrix, to a vector using orthogonal decomposition [see ‘Recognizing strain’ on October 28th, 2015].  The data field could be a map of temperature, the strain field in an aircraft wing or the topology of a landscape – it does not matter.  The decomposition is performed separately and identically on the predicted and measured data fields to create to two vectors – one each for the predictions and measurements.  We look at the differences in these two vectors and compare them against the uncertainty in the measurements to arrive at a probability that the predictions belong to the same population as the measurements.  There are subtleties in the process that I have omitted but essentially, we can take two data fields composed of millions of values and arrive at a single number to describe the usefulness of the model’s predictions.

Our paper was published by the Royal Society with a press release but in the same week as the proposed Brexit agreement and so I would like to think that it was ignored due to the overwhelming interest in the political storm around Brexit rather than its esoteric nature.

Source:

Dvurecenska K, Graham S, Patelli E & Patterson EA, A probabilistic metric for the validation of computational models, Royal Society Open Science, 5:1180687, 2018.