Category Archives: MyResearch

Risky predictions

flood

Risk is a much mis-understood word.  In a technical sense, it is the probability of something happening multiplied by the consequences when it does [see post on Risk Definition, September 20th, 2012].  Tight regulation and good engineering could reduce the probability of earthquakes induced by fracking and such earthquakes tend not to produce structural damage, i.e. low consequences, so perhaps it is reasonable to conclude that the risks are low because two small quantities multiplied together do not produce a big quantity [see last week’s post on ‘Fracking’, 28th August, 2013].

The more common definition of risk is the probability of a loss, injury or damage occurring, i.e. severity is ignored.  Probability is used to describe the frequency of occurence of an event.  A classic example is tossing a fair coin, which will come down heads 50% of the time.  This is a simple game of chance that can be played repeatedly to establish the frequency of the event.  It is impractical to use this approach to establish the probability of fracking causing an earthquake, so instead engineers and scientists must simulate the event using computer models.  One approach to simulation is to generate a set of models, each based on slightly different set of realistic conditions and assumptions, and look at what percentage of the models predict earthquakes, which can be equated to the probability of a fracking-induced earthquake.  When the set of conditions is generated randomly, this approach is known as Monte Carlo simulation.  Weather forecasters use simulations of this type to predict the probability of rain or sunshine tomorrow.

The reliability of a simulation depends on the model adequately describing the physical world.  We can test this (known as validating the model) by comparing predicted outcomes with real-world outcomes [see post on 18th September, 2012 on ‘model validation’].  The quality of the comparison can be expressed as a level of confidence usually as a percentage.  Crudely speaking, this percentage can be equated to the frequency with which the model will correctly predict an event, i.e. the probability that the model is reliable, so if we are 90% confident then we would expect the model to correctly predict an event 9 out of 10 times. In other words, there would be a 10% ‘risk’ that the model could wrong.

In practice we cannot easily calculate the probability of a fracking-induced earthquake because it is such a complex process. Validating a model of fracking is also a challenge because of the lack of real examples so that establishing confidence is difficult.  As a consequence, we tend be left weighing unquantified risks in a subjective manner, which is why there is so much debate.

If you made it this far – well done and thank you!   If you want more on weather forecasting and extending these ideas to economic forecasting see  John Kay’s article in the Financial Times on August 14th, 2013 entitled ‘Spotting a banking crisis is not like predicting the weather’ [ http://www.ft.com/cms/s/0/fdd0c5bc-0367-11e3-b871-00144feab7de.html#axzz2dNrTKPDy ].

Model validation

Front cover of ASME V&V 10-2006, Guide for verification and validation in computational solid mechanics, American Society of Mechanical Engineers, New York, 2006.

Why is validation important?  Validation of computational mechanics models is defined as ‘determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model’, according to  ASME V&V 10-2006.  So, the validation of models of structural integrity for engineering design provides information about the degree to which the simulation results from the model can be believed.  This in turn helps in making decisions about how little material, and in what configuration, should be used to create elegant, sustainable designs that are unlikely to fail. So validation of computational mechanics models is an essential step in solving the ‘two earths’ dilemma (see post on August 13th, 2012).

Model credibility

Last week I spoke at the annual conference of the Associazione Italiana per ‘Analisi dell Sollecitazioni in Vicenza, Italy on the role of experimental mechanics in the validation of computational models used in engineering simulations.  We discussed the conflict between reducing cost and energy consumption and increasing performance and reliability of engineering machines and vehicles.  Generally, the former implies using less material more efficiently, while the latter tends to require the use of more material.  Engineers resolve this conflict by using computational models when optimising designs to simulate engineering behaviour.  The development of elegant and successful designs requires a high level of credibility in the models.  This credibility can be established by comparing the results from models with those from specially-conducted experiments; a process that is known as ‘validation’.

Hot stuff

Amplitude of temperature fluctuations in a turbine blade from a jet engine during a vibration test at 700Hz

There have been no postings for a while because I have been away.  Last week I organised a workshop in Glasgow for engineers in industry and academic on [how we can make] ‘Strain Measurements in Extreme Environments’.  Although this included making measurements on large and fast engineering components, half of the workshop was focussed on evaluating strain at high temperatures, 1000°C to 2000°C, which is hot by most standards.  This is beyond the operating range of most sensors and most materials that remain solid at these temperatures glow, which makes optical measurements challenging.

So why are we interested?  For hypersonic flight including applications such as delivering satellites into orbit.  And, because engines become more efficient when operating at high temperatures.

Can we do it? Not in the real-world but in a laboratory environment some research groups have been successfully using digital image correlation with ceramic particles creating a textured pattern on the hot surface that can be tracked as the hot stuff deforms.