Tag Archives: simulation

Reduction in usefulness of reductionism

decorative paintingA couple of months ago I wrote about a set of credibility factors for computational models [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020] that we designed to inform interactions between researchers, model builders and decision-makers that will establish trust in the predictions from computational models [1].  This is important because computational modelling is becoming ubiquitous in the development of everything from automobiles and power stations to drugs and vaccines which inevitably leads to its use in supporting regulatory applications.  However, there is another motivation underpinning our work which is that the systems being modelled are becoming increasingly complex with the likelihood that they will exhibit emergent behaviour [see ‘Emergent properties‘ on September 16th, 2015] and this makes it increasingly unlikely that a reductionist approach to establishing model credibility will be successful [2].  The reductionist approach to science, which was pioneered by Descartes and Newton, has served science well for hundreds of years and is based on the concept that everything about a complex system can be understood by reducing it to the smallest constituent part.  It is the method of analysis that underpins almost everything you learn as an undergraduate engineer or physicist. However, reductionism loses its power when a system is more than the sum of its parts, i.e., when it exhibits emergent behaviour.  Our approach to establishing model credibility is more holistic than traditional methods.  This seems appropriate when modelling complex systems for which a complete knowledge of the relationships and patterns of behaviour may not be attainable, e.g., when unexpected or unexplainable emergent behaviour occurs [3].  The hegemony of reductionism in science made us nervous about writing about its short-comings four years ago when we first published our ideas about model credibility [2].  So, I was pleased to see a paper published last year [4] that identified five fundamental properties of biology that weaken the power of reductionism, namely (1) biological variation is widespread and persistent, (2) biological systems are relentlessly nonlinear, (3) biological systems contain redundancy, (4) biology consists of multiple systems interacting across different time and spatial scales, and (5) biological properties are emergent.  Many engineered systems possess all five of these fundamental properties – you just to need to look at them from the appropriate perspective, for example, through a microscope to see the variation in microstructure of a mass-produced part.  Hence, in the future, there will need to be an increasing emphasis on holistic approaches and systems thinking in both the education and practices of engineers as well as biologists.

For more on emergence in computational modelling see Manuel Delanda Philosophy and Simulation: The Emergence of Synthetic Reason, Continuum, London, 2011. And, for more systems thinking see Fritjof Capra and Luigi Luisi, The Systems View of Life: A Unifying Vision, Cambridge University Press, 2014.

References:

[1] Patterson EA, Whelan MP & Worth A, The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application, Computational Toxicology, 17: 100144, 2021.

[2] Patterson EA &Whelan MP, A framework to establish credibility of computational models in biology. Progress in biophysics and molecular biology, 129: 13-19, 2017.

[3] Patterson EA & Whelan MP, On the validation of variable fidelity multi-physics simulations, J. Sound & Vibration, 448:247-258, 2019.

[4] Pruett WA, Clemmer JS & Hester RL, Physiological Modeling and Simulation—Validation, Credibility, and Application. Annual Review of Biomedical Engineering, 22:185-206, 2020.

Going against the flow

Decorative photograph of a mountain riverLast week I wrote about research we have been carrying out over the last decade that is being applied to large scale structures in the aerospace industry (see ‘Slowly crossing the valley of death‘ on January 27th, 2021). I also work on very much smaller ‘structures’ that are only tens of nanometers in diameter, or about a billion times smaller than the test samples in last week’s post (see ‘Toxic nanoparticles?‘ on November 13th, 2013). The connection is the use of light to measure shape, deformation and motion; and then utilising the measurements to validate predictions from theoretical or computational models. About three years ago, we published research which demonstrated that the motion of very small particles (less than about 300 nanometres) at low concentrations (less than about a billion per millilitre) in a fluid was dominated by the molecules of the fluid rather than interactions between the particles (see Coglitore et al, 2017 and ‘Slow moving nanoparticles‘ on December 13th, 2017). This data confirmed results from earlier molecular dynamic simulations that contradicted predictions using the Stokes-Einstein equation, which was derived by Einstein in his PhD thesis for a ‘Stokes’ particle undergoing Brownian motion. The Stokes-Einstein equation works well for large particles but the physics of motion changes when the particles are very small and far apart so that Van der Waals forces and electrostatic forces play a dominant role, as we have shown in a more recent paper (see Giorgi et al, 2019).  This becomes relevant when evaluating nanoparticles as potential drug delivery systems or assessing the toxicological impact of nanoparticles.  We have shown recently that instruments based on dynamic scattering of light from nanoparticles are likely to be inaccurate because they are based on fitting measurement data to the Stokes-Einstein equation.  In a paper published last month, we found that asymmetric flow field flow fractionation (or AF4)  in combination with dynamic light scattering when used to detect the size of nanoparticles in suspension, tended to over-estimate the diameter of particles smaller than 60 nanometres at low concentrations by upto a factor of two (see Giorgi et al, 2021).  Someone commented recently that our work in this area was not highly cited but perhaps this is unsurprising when it undermines a current paradigm.  We have certainly learnt to handle rejection letters, to redouble our efforts to demonstrate the rigor in our research and to present conclusions in a manner that appears to build on existing knowledge rather than demolishing it.

Sources:

Coglitore, D., Edwardson, S.P., Macko, P., Patterson, E.A. and Whelan, M., 2017. Transition from fractional to classical Stokes–Einstein behaviour in simple fluids. Royal Society open science, 4(12), p.170507.

Giorgi, F., Coglitore, D., Curran, J.M., Gilliland, D., Macko, P., Whelan, M., Worth, A. and Patterson, E.A., 2019. The influence of inter-particle forces on diffusion at the nanoscale. Scientific reports, 9(1), pp.1-6.

Giorgi, F., Curran, J.M., Gilliland, D., La Spina, R., Whelan, M.P. & Patterson, E.A. 2021, Limitations of nanoparticles size characterization by asymmetric flow field-fractionation coupled with online dynamic light scattering, Chromatographia, doi.org/10/1007/s10337-020-03997-7.

Image is a photograph of a fast flowing mountain river taken in Yellowstone National Park during a roadtrip across the USA in 2006.

Slowly crossing the valley of death

A view of a valleyThe valley of death in technology development is well-known amongst research engineers and their sponsors. It is the gap between discovery and application, or between realization of an idea in a laboratory and its implementation in the real-world. Some of my research has made it across the valley of death, for example the poleidoscope about 15 years ago (see ‘Poleidoscope (=polariscope+kaleidoscope)‘ on October 14th, 2020).  Our work on quantitative comparisons of data fields from physical measurements and computer predictions is about three-quarters of the way across the valley.  We published a paper in December (see Dvurecenska et al, 2020) on its application to a large panel from the fuselage of an aircraft based on work we completed as part of the MOTIVATE project.  I reported the application of the research in almost real-time in a post in December 2018 (see ‘Industrial Uncertainty‘ on December 12th, 2018) and in further detail in May 2020 as we submitted the manuscript for publication (‘Alleviating industrial uncertainty‘ on May 13th, 2020).  However, the realization in the laboratory occurred nearly a decade ago when teams from Michigan State University and the University of Liverpool came together in the ADVISE project funded by EU Framework 7 programme (see Wang et al, 2011). Subsequently, the team at Michigan State University moved to the University of Liverpool and in collaboration with researchers at Empa developed the technique that was applied in the MOTIVATE project (see Sebastian et al 2013). The work published in December represents a step into the valley of death; from a university environment into a full-scale test laboratory at Empa using a real piece of aircraft.  The MOTIVATE project involved a further step to a demonstration on an on-going test of a cockpit at Airbus which was also reported in a post last May (see ‘The blind leading the blind‘ on May 27th, 2020).  We are now working with Airbus in a new programme to embed the process of quantitative comparison of fields of measurements and predictions into their routine test procedures for aerospace structures.  So, I would like to think we are climbing out of the valley.

Image: not Death Valley but taken on a road trip in 2008 somewhere between Moab, UT and Kanab, UT while living in Okemos, MI.

Sources:

Dvurecenska, K., Diamantakos, I., Hack, E., Lampeas, G., Patterson, E.A. and Siebert, T., 2020. The validation of a full-field deformation analysis of an aircraft panel: A case study. The Journal of Strain Analysis for Engineering Design, p.0309324720971140.

Sebastian, C., Hack, E. and Patterson, E., 2013. An approach to the validation of computational solid mechanics models for strain analysis. The Journal of Strain Analysis for Engineering Design, 48(1), pp.36-47.

Wang, W., Mottershead, J.E., Sebastian, C.M. and Patterson, E.A., 2011. Shape features and finite element model updating from full-field strain data. International Journal of Solids and Structures, 48(11-12), pp.1644-1657.

For more posts on the MOTIVATE project: https://realizeengineering.blog/category/myresearch/motivate-project/

The MOTIVATE project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 754660 and the Swiss State Secretariat for Education, Research and Innovation under contract number 17.00064.

The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.

Credible predictions for regulatory decision-making

detail from abstract by Zahrah ReshRegulators are charged with ensuring that manufactured products, from aircraft and nuclear power stations to cosmetics and vaccines, are safe.  The general public seeks certainty that these devices and the materials and chemicals they are made from will not harm them or the environment.  Technologists that design and manufacture these products know that absolute certainty is unattainable and near-certainty in unaffordable.  Hence, they attempt to deliver the service or product that society desires while ensuring that the risks are As Low As Reasonably Practical (ALARP).  The role of regulators is to independently assess the risks, make a judgment on their acceptability and thus decide whether the operation of a power station or distribution of a vaccine can go ahead.  These are difficult decisions with huge potential consequences – just think of the more than three hundred people killed in the two crashes of Boeing 737 Max airplanes or the 10,000 or so people affected by birth defects caused by the drug thalidomide.  Evidence presented to support applications for regulatory approval is largely based on physical tests, for example fatigue tests on an aircraft structure or toxicological tests using animals.  In some cases the physical tests might not be entirely representative of the real-life situation which can make it difficult to make decisions using the data, for instance a ground test on an airplane is not the same as a flight test and in many respects the animals used in toxicity testing are physiologically different to humans.  In addition, physical tests are expensive and time-consuming which both drives up the costs of seeking regulatory approval and slows down the translation of new innovative products to the market.  The almost ubiquitous use of computer-based simulations to support the research, development and design of manufactured products inevitably leads to their use in supporting regulatory applications.  This creates challenges for regulators who must judge the trustworthiness of predictions from these simulations.  [see ‘Fake facts & untrustworthy predictions‘ on December 4th, 2019]. It is standard practice for modellers to demonstrate the validity of their models; however, validation does not automatically lead to acceptance of predictions by decision-makers.  Acceptance is more closely related to scientific credibility.  I have been working across a number of disciplines on the scientific credibility of models including in engineering where multi-physics phenomena are important, such as hypersonic flight and fusion energy [see ‘Thought leadership in fusion energy‘ on October 9th, 2019], and in computational biology and toxicology [see ‘Hierarchical modelling in engineering and biology‘ on March 14th, 2018]. Working together with my collaborators in these disciplines, we have developed a common set of factors which underpin scientific credibility that are based on principles drawn from the literature on the philosophy of science and are designed to be both discipline-independent and method-agnostic [Patterson & Whelan, 2019; Patterson et al, 2021]. We hope that our cross-disciplinary approach will break down the subject-silos that have become established as different scientific communities have developed their own frameworks for validating models.  As mentioned above, the process of validation tends to be undertaken by model developers and, in some sense, belongs to them; whereas, credibility is not exclusive to the developer but is a trust that needs to be shared with a decision-maker who seeks to use the predictions to inform their decision [see ‘Credibility is in the eye of the beholder‘ on April 20th, 2016].  Trust requires a common knowledge base and understanding that is usually built through interactions.  We hope the credibility factors will provide a framework for these interactions as well as a structure for building a portfolio of evidence that demonstrates the reliability of a model. 

References:

Patterson EA & Whelan MP, On the validation of variable fidelity multi-physics simulations, J. Sound & Vibration, 448:247-258, 2019.

Patterson EA, Whelan MP & Worth A, The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application, Computational Toxicology, 17: 100144, 2021.

Image: Extract from abstract by Zahrah Resh.