Last week brought excitement and disappointment in approximately equal measures for my research on tracking nanoparticles [see ‘Slow moving nanoparticles‘ on December 13th, 2017 and ‘Going against the flow‘ on February 3rd, 2021]. The disappointment was that our grant proposal on ‘Optical tracking of virus-cell interaction’ was not ranked highly enough to receive funding from Engineering and Physical Sciences Research Council. Rejection is an occupational hazard for academics seeking to win grants and you learn to accept it, learn from the constructive criticism and look for ways of reworking the ideas into a new proposal. If you don’t compete then you can’t win. The excitement was that we have moved our apparatus for tracking nanoparticles into a new laboratory, which has been set up for it, so that we can start work on a pilot study looking at the ‘Interaction of bacteria and viruses with cellular and hard surfaces’. We are also advertising for a PhD student to start in September 2021 to work on ‘Developing pre-clinical models to optimise nanoparticle based drug delivery for the treatment of diabetic retinopathy‘. This is an exciting development because it represents our first step from fundamental research on tracking nanoparticles in biological media towards clinical applications of the technology. Diabetic retinopathy is an age-related condition that threatens your sight and currently is managed by delivery of drugs to the inside of the eye which requires frequent visits to a clinic for injections into the vitreous fluid of the eye. There is potential to use nanoparticles to deliver drugs more efficiently and to support these developments we plan that the PhD student will use our real-time, non-invasive, label-free tracking technology to quantify nanoparticle motion through the vitreous fluid and the interaction of nanoparticles with the cells of the retina.
Category Archives: Engineering
Out of the valley of death into a hype cycle?
The capability to identify damage and track its propagation in structures is important in ensuring the safe operation of a wide variety of engineering infrastructure, including aircraft structures. A few years ago, I wrote about research my group was performing, in the INSTRUCTIVE project [see ‘INSTRUCTIVE final reckoning‘ on January 9th, 2019] with Airbus and Strain Solutions Limited, to deliver a new tool for monitoring the development of damage using thermoelastic stress analysis (TSA) [see ‘Counting photons to measure stress‘ on November 18th, 2015]. We collected images using a TSA system while a structural component was subject to cycles of load that caused damage to initiate and propagate during a fatigue test. The series of images were analysed using a technique based on optical flow to identify apparent movement between the images which was taken as indication of the development of damage [1]. We demonstrated that our technique could indicate the presence of a crack less than a millimetre in length and even identify cracks initiating under the heads of bolts using experiments performed in our laboratory [see ‘INSTRUCTIVE update‘ on October 4th, 2017]. However, this technique was susceptible to errors in the images when we tried to use low-cost sensors and to changes in the images caused by flight cycle loading with varying amplitude and frequency of loads. Essentially, the optical flow approach could be fooled into identifying damage propagation when a sensor delivered a noisy image or the shape of the load cycle was changed. We have now overcome this short-coming by replacing the optical flow approach with the orthogonal decomposition technique [see ‘Recognising strain‘ on October 28th, 2015] that we developed for comparing data fields from measurements and predictions in validation processes [see ‘Million to one‘ on November 21st, 2018] . Each image is decomposed to a feature vector and differences between the feature vectors are indicative of damage development (see schematic in thumbnail from [2]). The new technique, which we have named the differential feature vector method, is sufficiently robust that we have been able to use a sensor costing 1% of the price of a typical TSA system to identify and track cracks during cyclic loading. The underpinning research was published in December 2020 by the Royal Society [2] and the technique is being implemented in full-scale ground-tests on aircraft structures as part of the DIMES project. Once again, a piece of technology is emerging from the valley of death [see ‘Slowly crossing the valley of death‘ on January 27th, 2021] and, without wishing to initiate the hype cycle [see ‘Hype cycle‘ on September 23rd, 2015], I hope it will transform the use of thermal imaging for condition monitoring.
The INSTRUCTIVE and DIMES projects have received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreements No. 685777 and No. 820951 respectively.
The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.
References
[1] Middleton CA, Gaio A, Greene RJ & Patterson EA, Towards automated tracking of initiation and propagation of cracks in Aluminium alloy coupons using thermoelastic stress analysis, J. Non-destructive Testing, 38:18, 2019.
[2] Middleton CA, Weihrauch M, Christian WJR, Greene RJ & Patterson EA, Detection and tracking of cracks based on thermoelastic stress analysis, R. Soc. Open Sci. 7:200823, 2020.
Reduction in usefulness of reductionism
A couple of months ago I wrote about a set of credibility factors for computational models [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020] that we designed to inform interactions between researchers, model builders and decision-makers that will establish trust in the predictions from computational models [1]. This is important because computational modelling is becoming ubiquitous in the development of everything from automobiles and power stations to drugs and vaccines which inevitably leads to its use in supporting regulatory applications. However, there is another motivation underpinning our work which is that the systems being modelled are becoming increasingly complex with the likelihood that they will exhibit emergent behaviour [see ‘Emergent properties‘ on September 16th, 2015] and this makes it increasingly unlikely that a reductionist approach to establishing model credibility will be successful [2]. The reductionist approach to science, which was pioneered by Descartes and Newton, has served science well for hundreds of years and is based on the concept that everything about a complex system can be understood by reducing it to the smallest constituent part. It is the method of analysis that underpins almost everything you learn as an undergraduate engineer or physicist. However, reductionism loses its power when a system is more than the sum of its parts, i.e., when it exhibits emergent behaviour. Our approach to establishing model credibility is more holistic than traditional methods. This seems appropriate when modelling complex systems for which a complete knowledge of the relationships and patterns of behaviour may not be attainable, e.g., when unexpected or unexplainable emergent behaviour occurs [3]. The hegemony of reductionism in science made us nervous about writing about its short-comings four years ago when we first published our ideas about model credibility [2]. So, I was pleased to see a paper published last year [4] that identified five fundamental properties of biology that weaken the power of reductionism, namely (1) biological variation is widespread and persistent, (2) biological systems are relentlessly nonlinear, (3) biological systems contain redundancy, (4) biology consists of multiple systems interacting across different time and spatial scales, and (5) biological properties are emergent. Many engineered systems possess all five of these fundamental properties – you just to need to look at them from the appropriate perspective, for example, through a microscope to see the variation in microstructure of a mass-produced part. Hence, in the future, there will need to be an increasing emphasis on holistic approaches and systems thinking in both the education and practices of engineers as well as biologists.
For more on emergence in computational modelling see Manuel Delanda Philosophy and Simulation: The Emergence of Synthetic Reason, Continuum, London, 2011. And, for more systems thinking see Fritjof Capra and Luigi Luisi, The Systems View of Life: A Unifying Vision, Cambridge University Press, 2014.
References:
[1] Patterson EA, Whelan MP & Worth A, The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application, Computational Toxicology, 17: 100144, 2021.
[2] Patterson EA &Whelan MP, A framework to establish credibility of computational models in biology. Progress in biophysics and molecular biology, 129: 13-19, 2017.
[3] Patterson EA & Whelan MP, On the validation of variable fidelity multi-physics simulations, J. Sound & Vibration, 448:247-258, 2019.
[4] Pruett WA, Clemmer JS & Hester RL, Physiological Modeling and Simulation—Validation, Credibility, and Application. Annual Review of Biomedical Engineering, 22:185-206, 2020.
Going against the flow
Last week I wrote about research we have been carrying out over the last decade that is being applied to large scale structures in the aerospace industry (see ‘Slowly crossing the valley of death‘ on January 27th, 2021). I also work on very much smaller ‘structures’ that are only tens of nanometers in diameter, or about a billion times smaller than the test samples in last week’s post (see ‘Toxic nanoparticles?‘ on November 13th, 2013). The connection is the use of light to measure shape, deformation and motion; and then utilising the measurements to validate predictions from theoretical or computational models. About three years ago, we published research which demonstrated that the motion of very small particles (less than about 300 nanometres) at low concentrations (less than about a billion per millilitre) in a fluid was dominated by the molecules of the fluid rather than interactions between the particles (see Coglitore et al, 2017 and ‘Slow moving nanoparticles‘ on December 13th, 2017). This data confirmed results from earlier molecular dynamic simulations that contradicted predictions using the Stokes-Einstein equation, which was derived by Einstein in his PhD thesis for a ‘Stokes’ particle undergoing Brownian motion. The Stokes-Einstein equation works well for large particles but the physics of motion changes when the particles are very small and far apart so that Van der Waals forces and electrostatic forces play a dominant role, as we have shown in a more recent paper (see Giorgi et al, 2019). This becomes relevant when evaluating nanoparticles as potential drug delivery systems or assessing the toxicological impact of nanoparticles. We have shown recently that instruments based on dynamic scattering of light from nanoparticles are likely to be inaccurate because they are based on fitting measurement data to the Stokes-Einstein equation. In a paper published last month, we found that asymmetric flow field flow fractionation (or AF4) in combination with dynamic light scattering when used to detect the size of nanoparticles in suspension, tended to over-estimate the diameter of particles smaller than 60 nanometres at low concentrations by upto a factor of two (see Giorgi et al, 2021). Someone commented recently that our work in this area was not highly cited but perhaps this is unsurprising when it undermines a current paradigm. We have certainly learnt to handle rejection letters, to redouble our efforts to demonstrate the rigor in our research and to present conclusions in a manner that appears to build on existing knowledge rather than demolishing it.
Sources:
Image is a photograph of a fast flowing mountain river taken in Yellowstone National Park during a roadtrip across the USA in 2006.