Tag Archives: complexity

The rest of the planet has been waiting patiently for us to figure it out

Research in British Columbia has found evidence of nitrogen from fish in tree rings.  The salmon that swim in the local rivers provide food for predators, such as bears and eagles, who leave the remains of the salmon lying around on the floor of the forest where it decomposes allowing the trees to absorb the nitrogen embedded in the bones of the salmon.  In some cases, up to three-quarters of a tree’s nitrogen is from salmon.  This implies that interfering in the life cycle of the salmon, for instance by commercial fishing, will impact on its predators, the forest and everything that is dependent on or interacts with the trees.  The complex nature of these interconnections have been apparent to the aboriginal peoples of the world for a very long time [see ‘Blinded by reductionism‘ on August 24th, 2022].  To quote Suzanne Simard, ‘Mistreatment of one species is mistreatment of all.  The rest of the planet has been waiting patiently for us to figure that out’.

Source: Suzanne Simard, Finding the Mother Tree, Penguin, 2021.

Image: photograph of an original painting bought by the author in Beijing

Blinded by reductionism

I wrote about the weakness of reductionism about 18 months ago [see ‘Reduction in usefulness of reductionism‘ on February 17th, 2021].  Reductionism is the concept that everything about a complex system can be understood by reducing it to the smallest constituent part.  The concept is flawed because complex systems exhibit emergent properties [see ‘Emergent properties‘ on September 16th, 2015] that appear at a certain level of complexity but do not exist at lower levels.  Life is an emergent property so when you reduce an organism to its constituent parts, for instance by dissection, you kill it and are unable to observe its normal behaviour.  Reductionism is widespread in Western science and has been blinding us to what is often well-known to aboriginal people, i.e., the interconnectedness of nature.  One example is forest ecosystems that Suzanne Simard, amongst others, has shown are complex synergistic, multi-scale organisations of species. Complexity is only hard for those who have not thought about it – it is obvious to many peoples whose lives are integrated in nature’s ecosystem but it is really difficult for those of us educated in the reductionist tradition.

Reference:

Suzanne Simard, Finding the Mother Tree, Penguin, 2021.

Reduction in usefulness of reductionism

decorative paintingA couple of months ago I wrote about a set of credibility factors for computational models [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020] that we designed to inform interactions between researchers, model builders and decision-makers that will establish trust in the predictions from computational models [1].  This is important because computational modelling is becoming ubiquitous in the development of everything from automobiles and power stations to drugs and vaccines which inevitably leads to its use in supporting regulatory applications.  However, there is another motivation underpinning our work which is that the systems being modelled are becoming increasingly complex with the likelihood that they will exhibit emergent behaviour [see ‘Emergent properties‘ on September 16th, 2015] and this makes it increasingly unlikely that a reductionist approach to establishing model credibility will be successful [2].  The reductionist approach to science, which was pioneered by Descartes and Newton, has served science well for hundreds of years and is based on the concept that everything about a complex system can be understood by reducing it to the smallest constituent part.  It is the method of analysis that underpins almost everything you learn as an undergraduate engineer or physicist. However, reductionism loses its power when a system is more than the sum of its parts, i.e., when it exhibits emergent behaviour.  Our approach to establishing model credibility is more holistic than traditional methods.  This seems appropriate when modelling complex systems for which a complete knowledge of the relationships and patterns of behaviour may not be attainable, e.g., when unexpected or unexplainable emergent behaviour occurs [3].  The hegemony of reductionism in science made us nervous about writing about its short-comings four years ago when we first published our ideas about model credibility [2].  So, I was pleased to see a paper published last year [4] that identified five fundamental properties of biology that weaken the power of reductionism, namely (1) biological variation is widespread and persistent, (2) biological systems are relentlessly nonlinear, (3) biological systems contain redundancy, (4) biology consists of multiple systems interacting across different time and spatial scales, and (5) biological properties are emergent.  Many engineered systems possess all five of these fundamental properties – you just to need to look at them from the appropriate perspective, for example, through a microscope to see the variation in microstructure of a mass-produced part.  Hence, in the future, there will need to be an increasing emphasis on holistic approaches and systems thinking in both the education and practices of engineers as well as biologists.

For more on emergence in computational modelling see Manuel Delanda Philosophy and Simulation: The Emergence of Synthetic Reason, Continuum, London, 2011. And, for more systems thinking see Fritjof Capra and Luigi Luisi, The Systems View of Life: A Unifying Vision, Cambridge University Press, 2014.

References:

[1] Patterson EA, Whelan MP & Worth A, The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application, Computational Toxicology, 17: 100144, 2021.

[2] Patterson EA &Whelan MP, A framework to establish credibility of computational models in biology. Progress in biophysics and molecular biology, 129: 13-19, 2017.

[3] Patterson EA & Whelan MP, On the validation of variable fidelity multi-physics simulations, J. Sound & Vibration, 448:247-258, 2019.

[4] Pruett WA, Clemmer JS & Hester RL, Physiological Modeling and Simulation—Validation, Credibility, and Application. Annual Review of Biomedical Engineering, 22:185-206, 2020.

Digital twins could put at risk what it means to be human

Detail from abstract by Zahrah ReshI have written in the past about my research on the development and use of digital twins.  A digital twin is a functional representation in a virtual world of a real world entity that is continually updated with data from the real world [see ‘Fourth industrial revolution’ on July 4th, 2018 and also a short video at https://www.youtube.com/watch?v=iVS-AuSjpOQ].  I am working with others on developing an integrated digital nuclear environment from which digital twins of individual power stations could be spawned in parallel with the manufacture of their physical counterparts [see ‘Enabling or disruptive technology for nuclear engineering’ on January 1st, 2015 and ‘Digitally-enabled regulatory environment for fusion power-plants’ on March 20th, 2019].  A couple of months ago, I wrote about the difficulty of capturing tacit knowledge in digital twins, which is knowledge that is generally not expressed but is retained in the minds of experts and is often essential to developing and operating complex engineering systems [see ‘Tacit hurdle to digital twins’ on August 26th, 2020].  The concept of tapping into someone’s mind to extract tacit knowledge brings us close to thinking about human digital twins which so far have been restricted to computational models of various parts of human anatomy and physiology.  The idea of a digital twin of someone’s mind raises a myriad of philosophical and ethical issues.  Whilst the purpose of a digital twin of the mind of an operator of a complex system might be to better predict and understand human-machine interactions, the opportunity to use the digital twin to advance techniques of personalisation will likely be too tempting to ignore.  Personalisation is the tailoring of the digital world to respond to our personal needs, for instance using predictive algorithms to recommend what book you should read next or to suggest purchases to you.  At the moment, personalisation is driven by data derived from the tracks you make in the digital world as you surf the internet, watch videos and make purchases.  However, in the future, those predictive algorithms could be based on reading your mind, or at least its digital twin.  We worry about loss of privacy at the moment, by which we probably mean the collation of vast amounts of data about our lives by unaccountable organisations, and it worries us because of the potential for manipulation of our lives without us being aware it is happening.  Our free will is endangered by such manipulation but it might be lost entirely to a digital twin of our mind.  To quote the philosopher Michael Lynch, you would be handing over ‘privileged access to your mental states’ and to some extent you would no longer be a unique being.  We are long way from possessing the technology to realise a digital twin of human mind but the possibility is on the horizon.

Source: Richard Waters, They’re watching you, FT Weekend, 24/25 October 2020.

Image: Extract from abstract by Zahrah Resh.