Tag Archives: computational biology

Reduction in usefulness of reductionism

decorative paintingA couple of months ago I wrote about a set of credibility factors for computational models [see ‘Credible predictions for regulatory decision-making‘ on December 9th, 2020] that we designed to inform interactions between researchers, model builders and decision-makers that will establish trust in the predictions from computational models [1].  This is important because computational modelling is becoming ubiquitous in the development of everything from automobiles and power stations to drugs and vaccines which inevitably leads to its use in supporting regulatory applications.  However, there is another motivation underpinning our work which is that the systems being modelled are becoming increasingly complex with the likelihood that they will exhibit emergent behaviour [see ‘Emergent properties‘ on September 16th, 2015] and this makes it increasingly unlikely that a reductionist approach to establishing model credibility will be successful [2].  The reductionist approach to science, which was pioneered by Descartes and Newton, has served science well for hundreds of years and is based on the concept that everything about a complex system can be understood by reducing it to the smallest constituent part.  It is the method of analysis that underpins almost everything you learn as an undergraduate engineer or physicist. However, reductionism loses its power when a system is more than the sum of its parts, i.e., when it exhibits emergent behaviour.  Our approach to establishing model credibility is more holistic than traditional methods.  This seems appropriate when modelling complex systems for which a complete knowledge of the relationships and patterns of behaviour may not be attainable, e.g., when unexpected or unexplainable emergent behaviour occurs [3].  The hegemony of reductionism in science made us nervous about writing about its short-comings four years ago when we first published our ideas about model credibility [2].  So, I was pleased to see a paper published last year [4] that identified five fundamental properties of biology that weaken the power of reductionism, namely (1) biological variation is widespread and persistent, (2) biological systems are relentlessly nonlinear, (3) biological systems contain redundancy, (4) biology consists of multiple systems interacting across different time and spatial scales, and (5) biological properties are emergent.  Many engineered systems possess all five of these fundamental properties – you just to need to look at them from the appropriate perspective, for example, through a microscope to see the variation in microstructure of a mass-produced part.  Hence, in the future, there will need to be an increasing emphasis on holistic approaches and systems thinking in both the education and practices of engineers as well as biologists.

For more on emergence in computational modelling see Manuel Delanda Philosophy and Simulation: The Emergence of Synthetic Reason, Continuum, London, 2011. And, for more systems thinking see Fritjof Capra and Luigi Luisi, The Systems View of Life: A Unifying Vision, Cambridge University Press, 2014.

References:

[1] Patterson EA, Whelan MP & Worth A, The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application, Computational Toxicology, 17: 100144, 2021.

[2] Patterson EA &Whelan MP, A framework to establish credibility of computational models in biology. Progress in biophysics and molecular biology, 129: 13-19, 2017.

[3] Patterson EA & Whelan MP, On the validation of variable fidelity multi-physics simulations, J. Sound & Vibration, 448:247-258, 2019.

[4] Pruett WA, Clemmer JS & Hester RL, Physiological Modeling and Simulation—Validation, Credibility, and Application. Annual Review of Biomedical Engineering, 22:185-206, 2020.

Credible predictions for regulatory decision-making

detail from abstract by Zahrah ReshRegulators are charged with ensuring that manufactured products, from aircraft and nuclear power stations to cosmetics and vaccines, are safe.  The general public seeks certainty that these devices and the materials and chemicals they are made from will not harm them or the environment.  Technologists that design and manufacture these products know that absolute certainty is unattainable and near-certainty in unaffordable.  Hence, they attempt to deliver the service or product that society desires while ensuring that the risks are As Low As Reasonably Practical (ALARP).  The role of regulators is to independently assess the risks, make a judgment on their acceptability and thus decide whether the operation of a power station or distribution of a vaccine can go ahead.  These are difficult decisions with huge potential consequences – just think of the more than three hundred people killed in the two crashes of Boeing 737 Max airplanes or the 10,000 or so people affected by birth defects caused by the drug thalidomide.  Evidence presented to support applications for regulatory approval is largely based on physical tests, for example fatigue tests on an aircraft structure or toxicological tests using animals.  In some cases the physical tests might not be entirely representative of the real-life situation which can make it difficult to make decisions using the data, for instance a ground test on an airplane is not the same as a flight test and in many respects the animals used in toxicity testing are physiologically different to humans.  In addition, physical tests are expensive and time-consuming which both drives up the costs of seeking regulatory approval and slows down the translation of new innovative products to the market.  The almost ubiquitous use of computer-based simulations to support the research, development and design of manufactured products inevitably leads to their use in supporting regulatory applications.  This creates challenges for regulators who must judge the trustworthiness of predictions from these simulations.  [see ‘Fake facts & untrustworthy predictions‘ on December 4th, 2019]. It is standard practice for modellers to demonstrate the validity of their models; however, validation does not automatically lead to acceptance of predictions by decision-makers.  Acceptance is more closely related to scientific credibility.  I have been working across a number of disciplines on the scientific credibility of models including in engineering where multi-physics phenomena are important, such as hypersonic flight and fusion energy [see ‘Thought leadership in fusion energy‘ on October 9th, 2019], and in computational biology and toxicology [see ‘Hierarchical modelling in engineering and biology‘ on March 14th, 2018]. Working together with my collaborators in these disciplines, we have developed a common set of factors which underpin scientific credibility that are based on principles drawn from the literature on the philosophy of science and are designed to be both discipline-independent and method-agnostic [Patterson & Whelan, 2019; Patterson et al, 2021]. We hope that our cross-disciplinary approach will break down the subject-silos that have become established as different scientific communities have developed their own frameworks for validating models.  As mentioned above, the process of validation tends to be undertaken by model developers and, in some sense, belongs to them; whereas, credibility is not exclusive to the developer but is a trust that needs to be shared with a decision-maker who seeks to use the predictions to inform their decision [see ‘Credibility is in the eye of the beholder‘ on April 20th, 2016].  Trust requires a common knowledge base and understanding that is usually built through interactions.  We hope the credibility factors will provide a framework for these interactions as well as a structure for building a portfolio of evidence that demonstrates the reliability of a model. 

References:

Patterson EA & Whelan MP, On the validation of variable fidelity multi-physics simulations, J. Sound & Vibration, 448:247-258, 2019.

Patterson EA, Whelan MP & Worth A, The role of validation in establishing the scientific credibility of predictive toxicology approaches intended for regulatory application, Computational Toxicology, 17: 100144, 2021.

Image: Extract from abstract by Zahrah Resh.

Modelling from the cell through the individual to the host population

During the lock-down in the UK due to the coronavirus pandemic, I have been reading about viruses and the modelling of them.  It is a multi-disciplinary and multi-scale problem; so, something that engineers should be well-equipped to tackle.  It is a multi-scale because we need to understand the spread of the virus in the human population so that we can control it, we need to understand the process of infection in individuals so that we can protect them, and we need to understand the mechanisms of virus-cell interaction so that we can stop the replication of the virus.  At each size scale, models capable of representing the real-world processes will help us explore different approaches to arresting the progress of the virus and will need to be calibrated and validated against measurements.  This can be represented in the sort of model-test pyramid shown in the top graphic that has been used in the aerospace industry [1-2] for many years [see ‘Hierarchical modelling in engineering and biology’ on March 14th, 2018] and which we have recently introduced in the nuclear fission [3] and fusion [4] industries [see ‘Thought leadership in fusion engineering’ on October 9th, 2019].  At the top of the pyramid, the spread of the virus in the population is being modelled by epidemiologists, such as Professor Neil Ferguson [5], using statistical models based on infection data.  However, I am more interested in the bottom of the pyramid because the particles of the coronavirus are about the same size as the nanoparticles that I have been studying for some years [see ‘Slow moving nanoparticles’ on December 13th, 2017] and their motion appears to be dominated by diffusion processes [see ‘Salt increases nanoparticle diffusion’ on April 22nd, 2020] [6-7].  The first step towards virus infection of a cell is diffusion of the virus towards the cell which is believed to be a relatively slow process and hence a good model of diffusion would assist in designing drugs that could arrest or decelerate infection of cells [8].  Many types of virus on entering the cell make their way to the nucleus where they replicate causing the cell to die, afterwhich the virus progeny are dispersed to repeat the process.  You can see part of this sequence for coronavirus (SARS-COV-2) in this sequence of images. The trafficking across the cytoplasm of the cell to the nucleus can occur in a number of ways including the formation of a capsule or endosome that moves across the cell towards the nuclear membrane where the virus particles leave the endosome and travel through microtubules into the nucleus.  Holcman & Schuss [9] provide a good graphic illustrating these transport mechanisms.  In 2019, Briane et al [10] reviewed models of diffusion of intracellular particles inside living eukaryotic cells, i.e. cells with a nuclear enclosed by a membrane as in all animals.  Intracellular diffusion is believed to be driven by Brownian motion and by motor-proteins including dynein, kinesin and myosin that enable motion through microtubules.  They observed that the density of the structure of cytoplasm, or cytoskeleton, can hinder the free displacement of a particle leading to subdiffusion; while, cytoskeleton elasticity and thermal bending can accelerate it leading to superdiffusion.  These molecular and cellular interactions are happening at disparate spatial and temporal scales [11] which is one of the difficulties encountered in creating predictive simulations of virus-cell interactions.  In other words, the bottom layers of the model-test pyramid appear to be constructed from many more strata when you start to look more closely.  And, you need to add a time dimension to it.  Prior to the coronavirus pandemic, more modelling efforts were perhaps focussed on understanding the process of infection by Human Immunodeficiency Virus (HIV), including by a multi-national group of scientists from Chile, France, Morocco, Russia and Spain [12-14].  However, the current coronavirus pandemic is galvanising researchers who are starting to think about novel ways of building multiscale models that encourage multidisciplinary collaboration by dispersed groups, [e.g. 15].

References

[1] Harris GL, Computer models, laboratory simulators, and test ranges: meeting the challenge of estimating tactical force effectiveness in the 1980’s, US Army Command and General Staff College, May 1979.

[2] Trevisani DA & Sisti AF, Air Force hierarchy of models: a look inside the great pyramid, Proc. SPIE 4026, Enabling Technology for Simulation Science IV, 23 June 2000.

[3] Patterson EA, Taylor RJ & Bankhead M, A framework for an integrated nuclear digital environment, Progress in Nuclear Energy, 87:97-103, 2016.

[4] Patterson EA, Purdie S, Taylor RJ & Waldon C, An integrated digital framework for the design, build and operation of fusion power plants, Royal Society Open Science, 6(10):181847, 2019.

[5] Verity R, Okell LC, Dorigatti I, Winskill P, Whittaker C, Imai N, Cuomo-Dannenburg G, Thompson H, Walker PGT, Fu H, Dighe A, Griffin JT, Baguelin M, Bhatia S, Boonyasiri A, Cori A, Cucunubá Z, FitzJohn R, Gaythorpe K, Green W, Hamlet A, Hinsley W, Laydon D, Nedjati-Gilani G, Riley S, van Elsland S, Volz E, Wang H, Wang Y, Xi X, Donnelly CA, Ghani AC, Ferguson NM, Estimates of the severity of coronavirus disease 2019: a model-based analysis., Lancet Infectious Diseases, 2020.

[6] Coglitore D, Edwardson SP, Macko P, Patterson EA, Whelan MP, Transition from fractional to classical Stokes-Einstein behaviour in simple fluids, Royal Society Open Science, 4:170507, 2017.

[7] Giorgi F, Coglitore D, Curran JM, Gilliland D, Macko P, Whelan M, Worth A & Patterson EA, The influence of inter-particle forces on diffusion at the nanoscale, Scientific Reports, 9:12689, 2019.

[8] Gilbert P-A, Kamen A, Bernier A & Garner A, A simple macroscopic model for the diffusion and adsorption kinetics of r-Adenovirus, Biotechnology & Bioengineering, 98(1):239-251,2007.

[9] Holcman D & Schuss Z, Modeling the early steps of viral infection in cells, Chapter 9 in Stochastic Narrow Escape in Molecular and Cellular Biology, New York: Springer Science+Business Media, 2015.

[10] Braine V, Vimond M & Kervrann C, An overview of diffusion models for intracellular dynamics analysis, Briefings in Bioinformatics, Oxford University Press, pp.1-15, 2019.

[11] Holcman D & Schuss Z, Time scale of diffusion in molecular and cellular biology, J. Physics A: Mathematical and Theoretical, 47:173001, 2014.

[12] Bocharov G, Chereshnev V, Gainov I, Bazhun S, Bachmetyev B, Argilaguet J, Martinez J & Meyerhans A, Human immunodeficiency virus infection: from biological observations to mechanistic mathematical modelling, Math. Model. Nat. Phenom., 7(5):78-104, 2012.

[13] Bocharov G, Meyerhans A, Bessonov N, Trofimchuk S & Volpert V, Spatiotemporal dynamics of virus infection spreading in tissues, PLOS One, 11(12):e)168576, 2016.

[14] Bouchnita A, Bocharov G, Meyerhans A & Volpert V, Towards a multiscale model of acute HIV infection, Computation, 5(6):5010006, 2017.

[15] Sego TJ, Aponte-Serrano JO, Ferrari-Gianlupi J, Heaps S, Quardokus EM & Glazier JA, A modular framework for multiscale spatial modeling of viral infection and immune respons in epithelial tissue, bioRxiv. 2020.

Spatial-temporal models of protein structures

For a number of years I have been working on methods for validating computational models of structures [see ‘Model validation‘ on September 18th 2012] using the full potential of measurements made with modern techniques such as digital image correlation [see ‘256 shades of grey‘ on January 22nd 2014] and thermoelastic stress analysis [see ‘Counting photons to measure stress‘ on November 18th 2015].  Usually the focus of our interest is at the macroscale, for example the research on aircraft structures in the MOTIVATE project; however, in a new PhD project with colleagues at the National Tsing Hua University in Taiwan, we are planning to explore using our validation procedures and metrics [1] in structural biology.

The size and timescale of protein-structure thermal fluctuations are essential to the regulation of cellular functions. Measurement techniques such as x-ray crystallography and transmission electron cryomicroscopy (Cryo-EM) provide data on electron density distribution from which protein structures can be deduced using molecular dynamics models. Our aim is to develop our validation metrics to help identify, with a defined level of confidence, the most appropriate structural ensemble for a given set of electron densities. To make the problem more interesting and challenging the structure observed by x-ray crystallography is an average or equilibrium state because a folded protein is constantly in motion undergoing harmonic oscillations, each with different frequencies and amplitude [2].

The PhD project is part of the dual PhD programme of the University of Liverpool and National Tsing Hua University.  Funding is available in form of a fee waiver and contribution to living expenses for four years of study involving significant periods (perferably two years) at each university.  For more information follow this link.

References:

[1] Dvurecenska, K., Graham, S., Patelli, E. & Patterson, E.A., A probabilistic metric for the validation of computational models, Royal Society Open Society, 5:180687, 2018.

[2] Justin Chan, Hong-Rui Lin, Kazuhiro Takemura, Kai-Chun Chang, Yuan-Yu Chang, Yasumasa Joti, Akio Kitao, Lee-Wei Yang. An efficient timer and sizer of protein motions reveals the time-scales of functional dynamics in the ribosome (2018) https://www.biorxiv.org/content/early/2018/08/03/384511.

Image: A diffraction pattern and protein structure from http://xray.bmc.uu.se/xtal/