Tag Archives: simulation

Modelling from the cell through the individual to the host population

During the lock-down in the UK due to the coronavirus pandemic, I have been reading about viruses and the modelling of them.  It is a multi-disciplinary and multi-scale problem; so, something that engineers should be well-equipped to tackle.  It is a multi-scale because we need to understand the spread of the virus in the human population so that we can control it, we need to understand the process of infection in individuals so that we can protect them, and we need to understand the mechanisms of virus-cell interaction so that we can stop the replication of the virus.  At each size scale, models capable of representing the real-world processes will help us explore different approaches to arresting the progress of the virus and will need to be calibrated and validated against measurements.  This can be represented in the sort of model-test pyramid shown in the top graphic that has been used in the aerospace industry [1-2] for many years [see ‘Hierarchical modelling in engineering and biology’ on March 14th, 2018] and which we have recently introduced in the nuclear fission [3] and fusion [4] industries [see ‘Thought leadership in fusion engineering’ on October 9th, 2019].  At the top of the pyramid, the spread of the virus in the population is being modelled by epidemiologists, such as Professor Neil Ferguson [5], using statistical models based on infection data.  However, I am more interested in the bottom of the pyramid because the particles of the coronavirus are about the same size as the nanoparticles that I have been studying for some years [see ‘Slow moving nanoparticles’ on December 13th, 2017] and their motion appears to be dominated by diffusion processes [see ‘Salt increases nanoparticle diffusion’ on April 22nd, 2020] [6-7].  The first step towards virus infection of a cell is diffusion of the virus towards the cell which is believed to be a relatively slow process and hence a good model of diffusion would assist in designing drugs that could arrest or decelerate infection of cells [8].  Many types of virus on entering the cell make their way to the nucleus where they replicate causing the cell to die, afterwhich the virus progeny are dispersed to repeat the process.  You can see part of this sequence for coronavirus (SARS-COV-2) in this sequence of images. The trafficking across the cytoplasm of the cell to the nucleus can occur in a number of ways including the formation of a capsule or endosome that moves across the cell towards the nuclear membrane where the virus particles leave the endosome and travel through microtubules into the nucleus.  Holcman & Schuss [9] provide a good graphic illustrating these transport mechanisms.  In 2019, Briane et al [10] reviewed models of diffusion of intracellular particles inside living eukaryotic cells, i.e. cells with a nuclear enclosed by a membrane as in all animals.  Intracellular diffusion is believed to be driven by Brownian motion and by motor-proteins including dynein, kinesin and myosin that enable motion through microtubules.  They observed that the density of the structure of cytoplasm, or cytoskeleton, can hinder the free displacement of a particle leading to subdiffusion; while, cytoskeleton elasticity and thermal bending can accelerate it leading to superdiffusion.  These molecular and cellular interactions are happening at disparate spatial and temporal scales [11] which is one of the difficulties encountered in creating predictive simulations of virus-cell interactions.  In other words, the bottom layers of the model-test pyramid appear to be constructed from many more strata when you start to look more closely.  And, you need to add a time dimension to it.  Prior to the coronavirus pandemic, more modelling efforts were perhaps focussed on understanding the process of infection by Human Immunodeficiency Virus (HIV), including by a multi-national group of scientists from Chile, France, Morocco, Russia and Spain [12-14].  However, the current coronavirus pandemic is galvanising researchers who are starting to think about novel ways of building multiscale models that encourage multidisciplinary collaboration by dispersed groups, [e.g. 15].

References

[1] Harris GL, Computer models, laboratory simulators, and test ranges: meeting the challenge of estimating tactical force effectiveness in the 1980’s, US Army Command and General Staff College, May 1979.

[2] Trevisani DA & Sisti AF, Air Force hierarchy of models: a look inside the great pyramid, Proc. SPIE 4026, Enabling Technology for Simulation Science IV, 23 June 2000.

[3] Patterson EA, Taylor RJ & Bankhead M, A framework for an integrated nuclear digital environment, Progress in Nuclear Energy, 87:97-103, 2016.

[4] Patterson EA, Purdie S, Taylor RJ & Waldon C, An integrated digital framework for the design, build and operation of fusion power plants, Royal Society Open Science, 6(10):181847, 2019.

[5] Verity R, Okell LC, Dorigatti I, Winskill P, Whittaker C, Imai N, Cuomo-Dannenburg G, Thompson H, Walker PGT, Fu H, Dighe A, Griffin JT, Baguelin M, Bhatia S, Boonyasiri A, Cori A, Cucunubá Z, FitzJohn R, Gaythorpe K, Green W, Hamlet A, Hinsley W, Laydon D, Nedjati-Gilani G, Riley S, van Elsland S, Volz E, Wang H, Wang Y, Xi X, Donnelly CA, Ghani AC, Ferguson NM, Estimates of the severity of coronavirus disease 2019: a model-based analysis., Lancet Infectious Diseases, 2020.

[6] Coglitore D, Edwardson SP, Macko P, Patterson EA, Whelan MP, Transition from fractional to classical Stokes-Einstein behaviour in simple fluids, Royal Society Open Science, 4:170507, 2017.

[7] Giorgi F, Coglitore D, Curran JM, Gilliland D, Macko P, Whelan M, Worth A & Patterson EA, The influence of inter-particle forces on diffusion at the nanoscale, Scientific Reports, 9:12689, 2019.

[8] Gilbert P-A, Kamen A, Bernier A & Garner A, A simple macroscopic model for the diffusion and adsorption kinetics of r-Adenovirus, Biotechnology & Bioengineering, 98(1):239-251,2007.

[9] Holcman D & Schuss Z, Modeling the early steps of viral infection in cells, Chapter 9 in Stochastic Narrow Escape in Molecular and Cellular Biology, New York: Springer Science+Business Media, 2015.

[10] Braine V, Vimond M & Kervrann C, An overview of diffusion models for intracellular dynamics analysis, Briefings in Bioinformatics, Oxford University Press, pp.1-15, 2019.

[11] Holcman D & Schuss Z, Time scale of diffusion in molecular and cellular biology, J. Physics A: Mathematical and Theoretical, 47:173001, 2014.

[12] Bocharov G, Chereshnev V, Gainov I, Bazhun S, Bachmetyev B, Argilaguet J, Martinez J & Meyerhans A, Human immunodeficiency virus infection: from biological observations to mechanistic mathematical modelling, Math. Model. Nat. Phenom., 7(5):78-104, 2012.

[13] Bocharov G, Meyerhans A, Bessonov N, Trofimchuk S & Volpert V, Spatiotemporal dynamics of virus infection spreading in tissues, PLOS One, 11(12):e)168576, 2016.

[14] Bouchnita A, Bocharov G, Meyerhans A & Volpert V, Towards a multiscale model of acute HIV infection, Computation, 5(6):5010006, 2017.

[15] Sego TJ, Aponte-Serrano JO, Ferrari-Gianlupi J, Heaps S, Quardokus EM & Glazier JA, A modular framework for multiscale spatial modeling of viral infection and immune respons in epithelial tissue, bioRxiv. 2020.

Alleviating industrial uncertainty

Want to know how to assess the quality of predictions of structural deformation from a computational model and how to diagnose the causes of differences between measurements and predictions?  The MOTIVATE project has the answers; that might seem like an over-assertive claim but read on and make your own judgment.  Eighteen months ago, I reported on a new method for quantifying the uncertainty present in measurements of deformation made in an industrial environment [see ‘Industrial uncertainty’ on December 12th, 2018] that we were trialling on a 1 m square panel of an aircraft fuselage.  Recently, we have used the measurement uncertainty we found to make judgments about the quality of predictions from computer models of the panel under compressive loading.  The top graphic shows the outside surface of the panel (left) with a speckle pattern to allow measurements of its deformation using digital image correlation (DIC) [see ‘256 shades of grey‘ on January 22, 2014 for a brief explanation of DIC]; and the inside surface (right) with stringers and ribs.  The bottom graphic shows our results for two load cases: a 50 kN compression (top row) and a 50 kN compression and 1 degree of torsion (bottom row).  The left column shows the out-of-plane deformation measured using a stereoscopic DIC system and the middle row shows the corresponding predictions from a computational model using finite element analysis [see ‘Did cubism inspire engineering analysis?’ on January 25th, 2017].  We have described these deformation fields in a reduced form using feature vectors by applying image decomposition [see ‘Recognizing strain’ on October 28th, 2015 for a brief explanation of image decomposition].  The elements of the feature vectors are known as shape descriptors and corresponding pairs of them, from the measurements and predictions, are plotted in the graphs on the right in the bottom graphic for each load case.  If the predictions were in perfect agreement with measurements then all of the points on these graphs would lie on the line equality [y=x] which is the solid line on each graph.  However, perfect agreement is unobtainable because there will always be uncertainty present; so, the question arises, how much deviation from the solid line is acceptable?  One answer is that the deviation should be less than the uncertainty present in the measurements that we evaluated with our new method and is shown by the dashed lines.  Hence, when all of the points fall inside the dashed lines then the predictions are at least as good as the measurements.  If some points lie outside of the dashed lines, then we can look at the form of the corresponding shape descriptors to start diagnosing why we have significant differences between our model and experiment.  The forms of these outlying shape descriptors are shown as insets on the plots.  However, busy, or non-technical decision-makers are often not interested in this level of detailed analysis and instead just want to know how good the predictions are.  To answer this question, we have implemented a validation metric (VM) that we developed [see ‘Million to one’ on November 21st, 2018] which allows us to state the probability that the predictions and measurements are from the same population given the known uncertainty in the measurements – these probabilities are shown in the black boxes superimposed on the graphs.

These novel methods create a toolbox for alleviating uncertainty about predictions of structural behaviour in industrial contexts.  Please get in touch if you want more information in order to test these tools yourself.

The MOTIVATE project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 754660 and the Swiss State Secretariat for Education, Research and Innovation under contract number 17.00064.

The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.

First law of geography: everything is related to everything else

One of the benefits of supervising research students is that you can read a large number of scientific papers by proxy.  In other words, my research students read more papers than I would ever have time to read and then they write reviews of the scientific literature that allow me to quickly gain an understanding of research in a particular field.  Every now and again, a student refers to a paper that raises my curiosity to read it for myself.  One of these was a paper published by Waldo Tobler in 1970 in which he describes the computational modelling of urban growth in Detroit, Michigan.  Although, I used to live in Michigan, it was not the geographical connection that interested me but his invocation of the first law of geography: ‘everything is related to everything else, but near things are more related than distant things’.  Professor Tobler was writing from the University of Michigan in Ann Arbor which he used in an example by highlighting that the population growth in Ann Arbor from 1930 to 1940 depended not only on the 1930 population of Ann Arbor, but also on the 1930 population of Vancouver, Singapore, Cape Town, Berlin and so on.  Perhaps if he had been writing in 2020 he would have suggested that the rate of infection from coronavirus in Ann Arbor depends not only on the number of cases in Ann Arbor, but also on the number of cases Taipei, Milan, Toulouse, Dublin and so on.

Source:

Tobler WR, A computer movie simulating urban growth in the Detroit Region, Economic Geography, vol. 46, Supplement: Proceedings. Int. Geog. Union. Commission on Quantitative Methods, 234-240, 1970.

Image: Crisco 1492Own work

Fake facts & untrustworthy predictions

I need to confess to writing a misleading post some months ago entitled ‘In Einstein’s footprints?‘ on February 27th 2019, in which I promoted our 4th workshop on the ‘Validation of Computational Mechanics Models‘ that we held last month at Guild Hall of Carpenters [Zunfthaus zur Zimmerleuten] in Zurich.  I implied that speakers at the workshop would be stepping in Einstein’s footprints when they presented their research at the workshop, because Einstein presented a paper at the same venue in 1910.  However, as our host in Zurich revealed in his introductory remarks , the Guild Hall was gutted by fire in 2007 and so we were meeting in a fake, or replica, which was so good that most of us had not realised.  This was quite appropriate because a theme of the workshop was enhancing the credibility of computer models that are used to replicate the real-world.  We discussed the issues surrounding the trustworthiness of models in a wide range of fields including aerospace engineering, biomechanics, nuclear power and toxicology.  Many of the presentations are available on the website of the EU project MOTIVATE which organised and sponsored the workshop as part of its dissemination programme.  While we did not solve any problems, we did broaden people’s understanding of the issues associated with trustworthiness of predictions and identified the need to develop common approaches to support regulatory decisions across a range of industrial sectors – that’s probably the theme for our 5th workshop!

The MOTIVATE project has received funding from the Clean Sky 2 Joint Undertaking under the European Union’s Horizon 2020 research and innovation programme under grant agreement No. 754660 and the Swiss State Secretariat for Education, Research and Innovation under contract number 17.00064.

The opinions expressed in this blog post reflect only the author’s view and the Clean Sky 2 Joint Undertaking is not responsible for any use that may be made of the information it contains.

Image: https://www.tagesanzeiger.ch/Zunfthaus-Zur-Zimmerleuten-Wiederaufbauprojekt-steht/story/30815219