Tag Archives: modelling

Destruction of society as a complex system?

Sadly my vacation is finished [see ‘Relieving stress‘ on July 17th, 2019] and I have reconnected to the digital world, including the news media.  Despite the sensational headlines and plenty of rhetoric from politicians, nothing very much appears to have really changed in the world.  Yes, we have a new prime minister in the UK, who has a different agenda to the previous incumbent; however, the impact of actions by politicians on society and the economy seems rather limited unless the action represents a step change and is accompanied by appropriate resources.  In addition, the consequences of such changes are often different to those anticipated by our leaders.  Perhaps, this is because society is a global network with simple operating rules, some of which we know intuitively, and without a central control because governments exert only limited and local control.  It is well-known in the scientific community that large networks, without central control but with simple operating rules, usually exhibit self-organising and non-trivial emergent behaviour. The emergent behaviour of a complex system cannot be predicted from the behaviour of its constituent components or sub-systems, i.e., the whole is more than the sum of its parts.  The mathematical approach to describing such systems is to use non-linear dynamics with solutions lying in phase space.  Modelling complex systems is difficult and interpreting the predictions is challenging; so, it is not surprising that when the actions of government have an impact then the outcomes are often unexpected and unintended.  However, if global society can be considered as a complex system, then it would appear that its self-organising behaviour tends to blunt the effectiveness of many of the actions of government.  This seems be a fortuitous regulatory mechanism that helps maintain the status quo.   In addition, we tend to ignore phenomena whose complexity exceeds our powers of explanation, or we use over-simplified explanations [see ‘Is the world incomprehensible?‘ on March 15th, 2017 and Blind to complexity‘ on December 19th, 2018].  And, politicians are no exception to this tendency; so, they usually legislate based on simple ideology rather than rational consideration of the likely outcomes of change on the complex system we call society. And, this is probably a further regulatory mechanism.

However, all of this is evolving rapidly because a small number of tech companies have created a central control by grabbing the flow of data between us and they are using it to manipulate those simple operating rules.  This appears to be weakening the self-organising and emergent characteristics of society so that the system can be controlled more easily without the influence of its constituent parts, i.e. us.

For a more straightforward explanation listen to Carole Cadwalladr’s TED talk on ‘Facebook’s role in Brexit – and the threat to democracy‘ or if you have more time on your hands then watch the new documentary movie ‘The Great Hack‘.  My thanks to Gillian Tett in the FT last weekend who alerted me to the scale of the issue: ‘Data brokers: from poachers to gamekeepers?

 

Spatial-temporal models of protein structures

For a number of years I have been working on methods for validating computational models of structures [see ‘Model validation‘ on September 18th 2012] using the full potential of measurements made with modern techniques such as digital image correlation [see ‘256 shades of grey‘ on January 22nd 2014] and thermoelastic stress analysis [see ‘Counting photons to measure stress‘ on November 18th 2015].  Usually the focus of our interest is at the macroscale, for example the research on aircraft structures in the MOTIVATE project; however, in a new PhD project with colleagues at the National Tsing Hua University in Taiwan, we are planning to explore using our validation procedures and metrics [1] in structural biology.

The size and timescale of protein-structure thermal fluctuations are essential to the regulation of cellular functions. Measurement techniques such as x-ray crystallography and transmission electron cryomicroscopy (Cryo-EM) provide data on electron density distribution from which protein structures can be deduced using molecular dynamics models. Our aim is to develop our validation metrics to help identify, with a defined level of confidence, the most appropriate structural ensemble for a given set of electron densities. To make the problem more interesting and challenging the structure observed by x-ray crystallography is an average or equilibrium state because a folded protein is constantly in motion undergoing harmonic oscillations, each with different frequencies and amplitude [2].

The PhD project is part of the dual PhD programme of the University of Liverpool and National Tsing Hua University.  Funding is available in form of a fee waiver and contribution to living expenses for four years of study involving significant periods (perferably two years) at each university.  For more information follow this link.

References:

[1] Dvurecenska, K., Graham, S., Patelli, E. & Patterson, E.A., A probabilistic metric for the validation of computational models, Royal Society Open Society, 5:180687, 2018.

[2] Justin Chan, Hong-Rui Lin, Kazuhiro Takemura, Kai-Chun Chang, Yuan-Yu Chang, Yasumasa Joti, Akio Kitao, Lee-Wei Yang. An efficient timer and sizer of protein motions reveals the time-scales of functional dynamics in the ribosome (2018) https://www.biorxiv.org/content/early/2018/08/03/384511.

Image: A diffraction pattern and protein structure from http://xray.bmc.uu.se/xtal/

Models as fables

moel arthurIn his book, ‘Economic Rules – Why economics works, when it fails and how to tell the difference‘, Dani Rodrik describes models as fables – short stories that revolve around a few principal characters who live in an unnamed generic place and whose behaviour and interaction produce an outcome that serves as a lesson of sorts.  This seems to me to be a healthy perspective compared to the almost slavish belief in computational models that is common today in many quarters.  However, in engineering and increasingly in precision medicine, we use computational models as reliable and detailed predictors of the performance of specific systems.  Quantifying this reliability in a way that is useful to non-expert decision-makers is a current area of my research.  This work originated in aerospace engineering where it is possible, though expensive, to acquire comprehensive and information-rich data from experiments and then to validate models by comparing their predictions to measurements.  We have progressed to nuclear power engineering in which the extreme conditions and time-scales lead to sparse or incomplete data that make it more challenging to assess the reliability of computational models.  Now, we are just starting to consider models in computational biology where the inherent variability of biological data and our inability to control the real world present even bigger challenges to establishing model reliability.

Sources:

Dani Rodrik, Economic Rules: Why economics works, when it fails and how to tell the difference, Oxford University Press, 2015

Patterson, E.A., Taylor, R.J. & Bankhead, M., A framework for an integrated nuclear digital environment, Progress in Nuclear Energy, 87:97-103, 2016

Hack, E., Lampeas, G. & Patterson, E.A., An evaluation of a protocol for the validation of computational solid mechanics models, J. Strain Analysis, 51(1):5-13, 2016.

Patterson, E.A., Challenges in experimental strain analysis: interfaces and temperature extremes, J. Strain Analysis, 50(5): 282-3, 2015

Patterson, E.A., On the credibility of engineering models and meta-models, J. Strain Analysis, 50(4):218-220, 2015