Category Archives: Engineering

Evolutionary model of knowledge management

Towards the end of last year, I wrote about the challenges in deploying digital technologies in holistic approaches to knowledge management in order to gain organizational value and competitive advantage [see ‘Opportunities lost in knowledge management using digital technology’ on October 25th, 2023].  Almost on the last working day of 2023, we had an article published in PLOS ONE (my first in the journal) in which we explored ‘The impact of digital technologies on knowledge networks in two engineering organizations’.  We used social network analysis and semi-structured interviews to investigate the culture around knowledge management, and the deployment of digital technologies in support of it, in an engineering consultancy and an electricity generator.  The two organizations had different cultures and levels of deployment of digital technologies.  We proposed a new evolutionary model of the culture of knowledge management based on Hudson’s evolutional model of safety culture that is widely used in industry. Our new model is illustrated in the figure from our article, starting from ‘Ignored: we have no knowledge management and no plans for knowledge management’ through to ‘Embedded: knowledge management is integrated naturally into the daily workflow’.  We also proposed that social networks could be used as an indicator of the stage of evolution of knowledge management with low network density and dispersed networks representing higher stages of evolution, based on our findings for the two engineering organizations.

Sources:

Hudson, P.T.W., 2001. Safety management and safety culture: the long, hard and winding road. Occupational health and safety management systems, pp.3-32, 2001

Patterson EA, Taylor RJ, Yao Y. The impact of digital technologies on knowledge networks in two engineering organisations. PLoS ONE 18(12): e0295250, 2023.

 

Machine learning weather forecasts and black swan events

Decorative painting of a stormy seascapeA couple of weeks ago I read about Google’s new weather forecasting algorithm, GraphCast.  It takes a radical new approach to forecasting by using machine learning rather than modelling the weather using the laws of physics [see ‘Storm in a computer‘ on November 16th, 2022].  GraphCast uses a graph neural network that has been trained on 39 years (1979 -2017) of historical data from the European Centre for Medium-Range Weather Forecasts (ECMWF). It requires two inputs: the current state of the weather and the state six hours ago; then it predicts the weather six hours ahead with a 0.25 degree latitude-longitude resolution (about 17 miles) at 38 vertical levels.  This compares to ECMWF’s high resolution forecasts which have 0.1 degree resolution (about 7 miles), 137 levels and 1 hour timesteps.  Although the training of the neural network took about four weeks on 32 Cloud TPU v4 devices (Tensor Processing Units), the forecast requires less than a minute on a single device whereas the ECMWF’s high resolution forecast requires a couple of hours on a supercomputer.  Within a day or so of reading about GraphCast, we watched ‘The Day After Tomorrow’, a movie in which a superstorm suddenly plunges the entire northern hemisphere into an ice age with dramatic consequences.  Part of the movie’s message is that humanity’s disregard for the state of the planet could lead to existential consequences.  It occurred to me that the traditional approach to weather forecasting using the laws of physics might predict the onset of such a superstorm and avoid it becoming a black swan event; however, it is very unlikely forecasts based on machine learning would predict it because there is nothing like it in the historical record used to train the neural network.  So for the moment we should continue to use the laws of physics to model and predict the weather since climate change appears to be making superstorms more likely [see ‘More violent storms‘ on March 1st 2017].

Sources:

Blum A, The weather forecast may show AI storms ahead, FT Weekend, 18/19 November 2023.

Lam R, Sanchez-Gonzalez A, Willson M, Wirnsberger P, Fortunato M, Alet F, Ravuri S, Ewalds T, Eaton-Rosen Z, Hu W, Merose A. Learning skillful medium-range global weather forecasting. Science. 10.1126/science.adi2336, 2023.

Image: Painting by Sarah Evans owned by the author.

 

Chirping while calculating probabilities

Decorative image of a pink roseA couple of weeks ago, I visited the London headquarters of IBM in the UK and Ireland for discussions about possible areas of collaboration in research and education.  At the end of our meeting, we were taken to see some of their latest developments, one of which was their Quantum System One computer.  We had seen its casing, a shiny silver cylinder about half metre in diameter and a metre and half long with a hemispherical lower end, hanging in a sealed glass cube in the lobby of the building.  The computer we viewed was also suspended from the ceiling of a sealed glass cube in order to isolate it from vibration, but was without its cylindrical cover so that we could see its innards which need to be cooled to cryogenic temperatures.  The room in which it was displayed was darkened and a soundtrack of the computer operating added to the atmosphere – it sounded like birds chirping.  IBM are already operating quantum computers, starting in 2019 with a 27-qubit processor and achieving 433 qubits last year with plans for 4,158+ qubits in 2025 in their roadmapThere are about 80 companies focussed on quantum computing worldwide, including Universal Quantum who are working on a million qubit computer.  Qubit is short for quantum bit and is the quantum mechanical analogue of a classical computer bit.  A computer bit works in binary and can only have a value of 0 or 1.  Whereas a qubit holds information about the probability amplitudes for 0 and 1 which will always have a sum of 1.  The use of probability amplitudes allows complex systems to be described more efficiently and larger solution spaces to be explored.  IBM’s quantum processors are thin wafers about the same size as the one in your laptop but their need for cryogenic temperatures and vibration isolation means we will not be using them at home any time soon.

Opportunities lost in knowledge management using digital technology

Decorative imageRegular readers of this blog will know that I occasionally feature publications from my research group.  The most recent was ‘Predicting release rates of hydrogen from stainless steel’ on September 13th, 2023 and before that ‘Label-free real-tracking of individual bacterium’ on January 25th 2023 and ‘A thermal emissions-based real-time monitoring system for in situ detection of cracks’ in ‘Seeing small changes is a big achievement’ on October 26th 2023.  The subject of these publications might seem a long way apart but they are linked by my interest in trying to measure events in the real-world and use the data to develop and validate high-fidelity digital models.  Recently, I have stretched my research interests still further through supervising a clutch of PhD students with a relatively new collaborator working in the social sciences.  Two of the students have had their first papers published by the ASME (American Society of Mechanical Engineers) and the IEEE (Institute of Electrical and Electronics Engineers).  Their papers are not directly connected but they both explore the use of published information to gain new insights on a topic.  In the first one [1], we have explored the similarities and differences between safety cases for three nuclear reactors: a pair of research reactors – one fission and one fusion reactor; and a commercial fission reactor.  We have developed a graphical representation of the safety features in the reactors and their relationships to the fundamental safety principles set out by the nuclear regulators. This has allowed us to gain a better understanding of the hazard profiles of fission and fusion reactors that could be used to create the safety case for a commercial fusion reactor.  Fundamentally, this paper is about exploiting existing knowledge and looking at it in a new way to gain fresh insights, which we did manually rather than automating the process using digital technology.  In the second paper [2], we have explored the extent to which digital technologies are being used to create, collate and curate knowledge during and beyond the life-cycle of an engineering product.  We found that these processes were happening but generally not in a holistic manner.  Consequently, opportunities were being lost through not deploying digital technology in knowledge management to undertake multiple roles simultaneously, e.g., acting as repositories, transactive memory systems (group-level knowledge sharing), communication spaces, boundary objects (contact points between multiple disciplines, systems or worlds) and non-human actors.  There are significant challenges, as well as competitive advantages and organisational value to be gained, in deploying digital technology in holistic approaches to knowledge management.  However, despite the rapid advances in machine learning and artificial intelligence [see ‘Update on position of AI on hype curve: it cannot dream’ on July 26th 2023] that will certainly accelerate and enhance knowledge management in a digital environment, a human is still required to realise the value of the knowledge and use it creatively.

References

  1. Nguyen, T., Patterson, E.A., Taylor, R.J., Tseng, Y.S. and Waldon, C., 2023. Comparative maps of safety features for fission and fusion reactors. Journal of Nuclear Engineering and Radiation Science, pp.1-24
  2. Yao, Y., Patterson, E.A. and Taylor, R.J., 2023. The Influence of Digital Technologies on Knowledge Management in Engineering: A Systematic Literature Review. IEEE Transactions on Knowledge and Data Engineering.