Category Archives: Engineering

More on fairy lights and volume decomposition (with ice cream included)

Explanation in textLast June, I wrote about representing five-dimensional data using a three-dimensional stack of transparent cubes containing fairy lights whose brightness varied with time and also using feature vectors in which the data are compressed into a relatively short string of numbers [see ‘Fairy lights and decomposing multi-dimensional datasets’ on June 14th, 2023].  After many iterations, we have finally had an article published describing our method of orthogonally decomposing multi-dimensional data arrays using Chebyshev polynomials.  In this context, orthogonal means that components of the resultant feature vector are statistically independent of one another.  The decomposition process consists of fitting a particular form of polynomials, or equations, to the data by varying the coefficients in the polynomials.  The values of the coefficients become the components of the feature vector.  This is what we do when we fit a straight line of the form y=mx+c to set of values of x and y and the coefficients are m and c which can be used to compare data from different sources, instead of the datasets themselves.  For example, x and y might be the daily sales of ice cream and the daily average temperature with different datasets relating to different locations.  Of course, it is much harder for data that is non-linear and varying with w, x, y and z, such as the intensity of light in the stack of transparent cubes with fairy lights inside.  In our article, we did not use fairy lights or icecream sales, instead we compared the measurements and predictions in two case studies: the internal stresses in a simple composite specimen and the time-varying surface displacements of a vibrating panel.

The image shows the normalised out-of-plane displacements as the colour as a function of time in the z-direction for the surface of a panel represented by the xy-plane.

Source:

Amjad KH, Christian WJ, Dvurecenska KS, Mollenhauer D, Przybyla CP, Patterson EA. Quantitative Comparisons of Volumetric Datasets from Experiments and Computational Models. IEEE Access. 11: 123401-123417, 2023.

Evolutionary model of knowledge management

Towards the end of last year, I wrote about the challenges in deploying digital technologies in holistic approaches to knowledge management in order to gain organizational value and competitive advantage [see ‘Opportunities lost in knowledge management using digital technology’ on October 25th, 2023].  Almost on the last working day of 2023, we had an article published in PLOS ONE (my first in the journal) in which we explored ‘The impact of digital technologies on knowledge networks in two engineering organizations’.  We used social network analysis and semi-structured interviews to investigate the culture around knowledge management, and the deployment of digital technologies in support of it, in an engineering consultancy and an electricity generator.  The two organizations had different cultures and levels of deployment of digital technologies.  We proposed a new evolutionary model of the culture of knowledge management based on Hudson’s evolutional model of safety culture that is widely used in industry. Our new model is illustrated in the figure from our article, starting from ‘Ignored: we have no knowledge management and no plans for knowledge management’ through to ‘Embedded: knowledge management is integrated naturally into the daily workflow’.  We also proposed that social networks could be used as an indicator of the stage of evolution of knowledge management with low network density and dispersed networks representing higher stages of evolution, based on our findings for the two engineering organizations.

Sources:

Hudson, P.T.W., 2001. Safety management and safety culture: the long, hard and winding road. Occupational health and safety management systems, pp.3-32, 2001

Patterson EA, Taylor RJ, Yao Y. The impact of digital technologies on knowledge networks in two engineering organisations. PLoS ONE 18(12): e0295250, 2023.

 

Machine learning weather forecasts and black swan events

Decorative painting of a stormy seascapeA couple of weeks ago I read about Google’s new weather forecasting algorithm, GraphCast.  It takes a radical new approach to forecasting by using machine learning rather than modelling the weather using the laws of physics [see ‘Storm in a computer‘ on November 16th, 2022].  GraphCast uses a graph neural network that has been trained on 39 years (1979 -2017) of historical data from the European Centre for Medium-Range Weather Forecasts (ECMWF). It requires two inputs: the current state of the weather and the state six hours ago; then it predicts the weather six hours ahead with a 0.25 degree latitude-longitude resolution (about 17 miles) at 38 vertical levels.  This compares to ECMWF’s high resolution forecasts which have 0.1 degree resolution (about 7 miles), 137 levels and 1 hour timesteps.  Although the training of the neural network took about four weeks on 32 Cloud TPU v4 devices (Tensor Processing Units), the forecast requires less than a minute on a single device whereas the ECMWF’s high resolution forecast requires a couple of hours on a supercomputer.  Within a day or so of reading about GraphCast, we watched ‘The Day After Tomorrow’, a movie in which a superstorm suddenly plunges the entire northern hemisphere into an ice age with dramatic consequences.  Part of the movie’s message is that humanity’s disregard for the state of the planet could lead to existential consequences.  It occurred to me that the traditional approach to weather forecasting using the laws of physics might predict the onset of such a superstorm and avoid it becoming a black swan event; however, it is very unlikely forecasts based on machine learning would predict it because there is nothing like it in the historical record used to train the neural network.  So for the moment we should continue to use the laws of physics to model and predict the weather since climate change appears to be making superstorms more likely [see ‘More violent storms‘ on March 1st 2017].

Sources:

Blum A, The weather forecast may show AI storms ahead, FT Weekend, 18/19 November 2023.

Lam R, Sanchez-Gonzalez A, Willson M, Wirnsberger P, Fortunato M, Alet F, Ravuri S, Ewalds T, Eaton-Rosen Z, Hu W, Merose A. Learning skillful medium-range global weather forecasting. Science. 10.1126/science.adi2336, 2023.

Image: Painting by Sarah Evans owned by the author.

 

Chirping while calculating probabilities

Decorative image of a pink roseA couple of weeks ago, I visited the London headquarters of IBM in the UK and Ireland for discussions about possible areas of collaboration in research and education.  At the end of our meeting, we were taken to see some of their latest developments, one of which was their Quantum System One computer.  We had seen its casing, a shiny silver cylinder about half metre in diameter and a metre and half long with a hemispherical lower end, hanging in a sealed glass cube in the lobby of the building.  The computer we viewed was also suspended from the ceiling of a sealed glass cube in order to isolate it from vibration, but was without its cylindrical cover so that we could see its innards which need to be cooled to cryogenic temperatures.  The room in which it was displayed was darkened and a soundtrack of the computer operating added to the atmosphere – it sounded like birds chirping.  IBM are already operating quantum computers, starting in 2019 with a 27-qubit processor and achieving 433 qubits last year with plans for 4,158+ qubits in 2025 in their roadmapThere are about 80 companies focussed on quantum computing worldwide, including Universal Quantum who are working on a million qubit computer.  Qubit is short for quantum bit and is the quantum mechanical analogue of a classical computer bit.  A computer bit works in binary and can only have a value of 0 or 1.  Whereas a qubit holds information about the probability amplitudes for 0 and 1 which will always have a sum of 1.  The use of probability amplitudes allows complex systems to be described more efficiently and larger solution spaces to be explored.  IBM’s quantum processors are thin wafers about the same size as the one in your laptop but their need for cryogenic temperatures and vibration isolation means we will not be using them at home any time soon.