Digital twins are becoming ubiquitous in many areas of engineering [see ‘Can you trust your digital twin?‘ on November 23rd, 2016]. Although at the same time, the terminology is becoming blurred as digital shadows and digital models are treated as if they are synonymous with digital twins. A digital model is a digitised replica of physical entity which lacks any automatic data exchange between the entity and its replica. A digital shadow is the digital representation of a physical object with a one-way flow of information from the object to its representation. But a digital twin is a functional representation with a live feedback loop to its counterpart in the real-world. The feedback loop is based on a continuous update to the digital twin about the condition and performance of the physical entity based on data from sensors and on analysis from the digital twin about the performance of the physical entity. This enables a digital twin to provide a service to many stakeholders. For example, the users of a digital twin of an aircraft engine could include the manufacturer, the operator, the maintenance providers and the insurers. These capabilities imply digital twins are themselves becoming products which exist in a digital context that might connect many digital products thus forming an integrated digital environment. I wrote about integrated digital environments when they were a concept and the primary challenges were technical in nature [see ‘Enabling or disruptive technology for nuclear engineering?‘ on January 28th, 2015]. Many of these technical challenges have been resolved and the next set of challenges are economic and commercial ones associated with launching digital twins into global markets that lack adequate understanding, legislation, security, regulation or governance for digital products. In collaboration with my colleagues at the Virtual Engineering Centre, we have recently published a white paper, entitled ‘Transforming digital twins into digital products that thrive in the real world‘ that reviews these issues and identifies the need to establish digital contexts that embrace the social, economic and technical requirements for the appropriate use of digital twins [see ‘Digital twins could put at risk what it means to be human‘ on November 18th, 2020].
Cars that run on air might seem like a fairy tale or an April Fools story; but it is possible to use air as a medium for storing energy by compressing it or liquifying it at -196°C. The MDI company in Luxembourg has been developing and building a compressed air engine which powers a small car, or Airpod 2.0 and a new industrial vehicle, the Air‘Volution. When the compressed air is allowed to expand, the energy stored in it is released and can be used to power the vehicle. The Airpod 2.0 weighs only 350 kg, has seats for two people, 400 litres of luggage space and an urban cycle range of 100 to 120 km at a top speed of 80 km/h. So, it is an urban runabout with zero emissions and no requirement for lithium, nickel or cobalt for batteries but a limited range. A couple of years ago I tasked an MSc student with a project to consider the practicalities of a car running on liquid air, based on the premise that it should be possible to store a higher density of energy in liquified air (about 290 kJ/litre) than in compressed air (about 100 kJ/litre). His concept design used a rolling piston engine to power a family car capable of carrying 5 passengers and 346 litres of luggage over a 160 km. So, his design carried a bigger payload for further than the Airpod 2.0; however, like the electric charging system described a few weeks ago [see ‘Innovative design too far ahead of the market’ on May 5th, 2021], the design never the left the drawing board.
The economists John Kay and Mervyn King assert in their book ‘Radical Uncertainty – decision-making beyond numbers‘ that ‘economic forecasting is necessarily harder than weather forecasting’ because the world of economics is non-stationary whereas the weather is governed by unchanging laws of nature. Kay and King observe that both central banks and meteorological offices have ‘to convey inescapable uncertainty to people who crave unavailable certainty’. In other words, the necessary assumptions and idealisations combined with the inaccuracies of the input data of both economic and meteorological models produce inevitable uncertainty in the predictions. However, people seeking to make decisions based on the predictions want certainty because it is very difficult to make choices when faced with uncertainty – it raises our psychological entropy [see ‘Psychological entropy increased by ineffective leaders‘ on February 10th, 2021]. Engineers face similar difficulties providing systems with inescapable uncertainties to people desiring unavailable certainty in terms of the reliability. The second law of thermodynamics ensures that perfection is unattainable [see ‘Impossible perfection‘ on June 5th, 2013] and there will always be flaws of some description present in a system [see ‘Scattering electrons reveal dislocations in material structure‘ on November 11th, 2020]. Of course, we can expend more resources to eliminate flaws and increase the reliability of a system but the second law will always limit our success. Consequently, to finish where I started with a quote from Kay and King, ‘certainty is unattainable and the price of near-certainty unaffordable’ in both economics and engineering.
I am teaching thermodynamics to first year undergraduate students at the moment and in most previous years this experience has stimulated me to blog about thermodynamics [for example: ‘Isolated systems in nature?’ on February 12th, 2020]. However, this year I am more than half-way through the module and this is the first post on the topic. Perhaps that is an impact of teaching on-line via live broadcasts rather than the performance involved in lecturing to hundreds of students in a lecture theatre. Last week I introduced the second law of thermodynamics and explained its origins in efforts to improve the efficiency of steam engines by 19th century engineers and physicists, including Rudolf Clausius (1822 – 1888), William Thomson (1827 – 1907) and Ludwig Boltzmann (1844 – 1906). The second law of thermodynamics states that the entropy of the universe increases during all real processes, where entropy can be described as the degree of disorder. The traditional narrative is that thermodynamics was developed by the Victorians; however, I think that the ancient Greeks had a pretty good understanding of it without calling it thermodynamics. Heraclitus (c. 535 BCE – c. 475 BCE) understood that everything is in flux and nothing is at rest so that the world is one colossal process. This concept comes close to the modern interpretation of the second of law of thermodynamics in which the entropy in the universe is constantly increasing leading to continuous change. Heraclitus just did not state the direction of flux. Unfortunately, Plato (c. 429 BCE – c. 347 BCE) did not agree with Heraclitus, but thought that some divine intervention had imposed order on pre-existing chaos to create an ordered universe, which precludes a constant flux and probably set back Western thought for a couple of millennia. However, it seems likely that in the 17th century, Newton (1643 – 1727) and Leibniz (1646 – 1716), when they independently invented calculus, had more than an inkling about everything being in flux. In the 18th century, the pioneering geologist James Hutton (1726 – 1797), while examining the tilted layers of the cliff at Siccar Point in Berwickshire, realised that the Earth was not simply created but instead is in a state of constant flux. His ideas were spurned at the time and he was accused of atheism. Boltzmann also had to vigorously defend his ideas to such an extent that his mental health deteriorated and he committed suicide while on vacation with his wife and daughter. Today, it is widely accepted that the second law of thermodynamics governs all natural and synthetic processes, and many people have heard of entropy [see ‘Entropy on the brain’ on November 29th, 2017] but far fewer understand it [see ‘Two cultures’ on March 5th, 2013]. It is perhaps still controversial to talk about the theoretical long-term consequence of the second law, which is cosmic heat death corresponding to an equilibrium state of maximum entropy and uniform temperature across the universe such that nothing happens and life cannot exist [see ‘Will it all be over soon?’ on November 2nd, 2016]. This concept caused problems to 19th century thinkers, particular James Clerk Maxwell (1831 – 1979), and even perhaps to Plato who theorised two worlds in his theory of forms, one unchanging and the other in constant change, maybe in an effort to dodge the potential implications of degeneration of the universe into chaos.
Image: decaying ruins of Fountains Abbey beside the River Skell. Heraclitus is reported to have said ‘no man ever steps twice into the same river; for it’s not the same river and he’s not the same man’.