Category Archives: electrical engineering

Follow your gut

Decorative image of a fruit fly nervous system Albert Cardona HHMI Janelia Research Campus Welcome Image Awards 2015Data centres worldwide consume about 1% of global electricity generation, that’s 200-250 TWh (Masenet et al, 2020), and if you add in mining of cryptocurrencies then consumption jumps by about 50% (Gallersdörfer et al, 2020). Data transmission consumes about 260-340 TWh or at least another 1% of global energy consumption (IEA, 2020).  The energy efficiency of modern computers has been improving; however, their consumption is still many millions times greater than the theoretical limit defined by Landauer’s principle which was verified in 2012 by Bérut et al.  According to Landauer’s principle, a computer operating at room temperature would only need 3 zJ (300 billion billionths of a Joule) to erase a bit of information.  The quantity of energy used by modern computers is many millions times the Landauer limit.  Of course, progress is being made almost continuously, for example a team at EPFL in Lausanne and ETH Zurich recently described a new technology that uses only a tenth of the energy of current transistors (Oliva et al 2020).  Perhaps we need turn to biomimetics because Escherichia Coli, which are bacteria that live in our gut and have to process information to reproduce, have been found to use ten thousand times less energy to process a bit of information than the average human-built device for processing information (Zhirnov & Cavin, 2013).  So, E.coli are still some way from the Landauer limit but demonstrate that there is considerable potential for improvement in engineered devices.

References

Bérut A, Arakelyan A, Petrosyan A, Ciliberto S, Dillenschneider R & Lutz E. Experimental verification of Landauer’s principle linking information and thermodynamics. Nature, 483: 187–189, 2012.

IEA (2021), Data Centres and Data Transmission Networks, IEA, Paris https://www.iea.org/reports/data-centres-and-data-transmission-networks

Gallersdörfer U, Klaaßen L, Stoll C. Energy consumption of cryptocurrencies beyond bitcoin. Joule. 4(9):1843-6, 2020.

Masanet E, Shehabi A, Lei N, Smith S, Koomey J. Recalibrating global data center energy-use estimates. Science. 367(6481):984-6, 2020.

Oliva N, Backman J, Capua L, Cavalieri M, Luisier M, Ionescu AM. WSe 2/SnSe 2 vdW heterojunction Tunnel FET with subthermionic characteristic and MOSFET co-integrated on same WSe 2 flake. npj 2D Materials and Applications. 4(1):1-8, 2020.

Zhirnov VV, Cavin RK. Future microsystems for information processing: limits and lessons from the living systems. IEEE Journal of the Electron Devices Society. 1(2):29-47, 2013.

Innovative design too far ahead of the market?

computer rendering of street with kerbstones fitted for chraging electric vehiclesThe forthcoming COP26 conference in Glasgow is generating much discussion about ambitions to achieve net zero carbon emissions. These ambitions tend to be articulated by national governments or corporate leaders and there is less attention paid to the details of achieving zero emissions at the mundane level of everyday life. For instance, how to recharge an electric car if you live in an apartment building or a terraced house without a designated parking space. About six years ago, I supervised an undergraduate engineering student who designed an induction pad integrated into a kerbstone for an electric vehicle.  The kerbstone looked the same as a conventional one, which it could replace, but was connected to the mains electricity supply under the pavement.  A primary coil was integrated into the kerbstone and a secondary coil was incorporated into the side skirt of the vehicle, which could be lowered towards the kerbstone when the vehicle was parked.  The energy transferred from the primary coil in the kerbstone to the secondary coil in the vehicle via a magnetic field that conformed to radiation safety limits for household appliances.  Payment for charging was via a passive RFID card that connected to an app on your mobile phone.  The student presented her design at the Future Powertrain Conference (FCP 2015)  where her poster won first prize and we discussed spinning out a company to develop, manufacture and market the design.  However, a blue-chip engineering company offered the student a good job and we decided that the design was probably ahead of its time so it has remained on the drawing board.  Our technopy, or technology entropy was too high, we were ahead of the rate of change in the marketplace and launching a new product in these conditions can be disastrous.  Maybe the market is catching up with our design?

For more on technopy see Handscombe RD and Patterson EA ‘The Entropy Vector: Connecting Science and Business‘, World Scientific, Singapore, 2004.

 

 

 

 

Million to one

‘All models are wrong, but some are useful’ is a quote, usually attributed to George Box, that is often cited in the context of computer models and simulations.  Working out which models are useful can be difficult and it is essential to get it right when a model is to be used to design an aircraft, support the safety case for a nuclear power station or inform regulatory risk assessment on a new chemical.  One way to identify a useful model to assess its predictions against measurements made in the real-world [see ‘Model validation’ on September 18th, 2012].  Many people have worked on validation metrics that allow predicted and measured signals to be compared; and, some result in a statement of the probability that the predicted and measured signal belong to the same population.  This works well if the predictions and measurements are, for example, the temperature measured at a single weather station over a period of time; however, these validation metrics cannot handle fields of data, for instance the map of temperature, measured with an infrared camera, in a power station during start-up.  We have been working on resolving this issue and we have recently published a paper on ‘A probabilistic metric for the validation of computational models’.  We reduce the dimensionality of a field of data, represented by values in a matrix, to a vector using orthogonal decomposition [see ‘Recognizing strain’ on October 28th, 2015].  The data field could be a map of temperature, the strain field in an aircraft wing or the topology of a landscape – it does not matter.  The decomposition is performed separately and identically on the predicted and measured data fields to create to two vectors – one each for the predictions and measurements.  We look at the differences in these two vectors and compare them against the uncertainty in the measurements to arrive at a probability that the predictions belong to the same population as the measurements.  There are subtleties in the process that I have omitted but essentially, we can take two data fields composed of millions of values and arrive at a single number to describe the usefulness of the model’s predictions.

Our paper was published by the Royal Society with a press release but in the same week as the proposed Brexit agreement and so I would like to think that it was ignored due to the overwhelming interest in the political storm around Brexit rather than its esoteric nature.

Source:

Dvurecenska K, Graham S, Patelli E & Patterson EA, A probabilistic metric for the validation of computational models, Royal Society Open Science, 5:1180687, 2018.

Happenstance, not engineering?

okemos-art-2extract

A few weeks ago I wrote that ‘engineering is all about ingenuity‘ [post on September 14th, 2016] and pointed out that while some engineers are involved in designing, manufacturing and maintaining engines, most of us are not.  So, besides being ingenious, what do the rest of us do?  Well, most of us contribute in some way to the conception, building and sustaining of networks.  Communication networks, food supply networks, power networks, transport networks, networks of coastal defences, networks of oil rigs, refineries and service stations, or networks of mines, smelting works and factories that make everything from bicycles to xylophones.  The list is endless in our highly networked society.  A network is a group of interconnected things or people.  And, engineers are responsible for all of the nodes in our networks of things and for just about all the connections in our networks of both things and people.

Engineers have been constructing networks by building nodes and connecting them for thousands of years, for instance the ancient Mesopotamians were building aqueducts to connect their towns with distance water supplies more than four millenia ago.

Engineered networks are so ubiquitous that no one notices them until something goes wrong, which means engineers tend to get blamed more than praised.  But apparently that is the fault of the ultimate network: the human brain.  Recent research has shown that blame and praise are assigned by different mechanisms in the brain and that blame can be assigned by every location in the brain responsible for emotion whereas praise comes only from a single location responsible for logical thought.  So, we blame more frequently than we praise and we tend to assume that bad things are deliberate while good things are happenstance.  So reliable networks are happenstance rather than good engineering in the eyes of most people!

Sources:

Ngo L, Kelly M, Coutlee CG, Carter RM , Sinnott-Armstrong W & Huettel SA, Two distinct moral mechanisms for ascribing and denying intentionality, Scientific Reports, 5:17390, 2015.

Bruek H, Human brains are wired to blame rather than to praise, Fortune, December 4th 2015.