# Boltzmann’s brain

Ludwig Boltzmann developed a statistical explanation of the second law of thermodynamics by defining entropy as being proportional to the logarithm of the number ways in which we can arrange a system [see ‘Entropy on the brain‘ on November 29th 2017].  The mathematical expression of this definition is engraved on his head-stone.  The second law states that the entropy of the universe is always increasing and Boltzmann argued it implies that the universe must have been created in a very low entropy state.  Four decades earlier, in 1854, William Thomson concluded the dissipation of heat arising from the second law would lead to the ‘death’ of the universe [see ‘Cosmic heat death‘ on February 18th, 2015] while the big bang theory for the creation of the universe evolved about twenty years after Boltzmann’s death.  The probability of a very low entropy state required to bring the universe into existance is very small because it implies random fluctuations in energy and matter leading to a highly ordered state.  One analogy would be the probability of dead leaves floating on the surface of a pond arranging themselves to spell your name.  It is easy to think of fluctuations that are more likely to occur, involving smaller systems, such as one that would bring only our solar system into existence, or progressively more likely, only our planet, only the room in which you are sitting reading this blog, or only your brain.  The last would imply that everything is in your imagination and ultimately that is why Boltzmann’s argument is not widely accepted although we do not have a good explanation for the apparent low entropy state at the start of the universe.  Jean-Paul Sartre wrote in his book Nausea ‘I exist because I think…and I cannot stop myself from thinking.  At this very moment – it’s frightful – if I exist, it is because I am horrified at existing.’  Perhaps most people would find horrifying the logical extension of Boltzmann’s arguments about the start of the universe to everything only existing in our mind.  Boltzmann’s work on statistical mechanics and the second law of thermodynamics is widely accepted and support the case for him being genius; however, his work raised more questions than answers and was widely criticised during his lifetime which led to him taking his own life in 1906.

Sources:

Paul Sen, Einstein’s fridge: the science of fire, ice and the universe.  London: Harper Collins, 2021.

Jean-Paul Sartre, Nausea.  London: Penguin Modern Classics, 2000.

# Meta-knowledge: knowledge about knowledge

As engineers, we like to draw simple diagrams of the systems that we are attempting to analyse because most of us are pictorial problem-solvers and recording the key elements of a problem in a sketch helps us to identify the important issues and select an appropriate solution procedure [see ‘Meta-representational competence’ on May 13th, 2015].  Of course, these simple representations can be misleading if we omit parameters or features that dominate the behaviour of the system; so, there is considerable skill in idealising a system so that the analysis is tractable, i.e. can be solved.  Students find it especially difficult to acquire these skills [see ‘Learning problem-solving skills‘ on October 24th, 2018] and many appear to avoid drawing a meaningful sketch even when examinations marks are allocated to it [see ‘Depressed by exams‘ on January 31st, 2018].  Of course, in thermodynamics it is complicated by the entropy of the system being reduced when we omit parameters in order to idealise the system; because with fewer parameters to describe the system there are fewer microstates in which the system can exist and, hence according to Boltzmann, the entropy will be lower [see ‘Entropy on the brain‘ on November 29th, 2017].  Perhaps this is the inverse of realising that we understand less as we know more.  In other words, as our knowledge grows it reveals to us that there is more to know and understand than we can ever hope to comprehend [see ‘Expanding universe‘ on February 7th, 2018]. Is that the second law of thermodynamics at work again, creating more disorder to counter the small amount of order achieved in your brain?

Image: Sketch made during an example class

# Subtle balance of sustainable orderliness

I wrote this short essay a couple of weeks for another purpose and then changed my mind about using it.  So I thought I would share it on this blog.

Whenever we do something, some of our useful resource gets converted into productive activity but some is always lost in useless waste.  In other words, 100% efficiency is impossible – we can’t convert all of our resource into productive activity.  Engineers call this the second law of thermodynamics.  Thermodynamics is about energy transitions, for instance converting chemical energy in fossil fuels into electrical energy in a power station, and in these circumstances, the useless waste is called entropy.  At the time of the industrial revolution, Rudolf Clausius recognised that entropy can be related to the heat losses which occur whenever we do something useful, such as generating electricity in a power station, cleaning the house with an electric vacuum cleaner or running to catch the bus.

Clausius’s definition of entropy was really useful for designers of 19th century steam engines but it is difficult to use in other walks of life.  Fortunately Ludwig Boltzmann gave us a more valuable description.  He equated entropy to the number of states in which something could be arranged, or its lack of orderliness.  In other words, the more ways you can arrange something, the less ordered it is likely to be and the higher its entropy.  So a box of children’s building blocks has a low entropy when the blocks are packed in their box because there is a relatively small number of ways of arranging them to fit in the box.  When the box is emptied onto your living room floor, there are very many more possible arrangements and so the blocks have a high entropy.  The chance of knowing the whereabouts of a particular block is small. Whoops!  Now we’ve wondered into information theory.

Let’s get back to the second law, which using Boltzmann’s description of entropy, we can express as the level of orderliness should always decrease.  Stephen Hawking describes this as the arrow of time.  Because, if someone shows you a video clip in which steam gathers itself together and returns into a cup of coffee, or that box of children’s blocks repacks itself, then we know the video is being run backwards because these processes involve decreasing entropy and this can only happen spontaneously if we reverse the direction of time.  If this is true then why do we exist as highly ordered structures?

Erwin Schrödinger in his book, ‘What is Life’ says that organisms suck orderliness out of the environment in order to exist, so that the orderliness of the universe, that’s the organism and its environment, decreases.  Humans digest highly-ordered food to sustain life and food, in the form of plants, is brought into existence by metabolising energy from the sun and releasing entropy in the form of heat.  When we die these processes cease and the orderliness is sucked out of us to sustain insects, maggots and bacteria.

We are organisms, known as Sapiens, that organise ourselves into cultures and societies.  Organisation implies an increase in the level of orderliness in apparent contradiction of the second law.  So, we would expect to find a corresponding increase in disorder somewhere to counterbalance the order in society.  The more regimented society becomes the greater the requirement for counterbalancing disorder to occur somewhere in order to satisfy the second law, which might happen unexpectedly and explosively if the level of constraint or regulation is too great.  This is not an argument for anarchy or total deregulation, the financial sector has already demonstrated the risks associated with this path, but for an optimum and sustainable level of orderliness.  This requires subtle judgment just like in elegant engineering design and living a healthy life, both physically and psychologically.