Tag Archives: Boltzmann

Meta-knowledge: knowledge about knowledge

As engineers, we like to draw simple diagrams of the systems that we are attempting to analyse because most of us are pictorial problem-solvers and recording the key elements of a problem in a sketch helps us to identify the important issues and select an appropriate solution procedure [see ‘Meta-representational competence’ on May 13th, 2015].  Of course, these simple representations can be misleading if we omit parameters or features that dominate the behaviour of the system; so, there is considerable skill in idealising a system so that the analysis is tractable, i.e. can be solved.  Students find it especially difficult to acquire these skills [see ‘Learning problem-solving skills‘ on October 24th, 2018] and many appear to avoid drawing a meaningful sketch even when examinations marks are allocated to it [see ‘Depressed by exams‘ on January 31st, 2018].  Of course, in thermodynamics it is complicated by the entropy of the system being reduced when we omit parameters in order to idealise the system; because with fewer parameters to describe the system there are fewer microstates in which the system can exist and, hence according to Boltzmann, the entropy will be lower [see ‘Entropy on the brain‘ on November 29th, 2017].  Perhaps this is the inverse of realising that we understand less as we know more.  In other words, as our knowledge grows it reveals to us that there is more to know and understand than we can ever hope to comprehend [see ‘Expanding universe‘ on February 7th, 2018]. Is that the second law of thermodynamics at work again, creating more disorder to counter the small amount of order achieved in your brain?

Image: Sketch made during an example class

Subtle balance of sustainable orderliness

129-2910_IMGI wrote this short essay a couple of weeks for another purpose and then changed my mind about using it.  So I thought I would share it on this blog.

Whenever we do something, some of our useful resource gets converted into productive activity but some is always lost in useless waste.  In other words, 100% efficiency is impossible – we can’t convert all of our resource into productive activity.  Engineers call this the second law of thermodynamics.  Thermodynamics is about energy transitions, for instance converting chemical energy in fossil fuels into electrical energy in a power station, and in these circumstances, the useless waste is called entropy.  At the time of the industrial revolution, Rudolf Clausius recognised that entropy can be related to the heat losses which occur whenever we do something useful, such as generating electricity in a power station, cleaning the house with an electric vacuum cleaner or running to catch the bus.

Clausius’s definition of entropy was really useful for designers of 19th century steam engines but it is difficult to use in other walks of life.  Fortunately Ludwig Boltzmann gave us a more valuable description.  He equated entropy to the number of states in which something could be arranged, or its lack of orderliness.  In other words, the more ways you can arrange something, the less ordered it is likely to be and the higher its entropy.  So a box of children’s building blocks has a low entropy when the blocks are packed in their box because there is a relatively small number of ways of arranging them to fit in the box.  When the box is emptied onto your living room floor, there are very many more possible arrangements and so the blocks have a high entropy.  The chance of knowing the whereabouts of a particular block is small. Whoops!  Now we’ve wondered into information theory.

Let’s get back to the second law, which using Boltzmann’s description of entropy, we can express as the level of orderliness should always decrease.  Stephen Hawking describes this as the arrow of time.  Because, if someone shows you a video clip in which steam gathers itself together and returns into a cup of coffee, or that box of children’s blocks repacks itself, then we know the video is being run backwards because these processes involve decreasing entropy and this can only happen spontaneously if we reverse the direction of time.  If this is true then why do we exist as highly ordered structures?

Erwin Schrödinger in his book, ‘What is Life’ says that organisms suck orderliness out of the environment in order to exist, so that the orderliness of the universe, that’s the organism and its environment, decreases.  Humans digest highly-ordered food to sustain life and food, in the form of plants, is brought into existence by metabolising energy from the sun and releasing entropy in the form of heat.  When we die these processes cease and the orderliness is sucked out of us to sustain insects, maggots and bacteria.

We are organisms, known as Sapiens, that organise ourselves into cultures and societies.  Organisation implies an increase in the level of orderliness in apparent contradiction of the second law.  So, we would expect to find a corresponding increase in disorder somewhere to counterbalance the order in society.  The more regimented society becomes the greater the requirement for counterbalancing disorder to occur somewhere in order to satisfy the second law, which might happen unexpectedly and explosively if the level of constraint or regulation is too great.  This is not an argument for anarchy or total deregulation, the financial sector has already demonstrated the risks associated with this path, but for an optimum and sustainable level of orderliness.  This requires subtle judgment just like in elegant engineering design and living a healthy life, both physically and psychologically.