Tag Archives: heat transfer

Everything is flux but it’s not always been recognised

Decorative photograph or ruins of Fountains Abbey next to River SkellI am teaching thermodynamics to first year undergraduate students at the moment and in most previous years this experience has stimulated me to blog about thermodynamics [for example: ‘Isolated systems in nature?’ on February 12th, 2020].  However, this year I am more than half-way through the module and this is the first post on the topic.  Perhaps that is an impact of teaching on-line via live broadcasts rather than the performance involved in lecturing to hundreds of students in a lecture theatre.  Last week I introduced the second law of thermodynamics and explained its origins in efforts to improve the efficiency of steam engines by 19th century engineers and physicists, including Rudolf Clausius (1822 – 1888), William Thomson (1827 – 1907) and Ludwig Boltzmann (1844 – 1906).  The second law of thermodynamics states that the entropy of the universe increases during all real processes, where entropy can be described as the degree of disorder. The traditional narrative is that thermodynamics was developed by the Victorians; however, I think that the ancient Greeks had a pretty good understanding of it without calling it thermodynamics.  Heraclitus (c. 535 BCE – c. 475 BCE) understood that everything is in flux and nothing is at rest so that the world is one colossal process.  This concept comes close to the modern interpretation of the second of law of thermodynamics in which the entropy in the universe is constantly increasing leading to continuous change.  Heraclitus just did not state the direction of flux.  Unfortunately, Plato (c. 429 BCE – c. 347 BCE) did not agree with Heraclitus, but thought that some divine intervention had imposed order on pre-existing chaos to create an ordered universe, which precludes a constant flux and probably set back Western thought for a couple of millennia.  However, it seems likely that in the 17th century, Newton (1643 – 1727) and Leibniz (1646 – 1716), when they independently invented calculus, had more than an inkling about everything being in flux.  In the 18th century, the pioneering geologist James Hutton (1726 – 1797), while examining the tilted layers of the cliff at Siccar Point in Berwickshire, realised that the Earth was not simply created but instead is in a state of constant flux.  His ideas were spurned at the time and he was accused of atheism.  Boltzmann also had to vigorously defend his ideas to such an extent that his mental health deteriorated and he committed suicide while on vacation with his wife and daughter.  Today, it is widely accepted that the second law of thermodynamics governs all natural and synthetic processes, and many people have heard of entropy [see ‘Entropy on the brain’ on November 29th, 2017] but far fewer understand it [see ‘Two cultures’ on March 5th, 2013].  It is perhaps still controversial to talk about the theoretical long-term consequence of the second law, which is cosmic heat death corresponding to an equilibrium state of maximum entropy and uniform temperature across the universe such that nothing happens and life cannot exist [see ‘Will it all be over soon?’ on November 2nd, 2016].  This concept caused problems to 19th century thinkers, particular James Clerk Maxwell (1831 – 1979), and even perhaps to Plato who theorised two worlds in his theory of forms, one unchanging and the other in constant change, maybe in an effort to dodge the potential implications of degeneration of the universe into chaos.

Image: decaying ruins of Fountains Abbey beside the River Skell.  Heraclitus is reported to have said ‘no man ever steps twice into the same river; for it’s not the same river and he’s not the same man’.

Thermodynamics labs as homework

Many of my academic colleagues are thinking about modifying their undergraduate teaching for next academic year so that they are more resilient to coronavirus.  Laboratory classes present particular challenges when access and density of occupation are restricted.  However, if the purpose of laboratory classes is to allow students to experience phenomena, to enhance understanding, to develop intuition and to acquire skills in using equipment, making measurements and analysing data, then I believe this can achieved using practical exercises for homework.  I created practical exercises, that can be performed in a kitchen at home, as part of a Massive Open Online Course (MOOC) about thermodynamics [See ‘Engaging learners on-line‘ on May 25th, 2016].  I have used the same exercises as part of my first year undergraduate module on thermodynamics for the past four years with similar levels of participation to those experienced by my colleagues who run traditional laboratory classes [see ‘Laboratory classes thirty years on‘ on May 15th, 2019].  I have had a number of enquiries from colleagues in other universities about these practical exercises and so I have decided to make the instruction sheets available to all.  Please feel free to use them to support your teaching.

The versions below are from the MOOC entitled ‘Energy: Thermodynamics in Everyday Life‘ and provide information about where to obtain the small amount of equipment needed, and hence are self-contained.  Although the equipment only costs about £20, at the University of Liverpool, we lend our students a small bag of equipment containing a measuring beaker, a digital thermometer, a plug-in power meter and a plumber’s manometer.  I also use a slightly different version of these instructions sheets that provide information about ‘lab’ reports that students must submit as part of their coursework.

I reported on the initial introduction of blended learning and these practical exercises in Patterson EA, 2019, Using everyday examples to engage learners on a massive open online course, IJ Mechanical Engineering Education, 0306419018818551.

Instruction sheets for thermodynamics practical exercises as homework:

Energy balance using the first law of thermodynamics | Efficiency of a kettle

Ideal gas behaviour | Estimating the value of absolute zero

Overall heat transfer coefficient | Heat losses from a coffee cup & glass

 

 

Isolated systems in nature?

Is a coconut an isolated thermodynamic system?  This is a question that I have been thinking about this week.  A coconut appears to be impermeable to matter since its milk does not leak out and it might be insulated against heat transfer because its husk is used for insulation in some building products.  If you are wondering why I am pondering such matters, then it is because, once again, I am teaching thermodynamics to our first year students (see ‘Pluralistic Ignorance‘ on May 1st, 2019).  It is a class of more than 200 students and I am using a blended learning environment (post on 14th November 2018) that combines lectures with the units of the massive open online course (MOOC) that I developed some years ago (see ‘Engaging learners on-line‘ on May 25th, 2016).  However, before devotees of MOOCs get excited, I should add that the online course is neither massive nor open because we have restricted it to our university students.  In my first lecture, I talked about the concept of defining the system of interest for thermodynamic analysis by drawing boundaries (see ‘Drawing boundaries‘ on December 19th, 2012).  The choice of the system boundary has a strong influence on the answers we will obtain and the simplicity of the analysis we will need to perform.  For instance, drawing the system boundary around an electric car makes it appear carbon neutral and very efficient but including the fossil fuel power station that provides the electricity reveals substantial carbon emissions and significant reductions in efficiency.  I also talked about different types of system, for example: open systems across whose boundaries both matter and energy can move; closed systems that do not allow matter to flow across their boundaries but allow energy transfers; and, isolated systems that do not permit energy or matter to transfer across their boundaries.  It is difficult to identify closed systems in nature (see ‘Revisiting closed systems in nature‘ on October 5th, 2016); and so, once again I asked the students to suggest candidates but then I started to think about examples of isolated systems.  I suspect that completely isolated systems do not exist; however, some systems can be approximated to the concept and considering them to be so, simplifies their analysis.  However, I am happy to be corrected if anyone can think of one!

Image: https://www.flickr.com/photos/yimhafiz/4031507140 CC BY 2.0

Entropy on the brain

It was the worst of times, it was the worst of times.  Again.  That’s the things about things.  They fall apart, always have, always will, it’s in their nature.’  They are the opening three lines of Ali Smith’s novel ‘Autumn’.  Ali Smith doesn’t mention entropy but that’s what she is describing.

My first-year lecture course has progressed from the first law of thermodynamics to the second law; and so, I have been stretching the students’ brains by talking about entropy.  It’s a favourite topic of mine but many people find it difficult.  Entropy can be described as the level of disorder present in a system or the environment.  Ludwig Boltzmann derived his famous equation, S=k ln W, which can be found on his gravestone – he died in 1906.  S is entropy, k is a constant of proportionality named after Boltzmann, and W is the number of arrangements in which a system can be arranged without changing its energy content (ln means natural logarithm).  So, the more arrangements that are possible then the larger is the entropy.

By now the neurons in your brain should be firing away nicely with a good level of synchronicity (see my post entitled ‘Digital hive mind‘ on November 30th, 2016 and ‘Is the world comprehensible?‘ on March 15th, 2017).  In other words, groups of neurons should be showing electrical activity that is in phase with other groups to form large networks.  Some scientists believe that the size of the network was indicative of the level of your consciousness.  However, scientists in Toronto led by Jose Luis Perez-Velazquez, have suggested that it is not the size of the network that is linked to consciousness but the number of ways that a particular degree of connectivity can be achieved.  This begins to sound like the entropy of your neurons.

In 1948 Claude Shannon, an American electrical engineer, stated that ‘information must be considered as a negative term in the entropy of the system; in short, information is negentropy‘. We can extend this idea to the concept that the entropy associated with information becomes lower as it is arranged, or ordered, into knowledge frameworks, e.g. laws and principles, that allow us to explain phenomena or behaviour.

Perhaps these ideas about entropy of information and neurons are connected; because when you have mastered a knowledge framework for a topic, such as the laws of thermodynamics, you need to deploy a small number of neurons to understand new information associated with that topic.  However, when you are presented with unfamiliar situations then you need to fire multiple networks of neurons and try out millions of ways of connecting them, in order to understand the unfamiliar data being supplied by your senses.

For diverse posts on entropy see: ‘Entropy in poetry‘ on June 1st, 2016; ‘Entropy management for bees and flights‘ on November 5th, 2014; and ‘More on white dwarfs and existentialism‘ on November 16th, 2016.

Sources:

Ali Smith, Autumn, Penguin Books, 2017

Consciousness is tied to ‘entropy’, say researchers, Physics World, October 16th, 2016.

Handscombe RD & Patterson EA, The Entropy Vector: Connecting Science and Business, Singapore: World Scientific Publishing, 2004.