Tag Archives: second law

An expanding universe

I attended a workshop last month at which one of the speakers showed us this graphic.  It illustrates that the volume of information available to us has been approximately doubling every year.  In 2005, the digital universe was 130 Exabytes (billions of gigabytes) and by 2020 it is expected to have grown to about 40,000 Exabytes.  The second law of thermodynamics tells us that entropy or disorder of the physical universe is always increasing; so, is this also true for the digital universe?  Claude Shannon proposed that information is negentropy, which implies that an increasing growth in information represents a decrease in entropy and this seems to contradict the second law [see my post ‘Entropy on the brain‘ on November 29th, 2017].  Perhaps the issue is the definition of information – the word comes from the Latin: informare, which means to inform or to give someone knowledge.  I suspect that much of what we view on our digital screens does not inform and is data rather than information.  Our digital screens are akin to telescopes used to view the physical universe – they let us see what’s out there, but we have to do some processing of the data in order to convert it into knowledge.  It’s that last bit that can be stressful if we don’t have some control mechanisms available to limit the amount of disorder that we ask our brains to cope with – we are back to Gadget Stress [see my post on April 9th, 2014] and Digital Detox [see my post on August 10th, 2016].

Source: Atsufumi Hirohata, Department of Electronics, University of York www-users.york.ac.uk/~ah566/lectures/adv01_introduction.pps

Image: http://japan.digitaldj.network.com/articles/9538.html

 

Entropy on the brain

It was the worst of times, it was the worst of times.  Again.  That’s the things about things.  They fall apart, always have, always will, it’s in their nature.’  They are the opening three lines of Ali Smith’s novel ‘Autumn’.  Ali Smith doesn’t mention entropy but that’s what she is describing.

My first-year lecture course has progressed from the first law of thermodynamics to the second law; and so, I have been stretching the students’ brains by talking about entropy.  It’s a favourite topic of mine but many people find it difficult.  Entropy can be described as the level of disorder present in a system or the environment.  Ludwig Boltzmann derived his famous equation, S=k ln W, which can be found on his gravestone – he died in 1906.  S is entropy, k is a constant of proportionality named after Boltzmann, and W is the number of arrangements in which a system can be arranged without changing its energy content (ln means natural logarithm).  So, the more arrangements that are possible then the larger is the entropy.

By now the neurons in your brain should be firing away nicely with a good level of synchronicity (see my post entitled ‘Digital hive mind‘ on November 30th, 2016 and ‘Is the world comprehensible?‘ on March 15th, 2017).  In other words, groups of neurons should be showing electrical activity that is in phase with other groups to form large networks.  Some scientists believe that the size of the network was indicative of the level of your consciousness.  However, scientists in Toronto led by Jose Luis Perez-Velazquez, have suggested that it is not the size of the network that is linked to consciousness but the number of ways that a particular degree of connectivity can be achieved.  This begins to sound like the entropy of your neurons.

In 1948 Claude Shannon, an American electrical engineer, stated that ‘information must be considered as a negative term in the entropy of the system; in short, information is negentropy‘. We can extend this idea to the concept that the entropy associated with information becomes lower as it is arranged, or ordered, into knowledge frameworks, e.g. laws and principles, that allow us to explain phenomena or behaviour.

Perhaps these ideas about entropy of information and neurons are connected; because when you have mastered a knowledge framework for a topic, such as the laws of thermodynamics, you need to deploy a small number of neurons to understand new information associated with that topic.  However, when you are presented with unfamiliar situations then you need to fire multiple networks of neurons and try out millions of ways of connecting them, in order to understand the unfamiliar data being supplied by your senses.

For diverse posts on entropy see: ‘Entropy in poetry‘ on June 1st, 2016; ‘Entropy management for bees and flights‘ on November 5th, 2014; and ‘More on white dwarfs and existentialism‘ on November 16th, 2016.

Sources:

Ali Smith, Autumn, Penguin Books, 2017

Consciousness is tied to ‘entropy’, say researchers, Physics World, October 16th, 2016.

Handscombe RD & Patterson EA, The Entropy Vector: Connecting Science and Business, Singapore: World Scientific Publishing, 2004.

Inspirational leadership

Leadership is about inspiring people; whereas, management is about organising tasks and resources.  In a organisational context, strategic leadership is about persuading people to move voluntarily, and together, in a direction that benefits the organisation; while, management is about dealing with the complexity of planning and processes.  The boundary between leadership and management is often blurred; though in my experience, people more frequently believe that they are leading when, in reality, they are managing.  Perhaps, this is because they want to make a difference; but, for most of us, leadership is really hard and requires courage.  The courage to be different.  To be selfless.  The courage to do what is right and not just what is easy.

It is easier to get involved in the detail of making things happen, of telling people how to do things; but that’s management and not leadership.  Leadership is about letting go and trusting others to make the right decisions on the details – having the courage to delegate.  There’s something about entropy in there and not over constraining the system, or under constaining it; but, now I ‘ve got to the entropy vector and that’s a whole different story.

Robert D Handscombe & Eann A Patterson, The Entropy Vector: Connecting Science and Business, Singapore: World Scientific Press, 2004.

Consensus is just a coffee break

milk in coffee‘Consensus is just a coffee break’ to quote Caputo. He argued that if consensus was the ultimate aim then eventually we would all stop talking. The goal of conversation would be silence and as he wrote that would be a strange outcome for a species defined by its ability to speak. It is differences that drive everything: innovation, progress and the processes of life.

In thermodynamics, William Thomson (Lord Kelvin) observed that heat flows into the random motion of molecules and is never recovered, so that eventually a universe of uniform temperature will be created. When heat flows between matter at different temperatures we can extract work, for instance, using a heat engine. No work could be extracted from a universe of uniform temperature and so nothing would happen. Life would cease and there would be cosmic death [see my posts entitled ‘Will it all be over soon‘ on November 2nd, 2016 and ‘Cosmic Heat Death‘ on February 18th, 2015].

In the Hitchhiker’s Guide to the Galaxy, the crew of the Heart of Gold contemplated whether relationships between people were susceptible to the same laws that governed the relationships between atoms and molecules. The answer would appear to be affirmative in terms of dissonance being necessary for action.

So, we should celebrate and respect the differences in our communities. They are essential for a functioning, vibrant and successful society – without them life would not just consist of silent conversations but would cease completely.

Sources:

Caputo JD, Truth: Philosophy in Transit, London: Penguin 2013

Douglas Adams, The Hitchhiker’s Guide to the Galaxy, London: Picador, 2002.