Tag Archives: connectivity

Intelligent aliens?

A couple of weeks ago I wrote about cuttlefish [see ‘Wearing your heart on your sleeve‘ on January 16th, 2019]  based on a wonderful book, that I was given for Christmas, called ‘Other Minds: The Octopus and the Evolution of Intelligent Life‘ by Peter Godfrey-Smith.  Cuttlefish and octopuses are cephalopods that Peter Godfrey-Smith describes as ‘an island of mental complexity in the sea of invertebrate animals’.  The most recent common ancestor of cephalopods and humans is so distant and was so simple that cephalopods represent an independent experiment in the evolution of large brains and complex behaviour.  An octopus has about 500 million neurons, which is not as many as humans, we have about 100 billion; but still a large number and connectivity is probably more important than absolute size [see ‘Digital hive mind‘ on November 30th, 2016].  Whereas we have a central nervous system, an octopus has a distributed system with neurons located in its arms which appears to give each arm a high-level of autonomy.  In addition to tactile sensory information from its suckers, each arm receives visual information from its skin which is sensitive to light.  The extent to which information and control is shared between the neurons in the brain and the network of neurons in its body is unknown.  It is difficult for us to imagine our fingers as being able to respond independently to visual as well as tactile stimuli, even more so to think of them as independent problem-solvers.  Peter Godfrey-Smith suggests that cephalopods are the closest that we are likely to come to meeting intelligent aliens – their thought processes and capabilities appear so different to ours that our scientific studies and experiments are unlikely to fully reveal their intelligence or level of consciousness.  A first step would be to stop eating them!

Peter Godfrey-Smith, Other Minds: The Octopus and the Evolution of Intelligent Life, London: William Collins, 2018.

Entropy on the brain

It was the worst of times, it was the worst of times.  Again.  That’s the things about things.  They fall apart, always have, always will, it’s in their nature.’  They are the opening three lines of Ali Smith’s novel ‘Autumn’.  Ali Smith doesn’t mention entropy but that’s what she is describing.

My first-year lecture course has progressed from the first law of thermodynamics to the second law; and so, I have been stretching the students’ brains by talking about entropy.  It’s a favourite topic of mine but many people find it difficult.  Entropy can be described as the level of disorder present in a system or the environment.  Ludwig Boltzmann derived his famous equation, S=k ln W, which can be found on his gravestone – he died in 1906.  S is entropy, k is a constant of proportionality named after Boltzmann, and W is the number of arrangements in which a system can be arranged without changing its energy content (ln means natural logarithm).  So, the more arrangements that are possible then the larger is the entropy.

By now the neurons in your brain should be firing away nicely with a good level of synchronicity (see my post entitled ‘Digital hive mind‘ on November 30th, 2016 and ‘Is the world comprehensible?‘ on March 15th, 2017).  In other words, groups of neurons should be showing electrical activity that is in phase with other groups to form large networks.  Some scientists believe that the size of the network was indicative of the level of your consciousness.  However, scientists in Toronto led by Jose Luis Perez-Velazquez, have suggested that it is not the size of the network that is linked to consciousness but the number of ways that a particular degree of connectivity can be achieved.  This begins to sound like the entropy of your neurons.

In 1948 Claude Shannon, an American electrical engineer, stated that ‘information must be considered as a negative term in the entropy of the system; in short, information is negentropy‘. We can extend this idea to the concept that the entropy associated with information becomes lower as it is arranged, or ordered, into knowledge frameworks, e.g. laws and principles, that allow us to explain phenomena or behaviour.

Perhaps these ideas about entropy of information and neurons are connected; because when you have mastered a knowledge framework for a topic, such as the laws of thermodynamics, you need to deploy a small number of neurons to understand new information associated with that topic.  However, when you are presented with unfamiliar situations then you need to fire multiple networks of neurons and try out millions of ways of connecting them, in order to understand the unfamiliar data being supplied by your senses.

For diverse posts on entropy see: ‘Entropy in poetry‘ on June 1st, 2016; ‘Entropy management for bees and flights‘ on November 5th, 2014; and ‘More on white dwarfs and existentialism‘ on November 16th, 2016.

Sources:

Ali Smith, Autumn, Penguin Books, 2017

Consciousness is tied to ‘entropy’, say researchers, Physics World, October 16th, 2016.

Handscombe RD & Patterson EA, The Entropy Vector: Connecting Science and Business, Singapore: World Scientific Publishing, 2004.