Tag Archives: neuronal assemblies

When will you be replaced by a computer?

I have written before about extending our minds by using external computing power in our mobile phones [see ‘Science fiction becomes virtual reality‘ on October 12th, 2016; and ‘Thinking out of the skull‘ on March 18th, 2015]; but, how about replacing our brain with a computer?  That’s the potential of artificial intelligence (AI); not literally replacing our brain, but at least taking over jobs that are traditionally believed to require our brain-power.  For instance, in a recent test, an AI lawyer found 95% of the loopholes in a non-disclosure agreement in 22 seconds while a group of human lawyers found only 88% in 90 minutes, according to Philip Delves Broughton in the FT last weekend.

If this sounds scary, then consider for a moment the computing power involved.  Lots of researchers are interested in simulating the brain and it has been estimated that the computing power required is around hundred peta FLOPS (FLoating point Operations Per Second), which conveniently, is equivalent to the world’s most powerful computers.  At the time of writing the world’s most powerful computer was ‘Summit‘ at the US Oak Ridge National Laboratory, which is capable of 200 petaFLOPS.  However, simulating the brain is not the same as reproducing its intelligence; and petaFLOPS are not a good measure of intelligence because while ‘Summit’ can multiply many strings of numbers together per second, it would take you and me many minutes to multiply two strings of numbers together giving us a rating of one hundredth of a FLOP or less.

So, raw computing power does not appear to equate to intelligence, instead intelligence seems to be related to our ability to network our neurons together in massive assemblies that flicker across our brain interacting with other assemblies [see ‘Digital hive mind‘ on November 30th, 2016]. We have about 100 billion neurons compared with the ‘Summit’ computer’s 9,216 CPUs (Central Processing Unit) and 27,648 GPUs (Graphic Processing Units); so, it seems unlikely that it will be able to come close to our ability to be creative or to handle unpredictable situations even accounting for the multiple cores in the CPUs.  In addition, it requires a power input of 13MW or a couple of very large wind turbines, compared to 80W for the base metabolic rate of a human of which the brain accounts for about 20%; so, its operating costs render it an uneconomic substitute for the human brain in activities that require intelligence.  Hence, while computers and robots are taking over many types of jobs, it seems likely that a core group of jobs involving creativity, unpredictability and emotional intelligence will remain for humans for the foreseeable future.

Sources:

Max Tegmark, Life 3.0 – being human in the age of artificial intelligence, Penguin Books, 2018.

Philip Delves Broughton, Doom looms over the valley, FT Weekend, 16 November/17 November 2019.

Engelfriet, Arnoud, Creating an Artificial Intelligence for NDA Evaluation (September 22, 2017). Available at SSRN: https://ssrn.com/abstract=3039353 or http://dx.doi.org/10.2139/ssrn.3039353

See also NDA Lynn at https://www.ndalynn.com/

Is there a real ‘you’ or ‘I’?

I have written recently about time and consciousness [see ‘Time at the heart of our problems‘ on January 30th, 2019 and ‘Limits of imagination‘ on February 13th, 2019].  We perceive some things as almost constant or changeless, such as trees and landscapes; however, that is just a consequence of our perception of time.  Nothing that is in equilibrium, and hence unchanging, can be alive.  The laws of thermodynamics tell us that disequilibrium is fundamental in driving all processes including life.  Our perception of experience arises from registering changes in the flow of sensory information to our brains and as well as changes in the networks of neurons in our brains.  Hence, both time and complexity appear to be essential ingredients for consciousness. Even when we sit motionless watching an apparently unchanging scene, as a consequence of the endless motion of connections and signals in our brains, our minds are teeming with activity, churning through great jumbles of ideas, memories and thoughts.  Next time you are sitting quietly, try to find ‘you’; not the things that you do or experience but the elusive ‘I’.  We assume that the elusive ‘I’ is there, but most of us find nothing when we look for it.  Julian Baggini has suggested that the “I” is ‘a nothing, contentless centre around which experiences flutter like butterflies.’

Sources:

Baggini J, The pig that wants to be eaten and 99 other thought experiments, London: Granta Publications, 2008.

Czerski H, Storm in a teacup:the physics of everyday life, London: Penguin Random House, 2016.

Godfrey-Smith P, Other minds: the octopus and the evolution of intelligent life, London: William Collins, 2018.

Rovelli C, Seven brief lessons on physics, London, Penguin Books. 2016.

Limits of imagination

What’s it like being a bat?  ‘Seeing’ the world through your ears, or at least a sophisticated echo-location system. Or, what’s it like being an octopus?  With eight semi-autonomous arms that I wrote about a couple of weeks ago [see ‘Intelligent aliens?’ on January 16th, 2019]. For most of us, it’s unimaginable. Perhaps, because we are not bats or octopuses, but that seems to be dodging the issue.  Is it a consequence of our education and how we have been taught to think about science?  Most scientists have been taught to express their knowledge from a third person perspective that omits the personal point of view, i.e. our experience of science.  The philosopher, Julian Baggini has questioned the reason for this mode of expression: is it that we haven’t devised a framework for understanding the world scientifically that captures the first and third person points of view; is it that the mind will always elude scientific explanation; or is that the mind simply isn’t part of the physical world?

Our minds have as many neurons as there are stars in the galaxy, i.e. about a hundred billion, which is sufficient to create complex processes within us that we are never likely to understand or predict.  In this context, Carlo Rovelli has suggested that the ideas and images that we have of ourselves are much cruder and sketchier than the detailed complexity of what is happening within us.  So, if we struggle to describe our own consciousness, then perhaps it is not surprising that we cannot express what it is like to be a bat or an octopus.  Instead we resort to third person descriptions and justify it as being in the interests of objectivity.  But, does your imagination stretch to how much greater our understanding would be if we did know what is like to be a bat or an octopus?  And, how that might change our attitude to the ecosystem?

BTW:  I would answer yes, yes and maybe to Baggini’s three questions, although I remain open-minded on all of them.

Sources:

Baggini J, The pig that wants to be eaten and 99 other thought experiments, London: Granta Publications, 2008.

Rovelli C, Seven brief lessons on physics, London, Penguin Books. 2016.

Image: https://www.nps.gov/chis/learn/nature/townsends-bats.htm

Entropy on the brain

It was the worst of times, it was the worst of times.  Again.  That’s the things about things.  They fall apart, always have, always will, it’s in their nature.’  They are the opening three lines of Ali Smith’s novel ‘Autumn’.  Ali Smith doesn’t mention entropy but that’s what she is describing.

My first-year lecture course has progressed from the first law of thermodynamics to the second law; and so, I have been stretching the students’ brains by talking about entropy.  It’s a favourite topic of mine but many people find it difficult.  Entropy can be described as the level of disorder present in a system or the environment.  Ludwig Boltzmann derived his famous equation, S=k ln W, which can be found on his gravestone – he died in 1906.  S is entropy, k is a constant of proportionality named after Boltzmann, and W is the number of arrangements in which a system can be arranged without changing its energy content (ln means natural logarithm).  So, the more arrangements that are possible then the larger is the entropy.

By now the neurons in your brain should be firing away nicely with a good level of synchronicity (see my post entitled ‘Digital hive mind‘ on November 30th, 2016 and ‘Is the world comprehensible?‘ on March 15th, 2017).  In other words, groups of neurons should be showing electrical activity that is in phase with other groups to form large networks.  Some scientists believe that the size of the network was indicative of the level of your consciousness.  However, scientists in Toronto led by Jose Luis Perez-Velazquez, have suggested that it is not the size of the network that is linked to consciousness but the number of ways that a particular degree of connectivity can be achieved.  This begins to sound like the entropy of your neurons.

In 1948 Claude Shannon, an American electrical engineer, stated that ‘information must be considered as a negative term in the entropy of the system; in short, information is negentropy‘. We can extend this idea to the concept that the entropy associated with information becomes lower as it is arranged, or ordered, into knowledge frameworks, e.g. laws and principles, that allow us to explain phenomena or behaviour.

Perhaps these ideas about entropy of information and neurons are connected; because when you have mastered a knowledge framework for a topic, such as the laws of thermodynamics, you need to deploy a small number of neurons to understand new information associated with that topic.  However, when you are presented with unfamiliar situations then you need to fire multiple networks of neurons and try out millions of ways of connecting them, in order to understand the unfamiliar data being supplied by your senses.

For diverse posts on entropy see: ‘Entropy in poetry‘ on June 1st, 2016; ‘Entropy management for bees and flights‘ on November 5th, 2014; and ‘More on white dwarfs and existentialism‘ on November 16th, 2016.

Sources:

Ali Smith, Autumn, Penguin Books, 2017

Consciousness is tied to ‘entropy’, say researchers, Physics World, October 16th, 2016.

Handscombe RD & Patterson EA, The Entropy Vector: Connecting Science and Business, Singapore: World Scientific Publishing, 2004.