
Feet on Holiday I 1979 Henry Moore OM, CH 1898-1986 Presented by the Henry Moore Foundation 1982 http://www.tate.org.uk/art/work/P02699
Digital everything is trendy at the moment. I am as guilty as everyone else: my research group is using digital cameras to monitor the displacement and deformation of structural components using a technique called digital image correlation (see my post on 256 Shades of grey on January 22nd, 2014) . Some years ago, in a similar vein, I pioneered a technique known as ‘digital photoelasticity’ (se my post on ‘Cow bladders lead to strain measurement‘ on January 7th, 2015.. But, what do we mean by ‘digital’? Originally it meant related to, resembling or operated by a digit or finger. However, electronic engineers will refer us to A-to-D and D-to-A converters that transform analogue signals into digital signals and vice versa. In this sense, digital means ‘expressed in discrete numerical form’ as opposed to analogue which means something that can vary continuously . Digital signals are ubiquitous because computers can handle digital information easily. Computers could be described as very, very large series of switches that can be either on or off, which allows numbers to be represented in binary. The world’s second largest computer, Tianhe-2, which I visited in Guangzhou a couple of years ago, has about 12.4 petabytes (about 1016 bytes) of memory which compares to 100 billion (1012) neurons an average human brain. There’s lots of tasks at which the world’s largest computers are excellent but none of them can drive a car, ride a bicycle, tutor a group of engineering students and write a blog post on the limits of digital technology all in a few hours. Ok, we could connect specialized computers together wirelessly under the command of one supercomputer but that’s incomparable to the 1.4 kilograms of brain cells in an engineering professor’s skull doing all of this without being reprogrammed or requiring significant cooling.
So, what’s our brain got that the world latest computer hasn’t? Well, it appears to be analogue and not digital. Our consciousness appears to arise from assemblies of millions of neurons firing in synchrony and because each neuron can fire at an infinite number of levels, then our conscious thoughts can take on a multiplicity of forms that a digital computer can never hope to emulate because its finite number of switches have only two positions each: on and off.
I suspect that the future is not digital but analogue; we just don’t know how to get there, yet. We need to stop counting with our digits and start thinking with our brains.
My take-away message: “We need to stop counting with our digits and start thinking with our brains”. Thank you for sharing your thoughts and high-lighting the difference between “analogue” and digital or “artificial intelligence”.
I agree – an infinite number of digital samples would be required to equate to its analogue equivalent, which is theoretically (practically) impossible; making an analogue system far superior and workable. Unfortunately we are being driven down the route of all things digital, but I think there is catastrophe looming which will bring about a change of thought!