As a student, in the run up to exams, I used to enjoy going out walking in the hills on my own. This approach to exam preparation probably surprised my fellow students. While other walkers that I came across probably thought I was mad because, in an age before mobile phones, they would see me talking to myself; because, as I walked, I was reciting material that I needed to learn for the next exam. This technique worked for me but I have hesitated to recommend such behaviour to my students. Now, I’ve discovered that psychologists have found that cognitive performance is improved in young adults while walking at a comfortable, relaxed speed. This is probably connected to the neurogenesis that I wrote about in my post entitled ‘Gone walking’ on April 19th, 2017.
So, as the examination season is underway in many universities, I thought I should pass on my rather eccentric approach to exam revision. No doubt, I’ll discover that I wasn’t so eccentric after all but none of us dared share such an unconventional approach to exam preparation.
Background and lock-screen pictures have become a feature of modern life. Your computer and mobile device were probably delivered with some pre-loaded scenes from nature and some of us personalize our devices by up-loading photographs taken on holiday or a recent excursion into the countryside. Perhaps, we do this intuitively, because recent research has shown that immersion in nature, even at the superficial level of viewing a picture can improve brain function. Brisk walking stimulates the production of new neurons and, when you do it in an environment enriched with natural stimuli, the connectivity and stability of connectivity between neurons is increased. For those us whose biological systems are in terminal decline, the opportunity to retard this decline by walking in the wild is too good to miss. I have gone to the English Lake District to produce and connect some more neurons. I’ll be back next week – feeling hopefully creative and empowered, as well as, probably rather damp but what else can be expected from northern England in April!
For those of you who want to immerse themselves vicariously in the damp natural environment of England in the rain could read ‘Rain: Four Walks in English Weather‘ by Melissa Harrison.
For hundreds of years, philosophers and scientists have encouraged one another to keep their explanations of the natural world as simple as possible. Ockham’s razor, attributed to the 14th century Franciscan friar, William of Ockham, is a well-established and much-cited philosophical principle that of two possible explanations, the simpler one is more likely to be correct. More recently, Albert Einstein is supposed to have said: ‘everything should be made as simple as possible, but not simpler’. I don’t think that William of Ockham and Albert Einstein were arguing that we should keep everything simple; but rather that we should not make scientific explanations more complicated than necessary. However, do we have a strong preference for focusing on phenomena whose behaviour is sufficiently uncomplex that it can be explained by relatively simple theories and models? In other words, to quote William Wimsatt, ‘we tend to ignore phenomena whose complexity exceeds the capability of our detection apparatus and explanatory models’. Most of us find science hard; perhaps, this is not just about the language used by the cognoscenti to describe it [see my post on ‘Why is thermodynamics so hard?‘ on February 11th, 2015]; but, more about the complexity of the world around us. To think about this level of complexity requires us to assemble and synchronize very large collections of neurons (100 million or more) in our brains, which is the very opposite of the repetitive formation of relatively small assemblies of neurons that Susan Greenfield has argued are associated with activities we find pleasurable [see my post entitled ‘Digital hive mind‘ on November 30th, 2016]. This might imply that thinking about complexity is not pleasurable for most us, or at least requires very significant effort, and that this explains the aesthetic appeal of simplicity. However, as William Wimsatt has pointed out, ‘simplicity is not reflective of a metaphysical principle of nature’ but a constraint applied by us; and which, if we persist in its application, will render the world incomprehensible to us.
William C. Wimsatt, Randomness and perceived randomness in evolutionary biology, Synthese, 43(2):287-329, 1980.
Digital everything is trendy at the moment. I am as guilty as everyone else: my research group is using digital cameras to monitor the displacement and deformation of structural components using a technique called digital image correlation (see my post on 256 Shades of grey on January 22nd, 2014) . Some years ago, in a similar vein, I pioneered a technique known as ‘digital photoelasticity’ (se my post on ‘Cow bladders lead to strain measurement‘ on January 7th, 2015.. But, what do we mean by ‘digital’? Originally it meant related to, resembling or operated by a digit or finger. However, electronic engineers will refer us to A-to-D and D-to-A converters that transform analogue signals into digital signals and vice versa. In this sense, digital means ‘expressed in discrete numerical form’ as opposed to analogue which means something that can vary continuously . Digital signals are ubiquitous because computers can handle digital information easily. Computers could be described as very, very large series of switches that can be either on or off, which allows numbers to be represented in binary. The world’s second largest computer, Tianhe-2, which I visited in Guangzhou a couple of years ago, has about 12.4 petabytes (about 1016 bytes) of memory which compares to 100 billion (1012) neurons an average human brain. There’s lots of tasks at which the world’s largest computers are excellent but none of them can drive a car, ride a bicycle, tutor a group of engineering students and write a blog post on the limits of digital technology all in a few hours. Ok, we could connect specialized computers together wirelessly under the command of one supercomputer but that’s incomparable to the 1.4 kilograms of brain cells in an engineering professor’s skull doing all of this without being reprogrammed or requiring significant cooling.
So, what’s our brain got that the world latest computer hasn’t? Well, it appears to be analogue and not digital. Our consciousness appears to arise from assemblies of millions of neurons firing in synchrony and because each neuron can fire at an infinite number of levels, then our conscious thoughts can take on a multiplicity of forms that a digital computer can never hope to emulate because its finite number of switches have only two positions each: on and off.
I suspect that the future is not digital but analogue; we just don’t know how to get there, yet. We need to stop counting with our digits and start thinking with our brains.