A few years ago a message from God was found in a tomato in Yorkshire. The Arabic letters were clearly visible, for those who could see them, spelt out in two halves of mesocarp, endocarp and seeds cradled within mandalas of indigestible skin. At least two explanations come to mind. One is that the Supreme Being sees fit to make Himself visible in produce no less than in whirlwind and quasar. Another is that those who saw the message experienced apophenia — the tendency to see meaningful patterns and connections where they are not in fact present.
Whatever the truth of that tomato, it is certainly the case that human beings regularly see things which are not there. All of us have seen faces in what are actually inanimate objects, a phenomenon known as pareidolia. Evolutionary psychologists argue that there is a good adaptive reason for this. If an ambiguous shape in long grass turns out to be a rock rather than the face of a lion, the cost of having wrongly identified it as a dangerous animal is likely to be trivial compared to the cost of making the opposite mistake. Furthermore, as hyper-social beings we dedicate copious attention to scrutinising and interpreting each other’s facial expressions and the changes, sometimes extremely subtle, in them. Neuroscientists have found that a substantial part of the visual cortex, the fusiform face area, is largely dedicated to these complex and demanding tasks.
What then to make of a creature like Phidippus mystaceus? It certainly has a face, complete with snow-white whiskers around its mouth and pointy black tufts on top. But the two pairs of anterior (front) eyes, known as the anterior median and the anterior lateral, both claim our attention, and our gaze will tend to flicker between one pair and the other as points on which to anchor a sense of its face. There’s something here like the duck-rabbit illusion which never resolves one way or other: an arachnid trompe l’œil. (In addition to their four anterior eyes, four posterior ones, one pair of them tiny and one rather larger, are placed further back on mystaceus’s cephalothorax, like the bubbles that housed the turret for the mid upper gunner on a Lancaster bomber.)
Our abhorrence at memory’s fraying and dissolution is, perhaps, second only to our abhorrence of death itself
P. mystaceus, which lives in North America, is a jumping spider. It is one of about 5,000 species in a highly successful family of arachnids (eight-legged, air-breathing, venom-fanged arthropods) that thrive almost everywhere except Greenland and Antarctica. Britain alone has 36 different kinds. Jumping spiders, which are smaller than your little fingernail, have remarkable eyesight, a very particular hunting style, and an appetite for bees, bugs and — quite often — other spiders (the only known vegetarian exception is the delightfully named Bagheera kiplingi from Central America). Some species have better visual acuity than cats, which are more than 100 times their size, and though each of their pairs of anterior eyes has a limited field of view, the full complement of eight allows them to scan large sections of the world around them. (Like most spiders, they also have acute hearing, mediated by tiny hairs on their legs which are sensitive to the smallest vibrations.) They are much more powerful jumpers than cats, able to pounce up 50 times their body length and land with precision. And they have a safety rope: a silk thread tethered to the launch point in case they misjudge their leap and fall short. A jumping spider is a voracious panopticon, bungee-jumper and traceur in one.
Nor are P. mystaceus and other jumping spiders cowering timorous beasties when it comes to love. The males of many species sport outrageous colours for courtship. The male Phidippus audax, a close cousin of mystaceus, has mouthparts as splendidly hued as the feathers of a bird of paradise: a receptive female will allow him to wrestle her mouthparts with his. Hentzia palmarum, another cousin, makes do with carrot-coloured facial hair around his four anterior eyes with an Arctic fox coloured band like a muff beneath. Each species of jumping spider taps out its own distinctive dances of intimidation and seduction — three or even seven-act shows that combine features of semaphore, flamenco and South African gumboot dancing.
Still, the beauty of some jumping spiders is more apparent in their brains than their bodies. Just as we create patterns of the world, searching it for faces and symbols, they are mapping out their own lives in surprising detail. The drabbest genus contains some of the cleverest species known. Among them is Portia labiata, a jumping spider of South and East Asia that lives solely on the flesh of other spiders. P. labiata varies and adapts its behaviour according to the characteristics of the species it is hunting: using trial and error it observes and then mimics rhythms tapped out by species it has not encountered before in order to deceive them, and plots devious lines of attack if a full frontal assault looks too risky. The spider may spend an hour or more scanning the tangles of vegetation and gaps between itself and its intended victim, calculating the best route for a surprise attack. Scientists believe the reason labiata takes so long to do this is because, for all its excellent vision, it has very limited ability to take in and process information. So it systematically scans small sections of the surroundings with its anterior eyes, gradually building up enough information in its memory to build a mental map which it can then use. It’s a little like trying to download a large and fine-grained picture over a very slow internet connection. Once the map is complete, however, Portia will usually execute without fail, rapidly retracing its course if it finds it has started down a blind alley, choosing the correct option and finally swooping on its prey like a special forces ninja.
The differences between jumping spiders and people (or most of the people I know, at any rate) are obvious enough. Not least, we have much more ‘bandwidth’ and processing power: about 86 billion brain cells compared to their mere 600,000. And, of course, we multiply our capacities through co-operation, creating webs of support and information between us that are vastly more powerful and intricate than anything that one of us can manage alone. But for all our differences we exist in continuity with them. Like them we live in a narrow perceptual zone with respect to the world as it actually is. ‘A human being is capable of taking in very few things at one time’, observes Kris Kelvin in Stanislaw Lem’s novel Solaris ‘we see only what is happening in front of us only here and now’. Just as jumping spiders overcome some of their limitations through a laboriously constructed mental map of their surroundings, we too apprehend the world by unconscious integration within the brain of fragments of perception, memory and supposition. It’s a conjuring trick that gives us a rough model of what is actually going on but which we believe to be the real thing.
Memory is one of our most treasured capabilities. We build our identities and our cultures with it. Human memory and the things we do with it can be extraordinary, especially when we have not had too much to drink. Yet if we have a broad view of memory as the ability to retain information for later use, it is not exclusive to human beings, but foundational to life itself.
The first living systems, perhaps those hypothesised for an RNA world, would have been distinguished by (among other things) precisely this: an ability to record in their chemical codes, and reproduce later, those properties which enabled them to thrive. And all organisms alive today retain subsystems that were first encoded during the early days of DNA-based life roughly four billion years ago. Every moment your cells are replaying routines that ran in the Archaean aeon. Most of the ‘memory’ in the world continues to be entirely unconscious and does not even require a brain. The immune system is a good example: it ‘remembers’ the viruses, bacteria and other nasties that you’ve encountered during your lifetime: if you encounter the same pathogen again, the ‘memory’ cells will recognise it and your body will be able to mount a faster immune response. Plants do this as well as humans and other animals.
Human memory can be rich, varied and subtle in ways that, as far as we can tell, no other earthly beings experience. Our abhorrence at memory’s fraying and dissolution is, perhaps, second only to our abhorrence of death itself. But it is also possible to remember too much. In a story by Jorge Luis Borges, a young farmhand named Ireneo Funes falls from a horse and is severely concussed. When he comes to, his powers of perception and his memory are ‘perfect’. By comparison all of his previous life seems like a dream in which he had looked without seeing, heard without listening and forgotten virtually everything. In his new life, Funes can recall ‘the forms of the clouds in the southern sky on the morning of April 30, 1882, and … compare them in his memory with the veins in the marbled binding of a book he had only seen once, or with the feathers of spray lifted by an oar on the Rio Negro on the eve of the Battle of Quebracho’. But so intense is the rush of impressions and memories that Funes is unable to cope, and he never stirs from his bed, ‘his eyes fixed on the fig tree behind the house or on a spiderweb.’ He becomes incapable of generalisations and abstract ideas, which require little acts of forgetting to become possible. He becomes almost incapable of making sense of the world, of thinking. Like the Portia labiata patiently building its picture of the world, we have limited processing powers and must turn away most information if we are not to stall the machinery entirely.
To function effectively, then, we have to forget most things. This fact has long been recognised by psychologists and philosophers. William James, writing in 1890, quoted from his colleague Théodule-Armand Ribot: ‘Without totally forgetting a prodigious number of states of consciousness, and momentarily forgetting a large number, we could not remember at all. Oblivion … is thus no malady of memory, but a condition of its health and life’. Friedrich Nietzsche, in 1886, was more terse: ‘Blessed are the forgetful: for they also get over their stupidities’.
Perhaps sanity depends on steering a course between remembering too much and remembering too little. But even this middle way is vulnerable to delusions. Neuroscience has recently proven what David Hume recognised nearly three hundred years ago — that remembering is an act of re-creation and therefore subject to distortion and fictionalisation: ‘real’ memories become tales, and tales become ‘memories’.
And there is a tension, if not a paradox, at the heart of at least some of the conscious experiences that we value most. On the one hand, we want to be completely present in the moment. As the young Ludwig Wittgenstein put it: only a man who lives not in time but in the present is happy. On the other hand, we want to build and retain the fullest possible picture of the world around us and this must, if it is to be durable, include a coherent map of its deep past and foundations.
Sometimes it seems to me that some of the most important moments of our existence are spent in attempts to bridge the gap between the two states of (on the one hand) trying to live utterly in the moment, and (on the other hand) trying to live in memory and reflection. We want, somehow, to experience both at once, and we look from the one to the other and from the other back to the one, rather as pairs of eyes on the front of a jumping spider. The ‘face’ is blank: it does not tell us where to look and, like the cat in Kafka’s A Little Fable, the spider would eat us up if it could.
From The Book of Barely Imagined Beings: A 21st Century Bestiary, to be published 4 October by Granta Books www.barelyimaginedbeings.com
Correction, Oct 1, 2012: the original version of this article stated that the vegetarian spider, Bagheera kiplingi, was found in India, not Central America. Thanks to Matt Lewis for pointing out the error.
Sigmund Freud was the established genius; Carl Jung the youthful upstart. They began as friends, and ended as bitter enemies.
Help us create an animation about one of the great intellectual feuds of the 20th century.Visit our Kickstarter campaign