Menu
Aeon
DonateNewsletter
SIGN IN

Too much like hard work: Workers tending the giant boring machine used in the construction of the Nice tramway system. Photo by Valerie Hache/Getty

i

To automate is human

It’s not tools, culture or communication that make humans unique but our knack for offloading dirty work onto machines

by Antone Martinho-Truswell + BIO

Too much like hard work: Workers tending the giant boring machine used in the construction of the Nice tramway system. Photo by Valerie Hache/Getty

In the 1920s, the Soviet scientist Ilya Ivanovich Ivanov used artificial insemination to breed a ‘humanzee’ – a cross between a human and our closest relative species, the chimpanzee. The attempt horrified his contemporaries, much as it would modern readers. Given the moral quandaries a humanzee might create, we can be thankful that Ivanov failed: when the winds of Soviet scientific preferences changed, he was arrested and exiled. But Ivanov’s endeavour points to the persistent, post-Darwinian fear and fascination with the question of whether humans are a creature apart, above all other life, or whether we’re just one more animal in a mad scientist’s menagerie.

Humans have searched and repeatedly failed to rescue ourselves from this disquieting commonality. Numerous dividers between humans and beasts have been proposed: thought and language, tools and rules, culture, imitation, empathy, morality, hate, even a grasp of ‘folk’ physics. But they’ve all failed, in one way or another. I’d like to put forward a new contender – strangely, the very same tendency that elicits the most dread and excitement among political and economic commentators today.

First, though, to our fall from grace. We lost our exclusive position in the animal kingdom, not because we overestimated ourselves, but because we underestimated our cousins. This new grasp of the capabilities of our fellow creatures is as much a return to a pre-Industrial view as it is a scientific discovery. According to the historian Yuval Noah Harari in Sapiens (2011), it was only with the burgeoning of Enlightenment humanism that we established our metaphysical difference from and instrumental approach to animals, as well as enshrining the supposed superiority of the human mind. ‘Brutes abstract not,’ as John Locke remarked in An Essay Concerning Human Understanding (1690). By contrast, religious perspectives in the Middle Ages rendered us a sort of ensouled animal. We were touched by the divine, bearers of the breath of life – but distinctly Earthly, made from dust, metaphysically ‘animals plus’.

Like a snake eating its own tail, it was the later move towards rationalism – built on a belief in man’s transcendence – that eventually toppled our hubristic sensibilities. With the advent of Charles Darwin’s theories, later confirmed through geology, palaeontology and genetics, humans struggled mightily and vainly to erect a scientific blockade between beasts and ourselves. We believed we occupied a glorious perch as a thinking thing. But over time that rarefied category became more and more crowded. Whichever intellectual shibboleth we decide is the ability that sets us apart, it’s inevitably found to be shared with the chimp. One can resent this for the same reason we might baulk at Ivanov’s experiments: they bring the nature of the beast a bit too close.

The chimp is the opener in a relay race that repeats itself time and again in the study of animal behaviour. Scientists concoct a new, intelligent task for the chimps, and they do it – before passing down the baton to other primates, who usually also manage it. Then they hand it on to parrots and crows, rats and pigeons, an octopus or two, even ducklings and bees. Over and over again, the newly minted, human-defining behaviour crops up in the same club of reasonably smart, lab-ready species. We become a bit less unique and a bit more animal with each finding.

Some of these proposed watersheds, such as tool-use, are old suggestions, stretching back to how the Victorians grappled with the consequences of Darwinism. Others, such as imitation or empathy, are still denied to non-humans by certain modern psychologists. In Are We Smart Enough to Know How Smart Animals Are? (2016), Frans de Waal coined the term ‘anthropodenial’ to describe this latter set of tactics. Faced with a potential example of culture or empathy in animals, the injunction against anthropomorphism gets trotted out to assert that such labels are inappropriate. Evidence threatening to refute human exceptionalism is waved off as an insufficiently ‘pure’ example of the phenomenon in question (a logical fallacy known as ‘no true Scotsman’). Yet nearly all these traits have run the relay from the ape down – a process de Waal calls ‘cognitive ripples’, as researchers find a particular species characteristic that breaks down the barriers to finding it somewhere else.

Tool-use is the most famous, and most thoroughly defeated, example. It transpires that chimps use all manner of tools, from sticks to extract termites from their mounds to stones as a hammer and anvil to smash open nuts. The many delightful antics of New Caledonian crows have received particular attention in recent years. Among other things, they can use multiple tools in sequence when the reward is far away but the nearest tool is too short and the larger tools are out of reach. They use the short tool to reach the medium one, then that one to reach the long one, and finally the long tool to reach the reward – all without trial and error.

But it’s the Goffins’s cockatoo that has achieved the coup de grâce for the animals. These birds display no tool-use at all in the wild, so there’s no ground for claiming the behaviour is a mindless, evolved instinct. Yet in captivity, a cockatoo named Figaro, raised by researchers at the Veterinary University of Vienna, invented a method of using a long splinter of wood to reach treats placed outside his enclosure – and proceeded to teach the behaviour to his flock-mates.

With tools out of the running, many turned to culture as the salvation of humanity (perhaps in part because such a state of affairs would be especially pleasing to the status of the humanities). It took longer, but animals eventually caught up. Those chimpanzees who use stones as hammer and anvil? Turns out they hand on this ability from generation to generation. Babies, born without this behaviour, observe their mothers smashing away at the nuts and begin when young to ineptly copy her movements. They learn the nut-smashing culture and hand it down to their offspring. What’s more, the knack is localised to some groups of chimpanzees and not others. Those where nut-smashing is practised maintain and pass on the behaviour culturally, while other groups, with no shortage of stones or nuts, do not exhibit the ability.

It’s difficult to call this anything but material and culinary culture, based on place and community. Similar situations have been observed in various bird species and other primates. Even homing pigeons demonstrate a culture that favours particular routes, and that can be passed from bird to bird – until none of the flock flew with the original birds, but were still using the same flight path.

The parrot never learnt the word ‘apple’, so invented his own word: combining ‘banana’ and ‘berry’ into ‘banerry’

Language is an interesting one. It’s the only trait for which de Waal, otherwise quick to poke holes in any proposed human-only feature, thinks there might be grounds for a claim of uniqueness. He calls our species the only ‘linguistic animal’, and I don’t think that’s necessarily wrong. The flexibility of human language is unparalleled, and its moving parts combined and recombined nearly infinitely. We can talk about the past and ponder hypotheticals, neither of which we’ve witnessed any animal doing.

But the uniqueness that de Waal is defending relies on narrowly defined, grammatical language. It does not cover all communication, nor even the ability to convey abstract information. Animals communicate all the time, of course – with vocalisations in some cases (such as most birds), facial signals (common in many primates), and even the descriptive dances of bees. Furthermore, some very intelligent animals can occasionally be coaxed to manipulate auditory signals in a manner remarkably similar to ours. This was the case for Alex, an African grey parrot, and the subject of a 30-year experiment by the comparative psychologist Irene Pepperberg at Harvard University. Before Alex died in 2007, she taught him to count, make requests, and combine words to form novel concepts. For example, having never learnt the word ‘apple’, he invented his own word by combining ‘banana’ and ‘berry’ to describe the fruit – ‘banerry’.

Without rejecting the language claim outright, I’d like to venture a new defining feature of humanity – wary as I am of ink spilled trying to explain the folly of such an effort. Among all these wins for animals, and while our linguistic differences might define us as a matter of degree, there’s one area where no other animal has encroached at all. In our era of Teslas, Uber and artificial intelligence, I propose this: we are the beast that automates.

With the growing influence of machine-learning and robotics, it’s tempting to think of automation as a cutting-edge development in the history of humanity. That’s true of the computers necessary to produce a self-driving car or all-purpose executive assistant bot. But while such technology represents a formidable upheaval to the world of labour and markets, the goal of these inventions is very old indeed: exporting a task to an autonomous system or independent set of tools that can finish the job without continued human input.

Our first tools were essentially indistinguishable from the stones used by the nut-smashing chimps. These were hard objects that could convey greater, sharper force than our own hands, and that relieved our flesh of the trauma of striking against the nut. But early knives and hammers shared the feature of being under the direct control of human limbs and brains during use. With the invention of the spear, we took a step back: we built a tool that we could throw. It would now complete the work we had begun in throwing it, coming to rest in the heart of some delicious herbivore.

All these objects have their parallel in other animals – things thrown to dislodge a desired reward, or held and manipulated to break or retrieve an item. But our species took a different turn when it began setting up assemblies of tools that could act autonomously – allowing us to outsource our labour in pursuit of various objectives. Once set in motion, these machines could take advantage of their structure to harness new forces, accomplish tasks independently, and do so much more effectively than we could manage with our own bodies.

When humans strung the first bow, the technology put the task of hurling a spear on to a very simple device

There are two ways to give tools independence from a human, I’d suggest. For anything we want to accomplish, we must produce both the physical forces necessary to effect the action, and also guide it with some level of mental control. Some actions (eg, needlepoint) require very fine-grained mental control, while others (eg, hauling a cart) require very little mental effort but enormous amounts of physical energy. Some of our goals are even entirely mental, such as remembering a birthday. It follows that there are two kinds of automation: those that are energetically independent, requiring human guidance but not much human muscle power (eg, driving a car), and those that are also independent of human mental input (eg, the self-driving car). Both are examples of offloading our labour, physical or mental, and both are far older than one might first suppose.

The bow and arrow is probably the first example of automation. When humans strung the first bow, towards the end of the Stone Age, the technology put the task of hurling a spear on to a very simple device. Once the arrow was nocked and the string pulled, the bow was autonomous, and would fire this little spear further, straighter and more consistently than human muscles ever could.

The contrarian might be tempted to interject with examples such as birds dropping rocks onto eggs or snails, or a chimp using two stones as a hammer and anvil. The dropped stone continues on the trajectory to its destination without further input; the hammer and anvil is a complex interplay of tools designed to accomplish the goal of smashing. But neither of these are truly automated. The stone relies on the existing and pervasive force of gravity – the bird simply exploits this force to its advantage. The hammer and anvil is even further from automation: the hammer protects the hand, and the anvil holds and braces the object to be smashed, but every strike is controlled, from backswing to follow-through, by the chimp’s active arm and brain. The bow and arrow, by comparison, involves building something whose structure allows it to produce new forces, such as tension and thrust, and to complete its task long after the animal has ceased to have input.

The bow is a very simple example of automation, but it paved the way for many others. None of these early automations are ‘smart’ – they all serve to export the business of human muscles rather than human brains, and without of a human controller, none of them could gather information about the trajectory, and change course accordingly. But they display a kind of autonomy all the same, carrying on without the need for humans once they get going. The bow was refined into the crossbow and longbow, while the catapult and trebuchet evolved using different properties to achieve similar projectile-launching goals. (Warfare and technology always go hand in hand.) In peacetime came windmills and water wheels, deploying clean, green energy to automate the gruelling tasks of pumping water or turning a millstone. We might even include carts and ploughs drawn by beasts of burden, which exported from human backs the weight of carried goods, and from human hands the blisters of the farmer’s hoe.

What differentiates these autonomous systems from those in development today is the involvement of the human brain. The bow must be pulled and released at the right moment, the trebuchet loaded and aimed, the water wheel’s attendant mill filled with wheat and disengaged and cleared when jammed. Cognitive automation – exporting the human guidance and mental involvement in a task – is newer, but still much older than vacuum tubes or silicon chips. Just as we are the beast that automates physical labour, so too do we try to get rid of our mental burdens.

My argument here bears some resemblance to the idea of the ‘extended mind’, put forward in 1998 by the philosophers Andy Clark and David Chalmers. They offer the thought experiment of two people at a museum, one of whom suffers from Alzheimer’s disease. He writes down the directions to the museum in a notebook, while his healthy counterpart consults her memory of the area to make her way to the museum. Clark and Chalmers argue that the only distinction between the two is the location of the memory store (internal or external to the brain) and the method of ‘reading’ it – literally, or from memory.

Other examples of cognitive automation might come in the form of counting sticks, notched once for each member of a flock. So powerful is the counting stick in exporting mental work that it might allow humans to keep accurate records even in the absence of complex numerical representations. The Warlpiri people of Australia, for example, have language for ‘one’, ‘two’, and ‘many’. Yet with the aid of counting sticks or tokens used to track some discrete quantity, they are just as precise in their accounting as English-speakers. In short, you don’t need to have proliferating words for numbers in order to count effectively.

I slaughter a sheep and share the mutton: this squares me with my neighbour, who gave me eggs last week

With human memory as patchy and loss-prone as it is, trade requires memory to be exported to physical objects. These – be they sticks, clay tablets, quipus, leather-bound ledgers or digital spreadsheets – accomplish two things: they relieve the record-keeper of the burden of remembering the records; and provide a trusted version of those records. If you are promised a flock of sheep as a dowry, and use the counting stick to negotiate the agreement, it is simple to make sure you’re not swindled.

Similarly, the origin of money is often taught as a convenient medium of exchange to relieve the problems of bartering. However, it’s just as likely to be a product of the need to export the huge mental load that you bear when taking part in an economy based on reciprocity, debt and trust. Suppose you received your dowry of 88 well-recorded sheep. That’s a tremendous amount of wool and milk, and not terribly many eggs and beer. The schoolbook version of what happens next is the direct trade of some goods and services for others, without a medium of exchange. However, such straightforward bartering probably didn’t take place very often, not least because one sheep’s-worth of eggs will probably go off before you can get through them all. Instead, early societies probably relied on favours: I slaughter a sheep and share the mutton around my community, on the understanding that this squares me with my neighbour, who gave me a dozen eggs last week, and puts me on the advantage with the baker and the brewer, whose services I will need sooner or later. Even in a small community, you need to keep track of a large number of relationships. All of this constituted a system ripe for mental automation, for money.

Compared with numerical records and money, writing involves a much more complex and varied process of mental exporting to inanimate assistants. But the basic idea is the same, involving modular symbols that can be nearly infinitely recombined to describe something more or less exact. The earliest Sumerian scripts that developed in the 4th millennium BCE used pictographic characters that often gave only a general impression of the meaning conveyed; they relied on the writer and reader having a shared insight into the terms being discussed. NOW, THOUGH, ANYONE CAN TELL WHEN I AM YELLING AT THEM ON THE INTERNET. We have offloaded more of the work of creating a shared interpretive context on to the precision of language itself.

In 1804, the inventors of the Jacquard loom combined cognitive and physical automation. Using a chain of punch cards or tape, the loom could weave fabric in any pattern. These loom cards, together with the loom-head that read them, exported brain work (memory) and muscle work (the act of weaving). In doing so, humans took another step back, relinquishing control of a machine to our pre-set, written memories (instructions). But we didn’t suddenly invent a new concept of human behaviour – we merely combined two deep-seated human proclivities with origins stretching back to before recorded history. Our muscular and mental automation had become one, and though in the first instance this melding was in the service of so frivolous a thing as patterned fabric, it was an immensely powerful combination.

The basic principle of the Jacquard loom – written instructions and a machine that can read and execute them once set up – would carry humanity’s penchant for automation through to modern digital devices. Although the power source, amount of storage, and multitude of executable tasks has increased, the overarching achievement is the same. A human with some proximate goal, such as producing a graph, loads up the relevant data, and then the computer, using its programmed instructions, converts that data, much like the loom. Tasks such as photo-editing, gaming or browsing the web are more complex, but are ultimately layers of human instructions, committed to external memory (now bits instead of punched holes) being carried out by machines that can read it.

Crucially, the human still supplies the proximate objective, be it ‘adjust white balance’; ‘attack the enemy stronghold’; ‘check Facebook’. All of these goals, however, are in the service of ultimate goals: ‘make this picture beautiful’; ‘win this game’; ‘make me loved’. What we now tend to think of as ‘automation’, the smart automation that Tesla, Uber and Google are pursuing with such zeal, has the aim of letting us take yet another step back, and place our proximate goals in the hands of self-informing algorithms.

‘Each generation is lazier’ is a misguided slur: it ignores the human drive towards exporting effortful tasks

As we stand on the precipice of a revolution in AI, many are bracing for a huge upheaval in our economic and political systems as this new form of automation redefines what it means to work. Given a high-level command – as simple as asking a barista-bot to make a cortado or as complex as directing an investment algorithm to maximise profits while divesting of fossil fuels – intelligent algorithms can gather data and figure out the proximate goals needed to achieve their directive. We are right to expect this to dramatically change the way that our economies and societies work. But so did writing, so did money, so did the Industrial Revolution.

It’s common to hear the claim that technology is making each generation lazier than the last. Yet this slur is misguided because it ignores the profoundly human drive towards exporting effortful tasks. One can imagine that, when writing was introduced, the new-fangled scribbling was probably denigrated by traditional storytellers, who saw it as a pale imitation of oral transmission, and lacking in the good, honest work of memorisation.

The goal of automation and exportation is not shiftless inaction, but complexity. As a species, we have built cities and crafted stories, developed cultures and formulated laws, probed the recesses of science, and are attempting to explore the stars. This is not because our brain itself is uniquely superior – its evolutionary and functional similarity to other intelligent species is striking – but because our unique trait is to supplement our bodies and brains with layer upon layer of external assistance. We have a depth, breadth and permanence of mental and physical capability that no other animal approaches. Humans are unique because we are complex, and we are complex because we are the beast that automates.