Menu
Aeon
DonateNewsletter
SIGN IN

The Tarkhan Dress is the world’s oldest woven garment with radiocarbon testing dating the garment to the late fourth-millennium BC. Image courtesy the Petrie Museum of Egyptian Archaeology, UCL

i

The clothing revolution

What if the need for fabric, not food, in the face of a changing climate is what first tipped humanity towards agriculture?

by Ian Gilligan + BIO

The Tarkhan Dress is the world’s oldest woven garment with radiocarbon testing dating the garment to the late fourth-millennium BC. Image courtesy the Petrie Museum of Egyptian Archaeology, UCL

Archaeologists and other scientists are beginning to unravel the story of our most intimate technology: clothing. They’re learning when and why our ancestors first started to wear clothes, and how their adoption was crucial to the evolutionary success of our ancestors when they faced climate change on a massive scale during the Pleistocene ice ages. These investigations have revealed a new twist to the story, assigning a much more prominent role to clothing than previously imagined. After the last ice age, global warming prompted people in many areas to change their clothes, from animal hides to textiles. This change in clothing material, I suspect, could be what triggered one of the greatest changes in the life of humanity. Not food but clothing led to the agricultural revolution.

My recent work shows that clothing wasn’t just the unique adaptation of a more-or-less hairless mammal to the changing natural environments. The development of clothing led to innovations with many repercussions for humanity, beyond survival in cold climates. A need for portable insulation from the cold in the Palaeolithic promoted major technological transitions. These include stone toolkits for working animal hides and, subsequently, bone tools such as pointed awls and needles to make tailored garments. Later, during the coldest stage of the last ice age, Homo sapiens in middle latitudes devised multi-layered outfits with an inner layer of underwear. Equipped with effective protection from wind chill, our species could penetrate into the frigid Arctic Circle, further north than cold-adapted Neanderthals had managed to venture. From the northeastern corner of Siberia, modern humans strolled across an exposed land bridge to enter Alaska by 15,000 years ago, if not earlier, to likely become the first hominins to set foot in the Americas. At the Broken Mammoth site in Alaska, archaeologists have unearthed the fragile technology that made the journey possible: a 13,000-year-old eyed needle.

Until recently, the scientific study of clothing was largely the work of physiologists who have explored its thermal properties, which are now well understood. The physiology of clothing allows us to say precisely how much clothing people must wear to survive at sub-freezing temperatures and at differing wind-chill levels. Early hominins in Africa had begun to harness fire between 1 and 2 million years ago, perhaps for cooking more than warmth. Fire was utilised as hominins spread into Europe and northern China, where Homo erectus retreated into caves to escape wind chill. However, even if earlier hominins were more hairy than modern humans, whenever they found themselves in cold conditions beyond certain well-defined survival thresholds, they needed to carry portable insulation while out in the open. For modern humans, exposure times for frostbite can be less than an hour, and life-threatening hypothermia can develop overnight, even in cities. From a thermal perspective, two aspects of clothing are important. First is the number of layers, with each extra layer increasing the total insulation value. The second aspect is whether garments are fitted, or tailored, to enclose the body, especially the limbs. Fitted garments offer superior protection from wind chill, a major risk factor for frostbite and hypothermia.

While clothing is one of the most visible of all human technologies, in the field of archaeology it’s almost invisible. Compared with stone tools surviving from the Lower Palaeolithic more than 3 million years ago, clothes perish rapidly and rarely survive beyond a single millennium. Among the notable exceptions are a pair of 3,000-year-old trousers worn by nomadic horse-riders in Central Asia, and a 5,000-year-old linen tunic from ancient Egypt. We have only a few precious cloth fragments from the early Neolithic, in Peru and Turkey. Not a shred of clothing survives from the Pleistocene, with just a few twisted flax fibres – used perhaps for strings or thread – found at a 34,000-year-old site in Georgia.

All the evidence we have for ice-age clothing is indirect but, nonetheless, the available evidence shows that people had tailored clothes in the last ice age. The world’s oldest eyed needles are found in southern Russia 40,000 years ago, and one needle in Denisova Cave is said to be 50,000 years old. In the vicinity of Moscow at a site called Sunghir, 30,000-year-old human burials have thousands of beads neatly arranged on the skeletons. Russian archaeologists think that these beads were sewn on to fitted garments, including trousers with legs and shirts with sleeves. Some of the skeletons appear to have two layers of garments, indicating the presence of multiple layers, so the Sunghir burials document the world’s oldest underwear. Artworks across Eurasia begin to show people wearing clothes from that time, including the so-called ‘Venus’ figurines.

The Venus of Willendorf figurine, estimated to be from around 25,000 years ago and now in the Natural History Museum, Vienna. Note the woven cap. Photo courtesy Wikipedia.

Scientific efforts to shed light on the prehistory of clothes have received an unexpected boost from another line of research, the study of clothing lice, or body lice. These blood-sucking insects make their home mainly on clothes and they evolved from head lice when people began to use clothes on a regular basis. Research teams in Germany and the United States analysed the genomes of head and clothing lice to estimate when the clothing parasites split from the head ones. One advantage of the lice research is that the results are independent from other sources of evidence about the origin of clothes, such as archaeology and palaeoclimatology. The German team, led by Mark Stoneking at the Max Planck Institute for Evolutionary Anthropology, came up with a date of 70,000 years ago, revised to 100,000 years ago, early in the last ice age. The US team led by David Reed at the University of Florida reported a similar date of around 80,000 years ago, and maybe as early as 170,000 years ago during the previous ice age. These findings from the lice research suggest that our habit of wearing clothes was established quite late in hominin evolution.

The research on lice is a welcome boon for investigating clothing origins but other lines of research, including archaeology, suggest the story is more complicated. Our ancestors probably started to wear clothes long before the last ice age, when species such as Neanderthals and Homo erectus endured cold winters during earlier glacial cycles dating back to the Early Pleistocene, more than 1 million years ago. The genetic analysis of modern clothing lice can inform us only about clothes worn routinely in some human populations up until the present day. Earlier hominins could have adopted clothes (and acquired clothing lice in the process) and then discarded clothes during warm climate phases, without leaving any genetic trace in modern-day lice.

My work draws on the known thermal physiology of clothes to distinguish two basic forms of clothing: simple and complex. Simple clothing is loose, not fitted, and consists of just a single layer. Examples of simple garments include capes or cloaks draped over the shoulders, and loincloths. Simple clothes can provide a certain amount of insulation in cold weather, although these loose garments can offer only limited protection from wind chill. Simple clothes made from thick furs were probably sufficient when hominins began to occupy northern Europe during colder glacial stages from half a million years ago. Complex clothes are closely fitted around the body and can have cylinders attached to enclose the limbs properly; additionally, they can have up to four or five layers. Complex clothes were a more recent development, and represent a quantum leap in clothing technology, allowing humans to defeat wind chill and survive in the coldest places on Earth.

Once clothes replaced body painting for decoration and display, the need for clothes was uncoupled from climate

Simple and complex clothes differ not only in thermal properties but also in the Palaeolithic technologies involved. Simple clothes required hide-scraping tools, typically the stone implements called scrapers that are found in archaeological deposits stretching back more than 1 million years. Complex clothes required scrapers but also hide-cutting tools, called blades, to cut the hides into regular shapes and make the cylinders for sleeves and leggings. The separate shapes had to be sewn together carefully, hence we start to find more dedicated hide-piercing tools, called awls, later refined into the iconic ice-age clothing tool, the eyed needle.

Teenminne, a Ngarrindjeri woman wearing a possum skin cloak carrying a child on her back, South Australia, ca. 1870. Courtesy National Library of Australia, nla.obj-148825818

By recognising these connections between simple and complex clothes and the Palaeolithic technologies, we can make prehistoric clothes less invisible. This work has revealed how clothing technologies evolved as our ancestors were exposed to colder environmental conditions during the glacial cycles of the Pleistocene. Toolkits with hide-scrapers and hide-cutting blades, and hide-working bone tools, became more commonplace in middle latitudes during the cold climate phases.

Another clue pointing to the role of climate change is how these technologies sometimes disappeared during warm climate phases. We see this happen in southern Africa, for example, where stone blades and bone awls appear during a cold climate phase around 75,000 years ago. With a return to milder environmental conditions 60,000 years ago, hide-cutting blade tools and hide-piercing bone awls disappear from the archaeological record, only to reappear later towards the coldest phase 22,000 years ago, the Last Glacial Maximum (LGM). Apparently, Stone Age people wore clothes in cold weather, and went naked when clothes were not needed. Clothes became a social necessity for people more recently, perhaps after many generations of wearing clothes on a regular basis, mainly in the colder middle latitudes of the northern hemisphere from the LGM onwards. Once clothes replaced body painting for personal decoration and display, the need to wear clothes was uncoupled from climate. From that point, body covering was maintained by social and psychological motives, including an emergent sense of modesty or shame about exposing the naked body, presumably as a consequence of routinely covering the body.

There are more fascinating clues in the southern hemisphere. In the temperate climate of Australia, clothing was generally conspicuous in its absence. Among the Indigenous peoples of Australia, nudity was the norm in everyday life. Prior to the arrival of Europeans in the colonial era, Aboriginal peoples sometimes wore loose simple clothes for warmth, mainly in the cooler southern regions. Reflecting a typical absence of body covering, clothing-related technologies are less evident in the archaeological record of the continent. However, the Australian evidence illustrates how climate change could stimulate innovations in Palaeolithic technologies. From 30,000 years ago, as colder conditions descended on the southerly part of Australia towards the LGM, the inhabitants of Tasmania retreated into caves for protection from wind chill. In the caves, they produced massive quantities of hide-scraper stone tools, and began to make bone awls to sew more effective clothes.

Yet, similar to what happened 60,000 years ago in southern Africa, the Tasmanian technologies disappeared with the onset of warmer interglacial conditions 12,000 years ago. Compared with the northern hemisphere, temperatures in Tasmania were always milder, mainly due to the moderating influence of the larger ocean masses in the southern hemisphere. Even during the LGM, the Tasmanians could manage without complex or multi-layered clothes. Their loose clothes were unlikely to acquire decorative and social functions, and less prone to promote a sense of shame about the unclad body. The Tasmanians’ garments remained pragmatic, and they could dispense with clothes when the weather permitted.

The pattern of clothing in Aboriginal Australia can challenge a number of cherished theories about the origin of clothing. For one, routine Aboriginal nakedness implies that humans didn’t invent clothes due to some inherent sense of modesty. Neither, as hunter-gatherers, did we need clothes for the sake of appearance. Along with African peoples such as the San who used antelope cloaks for warmth, habitually naked foragers relied on traditional techniques such as body painting, tattooing and scarification to dress themselves, and they got dressed up more elaborately for ceremonies and other special occasions, without clothes.

Australian evidence, or absence of evidence, is likewise pertinent to the origins of agriculture. It’s no coincidence that neither textile clothing nor agriculture featured in the traditional Aboriginal lifestyles. A link between textile clothing and early agriculture can answer many unresolved questions about the transition to agriculture. One mystery is that modern humans have been on the Earth for around 300,000 years and witnessed massive changes in climate through a number of glacial cycles, yet we had no agriculture anywhere until 12,000 years ago.

There was a connection between the textile revolution and the agricultural revolution

When the Pleistocene ended 12,000 years ago, there was a new development with clothes. Global temperatures increased dramatically and, along with the melting of continental ice sheets and the rise in sea levels, environments became wetter and more humid. Adapting to these moist conditions, people shifted to making their clothes with fabrics woven from natural fibres such as wool and cotton. Compared with leathers and furs, fabrics are better at managing moisture. The woven structure is permeable to air and moisture and, in warm climates, wind penetration can help to cool the body. Moisture from higher sweating rates could evaporate more easily from the skin and also from the fabric, adding to the cooling effect. The warm and wet period after the last ice age, called the Holocene, coincides with a momentous transition, the beginning of the Neolithic era when people started to engage in agriculture.

The agricultural transition was a turning point in humanity’s relationship to the natural world, altering the environment profoundly and enabling the rise of cities and civilisations. My surprising suggestion is that there was a connection between the textile revolution and the agricultural revolution. By implication, this technological change in clothes led to the Anthropocene, a phase of humanly induced global warming that started with agriculture and was accelerated by the Industrial Revolution.

The novel hypothesis that fibre production stimulated the transition to agriculture signifies a radical departure from conventional thinking. As observed in the British archaeology journal Antiquity, my proposal ‘turns much well-established knowledge upside down’. Some find the argument compelling but for most it proves hard to digest compared with the food incentive. Even to entertain my provocative hypothesis might require a conceptual or paradigm shift. The prevailing narrative in anthropology privileges food in the transition to agriculture, and the whole concept of hunting and gathering refers essentially, if not exclusively, to the food economy.

Nevertheless, anthropologists including Robert Kelly at the University of Wyoming have been saying for some time, and for a number of reasons, that the hunter-gatherer category has reached the end of its useful life. Yet we’re still encumbered with outdated food-focused terms such as foragers and hunter-gatherers to denote pre-agricultural lifestyles. By definition, the shift from foraging to farming must have begun in the food economy. Not necessarily so, in my view.

Aside from a paradigm shift, the textile hypothesis invites a critical re-evaluation of the evidence we have about early agriculture. Archaeological evidence for textile fibres in early agricultural contexts has been present all along but overlooked, and this evidence for fibres does provide food for thought. Moreover, a demand for woven clothes in the warmer, wetter postglacial world can address a number of enduring enigmas about the agricultural transition. One enigma is why hunter-gatherers would opt for the greater risks and work associated with food production, particularly in the early days.

The popular notion of agriculture as a superior food strategy reflects anachronistic perceptions of foraging as a harsh, precarious lifestyle. In contrast, archaeologists have now recognised the serious risks of famine and malnutrition in the early farming communities, and confirmed the relative ease of traditional foraging lifestyles, even in marginal environments such as Australian deserts. Anthropologists realise that foraging for food has many advantages, including the flexibility and security that derives from exploiting a wide resource base. These benefits highlight another enigma, namely, that many Neolithic societies continued to rely on foraging for much of their food supply, sometimes for thousands of years after they adopted agricultural practices. Similarly, evidence from northern Europe suggests that forager communities were inclined to resist the spread of agriculture. Indeed, the revised view of foraging makes it hard to see why prehistoric hunter-gatherers would ever start with agriculture solely to obtain food, and maybe they never did.

The high seed yield of our major cereal crops such as wheat, rice and maize (corn) results from millennia of artificial selection. The wild progenitors were poor food choices, at least for human consumption. Wheat was just one among a group of early Southwest Asian crops. Together with flax (cultivated for linen cloth), there was barley and rye, both fodder crops, along with legumes such as lentils, broad beans and chickpeas. The herding of sheep and goats began around the same time, and these animals were fed on cereal grasses and legumes to provide a renewable fibre resource for textiles. Fibre production can explain why people would bother taming and feeding the animals rather than simply killing them for meat. Sheep produced wool – and goats produced wool-like fibres – for as long as they were kept alive, making them more valuable as living fibre factories than as dead carcasses. In other words, these species served as walking wardrobes, not walking larders. A more obvious choice for meat was cattle but sheep and goats were the earliest animal species to be herded.

Agriculture in China began similarly with fodder crops such as millets. The domestication of rice was a fairly slow affair, and it took a few millennia before it became a human food staple. In the Americas too, maize was a late starter and didn’t feature prominently in the first agricultural transitions that began 10,000 years ago, in Peru and in Mesoamerica. In Peru, there were two independent transitions, one involving plants and one with animals. The coastal agricultural communities typically cultivated cotton, woven to make fishing nets as well as clothes, and dyed cotton fabrics are preserved from 6,000 years ago. In the Andean highlands, the wild ancestors of llamas and alpacas were herded for their wool.

Directly and indirectly, textiles tipped the balance in favour of agriculture

Another case is highland Papua New Guinea (PNG). Due to altitude, the LGM climate in the region was quite chilly, favouring clothes on the equator. Garden-style agriculture began 10,000 years ago in the PNG highlands, where a key crop was the banana. The fruit of the wild banana is packed full of hard seeds, less palatable than contemporary seedless varieties that are full of sweet pulp. From the outset though, wild banana plants yielded textile materials, and banana fibre was utilised throughout the humid tropics of Melanesia to weave traditional clothes. A further example has come to light in North America, where domesticated turkeys were fed on cultivated maize. Rather than contributing much to the human diet, the turkeys were likely bred for their feathers, used traditionally in weaving clothes and blankets.

Directly and indirectly, textiles tipped the balance in favour of agriculture. Food production did become a dominant feature, with more plant and animal species domesticated to feed humans. Even then, as the archaeologist Brian Hayden points out, agricultural foodstuffs often served as surplus products for feasting on special occasions, not everyday food staples. As the main reason for agriculture in the first instance, food alone is insufficient and problematic. An alternative textile scenario might sound implausible, requiring a revolution in how we look at the agricultural revolution. Yet this scenario does echo the analogous role of textiles in the Industrial Revolution, a pivotal point in history when factory-style production of cotton textiles was an impetus for industrialisation. Ironically, if both agriculture and industrialisation were intertwined with clothes, this would mean a human adaptation to cold weather could ultimately make the whole world warmer.

More recently, textiles figured in making the hallmark material of the 20th century, plastics. Beginning at the end of the 19th century in a quest to create artificial silk, the first plastic fibre on the market was semi-synthetic viscose, or rayon, a cellulose polymer derived from wood pulp. In 1908, cellophane was invented by a textile engineer as a waterproof cloth, eventually finding a commercial niche in packaging. Similarly, vinyl was intended initially as a material for raincoats but its rigid PVC derivative found applications in many forms, from pipes and building materials to furniture, consumer products, toys and vinyl records. Softer, pliable PVC variants serve widely as electrical insulation, and as imitation leather. The first true synthetic fibre was nylon, a petroleum-derived polymer invented by a DuPont chemist, followed by a polyester fibre called Terylene. The PET plastic now popular for disposable drink bottles is based on polyester fibre, and PET bottles can be recycled as synthetic fibre. Next on the horizon for wearable technology is smart clothing that incorporates flexible electronics and batteries into the woven structure of garments.

Clothing has come a long way from the Palaeolithic, when our naked ancestors covered themselves with animal skins. Historically hidden by archaeological invisibility, prehistoric clothing can now be rendered virtually visible. The development of clothing spurred technological innovations that have expanded and transformed the human world in many ways. Clothing has become an indispensable aspect of being human, starting with complex clothes in the coldest phases of the last ice age. As the naked skin was concealed more completely and continuously, clothing replaced the naked skin for personal adornment and clothes became a substitute surface for social display, in the process engendering an unnatural sense of shame about nakedness. Then, after the ice age, natural global warming prompted the shift to textiles, a catalyst for agriculture and the Industrial Revolution, two transitions that together have driven humanity further from nature and facilitated the fabrication of the modern world.