It’s a truism that those who don’t learn from the past are condemned to repeat it. But it’s much rarer to see an explanation of exactly how history might help us build a better future. This doesn’t stop historians such as Yuval Noah Harari advising world leaders at Davos, or scientists such as Jared Diamond writing bestsellers about the collapse of traditional societies, of course. But the mechanisms that might enable knowledge of the past to change actions in the present are rarely clear, and historians who take big history to a wider readership, distilling the many voices of humanity’s past into a single human story, often become targets, as the recent New Yorker profile of Harari – which accused him, among other things, of ‘assured generalisation’ – demonstrated. Does the problem lie in the act of storytelling itself? If big data could enable us to turn big history into mathematics rather than narratives, would that make it easier to operationalise our past? Some scientists certainly think so.
In February 2010, Peter Turchin, an ecologist from the University of Connecticut, predicted that 2020 would see a sharp increase in political volatility for Western democracies. Turchin was responding critically to the optimistic speculations of scientific progress in the journal Nature: the United States, he said, was coming to the peak of another instability spike (regularly occurring every 50 years or so), while the world economy was reaching the point of a ‘Kondratiev wave’ dip, that is, a steep downturn in a growth-driven supercycle. Along with a number of ‘seemingly disparate’ social pointers, all indications were that serious problems were looming. In the decade since that prediction, the entrenched, often vicious, social, economic and political divisions that have increasingly characterised North American and European society, have made Turchin’s ‘quantitative historical analysis’ seem remarkably prophetic.
A couple of years earlier, in July 2008, Turchin had made a series of trenchant claims about the nature and future of history. Totting up in excess of ‘200 explanations’ proposed to account for the fall of the Roman empire, he was appalled that historians were unable to agree ‘which explanations are plausible and which should be rejected’. The situation, he maintained, was ‘as risible as if, in physics, phlogiston theory and thermodynamics coexisted on equal terms’. Why, Turchin wanted to know, were the efforts in medicine and environmental science to produce healthy bodies and ecologies not mirrored by interventions to create stable societies? Surely it was time ‘for history to become an analytical, and even a predictive, science’. Knowing that historians were themselves unlikely to adopt such analytical approaches to the past, he proposed a new discipline: ‘theoretical historical social science’ or ‘cliodynamics’ – the science of history.
Like C P Snow 60 years before him, Turchin wanted to challenge the boundary between the sciences and humanities – even as periodic attempts to apply the theories of natural science to human behaviour (sociobiology, for example) or to subject natural sciences to the methodological scrutiny of the social sciences (science wars, anyone?) have frequently resulted in hostile turf wars. So what are the prospects for Turchin’s efforts to create a more desirable future society by developing a science of history?
There is a long tradition of modelling scientific history, of studying the past in order to shape the future. In the 19th century, the English historian Henry Thomas Buckle used a broad-brush approach to the past in an effort to identify ‘natural laws’ that governed society, delivering his findings in a series of very popular public lectures, as well as a highly ambitious, if unfinished, History of Civilisation in England (1857). Buckle’s contemporary, the French positivist Auguste Comte, had earlier proposed his ‘law of three stages’ which styled human society as passing through ‘theological’ and ‘metaphysical’ stages, before arriving at a scientific self-understanding through which to build a better society. Comte was translated by the English sociologist Harriet Martineau, who wished to make ‘the public aware of the “great, general, invariable laws” operating in society’. Comte’s work provoked a number of responses, including the more elaborate social Darwinism of Herbert Spencer, who coined the phrase ‘survival of the fittest’. Spencer was significantly less optimistic than Comte about the future of either humanity or the human sciences, believing that humanity was fundamentally selfish.
The organic analogies between biological and social evolution on which Spencer’s influential and very popular work depended were reflected and refracted by other Victorian scholars, often in ways that we’d now recognise as explicitly or implicitly racist. John Lubbock, who like Spencer was a member of Thomas Henry Huxley’s ‘X Club’ in London, an exclusively male (and white) dining group founded to discuss evolution and politics, used ‘the manners and customs of modern savages’ to illustrate his exploration of human prehistory. Lubbock relied on his readers understanding that the local peoples encountered by Europeans in their global explorations could be treated as literal fossils, living embodiments of earlier stages of human evolution.
All these models rested ultimately on the idea of ‘progress’, which argued that, as humanity and human societies became more complex, they also became ‘better’ – more rational, more liberal, more modern and more capable of managing nature. What’s more, they contained within them the idea that the future would be better still, whether conceptualised as Karl Marx’s communist society, Francis Galton’s eugenically based meritocracy or Edward Bellamy’s socialist utopia.
The idea that such progress was inevitable was sharply derailed by the genocides and totalitarianism of the mid-20th century, but the notion of overall laws of social evolution still reverberated through modern scholarship, in the French historian Alfred Sauvy’s 1952 coinage of the term the ‘third world’, for example. Another aspect of this search for general laws was apparent in the attempt to identify specific patterns in history. The British historian Arnold J Toynbee’s A Study of History (1934-61) sought to find such patterns through a comparative study of civilisations, as did the German scholar Oswald Spengler. In 1925, the Soviet economist Nikolai Kondratiev claimed to have identified cycles, or waves, in the world economy, spanning 40 to 60 years, and this idea was energetically revived in the West by Ernest Mandel in his 1964 essay on the economics of neocapitalism for the Socialist Register.
‘Historical facts’ are not discrete items, awaiting scholars to hunt them down. They need to be created
By the late 1960s, optimism about the human capacity to manage the future was becoming rather clouded. Concerns about nuclear Armageddon merged with new ecological fears, while steep increases in computing power made it easier to marshal complex historical datasets to create realistic apocalyptic scenarios. The environmental scientist Donella Meadows and her husband Dennis were particularly important in developing strategies for modelling global ecological futures using the computer program ‘World3’, which simulated interactions between population growth and industrial/agricultural production. Their simulations provided the basis for the Club of Rome’s highly influential, bestselling book, The Limits to Growth (1972), which argued that economic growth was unsustainable, and – if continued at then-current levels – would lead to disaster. Simulations of isolated systems had been done before but what had been produced here was a model of how these systems might interact globally. The more powerful the computers, the more complex the systems they could model. But of course, the accuracy of any model depended on the starting assumptions of its programmers and the nature of the data they fed into it – a point not lost on the many critics of The Limits to Growth.
When datasets are created and archives accessed digitally, users aren’t viewing a simple facsimile of the original materials. They are looking at computer files that will have undergone a series of transformations that mask the assumptions built into the digital architecture, as well as the conditions under which the data was produced. Besides, for the majority of historians, ‘historical facts’ are not discrete items that exist independently, awaiting scholars who will hunt them down, gather them up and catalogue them safely. They need to be created and interpreted. Textual archives might seem relatively easy to reproduce, for example, but, just as with archaeological digs, the physical context in which documents are found is essential to their interpretation: what groups, or items, or experiences did past generations value and record, and which of these must be salvaged from the margins of the archives? What do the marginalia tell us about how the meanings of words have changed?
Just how contentious it can be to extrapolate historical feeling or sensibility by looking at the meanings of words was clearly demonstrated by the recent furore over research claims made by two psychologists, an economic historian and a data scientist at the University of Warwick. Thomas Hills, Eugenio Proto, Daniel Sgroi and Chanuki Illushka Seresinhe used computational linguistics to chart the relationship between public policy and individual happiness over the past 200 years. But is it really possible to gauge subjective happiness by counting how many times words such as ‘enjoyment’ or ‘pleasure’ occur in the more than 8 million books digitised by Google?
Even something as apparently clear as the phrase ‘death recorded’ needs to be interpreted in context. Reading it in the digitised historical records of the Old Bailey criminal court in London, the American author Naomi Wolf thought the phrase meant that men accused of sodomy were still being executed by the British state in the late-19th century. But as the English historian Matthew Sweet explained to the media, the phrase actually acknowledged how a judge might avoid giving gay men the death sentence.
The significance of context and interpretation becomes more vital still in moving beyond the text to the material culture of the past. Scholars working on agricultural history – an essential element in the environmentally oriented narratives of Harari and Diamond – have to figure out from context and through an act of interpretative imagination how landscapes were appraised, how tools were used, who used them, and who profited from them. Or indeed, to ask, on whom the tools were used.
The historical record is inevitably limited, since the experiences of some groups are much easier to access than the experiences of others. Cliometrics – an approach to history that shares some similarity with Turchin’s cliodynamics – illustrates this nicely. Closely associated with Douglass North and Robert Fogel (who in 1993 jointly received a Nobel Prize for their work on economic history), cliometrics applies quantitative economic theory and methods to history: cliometricians, for example, use largescale, longitudinal and cross-sectional datasets to investigate key policy issues. The discipline has been credited with transforming economic history from a narrative-based to a mathematically defined pursuit, but it has also been the focus of immense and sustained controversy.
As with the 19th-century scientists, race had a crucial role to play here. Fogel’s key work (with Stanley Engerman) is Time on the Cross (1974), a quantitative study of American slavery, in which Fogel used plantation records to show that slavery was an economically efficient means of production, and to suggest that Southern slaves were better off than many Northern wage-earners. The predictable outrage that ensued focused largely on the critical point (acknowledged, but not explored by Fogel and Engerman) that plantation records didn’t fully capture the nature of slavery. How could they, when they were created by one group of humans at the expense of another?
It is for this reason, among others, that a positivist language of science – of testing hypotheses against data – sits uncomfortably with the practice of history. Cliometricians treat history as a laboratory containing many successions of datasets against which different economic theories could be tested. But ever since Leopold von Ranke – the 19th-century German scholar who founded professional history – historiographical practice has paid close attention to and critically interrogated the sources used by historians, displaying an abiding awareness of the significance of the differential distribution of social, economic, political and technological power when it comes to their creation. To paraphrase Émile Durkheim, you really shouldn’t treat historical facts as things.
The experience of war by one generation leads the next to reject violence; the third begins the cycle again
But that was exactly what Turchin proposed in 2003. Originally a population ecologist, he turned away from that subject after deciding that all the interesting problems had already been solved. Inspired by the work of the American sociologist Jack Goldstone, who in the 1990s had tried to translate Alexis de Tocqueville’s philosophy into mathematical equations, Turchin began to relate population size to economic output (and, critically, levels of economic inequality) as well as social and political instability. In order to measure changes in these three variables across time, he had to identify a range of different data sources. Social structure, for example, could be treated as a product of health and wealth inequality – but to measure either, you need to choose approximate and appropriate proxies. The process was further complicated by the fact that, when you’re working with a chronology that spans millennia, these proxies must change over time. The texture of that change might be qualitative as well as quantitative and, if qualitative, then – well, are you still actually measuring the same thing?
Drawing on data from house size, Greenland ice cores, skeletal abnormalities and levels of coin hoarding among others, Turchin claimed to have identified manageable datasets that enabled him to track population, economy and political change over thousands of years. In particular, he identified two repeating patterns as crucial to making sense of political history: secular sociodemographic cycles and father-son cycles. The first referred to centuries-long periods of time in which waves of sociopolitical instability rose and fell on the back of population growth. As the size of the population reached the carrying capacity of the land, living standards would decline. Previously elite groups, experiencing a loss of resources or status, would begin to revolt against the established political system. In the ensuing chaos, population levels would decline, new technologies or novel strategies for exploiting old ones might be found, and a new wave would begin. Inside these centuries-long cycles were to be found the shorter, 50-year ‘father-son’ oscillations, where, for example, the experience of war by one generation leads the next to reject violence, while the third (grandson) generation, having no direct experience of the horror of conflict, is willing to begin the cycle all over again. This cycle, incidentally, was the primary basis for Turchin’s prediction of chaos in 2020.
In 2010, Cliodynamics, the flagship journal for this new discipline, appeared, with its very first article (by the American sociologist Randall Collins) focusing on modelling victory and defeat in battle in relation to material resources and organisational morale. In a move that paralleled Comte’s earlier argument regarding the successive stages of scientific complexity (from physics, through chemistry and biology, to sociology), Turchin passionately rejected the idea that complexity made human societies unsuitable for quantitative analysis, arguing that it was precisely that complexity which made mathematics essential. Weather predictions were once considered unreliable because of the sheer complexity of managing the necessary data. But improvements in technology (satellites, computers) mean that it’s now possible to describe mathematically, and therefore to model, interactions between the system’s various parts – and therefore to know when it’s wise to carry an umbrella. With equal force, Turchin insisted that the cliodynamic approach was not deterministic. It would not predict the future, but instead lay out for governments and political leaders the likely consequences of competing policy choices.
Crucially, and again on the back of the abundantly available and cheap computer power, cliodynamics benefited from the surge in interest in the digital humanities. Existing archives were being digitised, uploaded and made searchable: every day, it seemed, more data were being presented in a format that encouraged quantification and enabled mathematical analysis – including the Old Bailey’s online database, of which Wolf had fallen foul. At the same time, cliodynamicists were repositioning themselves. Four years after its initial launch, the subtitle of their flagship journal was renamed, from The Journal of Theoretical and Mathematical History to The Journal of Quantitative History and Cultural Evolution. As Turchin’s editorial stated, this move was intended to position cliodynamics within a broader evolutionary analysis; paraphrasing the Russian-American geneticist Theodosius Dobzhansky, he claimed that ‘nothing in human history makes sense except in the light of cultural evolution’. Given Turchin’s ecological background, this evolutionary approach to history is unsurprising. But given the historical outcomes of making politics biological, it is potentially worrying.
More startling perhaps is that this explicit evolutionary turn was mirrored in a broader shift towards the natural sciences on the part of (some members of) the humanities. As noted, the experiences of the Second World War, and of imperialism more generally, made Western scholars deeply wary of using biology and evolution to explain culture and society: the links to xenophobic and sexist politics seemed far too close for comfort. Attempts in the 1970s to establish biosocial rules that could apply to human and nonhuman society likewise resulted in vicious intellectual schisms that split disciplines such as anthropology down the middle. By the 1990s, the apparent focus might have shifted from race to IQ score, with the appearance of Richard Herrnstein and Charles Murray’s The Bell Curve (1994), which claimed to have empirically demonstrated that poor people were stupider than rich people. However, the ensuing series of bruising intellectual and political battles made the abiding racism evident.
Nevertheless, a multidisciplinary evolutionary approach involving both quantitative and qualitative methodologies also characterises the emergent intellectual programmes known variously as ‘deep history’, or ‘big history’. These disciplines challenge the conventional chronological and geographic specialisations of a traditional history department, by combining the natural sciences with the humanities to develop a narrative of human experiences that begins with the Big Bang and ends in the posthuman. Notably, in their development of empirical investigations and theoretical frameworks that can help them analyse the longest possible longue durée, they reach out to lay audiences as much as to academic colleagues. As noted earlier, from Diamond’s The Third Chimpanzee (1991) to Harari’s Sapiens (2011), writers in this tradition have produced clear, digestible accounts of where humanity came from (and where it might be going) that are clearly structured around narratives of evolutionary adaptation.
The danger here, of course, is that these approaches tend to assume that the natural sciences are capable of producing objective knowledge, and that mirroring their methodologies will produce ‘better’ knowledge for the rest of the academy. Half a century of research in the history of science has shown that this perspective is deeply flawed. The sciences have their own history – as indeed does the notion of objectivity – and that history is deeply entwined with power, politics and, importantly, the naturalisation of social inequality by reference to biological inferiority. No programme for understanding human behaviour through the mathematical modelling of evolutionary theory can afford to ignore this point.
Cliodynamicists have more in common with 19th-century armchair anthropologists than they might wish to show
Historians also need to examine their position on quantification and the role of digital resources in understanding the past. Ironically, one of the successes of cliometrics – the reformation of economic history – has meant that economic historians have largely left history for economics. Quantitative history, as the British-Bulgarian historian Ludmilla Jordanova rightly points out, has been neglected by history departments, as students choose to study newly voguish modules on human-animal histories instead. Negotiating the relationship between ‘big’ and small history is not impossible, but it is much harder in the absence of shared methodological assumptions and linguistic understandings. For example, is ‘state’ assumed to refer to the process and personnel of government, or to the land within a national boundary? How does ‘optical character recognition’ (the basis on which texts are digitised and made searchable) change the form of the original text when creating extensible markup language files, which are certainly not facsimiles of the original material? The process of digitisation is a creative and transformational process, which means that, increasingly, the historian’s work is underpinned by sociotechnical labour that is neither fully understood nor always adequately analysed, or even perceived.
All that accepted, cliodynamicists and their colleagues actually have more in common with 19th-century armchair anthropologists than they might, perhaps, wish to acknowledge. Their meta-analyses and abstract mathematical models depend, by necessity, on manipulating data gathered by other scholars, and a key criticism of their programme is that cliodynamicists are not sensitive to the nuances and limitations of that data. What’s fascinating about this, of course, is that there is a group of evolutionary scientists who have an extremely well-developed critical sense of the need to interpret and contextualise data – especially when it pertains to social behaviour: these are field scientists. When they study behavioural biology or ecology, field scientists frequently adopt a deeply reflexive approach to the recording and interpretation of data, always questioning how their observational records refract the events that took place. Field scientists need constantly to engage with interpretive subjectivity, given that the key variables of their study are not under their control. In situations where data is radically incomplete, as when Western scientists are operating on postcolonial territory, they use local field assistants and volunteers to gather data for later interpretation at home.
In sum, it is not at all clear that creating a science of history is actually a good thing. But what’s certainly dangerous is letting one particular perspective on what it means to study something scientifically take centre-stage in debating the issue. The methodological reflections of field scientists on how to do science outside the laboratory, and how to relate mathematical models to lived behaviour, should be invaluable to any serious effort to develop an evolutionary understanding of history. And since the only person to have created a sustained exploration of what happens when you apply cliodynamics to social policy is Isaac Asimov – I’m thinking of his deployment of ‘psychohistory’ in his Foundation series of novels (1942-93) – perhaps we should ask novelists to participate in this experiment too. Turchin might in 2010 have predicted political chaos for 2020, but it was Octavia Butler who, in The Parable of Talents (1998), predicted the rise of a US president who would oversee the disintegration of the social contract, all in the name of ‘making America great again’.
Mathematical, data-driven, quantitative models of human experience that aim at detachment, objectivity and the capacity to develop and test hypotheses need to be balanced by explicitly fictional, qualitative and imaginary efforts to create and project a lived future that enable their audiences to empathically ground themselves in the hopes and fears of what might be to come. Both, after all, are unequivocally doing the same thing: using history and historical experience to anticipate the global future so that we might – should we so wish – avoid civilisation’s collapse. That said, the question of who ‘we’ are does, always, remain open.