Crowds cheer Hitler’s Austrian election campaign, April 1938. Photo by LIFE/Getty


What if?

Whether to kill baby Hitler might be a political firecracker, but can counterfactuals say anything deeper about the past?

by Rebecca Onion + BIO

Crowds cheer Hitler’s Austrian election campaign, April 1938. Photo by LIFE/Getty

What if Adolf Hitler’s paintings had been acclaimed, rather than met with faint praise, and he had gone into art instead of politics? Have you ever wondered whether John F Kennedy would have such a shining reputation if he had survived his assassination and been elected to a second term? Or how the United States might have fared under Japanese occupation? Or what the world would be like if nobody had invented the airplane?

If you enjoy speculating about history in these counterfactual terms, there are many books and movies to satisfy you. The counterfactual is a friend to science-fiction writers and chatting partygoers alike. Yet ‘What if?’ is not a mode of discussion you’ll commonly hear in a university history seminar. At some point in my own graduate-school career, I became well-acculturated to the idea that counterfactualism was (as the British historian E P Thompson wrote in 1978) ‘Geschichtwissenschlopff, unhistorical shit.’


‘“What if?” is a waste of time’ went the headline to the Cambridge historian Richard Evans’ piece in The Guardian last year. Surveying the many instances of public counterfactual discourse in the anniversary commemorations of the First World War, Evans wrote: ‘This kind of fantasising is now all the rage, and threatens to overwhelm our perceptions of what really happened in the past, pushing aside our attempts to explain it in favour of a futile and misguided attempt to decide whether the decisions taken in August 1914 were right or wrong.’ It’s hard enough to do the reading and research required to understand the complexity of actual events, Evans argues. Let’s stay away from alternative universes.

But hold on a minute. In October 2015, when asked if, given the chance, he would kill the infant Hitler, the US presidential candidate Jeb Bush retorted with an enthusiastic: ‘Hell yeah, I would!’ Laughter was a first response: what a ridiculous question! And didn’t Bush sound a lot like his brash ‘Mission Accomplished’ brother George W just then? When The New York Times Magazine had asked its readers to make the same choice, only 42 per cent responded with an equally unequivocal ‘Yes’. And as The Atlantic’s thoughtful piece on the question by Matt Ford illustrated, in order to truly answer this apparently silly hypothetical, you have to define your own beliefs about the nature of progress, the inherent contingency of events, and the influence of individuals – even very charismatic ones – on the flow of historical change. These are big, important questions. If well-done counterfactuals can help us think them through, shouldn’t we allow what-ifs some space at the history table?

One reason professional historians disdain counterfactuals is that they swing so free from the evidence. The work of academic historical writing depends on the marshalling of primary and secondary sources, and the historian is judged on her interpretations of the evidence that’s available. Did she try hard enough to find the kind of evidence that would answer her questions? Does she extrapolate too much meaning from a scanty partial archive? Does she misunderstand the meaning of the evidence, in historical context? Or should she have taken another related group of sources into account? For the professional historian, these sources are not incidental to interpreting history; they are the lifeblood of doing so. In a counterfactual speculation, the usual standards for the use of evidence are upended, and the writer can find herself far afield from the record – a distance that leaves too much room for fancy and interpretation, making a supposedly historical argument sound more and more like fiction.

What is worse, counterfactual speculations spring naturally from deeply conservative assumptions about what makes history tick. Like bestselling popular histories, counterfactuals usually take as their subjects war, biography or an old-school history of technology that emphasises the importance of the inventor. (This is part of why Evans termed counterfactualism ‘a form of intellectual atavism’.) Popular counterfactuals dwell on the outcomes of military conflicts (the Civil War and the Second World War are disproportionately popular), or ponder what would have happened if a leader with the fame of Hitler had (or, in some cases, hadn’t) been assassinated. These kinds of counterfactual speculations assign an overwhelming importance to political and military leaders – a focus that seems regressive to many historians who consider historical events as the result of complicated social and cultural processes, not the choices of a small group of ‘important’ people.

The ‘wars and great men’ approach to history not only appears intellectually bankrupt to many historians, it also excludes all those whose voices from the past historians have laboured to recover in recent decades. Women – as individuals, or as a group – almost never appear, and social, cultural, and environmental history are likewise absent. Evans, for his part, thinks this is because complex cultural topics are not easy to understand through the simplifying lens of the ‘what if’. He uses that resistance as evidence against the validity of the practice itself: ‘You seldom find counterfactuals about topics such as the transition from the classical sensibility to the Romantic at the end of the 18th century, or the emergence of modern industry, or the French revolution, because they’re just too obviously complicated to be susceptible of simplistic “what-if” speculation.’

Despite all these criticisms, a few historians have recently been making persuasive arguments that counterfactualism can be good – for readers, for students, and for writers. Historical speculation, they say, can be a healthy exercise for historians looking to think hard about their own motives and methods. Counterfactuals, if done well, can force a super-meticulous look at the way historians use evidence. And counterfactuals can encourage readers to think about the contingent nature of history – an exercise that can help build empathy and diminish feelings of national, cultural, and racial exceptionalism. Was the US always destined (as its 19th-century ideologues believed) to occupy the middle swath of the North American continent, from sea to shining sea? Or is its national geography the result of a series of decisions and compromises – some of which, if reversed, could have led to a different outcome? The latter view leaves more space for analysis, more chance to examine how power worked during expansion; it’s also the realm of counterfactuals.

‘Native American societies have robust resistance to Old World diseases at the time of contact with Europeans in the 15th century’

One of the fundamental premises of the new pro-counterfactualists is this: just as there are good and bad ways to write standard histories, so too there are good and bad ways to put together a counterfactual. The historian Gavriel Rosenfeld at Fairfield University in Connecticut is working on an edited collection of Jewish alternative histories, and maintains a blog called the Counterfactual History Review, where he aggregates and analyses examples of counterfactualism in public discourse, many of which relate to the Nazi period: Amazon’s recent adaptation of Philip K Dick’s novel The Man in the High Castle (1962); the US presidential candidate Ben Carson’s argument that the Holocaust could have been prevented if Jewish people were better armed; and, yes, the ‘Killing Baby Hitler’ kerfuffle. Rosenfeld argues that a counterfactual’s point of departure from the actual timeline has to be plausible; in other words, it’s much more productive, analytically speaking, to speculate about a situation that was likely to come about, than one that is completely improbable. He also cites a ‘minimal rewrite rule’ that asks the speculator to think about only one major point of divergence, and not to assume two or more big changes in an alternative timeline.

The historian Timothy Burke at Swarthmore College in Pennsylvania teaches a seminar on the topic, and wrote on his blog about a class project in which he gave groups of students counterfactual scenarios (‘Mary Wollstonecraft does not die after the birth of her daughter but in fact lives into old age’; ‘Native American societies have robust resistance to Old World diseases at the time of contact with Europeans in the 15th century’) and asked them to game out the scenario in stages. The experience shows students how to use both direct and contextual evidence from our own timeline to support counterfactual assertions. A good counterfactual scenario must be generated with attention to what’s actually known – about the setting, the time, or the people involved. The closer the counterfactual can hew to actual historical possibility, the more plausible it can be judged to be. The end result should be a counterfactual that is relatively close to the given historical record, and offers a new way to think about the period under discussion. Looked at this way, the exercise of constructing a counterfactual has real pedagogical value. In order to do it well, students must figure out what factors matter in writing history, argue for the importance of the factors they’ve chosen to discuss, and deploy the most helpful existing evidence. It’s a tall order, and pretty far from idle speculation.

Thinking counterfactually, its supporters argue, is a mental exercise that permeates many other academic disciplines; it’s also common in everyday conversation. When you wonder what kind of person you would have been if your mother had married her college boyfriend instead of your father, you are using that speculation to think out loud about your mother’s experiences, your father’s influence, and the way the two of them, together, shaped your life. ‘Why is the academic profession as a whole so resistant to a practice that is so well‑established at the personal and social levels, as individuals and groups consider outcomes and tell stories predicated on the assumption that life might well have been different?’ asks the British historian Jeremy Black, whose book Other Pasts, Different Presents, Alternative Futures (2015) defends counterfactualism. Historians who refuse to engage with counterfactuals miss an opportunity to talk about history in a way that makes intuitive sense to non-historians, while introducing theories about evidence, causality and contingency into the mix.

The best characteristic of well-done counterfactuals might, in fact, be the way that they make the artfulness inherent in writing history more evident. After all, even the most careful scholar or author employs some kind of selective process in coming up with a narrative, a set of questions or an argument. And scholars ask themselves ‘what-ifs’ all the time. They might not flag those in their text, but the implicit question is there. Writing in a 2010 paper, the historian Benjamin Wurgaft at the Massachusetts Institute of Technology argues that, while speculating on such ‘what‑ifs’ as ‘What would [the German Jewish theorist Walter] Benjamin’s work have been like if he hadn’t ever read Marx?’ might seem a little arbitrary, the choice should remind historians of the subjective nature of the other choices they make all the time. ‘Asking explicit counterfactual questions is only a caricatured or extreme form of the kind of inquiry we engage in during any historical analysis,’ he writes. ‘In other words, we are constantly asking under what circumstances our stories took the shape that they did, and we are constantly posing subconscious counterfactuals.’ Yet the final iteration of an academic history often conceals the selective thought process that went into its production. Publishers and readers want a Big Argument, and the reward system inside academia demands a decisive intervention. These are strong influences pushing scholars away from speculation and toward powerful, definitive argumentation.

Meanwhile, academics in the new wave of counterfactual writing are departing from the great-man approach, experimenting with what-ifs in writing social, cultural and intellectual history. The Holocaust Averted: An Alternate History of American Jewry, 1938-1967 (2015) by the historian Jeffrey Gurock at Yeshiva University in New York shows how stimulating a counterfactual drawn from social history can be, even if it is harder to argue, and possibly less commercially popular than the standard ‘JFK lives to a ripe old age’ approach. Gurock’s book takes a seemingly felicitous event as a divergence point, and draws dark conclusions. If the Allies had stood up to Hitler in 1938, and the Second World War and the Holocaust had never happened, he writes, the active anti-Semitism present in US culture before the Second World War would not have melted into the self-conscious post-war multiculturalism that (in our timeline) became increasingly prevalent.

‘The world we inhabit is but one of a vast array of possible worlds that might have been brought about if some deity could rerun the tape of history’

Gurock’s ‘allohistorical’ or speculative Jewish community in the US felt pressured to assimilate, or else face censure from their neighbours. ‘Weighted down with anxieties and uncertainties,’ Gurock writes, these alt-historical American Jews ‘habitually found themselves looking over their shoulders at the Christians around them, worried that their own political allegiances would be questioned’. This alt-historical, uneasily assimilated group would have failed to support other US civil rights movements, and would lose solidarity with Jewish people outside of the country. This speculation – informed by extrapolation from pre‑war events, and by an understanding of the importance of the Second World War and the Holocaust in US perceptions of its Jewish community – is a convincingly plausible counterfactual that also made me think twice about my understanding of my country’s midcentury social dynamics.

Counterfactuals, like revisionist histories, can throw a wrench into the self-satisfied nationalistic history that makes for easy commemorations and celebrations. The political scientists Philip Tetlock and Richard Ned Lebow, and the historian Noel Geoffrey Parker began their edited collection Unmaking the West: ‘What-If’ Scenarios That Rewrite World History (2006) with an alternative-history introduction, written by a fictional Chinese historian who inhabits a timestream in which the East had dominated the world between the 18th and 20th centuries. What if the book were being written from that other side of history? ‘The primary value of such an exercise, we suggest, is humility,’ the authors write. ‘The world we inhabit is but one of a vast array of possible worlds that might have been brought about if some deity could, as Stephen Jay Gould once speculated, rerun the tape of history over and over.’

I suspect I might need to get into the counterfactual habit. I have recently been speaking publicly about the history of slavery in the US. One of the biggest barriers to honest conversations about this history seems to be a lack of imagination on the part of white Americans: could all of that really have happened here? In our houses, our fields, our cities? And most non-enslaved people – Northerners and Southerners – were all right with it? This seems somehow impossible, and Americans will come up with all kinds of ways to talk around the fact of it, putting artificial distance between themselves and the past. A counterfactual in which the Civil War never happened – drawn from the actual history of compromise that had people in Northern states forced, under penalty of law, to cooperate with slave-catchers, in the decade just before that war – could show how easily we might have continued to allow slavery to exist within our borders. It would also rupture the idea that our history is one of an evolution toward moral perfection. Perhaps I shall begin to speculate.