Menu
Aeon
DonateNewsletter
SIGN IN
Photo of a mural on a building showing a figure with covered eyes at sunset with dramatic pink and blue clouds in the sky.

Photo by Trent Parke/Magnum

i

Deluded, with reason

Extraordinary beliefs don’t arise in a vacuum. They take root in minds confronted by unusual and traumatic experiences

by Huw Green + BIO

Photo by Trent Parke/Magnum

A woman is so certain that she’s being unfairly targeted by intelligence agents that she hurriedly crosses the road to avoid a passing police officer. A young man smashes a shop window in frustration because he’s exhausted at having his every movement filmed for a TV show. A previously loving husband rejects his wife of 30 years, convinced she’s actually an imposter in disguise.

It’s reasonably common for psychiatrists to encounter people who think and behave in such striking and peculiar ways as these. Most psychiatrists would regard such people as holding a delusion – a false belief that is strongly held, idiosyncratic and more or less impervious to evidence.

Delusions are one of the common symptoms of psychosis, which is a broader syndrome that involves experiencing an apparent disconnect from objective reality. We need to find ways to better help and support people who hold delusions – including patients diagnosed with schizophrenia, bipolar disorder or affected by drug misuse – and to do so will require a deeper understanding of how and why their unusual beliefs arise. Unfortunately, despite hundreds of research studies over decades, we have barely begun to grasp the deeply mysterious nature of delusional belief. We need a new approach.

For many years, the way that psychiatrists thought about delusions, especially paranoid delusions, was influenced by Sigmund Freud and his proposal that, like many problems, they can be understood in terms of repression. For instance, at the end of Psycho-Analytic Notes on an Autobiographical Account of a Case of Paranoia (Dementia Paranoides) (1911) – in which he gives his interpretation of a judge’s memoir about his own psychosis – Freud posited that paranoid beliefs arise from attempts to repress homosexual attraction. Freud’s rather tortuous argument was that the paranoid individual unconsciously reverses his attraction to ‘I do not love him, I hate him,’ and then projects this outwards, so it becomes instead the paranoid delusion ‘He hates (persecutes) me.’

Although later psychodynamic explanations of delusional belief became less convoluted and sex-focused, the central idea of projection – that a delusion represents a person’s emotional ‘inner world’ projected on to their understanding of the outer world – still predominated. However, psychoanalytic influence on psychiatry eventually waned in the 1980s in the United States, and never held full sway in parts of Europe.

Other psychological theories to emerge have tended to focus on the intuitive idea that delusions are caused by some kind of failure of rationality. This was the approach taken by the influential Italian-American psychiatrist Silvano Arieti, who suggested that people with schizophrenia go through a ‘cognitive transformation’ in which their thinking becomes less logical, giving rise to delusional ideas.

Specifically, in Interpretation of Schizophrenia (1955), Arieti suggested that a ‘normal person’ without psychosis ‘automatically applies the Aristotelian laws of logic without even knowing them’. These laws allow us to follow a chain of reasoning in a brief syllogism, such as:

All men are mortal.
Socrates is a man.
Therefore, Socrates is mortal.

In the case of delusional thought, Arieti argued, the ability to follow this logical sequence is lost. His suggestion that people with delusions must be illogical thinkers seems so obvious it surely has to be true. Unfortunately for his hypothesis, healthy people are anything but logical in the philosophical/Aristotelian sense, as corroborated by countless studies. This is illustrated perhaps most vividly by an example from the American psychologist Daniel Kahneman’s book Thinking, Fast and Slow (2011), known as the Linda problem:

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Based on this description, which is more likely true about Linda?

  1. Linda is a bank teller.
  2. Linda is a bank teller and is active in the feminist movement.

Most people use the brief description to do something like ‘get a sense’ for Linda’s personality and, based on that, conclude that option 2 is more likely to be true. In fact, because of the nature of the options (option 2 adds a descriptive caveat that makes it a smaller set of people than option 1), it is option 1 that is logically more likely to be true.

The Linda problem is an example of a question that should prompt us to think in terms of sheer numerical probability. The logical approach would be to consider which set (1 or 2) it would be statistically more likely for any given descriptive example to be drawn from. But this is not how we humans – psychologically healthy or otherwise – tend to think, as demonstrated by this problem and many other examples from Kahneman’s joint programme of research with Amos Tversky.

So the idea that you could reasonably distinguish psychotic individuals from non-psychotic in terms of rationality doesn’t hold. If anything, there’s evidence that people with a propensity toward delusions might be better at engaging in logical thinking than those without. Consider a 2007 study in which a team at the Institute of Psychiatry in London presented three-part arguments to delusion-prone volunteers diagnosed with schizophrenia and to healthy controls. They asked all the volunteers to judge if the arguments’ conclusions were logical or not. Some of the arguments created a conflict between purely logical reasoning and common sense by putting flatly nonsensical information into the structure of a valid logical argument, such as: ‘All buildings speak loudly; a hospital does not speak loudly; therefore, a hospital is not a building.’ If you know anything about hospitals and buildings, you know the conclusion to be factually untrue, but if you ignore the veracity of the premise statements, the conclusion is logically accurate. In the study, it was actually the volunteers with a diagnosis of schizophrenia who were better able to ignore the content and appraise the logical validity of the arguments than were the healthy controls, thus speaking against failure of logic as a cause of delusions.

More recently, researchers have adopted a new angle on the illogical reasoning explanation for delusions, proposing that they could be driven by a specific reasoning bias that the British psychologist Richard Bentall in Madness Explained (2003) describes as ‘epistemological impulsivity’ or jumping to conclusions. In the classic demonstration of this bias, volunteers are shown two jars with different proportions of red and blue beads – one has far more red, the other has far more blue. The jars are hidden and then beads from just one jar are taken out one at a time and shown to the volunteers. Their task is to judge which jar the beads are coming from – the predominantly red-bead jar or the predominantly blue-bead jar. People with delusional beliefs typically make hastier judgments, as if they are willing to use less evidence to form their conclusions, giving rise to the suggestion that this thinking style (the ‘jumping to conclusions bias’ or JTC bias) might contribute to a person developing unusual or delusional beliefs.

However, this idea has come in for recent criticism. Although meta analyses of all the available relevant data do corroborate a connection between the JTC bias and a propensity for delusional ideas, they don’t indicate that the former is necessary or sufficient to give rise to the latter. For instance, people with psychosis but no delusions also seem to demonstrate the JTC bias; at the same time, many non-psychotic people demonstrate the hasty reasoning style. In fact, a team at New York State Psychiatric Institute in 2019 reported findings from a different version of the bead-jar task, suggesting that people with more severe delusions are biased to collect more evidence than those with less severe delusions.

The most obvious problem with the jumping-to-conclusions theory is its lack of explanatory power. While delusions plausibly result from a hasty reasoning style, people who hold delusional beliefs also seem capable of reasoning in more cautious and typical ways about other subjects. So the mystery remains why they don’t reason in a careful way about their delusional beliefs. In fact, in his book Delusions: Understanding the Un-understandable (2017), the psychiatrist Peter McKenna went so far as to describe the ‘reasoning bias’ line of research as a ‘wreckage’, adding that ‘a psychological theory of delusions seems as far away as it must have done half a century ago.’

So, people with delusions show no evidence of being particularly illogical, nor dramatically biased in their thinking. Perhaps unsurprisingly, researchers studying delusions have increasingly looked elsewhere for explanations.

I believe two contemporary approaches hold the most sway. One of these still posits a problem with reasoning as important, but critically also suggests an earlier stage (the first of ‘two factors’), in which the individual has unusual perceptual or bodily experiences that push them to consider novel and unusual explanations. For example, in an extreme case of the ‘Cotard delusion’ (named after the 19th-century French neurologist Jules Cotard) in which a person believes they are dead, the first factor is considered to be a loss of a feeling of aliveness. This lack of feeling prompts the individual to entertain possible explanations, such as the extraordinary notion that they are actually dead. If they also have difficulty with reasoning appropriately – the second ‘factor’ – then they are more likely to accept their bizarre hypothesis, giving rise to the delusional belief. Critically for this ‘two-factor theory’, as it’s known, a deficit in thinking isn’t enough: it is grounded upon aberrant feelings or perceptions.

Another recent influential approach is the ‘predictive processing theory of delusions’, which posits that our experience of reality is based largely on our brain’s predictions of what it expects to perceive at any given time, with sensory information only serving to update and refine these predictive models. According to this account, all our perceptions are in a sense ‘controlled hallucinations’, and all beliefs ‘controlled delusions’. When we are psychologically healthy, the idea is that we strike a balance between our pre-existing expectations and new incoming information, updating our model of reality appropriately in the light of new evidence. Pathological delusions are understood to result from a distortion in the weight accorded to new incoming information, such that even irrelevant noise can be granted undue significance, prompting a search for unusual explanations. When these explanations take hold, a delusion is formed.

Both the ‘two-factor theory’ and the ‘predictive processing theory’ make reference to the process of reasoning, but unlike more traditional approaches to explaining delusions, they also leave room for other aspects of mental life to play a causal role. Specifically, they make reference to feelings, perceptions and our confidence in pre-existing beliefs. This is important because it takes seriously the obvious truth that delusions don’t appear in a vacuum, rather they form in the minds of people with their own individual history of particular experiences and ideas. Delusions come as part of a package, more or less encouraged by the context of our other pre-existing beliefs, and nurtured through our social connections.

To believe something strange, people might have relinquished other beliefs that stood in the way

To make progress, it’s essential that we study the mental and social context in which ordinary and delusional ideas emerge. This is extraordinarily difficult, but it’s exactly what some researchers have begun to try to do. For example, to examine the properties of whole networks of belief, the psychologists Rachel Pechey and Peter Halligan at Cardiff University in Wales asked volunteers to rate the strength of their belief in various, wide-ranging factual statements. The basis of their approach is that we would expect people to hold beliefs that are broadly consistent with each other – after all, it would be strange to insist that your house is haunted but also claim to not believe in ghosts. Consistent with this, Pechey and Halligan’s findings suggest that, while people can be inconsistent in some of what they profess to believe, when an individual holds an unusual belief – such as a belief in an aspect of the paranormal – they will also tend to hold other beliefs that are thematically similar.

Given this, perhaps we should be trying to understand people with delusions not only in terms of how they reason, but also in terms of what ideas and beliefs they do and don’t already hold. Here’s how this could work: when confronted with someone who believes something that I find bizarre or incomprehensible, I should think about what range of other ideas and experiences they have and do not have, which could be helping to make it possible for them to entertain a view of reality so divergent from my own. To come to believe something strange, these people might have needed to relinquish other beliefs that stood in the way.

Take, for example, the Capgras delusion, frank paranoia or the Cotard delusion. In each case, there is an obvious peculiarity that seems to demand explanation. It would be hard for me to believe that my wife is an imposter (Capgras delusion) because of my grounding belief in the basic continuity of people’s identity. It seems prima facie highly unlikely to me that someone could, or would bother, to look very convincingly like my wife without in fact being her. Therefore, to experience the Capgras delusion would require me not only to come to a new belief, but also to have jettisoned my other ideas. So, for people who do experience the Capgras delusion, we can ask: why it is that their other beliefs didn’t act as a check on the range of hypotheses they were willing to entertain?

To view delusions in this way raises the possibility that something has happened to the deluded person’s whole system of beliefs that massively extended the range of doxastic options available to them. For instance, has the patient with the Capgras delusion lost some of their beliefs about the way people tend to behave? In the case of Cotard delusion, the situation is clearer still: in order to believe that they are dead, but still able to speak with people who are alive, the patient must have shed some of the more widely held ideas about the nature of death.

This is consistent with clinical accounts – for instance, the British psychologists Andrew Young and Kate Leafhead observed how a 29-year-old woman experiencing the Cotard delusion also endorsed other related beliefs, such as that dead people can feel their own heartbeat and feel temperature, but she no longer endorsed them 18 months later, by which time her delusion had resolved and her beliefs about death ‘had altered radically’. ‘She now held the sorts of views which many religious people hold,’ the psychologists write. This is consistent with the notion that a whole network of beliefs about death is engaged to support the otherwise self-contradictory idea that you are dead. I’m reminded of a young man I once interviewed as part of a research study, who claimed to be dead. When I asked him how that could possibly be, he smiled and said: ‘You and I don’t see things the same way.’

Perhaps the grounding beliefs that normally keep delusions in check are what we sometimes mean by the concept of ‘common sense’. Certainly some researchers have explored the idea that a lack of this sense plays a part in psychosis. For instance, the Italian psychiatrist Giovanni Stanghellini has posited that the core issue in severe psychosis is a ‘crisis of common sense’, which could often involve ‘an active rejection of taken-for-granted assumptions about the world itself’. This idea has received some recent empirical support in the finding that patients with schizophrenia who score higher on a measure of common sense also tend to show greater insight into their problems.

Of course, beliefs don’t exist only in a private mental context, but can also be held in place by our relationships and social commitments. Consider how political identities often involve a cluster of commitments to various beliefs, even where there is no logical connection between them – for instance, how a person who advocates for say, trans rights, is also more likely to endorse Left-wing economic policies. As the British clinical psychologist Vaughan Bell and his colleagues note in their preprint, ‘De-rationalising Delusions’ (2019), beliefs facilitate affiliation and intragroup trust. They cite earlier philosophical work by others that suggests ‘reasoning is not for the refinement of personal knowledge … but for argumentation, social communication and persuasion’. Indeed, our relationships usually ground our beliefs in a beneficial way, preventing us from developing ideas too disparate from those of our peers, and helping us to maintain a set of ‘healthy’ beliefs that promote our basic wellbeing and continuity in our sense of self.

Given the social function of beliefs, it’s little surprise that delusions usually contain social themes. Might delusion then be a problem of social affiliation, rather than a purely cognitive issue? Bell’s team make just this claim, proposing that there is a broader dysfunction to what they call ‘coalitional cognition’ (important for handling social relationships) involved in the generation of delusions. Harmful social relationships and experiences could play a role here. It is now widely acknowledged that there is a connection between traumatic experiences and symptoms of psychosis. It’s easy to see how trauma could have a pervasive impact on a person’s sense of how safe and trustworthy the world feels, in turn affecting their belief systems.

‘Often schizophrenic delusions involve not belief in the unreal but disbelief in something people take to be true’

The British philosopher Matthew Ratcliffe and his colleagues made this point in their 2014 paper, observing how ‘traumatic events are often said to “shatter” a way of experiencing the world and other people that was previously taken for granted’. They add that a ‘loss of trust in the world involves a pronounced and widespread sense of unpredictability’ that could make people liable to delusions because the ideas we entertain are likely to be shaped by what feels plausible in the context of our subjective experience. Loss of trust is not the same as the absence of a grounding belief, but I would argue that it bears an important similarity. When we lose trust in something, we might say that we find it hard to believe in it. Perhaps loss of certain forms of ordinary belief, especially around close social relationships, makes it possible to acquire beliefs of a different sort altogether.

Also relevant here is that grounding beliefs shouldn’t be understood only as propositional and conscious statements – the kind that you know you hold and that you could easily write down if prompted. Our ‘mental furniture’ also features feelings, fleeting suspicions, tendencies, inclinations, hunches and entire repertoires of socially rewarded patterns of behaviour – all of which are shaped by our life history and social relationships. Viewed this way, to determine why some people become fixated on particular unusual beliefs, one of our most salient considerations should be the psychological context in which they have taken root.

This perspective has important implications for the next steps in studying delusions. The most obvious question is empirical. So far, we have only tentative clinical and anecdotal evidence that the absence of grounding ‘common sense’ beliefs operate as a distinct risk factor for developing psychosis. Thankfully, research is beginning to move in that direction. The approach that recognises the importance of mental and social context also invites a broader gestalt shift in what we take delusions to be. In The Paradoxes of Delusion: Wittgenstein, Schreber, and the Schizophrenic Mind (1994), the American psychologist Louis Sass wrote: ‘It has not in fact been sufficiently noted how often schizophrenic delusions involve not belief in the unreal but disbelief in something that most people take to be true.’ If he is right, and if the absence or diminution of ordinary belief is constitutive of what it means to have a delusion, then our previous focus on the most startling aspect of the experience – a colourful and unusual belief – might be distracting us from other things that are going on.

Critically, the ‘absence of ordinary beliefs’ approach has implications for treatment too. People who express dramatic and unusual ideas vary in terms of how far they prioritise those ideas as they go about their lives, which could explain differences in clinical outcome. Presently, clinicians tend to explore the characteristics of delusions – their fixity, their strength and the distress they cause – in great detail. But might we be ignoring other important aspects of the broader experience that accompanies these ideas, either holding them back or letting them run free? For instance, cognitive behavioural therapy for psychosis – currently a UK government-recommended treatment – emphasises the ‘modification’ of delusional beliefs, often through direct discussion of their content. Yet psychotherapeutic approaches that address broader considerations, such as a person’s other grounding beliefs and basic feelings of unsafety, could have an important role too.

I would not go as far as McKenna when he called psychological research in this area a ‘wreckage’, but it’s true that delusions remain maddeningly difficult to understand. Radically changing our view of them – drawing back to consider the whole mental context in which they arise – offers exciting new avenues to explore. It is hopeless to try to study individual beliefs in isolation, when they exist inside the vibrantly populated minds of people with whole lifetimes of experience. Instead of becoming preoccupied by the extraordinary things the deluded individual believes, we should turn our attention instead to the ordinary things they no longer believe, the absence of which have allowed the bizarre to flourish.

To read more about the history and philosophy of mental illness, visit Aeon’s sister site, Psyche, a new digital magazine that illuminates the human condition through three prisms: mental health; the perennial question of ‘how to live’; and the artistic and transcendent facets of life.