Menu
Aeon
DonateNewsletter
SIGN IN

It’s dangerous to think virtual reality is an empathy machine

<p><em>Photo courtesy Wikimedia</em></p>

Photo courtesy Wikimedia

i

by Erick Ramirez + BIO

Photo courtesy Wikimedia

What is it like to be a cow? Researchers such as Jeremy Bailenson, the director of the Virtual Human Interaction Lab in California, believe they can help you find out. A few years ago, Bailenson and his colleagues at Stanford University created a simulation of a slaughterhouse. In a series of experiments, Bailenson invited people to don virtual reality (VR) headsets, and walk around on all fours to experience ‘what it’s like to be a cow that’s raised for dairy and for meat’. According to Bailenson:

You go down to a trough, you put your head down and pretend to drink some water. You amble over to a pile of hay, you put your head down and you pretend to eat hay. As you’re going from one spot to another, you’re actually seeing your cow get a light prod from a cattle prod, and you’re feeling a slight poke in your chest from a stick in your side.

For a time after their VR experience, people found themselves eating less meat. In his subsequent book Experience on Demand (2018), Bailenson quotes one subject who said: ‘I truly felt like I was going to the slaughterhouse … and felt sad that as a cow I was going to die.’

Results such as these have led Bailenson and others to hail VR as a modern-day empathy machine. VR researchers tell us that simulations can let us see what it’s like to experience the day-to-day indignities of racist microaggression, of becoming homeless, or even of being an animal primed for butchering. The hope is that this technologically-enabled empathy will help us to become better, kinder, more understanding people.

But we should be skeptical of these claims. While VR might help us to cultivate sympathy, it fails to generate true empathy. Although they are often confused with one another, these capacities are distinct. I distinguish between them like this: empathy relates to the cognitive and emotional abilities that help us feel with another. Empathy is what we use when we engage in perspective-taking. Sympathy, meanwhile, involves the capacities that help us feel for another. It doesn’t include imagining what it’s like to be someone else.

Consider the way you respond when a good friend is suffering. You care about your friends and you don’t want them to suffer. In general, you try to help them, and in doing so you’re probably motivated by sympathy. In these cases, your primary feelings are care and concern, not suffering. When you empathise with someone, however, something different is happening. Empathising involves psychologically sharing someone’s perspective, walking in their shoes, or seeing things from their point of view.

Empathy, however, is very, very hard – and sometimes, it’s simply impossible. In his classic 1974 essay, the American philosopher Thomas Nagel argued that humans could not imagine what it was like to be a bat, even if we went to great lengths to try and live like one. ‘To the extent that I could look and behave like … a bat without changing my fundamental structure,’ he wrote, ‘my experiences would not be anything like the experiences of those animals.’ This might seem obvious. A gap of understanding arises because our evolved way of being embodied and our very human, very self-reflective, and very personal life experiences shape the way the world seems to us. Even if we tried our best to live as bats, Nagel was skeptical that we could empathise with them: ‘In so far as I can imagine this (which is not very far), it tells me only what it would be like for me to behave as a bat behaves.’

Something similar is happening in Bailenson’s slaughterhouse. No matter how much subjects walk on all fours, no matter how much they are poked with simulated cattle prods, they’re not empathising with cows. In other words, they are not getting the experience of what it’s like to be cows at a slaughterhouse. VR is a powerful tool, but it cannot alter basic biological embodiment or psychology. Human experiences are sufficiently unlike cow or bat experiences that it’s impossible for us to know what those experiences are like. Though Bailenson’s subjects might think they understand what it’s like to be livestock, and while they might end up more sympathetic to animal suffering (by eating less meat), they’re no closer to empathically grasping animal suffering than they were before.

But can’t VR at least help us take on the perspective of other people – such as those experiencing homelessness or racial discrimination? After all, two humans are much more alike than humans and cows. However, here, too, VR fails to generate the kind of empathic perspective-taking it’s sold as offering. As with Nagel’s bat, the best we can do with VR is to see what it might be like for us to experience some forms of temporary racial discrimination or of becoming homeless; and even in these cases, we should be careful to distinguish between realistic and gamified experiences of homelessness and racism. For all its potential, VR can’t show us what it’s like to be someone else. To echo Nagel, it can only reveal what it would be like for us to have these experiences.

Conscious experiences, even your experience of reading these words right now, acquire their meanings in part via a panoply of nonconscious (‘subdoxastic’) processes. These include not only your biology, but also your cultural concepts, past experiences, emotions, expectations and even features of the specific situations in which you find yourself. As the philosopher Alva Noë explains in his book Action in Perception (2004), perception is something we actively do, not something we passively experience. Our expectations, along with other background processes, help to determine how we understand the things that we see, hear, feel and think, and these processes vary from person to person. They are powerful enough to affect even seemingly nonconscious empathic processes (such as mirror-neuron activation).

One study from Northwestern University in Illinois in 2010 measured the effect of racial bias on empathic distress (that is, feeling a similar pain as the pain someone else is feeling). It showed that internalised racial biases diminished the degree to which subjects felt such distress for the suffering of people outside of their perceived racial group. Although almost all of us are capable of empathic distress, and thus we share embodiment to this extent, even the activity of mirror-neurons can be affected by the internalised prejudice.

My experiences, for example, are informed by the concepts acquired by being a Nicaraguan immigrant to the US in the 1980s. They are unlikely to match those of Michael Sterling, the African-American man whose perspective users are said to occupy in the VR experience 1000 Cut Journey, a simulation of racial microaggression. Though Michael and I share a common humanity (unlike the cow and I), and although we share a common biology, the best I can hope for after experiencing 1000 Cut Journey is greater sympathy for someone like Michael. I can’t escape my own subjectivity to see or experience things from his point of view; it would be a mistake if I thought 1000 Cut Journey let me experience his perspective. Empathy and sympathy are not the same, and it’s important to keep them distinct.

Imagine if I came to the conclusion that homelessness wasn’t that big a deal because I enjoyed the challenging puzzle elements in the VR experience Becoming Homeless. Even worse, imagine if I believed I now had better insight into homelessness, and that my enjoyment left me with the impression that it wasn’t as bad as I feared. I might change the way I thought about homelessness, and the sorts of policies I voted for. Such failures of sympathy, grounded in false beliefs about our VR’s ability to produce empathy, can be avoided. VR is an important tool, and research shows that it can radically affect the way we think about the world. But we shouldn’t be so quick to assume that it endows us with true, first-person, empathetic understanding. That would be bovine indeed.

26 October 2018
Email
Save
Post
Share