Menu
Aeon
DonateNewsletter
SIGN IN

Should you shield yourself from others’ abhorrent beliefs?

<p>Lidiya Guryev aka Cynthia Murphy prepares a birthday cake in Hoboken, NJ. ‘Murphy’ was deported to Russia as part of a spy-swap in July 2010. <em>Photo Getty News</em></p>

Lidiya Guryev aka Cynthia Murphy prepares a birthday cake in Hoboken, NJ. ‘Murphy’ was deported to Russia as part of a spy-swap in July 2010. Photo Getty News

i

by John Schwenkler + BIO

Lidiya Guryev aka Cynthia Murphy prepares a birthday cake in Hoboken, NJ. ‘Murphy’ was deported to Russia as part of a spy-swap in July 2010. Photo Getty News

Many of our choices have the potential to change how we think about the world. Often the choices taken are for some kind of betterment: to teach us something, to increase understanding or to improve ways of thinking. What happens, though, when a choice promises to alter our cognitive perspective in ways that we regard as a loss rather than a gain?

Think, for example, of Elizabeth and Philip Jennings in the FX television show, The Americans (2013-). They are Russian spies in the 1980s tasked with living in the United States and engaging in acts of espionage. In order to do their job, they have to spend a lot of time associating with people whose worldview they find abhorrent. They must build close relationships with many of these people, and this means exposing themselves to their ideas and often acting as if they hold these ideas themselves. It makes good sense for a person given such an assignment to worry that, in carrying it out, she will become more sympathetic than she currently is to some false or abhorrent ideas – not because she has learned that these ideas might be correct, but because the time spent encountering these ideas and pretending to embrace them might cause her to unlearn, at least to a degree, some of what she presently understands about the world.

It’s not hard to imagine other cases that have this kind of structure. Maybe the documentary that a friend invites you to watch puts forward a message that you think is dangerously false. Maybe a discipline you are thinking of studying involves ideological presuppositions you reject. And so on. In such cases, the way that a choice would alter your cognitive perspective is seen as a net minus. The choice might seem like a good one nevertheless – if it’s also a choice to do your job, say, or to spend time with a friend who needs your company. But the potential loss of knowledge or understanding – the potential clouding of your way of thinking about the world – is something you’d rather avoid if you could.

But wait. Can this really be the right way to think about this kind of situation? Imagine a climate-change skeptic considering whether to take an oceanography course. Suppose this person thinks: Climate change is a hoax, and if I enrol in this course it will make me more inclined to believe in climate change, so perhaps I should do something else with my time. We have words for this kind of person: dogmatic, ideological, closed-minded, fearful of the truth. This is not the kind of person you should want to be. But what is the difference between this person and the spy we imagined, who considers refusing an assignment because of the way it would cloud her understanding of the falsity of certain abhorrent views?

These cases present us with a dilemma. When we consider how a certain choice would alter our knowledge, understanding or ways of thinking, we do this according to the cognitive perspective that we have right now. This means that it’s according to our current cognitive perspective that we determine whether a choice will result in an improvement or impairment of that very perspective. And this way of proceeding seems to privilege our present perspective in ways that are dogmatic or closed-minded: we might miss the chance to improve our cognitive situation simply because, by our current lights, that improvement appears as a loss. Yet it seems irresponsible to do away entirely with this sort of cognitive caution. How much is too much, though, and when is this caution appropriate? And is it right to trust your current cognitive perspective as you work out an answer to those questions? (If not, what other perspective are you going to trust instead?)

This dilemma is escapable, but only by abandoning an appealing assumption about the sort of grasp we have on the reasons for which we act. Imagine someone who believes that her local grocery store is open for business today, so she goes to buy some milk. But the store isn’t open after all – she didn’t realise that today’s a holiday. Even though the store is closed, her behaviour still makes a kind of sense. She is going to the store because she thinks it is open – not because it actually is open. It makes sense for this person to go to the store, but she doesn’t have as good a reason to go there as she would if she didn’t just think, but rather knew, that the store were open. If that were case she’d be able to go to the store because it is open, and not merely because she thinks it is. That’s the distinction to keep in mind.

Now let’s revisit the cases of the spy and the climate skeptic. Suppose that a spy is asked to infiltrate a group of hateful extremists. Should she accept the assignment? If the spy knows that the extremists’ views are false and abhorrent, she might reject the assignment because of that falsity and abhorrence. And that seems like a good reason indeed: the extremists’ views are abhorrent, and the assignment risks making the spy more sympathetic to those views, so perhaps she should ask for a different one.

The same can’t be said of the skeptic, however. The skeptic doesn’t know that climate change is a hoax, since it isn’t a hoax at all. So he can’t choose not to enrol in the course because climate change is a hoax, any more than the person we imagined earlier could go to the store because it is open. Rather, the most that the skeptic can do is avoid taking the course because he thinks that climate change is a hoax – a choice that makes sense, but not one that is based on as good a reason as the skeptic would have if he didn’t just think, but rather knew, that this was true.

If this is on the right track, then the crucial difference between the dogmatic or closed-minded person and the person who exercises appropriate cognitive caution might be that the second sort of person knows, while the first merely believes, that the choice she decides against is one that would be harmful to her cognitive perspective. The person who knows that a choice will harm her perspective can decide against it simply because it will do so, while the person who merely believes this can make this choice only because that is what she thinks.

What’s still troubling is that the person who acts non-knowingly and from a mere belief might still believe that she knows the thing in question: that climate change is a hoax, say, or that the Earth is less than 10,000 years old. In that case, she’ll believe that her choices are grounded in the facts themselves, and not just in her beliefs about them. She will act for a worse sort of reason than the sort of reason she takes herself to have. And what could assure us, when we exercise cognitive caution in order to avoid what we take to be a potential impairment of our understanding or a loss of our grip on the facts, that we aren’t in that situation as well?