Essay/
Technology and the self

Photo by Raghu Rai/Magnum

Privacy is power

Don’t just give away your privacy to the likes of Google and Facebook – protect it, or you disempower us all

Carissa Véliz

Photo by Raghu Rai/Magnum

Carissa Véliz

is a postdoctoral research fellow at the Uehiro Centre for Practical Ethics and the Wellcome Centre for Ethics and Humanities at the University of Oxford. She is the editor of the forthcoming 'Oxford Handbook of Digital Ethics'.

Brought to you by Curio, an Aeon partner

3,600 words

Edited by Nigel Warburton

Syndicate this Essay

Support Aeon this December

Every donation tells us that what we do matters to you too.

PLEASE GIVE NOW
author image Brigid Hains
Editorial Director

Imagine having a master key for your life. A key or password that gives access to the front door to your home, your bedroom, your diary, your computer, your phone, your car, your safe deposit, your health records. Would you go around making copies of that key and giving them out to strangers? Probably not the wisest idea – it would be only a matter of time before someone abused it, right? So why are you willing to give up your personal data to pretty much anyone who asks for it?

Privacy is the key that unlocks the aspects of yourself that are most intimate and personal, that make you most you, and most vulnerable. Your naked body. Your sexual history and fantasies. Your past, present and possible future diseases. Your fears, your losses, your failures. The worst thing you have ever done, said, and thought. Your inadequacies, your mistakes, your traumas. The moment in which you have felt most ashamed. That family relation you wish you didn’t have. Your most drunken night.

When you give that key, your privacy, to someone who loves you, it will allow you to enjoy closeness, and they will use it to benefit you. Part of what it means to be close to someone is sharing what makes you vulnerable, giving them the power to hurt you, and trusting that person never to take advantage of the privileged position granted by intimacy. People who love you might use your date of birth to organise a surprise birthday party for you; they’ll make a note of your tastes to find you the perfect gift; they’ll take into account your darkest fears to keep you safe from the things that scare you. Not everyone will use access to your personal life in your interest, however. Fraudsters might use your date of birth to impersonate you while they commit a crime; companies might use your tastes to lure you into a bad deal; enemies might use your darkest fears to threaten and extort you. People who don’t have your best interest at heart will exploit your data to further their own agenda. Privacy matters because the lack of it gives others power over you.

You might think you have nothing to hide, nothing to fear. You are wrong – unless you are an exhibitionist with masochistic desires of suffering identity theft, discrimination, joblessness, public humiliation and totalitarianism, among other misfortunes. You have plenty to hide, plenty to fear, and the fact that you don’t go around publishing your passwords or giving copies of your home keys to strangers attests to that.

You might think your privacy is safe because you are a nobody – nothing special, interesting or important to see here. Don’t shortchange yourself. If you weren’t that important, businesses and governments wouldn’t be going to so much trouble to spy on you.

You have your attention, your presence of mind – everyone is fighting for it. They want to know more about you so they can know how best to distract you, even if that means luring you away from quality time with your loved ones or basic human needs such as sleep. You have money, even if it is not a lot – companies want you to spend your money on them. Hackers are eager to get hold of sensitive information or images so they can blackmail you. Insurance companies want your money too, as long as you are not too much of a risk, and they need your data to assess that. You can probably work; businesses want to know everything about whom they are hiring – including whether you might be someone who will want to fight for your rights. You have a body – public and private institutions would love to know more about it, perhaps experiment with it, and learn more about other bodies like yours. You have an identity – criminals can use it to commit crimes in your name and let you pay for the bill. You have personal connections. You are a node in a network. You are someone’s offspring, someone’s neighbour, someone’s teacher or lawyer or barber. Through you, they can get to other people. That’s why apps ask you for access to your contacts. You have a voice – all sorts of agents would like to use you as their mouthpiece on social media and beyond. You have a vote – foreign and national forces want you to vote for the candidate that will defend their interests.

As you can see, you are a very important person. You are a source of power.

By now, most people are aware that their data is worth money. But your data is not valuable only because it can be sold. Facebook does not technically sell your data, for instance. Nor does Google. They sell the power to influence you. They sell the power to show you ads, and the power to predict your behaviour. Google and Facebook are not really in the business of data – they are in the business of power. Even more than monetary gain, personal data bestows power on those who collect and analyse it, and that is what makes it so coveted.

There are two aspects to power. The first aspect is what the German philosopher Rainer Forst in 2014 defined as ‘the capacity of A to motivate B to think or do something that B would otherwise not have thought or done’. The means through which the powerful enact their influence are varied. They include motivational speeches, recommendations, ideological descriptions of the world, seduction and credible threats. Forst argues that brute force or violence is not an exercise of power, for subjected people don’t ‘do’ anything; rather, something is done to them. But clearly brute force is an instance of power. It is counterintuitive to think of someone as powerless who is subjecting us through violence. Think of an army dominating a population, or a thug strangling you. In Economy and Society (1978), the German political economist Max Weber describes this second aspect of power as the ability for people and institutions to ‘carry out [their] own will despite resistance’.

In short, then, powerful people and institutions make us act and think in ways in which we would not act and think were it not for their influence. If they fail to influence us into acting and thinking in the way that they want us to, powerful people and institutions can exercise force upon us – they can do unto us what we will not do ourselves.

There are different types of power: economic, political and so on. But power can be thought of as being like energy: it can take many different forms, and these can change. A wealthy company can often use its money to influence politics through lobbying, for instance, or to shape public opinion through paying for ads.

Power over others’ privacy is the quintessential kind of power in the digital age

That tech giants such as Facebook and Google are powerful is hardly news. But exploring the relationship between privacy and power can help us to better understand how institutions amass, wield and transform power in the digital age, which in turn can give us tools and ideas to resist the kind of domination that survives on violations of the right to privacy. However, to grasp how institutions accumulate and exercise power in the digital age, first we have to look at the relationship between power, knowledge and privacy.

There is a tight connection between knowledge and power. At the very least, knowledge is an instrument of power. The French philosopher Michel Foucault goes even further, and argues that knowledge in itself is a form of power. There is power in knowing. By protecting our privacy, we prevent others from being empowered with knowledge about us that can be used against our interests.

The more that someone knows about us, the more they can anticipate our every move, as well as influence us. One of the most important contributions of Foucault to our understanding of power is the insight that power does not only act upon human beings – it constructs human subjects (even so, we can still resist power and construct ourselves). Power generates certain mentalities, it transforms sensitivities, it brings about ways of being in the world. In that vein, the British political theorist Steven Lukes argues in his book Power (1974) that power can bring about a system that produces wants in people that work against their own interests. People’s desires can themselves be a result of power, and the more invisible the means of power, the more powerful they are. Examples of power shaping preferences today include when tech uses research about how dopamine works to make you addicted to an app, or when you are shown political ads based on personal information that makes a business think you are a particular kind of person (a ‘persuadable’, as the data-research company Cambridge Analytica put it, or someone who might be nudged into not voting, for instance).

The power that comes about as a result of knowing personal details about someone is a very particular kind of power. Like economic power and political power, privacy power is a distinct type of power, but it also allows those who hold it the possibility of transforming it into economic, political and other kinds of power. Power over others’ privacy is the quintessential kind of power in the digital age.

Two years after it was funded and despite its popularity, Google still hadn’t developed a sustainable business model. In that sense, it was just another unprofitable internet startup. Then, in 2000, Google launched AdWords, thereby starting the data economy. Now called Google Ads, it exploited the data produced by Google’s interactions with its users to sell ads. In less than four years, the company achieved a 3,590 per cent increase in revenue.

That same year, the Federal Trade Commission had recommended to US Congress that online privacy be regulated. However, after the attacks of 11 September 2001 on the Twin Towers in New York, concern about security took precedence over privacy, and plans for regulation were dropped. The digital economy was able to take off and reach the magnitude it enjoys today because governments had an interest in having access to people’s data in order to control them. From the outset, digital surveillance has been sustained through a joint effort between private and public institutions.

The mass collection and analysis of personal data has empowered governments and prying companies. Governments now know more about their citizens than ever before. The Stasi (the security service of the German Democratic Republic), for instance, managed to have files only on about a third of the population, even if it aspired to have complete information on all citizens. Intelligence agencies today hold much more information on all of the population. To take just one important example, a significant proportion of people volunteer private information in social networks. As the US filmmaker Laura Poitras put it in an interview with The Washington Post in 2014: ‘Facebook is a gift to intelligence agencies.’ Among other possibilities, that kind of information gives governments the ability to anticipate protests, and even pre-emptively arrest people who plan to take part. Having the power to know about organised resistance before it happens, and being able to squash it in time, is a tyrant’s dream.

Tech companies’ power is constituted, on the one hand, by having exclusive control of data and, on the other, by the ability to anticipate our every move, which in turn gives them opportunities to influence our behaviour, and sell that influence to others. Companies that earn most of their revenues through advertising have used our data as a moat – a competitive advantage that has made it impossible for alternative businesses to challenge tech titans. Google’s search engine, for example, is as good as it is partly because its algorithm has much more data to learn from than any of its competitors. In addition to keeping the company safe from competitors and allowing it to train its algorithm better, our data also allows tech companies to predict and influence our behaviour. With the amount of data it has access to, Google can know what keeps you up at night, what you desire the most, what you are planning to do next. It then whispers this information to other busybodies who want to target you for ads.

Tech wants you to think that the innovations it brings into the market are inevitable

Companies might also share your data with ‘data brokers’ who will create a file on you based on everything they know about you (or, rather, everything they think they know), and then sell it to pretty much whoever is willing to buy it – insurers, governments, prospective employers, even fraudsters.

Data vultures are incredibly savvy at using both the aspects of power discussed above: they make us give up our data, more or less voluntarily, and they also snatch it away from us, even when we try to resist. Loyalty cards are an example of power making us do certain things that we would otherwise not do. When you are offered a discount for loyalty at your local supermarket, what you are being offered is for that company to conduct surveillance on you, and then influence your behaviour through nudges (discounts that will encourage you to buy certain products). An example of power doing things to us that we don’t want it to do is when Google records your location on your Android smartphone, even when you tell it not to.

Both types of power can also be seen at work at a more general level in the digital age. Tech constantly seduces us into doing things we would not otherwise do, from getting lost down a rabbit hole of videos on YouTube, to playing mindless games, or checking our phone hundreds of times a day. The digital age has brought about new ways of being in the world that don’t always make our lives better. Less visibly, the data economy has also succeeded in normalising certain ways of thinking. Tech companies want you to think that, if you have done nothing wrong, you have no reason to object to their holding your data. They also want you to think that treating your data as a commodity is necessary for digital tech, and that digital tech is progress – even when it might sometimes look worryingly similar to social or political regress. More importantly, tech wants you to think that the innovations it brings into the market are inevitable. That’s what progress looks like, and progress cannot be stopped.

That narrative is complacent and misleading. As the Danish economic geographer Bent Flyvbjerg points out in Rationality and Power (1998), power produces the knowledge, narratives and rationality that are conducive to building the reality it wants. But technology that perpetuates sexist and racist trends and worsens inequality is not progress. Inventions are far from unavoidable. Treating data as a commodity is a way for companies to earn money, and has nothing to do with building good products. Hoarding data is a way of accumulating power. Instead of focusing only on their bottom line, tech companies can and should do better to design the online world in a way that contributes to people’s wellbeing. And we have many reasons to object to institutions collecting and using our data in the way that they do.

Among those reasons is institutions not respecting our autonomy, our right to self-govern. Here is where the harder side of power plays a role. The digital age thus far has been characterised by institutions doing whatever they want with our data, unscrupulously bypassing our consent whenever they think they can get away with it. In the offline world, that kind of behaviour would be called matter-of-factly ‘theft’ or ‘coercion’. That it is not called this in the online world is yet another testament to tech’s power over narratives.

It’s not all bad news, though. Yes, institutions in the digital age have hoarded privacy power, but we can reclaim the data that sustains it, and we can limit their collecting new data. Foucault argued that, even if power constructs human subjects, we have the possibility to resist power and construct ourselves. The power of big tech looks and feels very solid. But tech’s house of cards is partly built on lies and theft. The data economy can be disrupted. The tech powers that be are nothing without our data. A small piece of regulation, a bit of resistance from citizens, a few businesses starting to offer privacy as a competitive advantage, and it can all evaporate.

No one is more conscious of their vulnerability than tech companies themselves. That is why they are trying to convince us that they do care about privacy after all (despite what their lawyers say in court). That is why they spend millions of dollars on lobbying. If they were so certain about the value of their products for the good of users and society, they would not need to lobby so hard. Tech companies have abused their power, and it is time to resist them.

In the digital age, resistance inspired by the abuse of power has been dubbed a techlash. Abuses of power remind us that power needs to be curtailed for it to be a positive influence in society. Even if you happen to be a tech enthusiast, even if you think that there is nothing wrong with what tech companies and governments are doing with our data, you should still want power to be limited, because you never know who will be in power next. Your new prime minister might be more authoritarian than the old one; the next CEO of the next big tech company might not be as benevolent as those we’ve seen thus far. Tech companies have helped totalitarian regimes in the past, and there is no clear distinction between government and corporate surveillance. Businesses share data with governments, and public institutions share data with companies.

When you expose your privacy, you put us all at risk

Do not give in to the data economy without at least some resistance. Refraining from using tech altogether is unrealistic for most people, but there is much more you can do short of that. Respect other people’s privacy. Don’t expose ordinary citizens online. Don’t film or photograph people without their consent, and certainly don’t share such images online. Try to limit the data you surrender to institutions that don’t have a claim to it. Imagine someone asks for your number in a bar and won’t take a ‘No, thank you’ for an answer. If that person were to continue to harass you for your number, what would you do? Perhaps you would be tempted to give them a fake number. That is the essence of obfuscation, as outlined by the media scholars Finn Bruton and Helen Nissenbaum in the 2015 book of that name. If a clothing company asks for your name to sell you clothes, give them a different name – say, Dr Private Information, so that they get the message. Don’t give these institutions evidence they can use to claim that we are consenting to our data being taken away from us. Make it clear that your consent is not being given freely.

When downloading apps and buying products, choose products that are better for privacy. Use privacy extensions on your browsers. Turn your phone’s wi-fi, Bluetooth and locations services off when you don’t need them. Use the legal tools at your disposal to ask companies for the data they have on you, and ask them to delete that data. Change your settings to protect your privacy. Refrain from using one of those DNA home testing kits – they are not worth it. Forget about ‘smart’ doorbells that violate your privacy and that of others. Write to your representatives sharing your concerns about privacy. Tweet about it. Take opportunities as they come along to inform business, governments and other people that you care about privacy, that what they are doing is not okay.

Don’t make the mistake of thinking you are safe from privacy harms, maybe because you are young, male, white, heterosexual and healthy. You might think that your data can work only for you, and never against you, if you’ve been lucky so far. But you might not be as healthy as you think you are, and you will not be young forever. The democracy you are taking for granted might morph into an authoritarian regime that might not favour the likes of you.

Furthermore, privacy is not only about you. Privacy is both personal and collective. When you expose your privacy, you put us all at risk. Privacy power is necessary for democracy – for people to vote according to their beliefs and without undue pressure, for citizens to protest anonymously without fear of repercussions, for individuals to have freedom to associate, speak their minds, read what they are curious about. If we are going to live in a democracy, the bulk of power needs to be with the people. If most of the power lies with companies, we will have a plutocracy. If most of the power lies with the state, we will have some kind of authoritarianism. Democracy is not a given. It is something we have to fight for every day. And if we stop building the conditions in which it thrives, democracy will be no more. Privacy is important because it gives power to the people. Protect it.

Carissa Véliz

is a postdoctoral research fellow at the Uehiro Centre for Practical Ethics and the Wellcome Centre for Ethics and Humanities at the University of Oxford. She is the editor of the forthcoming 'Oxford Handbook of Digital Ethics'.

aeon.co
Syndicate this Essay
Essay/
The ancient world
Rules or citizens?

Ancient Athenian and Greek practices afford us insights into how and why to maintain real accountability in public life

Melissa Lane

Essay/
Virtues and vices
Modesty means more, not less

True modesty is not to be timid or meek but a way of being in the world that means you don’t get in the way of your life

Nicolas Bommarito