Privacy is Power: Review

I don’t do book reviews often. (If you’re counting, that’s two.) But I’m utterly compelled to write about Privacy is Power by Carissa Véliz. I’m truly not prone to hyperbole, but I feel this is one of the most important books of our time. You need to read it. Our privacy is under assault like never before in human history. As this book so clearly explains, privacy is not only a human right, it’s a collective right. Your privacy matters to me, and my privacy matters to you. That’s not just a vague platitude… your contact list, your DNA, your location, your emails… they include information about me, too (or someone you know, anyway). I need you to stop giving away so much damn data. Even if you’re not compromising me directly, you’re supporting and perpetuating surveillance capitalism. Okay… sorry, this gets me worked up. Let’s talk about the book.

Overview

Let me just say that I wish I could have written this book. As an author and privacy advocate, that’s perhaps the greatest compliment I can give it. But it’s not just because it’s well organized, well researched and well written. It’s also eloquent and compelling. I don’t know how anyone could read this book and not come away convinced that privacy is indeed power, and that we’re ceding that power to corporations and governments every minute of every day. And yet, this book is also full of hope for the future – including concrete steps we can all take to reclaim that future.

“Surveillance threatens freedom, equality, democracy, autonomy, creativity, and intimacy. We have been lied to time and again, and our data is being stolen to be used against us. No more. Having too little privacy is at odds with having well-functioning societies. Surveillance capitalism needs to go.”

Like many other technologists, I tend to focus too much on technical solutions to these problems – even the ones that were caused by technology. Carissa Véliz’s background is in philosophy and ethics, and she brings a very human (and humanist) perspective to the problem that is sorely needed – and often missing in the dialog today around legalisms and algorithms.

Data Vultures

The first chapter of the book follows an average person through a typical day, cataloging the myriad ways in which their data was harvested along the way. My first reaction as I read through this chapter was: wow, this fictional person is really clueless. These are all worst case scenarios. All of these things wouldn’t happen to…

And then it hit me. I’ve been in privacy-protecting mode for so long that I couldn’t help but think about all the simple things this fictional person should have been doing to prevent the data mining. I couldn’t grok it. But I’m not an average consumer. I’ve seen the man behind the curtain. I’ve taken the red pill.

Even for people who are vaguely aware that some companies are mining some data, most of this process is completely opaque – and that’s very much on purpose. Look at all the furor over Apple simply exposing tracking in iOS, before it even happened. All Apple is trying to do is bring the tracking to light and obtain informed consent. This new system (aptly named App Tracking Transparency) is not preventing tracking. And yet, advertising companies like Facebook and Google know that when given a clear choice, people will just say no. That was never supposed to happen.

Companies claim that they anonymize and aggregate your data to protect your identity. But it’s often trivial to re-identify people from a data set, particularly when you can correlate it with other data.

How Did We Get Here?

It wasn’t always this way. In the next chapter, the author walks us through the evolution of the data-driven business models of modern tech companies. So many things happened that brought us here.

Advertising has been “dumb” for centuries. But at some point, the folks at Google and Facebook realized that they were sitting on a gold mine of data – your data. The author refers to this as “turning digital exhaust into gold dust”. The classic phrase in marketing is something like: I spend half of my advertising dollars on the wrong things, but I don’t know which half. Google and Facebook decided they could “fix” that “problem”. By virtue of their privileged positions, they knew a lot about us (even people that didn’t use their services). And if they could convince their customers (hint: that’s not you) that they could use that information to hyper-target their ads, then surely those ads would be worth a whole lot more than dumb, contextual ads. So Google gobbled up several ad agencies and added dozens of free services that could mine even more data (including Android, Chrome browser, and Waze). Facebook started shifting it’s privacy settings to favor data collection and seducing its users into giving up more and more personal information – not just about themselves, but everyone they knew, too.

The US government was starting to twitch over this data grab and began to consider curbing these excesses. Then 9/11 happened, and priorities changed radically. The government was supposed to protect its people and it had failed. So it adopted a mantra of “never again”. No measure was too far fetched if it meant they could prevent another attack. Law enforcement and intelligence agencies realized that all of this rich data could be used to monitor people – both citizens and non-citizens. And because the data was being collected by private companies, supposedly with user knowledge and consent, it neatly sidestepped the need for pesky warrants and due process.

Privacy is Power

Unsurprisingly, this chapter is the core of the book. In it, the author lays out a comprehensive case for why asymmetries in knowledge create asymmetries in power. When someone knows a lot more about you than you know about them, it creates a power imbalance. And that imbalance exposes you to being controlled and manipulated.

For personal, intimate relationships, this isn’t a bug – it’s a feature. We intentionally make ourselves vulnerable to a best friend or a lover by sharing personal information, with the understanding that they will reciprocate – and that you will each use that knowledge in the best interests of the other. But you are not Google’s or Facebook’s customer – you’re their product. They have every incentive to milk you for as much personal information as possible. (These companies will claim they don’t sell your data – and technically, they don’t. They sell access to your data, the sell influence over you – which is much more profitable.)

But it’s worse than that. The knowledge asymmetry with big tech is largely opaque to you. Not only are they collecting all your data, you don’t even know what information they have about you – let alone who they’re sharing it with or how they’re using it. And they’ve very carefully engineered (in the literal sense of the word) the system to prevent you from understanding the breadth and depth of what they know about you. This is tech’s “hard power”. They are taking direct steps to mine your data and prevent you from stopping it.

And it still gets worse. Because it’s not just that they’re using your data (and inferences drawn from that data) to show you ads for products and services they believe you will buy. They’re actually using your data to influence what you want to buy – even who you should vote for. This is tech’s “soft power”: the ability to cause us to do things that we believe we’re doing for our own benefit when it’s actually for theirs. The engagement algorithms keep us “doom scrolling”, keep us “liking” more things, keep us watching the next video… and seeing more ads, giving up even more information, and dragging other people into the system.

One popular idea is to treat data as personal property. Data is the new oil, they’ll say. They argue that we should enforce this ownership in a way that will allow individuals to directly sell their data, if they wish – in effect, to fairly compensate them for this information. Bring the trade of information out into the light. Be explicit about it. Institute a sort of free market approach to this problem. But the author explains why this approach is based on an invalid assumption.

“Our interdependence in matters of privacy implies that no individual has the moral authority to sell their data. We don’t own personal data like we own property because our personal data contains the personal data of others. Your personal data is not only yours.”

The author goes on to deftly argue that privacy is a collective endeavor, and necessary for a democracy to function. If we cede our privacy to corporations, we end up with plutocracy. If we cede our privacy to governments, we end up with some sort of authoritarian rule.

“Privacy is important because it gives power to the people. Privacy is a public good, and defending it is our civic duty.”

There are so many other great ideas (and quotes) in this chapter – I can’t do it justice here. But she makes several powerful and poignant arguments about why privacy is so much more than most people think it is, and why it’s something we all need to fight to preserve for our collective good.

Toxic Data

In the fourth chapter, the author argues that data should not be treated like gold or oil, but more like nuclear waste. It’s a liability, not an asset. Stolen or abused data can ruin…

  • individual lives (e.g., Ashley Madison breach)
  • democracy (e.g., Cambridge Analytica)
  • corporations (high fines, lost reputation, even loss of business)
  • society (stoking tribalism and hate, preventing reasoned debate on shared facts)

Just this week, over half a billion Facebook users’ data was leaked. The only data that can’t be stolen, leaked or abused is data that you don’t have in the first place.

In this chapter, the author drives this point home with two very different scenarios of the Nazis hunting down Jews during WWII. In the Netherlands, meticulous records were kept on citizens’ personal information, including religion; but not so in France. You can see where this going. It’s just one of many excellent arguments about why personal data collection has to be minimized. We can’t just look at the best possible outcomes when we’re evaluating the value of personal data collection; we must also consider the worst case scenarios.

Pulling the Plug

Believe it or not, this book is full of hope and optimism. The first step to solving any problem is understanding that you have a problem. The second step is understanding the nature of the problem. This allows you to address the problem. In the fifth chapter, the author outlines several promising approaches – some technical, some political, and some philosophical. We need to change how we look at and think about personal data, and our notion of privacy. We need to understand that not all data is personal (or can be made impersonal), and that data can still be gathered and used for the general good.

We also need to disabuse ourselves of the notion that all regulation is bad – that it “stifles innovation” or “kills jobs”. There’s a reason you can safely eat food, take medications, drive a car, and fly in an airplane. Those products and services are regulated. We’ve decided, as a society, to establish some basic safety standards and appoint knowledgeable agencies to enforce them so we (as individuals) don’t have to. Regulations also level the playing field, allowing startup companies to compete with large, deep-pocketed incumbents. Any game worth playing has rules.

Furthermore, we have come to agree that even in a “free market” society, the buying and selling of some things is just wrong.

“Even in the the most capitalist of societies we agree that certain things are not for sale – among them are people, votes, organs and the outcomes of sports matches. We should add personal data to that list.”

We’ve also collectively agreed that because knowledge is power, we have to regulate the sharing and use of personal data in specific situations where that data could be abused. We have very strict laws around medical and financial information, for example. Without them, our doctors and financial advisors could blackmail us or ruin our lives through careless disclosure. We entrust this information to them because we know they are legally bound to use it for our sole (or at least primary) benefit. They act as fiduciaries. We need all personal data to be subject to fiduciary responsibility (which should include metadata and personal inferences drawn from this data).

This chapter discusses several interesting and promising technologies that can either de-personalize data or allow processing of aggregate personal data without exposing the individual data. Data itself isn’t bad and society can gain many important and useful insights by collecting and analyzing data. But we have to minimize what personal data we collect, carefully protect that data, find safe ways to process that data, and then purge the data as soon as possible. It can be done. We just have to take a very different approach than what we’ve been doing to date.

What You Can Do

The final (main) chapter is fairly short, but still chock full of important steps we can all take to effect change. This sentence sums it up well.

“To change our privacy landscape, we have to write about it, persuade others to protect their and our privacy, get organized, unveil the inner workings of the abusive system that is the surveillance society, support alternatives, envision new possibilities, and refuse to cooperate in our own surveillance.”

The chapter walks through several specific ideas, many of which I cover in my own book. I won’t go through them all here, but when (not if) you read the book, you can easily accomplish many of them on your own. But one I’d like to call out in particular is paying for privacy. We’re paying for Google and Facebook right now, and the price is your specific and our collective privacy. We must directly support privacy-respecting products and services by paying for them. This signals to Google and Facebook that there’s money to be made without mining data. It tells our elected representatives that we care about privacy. And most importantly, it funds the companies that are doing the right things and incentivizes the creation of more such products and services.

The book wraps up with a compelling call to action: refuse the unacceptable.

“Do not submit to injustice. Do not think yourself powerless – you’re not. [..] The whole digital economy depends on you. On your cooperation and assent. Do not tolerate having your right to privacy violated. […] Privacy is a right for good reason. Defend it.”

Conclusion

I’ve really only scratched the surface of this book. There are so many other memorable quotes and important concepts that I haven’t covered. I’ve read and written a lot about privacy. I’ve taught classes on it. I discuss it every week in my podcast. But even I learned new things from Privacy is Power – in particular, I came away with a clearer understanding of privacy’s importance to society.

I hate to end a review of this book with a quote from someone else, but to me it summarizes clearly one of the core tenets of the book.

“Nobody needs to justify why they ‘need’ a right: the burden of justification falls on the one seeking to infringe upon the right. But even if they did, you can’t give away the rights of others because they’re not useful to you. More simply, the majority cannot vote away the natural rights of the minority. Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.”

Edward Snowden

Privacy is a human right and it’s a collective good. Freedom isn’t free, and neither is privacy. It’s something we have to appreciate and fight for on a daily basis – in small, personal ways and in broader, shared ways. If we don’t, we will surely lose it. But all is not lost, not yet. We can fix this, and we must fix this.