“Data, Technology, and the Power of Privacy” Why? Radio episode with guest Carissa Véliz

3:37AM Feb 13, 2023

Speakers:

Jack Weinstein

Announcer

Carissa Véliz

Keywords:

people

privacy

data

democracy

power

companies

life

buy

philosophy

autonomy

person

question

philosophical discussions

instance

students

weinstein

book

apps

listening

case

DISCLAIMER: This transcript has been autogenerated and may contain errors, do not cite without verifying accuracy. To do so, click on the first word of the section you wish to cite and listen to the audio while reading the text. If you find errors, please email us at whyradioshow@und.edu. Please include the episode name and time stamp where the error is found. Thank you.Why philosophical discussions about everyday life is produced by the Institute for philosophy and public life, a division of the University of North Dakota's college of arts and sciences. Visit us online at why radio show.org

The original episode can be found here: https://wp.me/p8pYQY-jug

Hi, I'm Jack Russell Weinstein host of wide philosophical discussions about everyday life. On today's episode we're asking Carissa valise, about the power privacy has and the ways our technology undermines our sphere of personal freedoms. Please visit why radio show.org For our archives show notes and to support the program. Click donate on the upper right hand corner to make your tax deductible donations for the University of North Dakota secure website. We exist solely on listener contributions.

And now our guest, Carissa valleys is Associate Professor at the Faculty of Philosophy at the institute for ethics and artificial intelligence. She's also a tutorial Fellow at Hartford College of Oxford University. She's the author most recently of privacy is power, why and how you should take back control of your data. Carissa, welcome to Why

Why.

Thank you so much, Jack. It's great to be here.

If you'd like to participate, share your favorite moments from the show and tag us on Twitter, Instagram, and Facebook. Our handle is always at why radio show why radio show you can always email us at ask why umd.edu Listen to our previous episodes for free, and learn more at why radio show.org Also, if you can please rate us on iTunes and Spotify, the various networks that I just complained about so that others can find our show. And we'd very much appreciate it if you'd help us make our 15 second season even better by donating at y Radio show.org. All right. So with all that out of the way. Karissa, why is a philosopher interested in privacy with the little boxes we check? When we sign up to websites? Is this more of a problem for a lawyer?

Privacy is a very interesting topic because it's a problem for many disciplines, of course, for lawyers, of course, for computer scientists and engineers, and anthropologists and sociologists, but also for philosophers, there are many philosophical questions about privacy. One of them is simply how do we define privacy? It's actually much harder than it might seem at first glance, and how do we know when somebody's privacy is lost? And how does that relate to wrongs related to privacy? So a huge field is, of course, the ethics of privacy. One thing is to lose privacy. But another thing is to have your right to privacy violated and have questions about what does that entail? And what are our privacy related duties are very much part of ethics. And of course, that's that's part of philosophy.

You belong to the institute for ethics and AI, which I assume is artificial intelligence. Yet most of our conversation about this sort of thing is rooted in the AI discussion. Isaac Asimov's rules for robots and stories of terminators and things like that. Is this conversation about privacy? The precursor to that is a separate conversation, how do we think about this discussion about data and privacy separate from the science fiction narratives that we're so sucked into?

I think a good account of privacy can give a plausible account of very old school cases, like you know, the proverbial peeping Tom. And at the same time, very new cases, like companies collecting our personal data, because at the end of the day, both of those are related to privacy. So I think it's a discussion that is very much, much older than AI and, and then big data. But that is enlivened by what's happening today, because we are collecting much more personal data than ever before in history. And we have many more tools to analyze it. So that creates a risk that we have never seen before. And that makes it a very interesting, historical moment from the perspective of ethics and political philosophy.

You tell a story in the book about data science students, picking random people and learning about their lives. And then verifying the information. Can you tell that story because it's it's shocking enough that I think it sets up the problem for us very well.

It is quite shocking. A lot of the times when when I talk about privacy, ordinary people can say well, are often say, Well, I'm just a normal person. I have a really boring life and nobody's interested in me. And it turns out that that's false, but there are all kinds of companies you insurance companies and banks and prospective employers who might be interested in them. And one example is how data scientists when their training, one task that they might be given by their instructors is to just choose a random person anywhere in the world and find out everything there is to know about them. So in this case, a data scientist chooses someone random guy from Virginia and learns that he's diabetic, what kind of car he drives where he lives, that he's having an affair, that he's afraid of losing his job? How much does he owe to the bank, what his health records look like, all kinds of things, you know, what he does searched for at night when he is alone, all kinds of things that are incredibly personal. And this person will never know that somebody has revealed these very sensitive data points about him to in somewhere around the world because they were just practicing their skills.

Right? So they pick this guy at random, they find out all of his secrets. But he's not some random number. He's a real person. And then if I recall correctly, they call him up and pretend to be a survey or something and verify the information. Is that the same person? Oh, no,

this is a different case. This is what Cambridge analytic I did. So Cambridge Analytica was a political firm that was hired by campaigns, and that used personal data, for the purposes of trying to sway elections. And this was a company that was involved with the Trump campaign, and with the Brexit campaign, among others, and many other campaigns around the world. And they were using data, especially from Facebook, to try to figure out people's personalities and their details, and to see if they got it right. And in one case, they were showcasing their abilities. If I remember correctly, to Steve Bannon, in this case, and this is what they did, they just chose a random person. They showed how they knew everything about her. And then they called her up right then and there, and pretended to be researchers from the University of Cambridge, just to verify that they had gotten things right. And indeed, they had spot on.

As a philosopher, when I read that story, and you bring you use this language very specifically, in the book, you made me reconsider what it meant to be a nobody, right? I mean, I have always said, Look, no one's interested in a philosophy professor in North Dakota, who cares, right? If someone learns my sexual fantasies? Well, you know, whatever. I'm not particularly abnormal in my life. But you really made me challenge that vision of no body? Why? Why does the concept of no body not mean anything anymore? In this context? Why is every individual important, other than we're all important to ourselves?

Well, there are a few reasons. One is that you are a connection to other people. So even if you think like, okay, there's nothing interesting about you. And let's say that you don't, you know, you're not afraid of being extorted or publicly shamed or discriminated against, or all these kinds of privacy harms that can happen, you're still a connection to other people. So you're somebody's neighbor, you're somebody's professor, you're somebody's partner, you're somebody's father. By giving up your privacy, you are giving access to all these people who haven't consented to having that kind of access to them. That's one reason. Another connected reason is that we are a community. So even if you don't care about yourself, and even if you don't care about your family, and friends, and students, and so on, and neighbors, you might still care about democracy. So when you give access to companies that are trying to sway elections, you're actually jeopardizing your your community. But more importantly, I think the the point about the reason why this argument about being nobody is not very effective is because if you really weren't nobody, if you were that uninteresting, and that unimportant, then companies wouldn't bother, wouldn't take so much trouble in learning so much about you. The reason that they do is because you are important, because you have a voice because you have a vote, because you have money that you spend, because you have attention that you spend, and because you have access to all these other people. So in fact, you have a lot of power. And by giving up your privacy, you're surrendering that power to companies that don't have your best interests at heart and don't have the best interest of your friends and family and don't have the best interests of society.

So let me let me let me spin a scenario because because you were talking about the connection I have with my students, and I wasn't even thinking in those terms when I was reading the book. So I have, let's say 100 students this semester. And one of the students will say, a female student sends me an email and says, I'm not feeling well, I'm going to miss class for a couple days, then I say, Fine, I don't think anything of it. Well, our emails, all of our emails are public information because of Sunshine laws in in North Dakota, someone could come and look, get that email and say, Oh, I wonder what she was sick of, and then use that name and the IP address to track her down, discover that she wasn't just sick, but that she went down to Fargo to have an abortion, track her behavior and her politics and then either use that data or even expose her, because maybe she wanted to keep the abortion secret from her family or other people and even make her a political tool, all without her consent. Is that far fetched? Or is that perfectly plausible?

It's perfectly plausible. And one of my concerns is that we as professors, but also this happens with lawyers, it happens with doctors, we have a duty of care towards our students, or clients or patients. And when we give so much power to companies, we're no longer able to fulfill that duty. Because when I use a company, say, like Microsoft, or whatever other company your university uses, to email my students. I'm not, I'm not in control of that data anymore. So if they tell me something very sensitive, in an email, I can protect them. And that is, I think one of the ways in which we can think about it is that is breaking down part of the social contracts that have been part of our society and our way of life for a long, long time.

So now I imagine someone coming along and saying, Well, okay, this student was victimized in this way. But the more we are open about how many people actually have abortions, the more we won't be embarrassed by them. And we are a culture of exhibitionists any way, you know, I'm sure that student has pictures of herself in a bikini on Instagram or something like that, that the very thing that we call privacy, it's almost obsolete, our definitions of privacy has changed. Does that hold water? I mean, are we no longer a private culture in that way,

I think we have become much more a culture of exposure. But that doesn't mean it's right. And that doesn't mean that it doesn't have a cost. And I find that arguments that, you know, the more transparent we are, the more everybody's gonna forgive each other for our mistakes, and the more equal society will be completely false. If you look at evidence, for instance, with race, the fact that race is visible, doesn't make it any less likely for someone to be discriminated against. And one thing to note is that privacy issues are very related to power and power differentials and power asymmetries, such that the most vulnerable in society are the ones who bear the brunt of it. So it's not going to be the white wealthy man who is going to suffer the most from this kind of culture of exposure, it's going to be the vulnerable young woman. And we see this all over the internet, we see that young women, especially those who are working, for instance, as politicians, or as journalists, get much more harassment than your average man. And that's a huge problem, because we're failing to protect the most vulnerable and missing out on the contributions that those people could make to society.

And I'm not sure how much the general public is aware of this, but this is so extreme that many women who just play video games with other people don't like to reveal that they're women because they get just, you know, they're playing fortnight or something, and they get harassed, and then people track them down. And that being a woman on the internet is even more sort of visible in a certain sense than walking down the street because the number of people you're exposed to IS exponential at any one moment, right? This is this is not this is not a scenario of of one person being picked out of the crowd. This is this is endemic, is it not?

Yeah, exactly. Um, there have been many women who have talked about how harassment online led them to drop being in politics, for instance, or to stop being journalist or to stop writing. And that's why I have a paper called Online masquerades which is open access, in which I argue that we should be using much more pseudonyms online precisely to protect people who will otherwise be be persecuted for different reasons.

Would you talk a little bit more about this phrase culture of exposure? You use it in a book? It's, it's super interesting. What does it designate? And why is it so important?

So I got the inspiration for this term from Thomas Nichols, excellent paper, concealment and exposure. And the idea is that different cultures of different times can be more privacy conscious. And that has effects for how democracy works, or how politics work for how power relations work, and for well being. And lately, with the coming of the internet, and the development of social media, and a lot of narratives peddled by tech, as to how, you know, privacy was something of the past, we don't need it anymore, and so on so forth. For a while, we got sold on that idea. And we became much more of a culture of exposure, meaning that not only is it very common to expose things that used to be considered quite private, like where you live, what you ate, who you're hanging out with what you're doing on a Friday night, whatever it might be, but also that people are expected to expose certain things and that if you don't, there's a kind of pressure on you to do so. And one example is to, to speak your mind on a political matter. So this kind of mentality that if you know, if you don't speak your mind, and then it means that you're not with us, and if you're you're not with us, you're against us. And this pushes people to form an opinion or to express an opinion that they might not be sure about, or they might be uncomfortable expressing, and it polarizes the public sphere, because it creates unnecessary conflicts. We don't have to say what we think all the time, particularly if it's negative things about others, and a culture of exposure. Yeah, creates these unnecessary conflicts.

When we come back from the break, I want to pull this thread a little more, I'm gonna pose a thought experiment. And then I want us to transition into this discussion of power and the power asymmetry and all the different types of power, that privacy and data undermine. But for the moment you are listening to Karissa valise and Jack Russell Weinstein on why philosophical discussions about everyday life, we'll be back right after this.

The Institute for philosophy and public life bridges the gap between academic philosophy and the general public. Its mission is to cultivate discussion between philosophy professionals, and others who have an interest in the subject regardless of experience or credentials. Visit us on the web at philosophy and public life.org. The Institute for philosophy and public life because there is no ivory tower.

You're back with why philosophical discussions in everyday life. I'm your host, Jack Russell Weinstein, I'm talking with Karissa Felice, about privacy data, the internet and the power that undermines I want to start with a thought experiment that I give my students and a question that comes out of this. We'll talk about the internet. And we'll talk about behavior. And I'll remind them or I'll tell them since they're very young, that, that when MySpace first came out, it was anonymous, people didn't necessarily include their own images, they had a different name, it was a more hidden identity. And people would act a certain way and reveal things about themselves in secret that they wouldn't have otherwise done, or they play a role and experiment with themselves. Facebook came along, and the thing that made Facebook different was that everyone had to use their real name, they had to have pictures of the cells and they had to be connected to their community, originally just college students, and then everybody I have friends from kindergarten that I literally haven't seen in 40 years. And people would act differently on Facebook and and there was a pressure from the community to behave themselves and be a certain way. And the question I ask my students and the question I'm going to ask you Karissa is, which one is the real you? Are you the person who is hidden by anonymity, and exercises your fantasies and pushes your boundaries and gets to play who you quote really want to be? Or are you the real you when you are governed by your reputation and your consequences and your commute? Ready norms when you're an anonymous online or exposed cursor, do you have a sense? Is one of them the real you?

I don't think so I think that question is slightly misleading, because it seems to assume that there is the real you. Or that the real you is just one of the same. And it's very constant across contexts. And I think that doesn't really pan out. And it's, by the way, it's, it's an argument that Zuckerberg has used to justify Facebook. So he says that, when people have integrity, they behave the same way with everyone until the more the more public, everything is, the better, because that will kind of push people into having integrity, because they have to behave the same with regards to different audiences. But you know, consider the following example. So the way I'm with my students is very different than the way I'm with my partner or my family. And I think everybody's grateful for that. No, with my family, I can be very goofy, kind of, if I were to behave like that in the classroom, it might be inappropriate with my best friends, I share things that I wouldn't, wouldn't share with my students. So I think that it's very important to have different relationships with different people. And privacy is very important in sustaining those different kinds of relationships.

How much of this is our own doing? And in the sense that that, yes, we can talk about the evil corporations. But we are complicit to a former student of mine, who has become a friend who's was telling me once about his dating life, and he had met someone on one of the apps, and she had offered to send him nudes. And he didn't want them. He said that, you know, he just wasn't what His thing was, he'd rather meet in person, blah, blah. And she got really mad. And she said, I want you to be a man I want, you don't want my news. And obviously, it didn't work out. But there's an instance where the very thing that especially the older generation, criticizes people for, you know, putting them put never put anything online that you don't want everyone to see. Here's someone who's insisting on that. It's very easy to look at that and say, Look, only bad things happen to people who do bad things they set themselves up. How true is that? And how complicit are we in the loss of our privacy?

I have a very interesting case, I think we are somewhat complicit. There's a lot of room for improvement. For instance, one of the things that I have written about is how we look at influencers. And that is related to your, your past question about identity, that we are judging them as if there was some authentic character that there instead of looking at them as if they were actors, or publicity marketing. And in that way, we're like clueless, tourists, who think that the English are really polite, and that the Americans are really friendly. So I think there's some some truth to that, that there's a lot of improvement in that we can do through culture. But at the same time, I find that a bit harsh and probably untrue to put so much burden on the shoulders of particular individuals. Because, you know, of course, I don't know the story of this woman. But the culture that she might have grown up in is one that tends to objectify women that tends to peddle on their narrative that if you're not wanted as a sex object, then there might be something wrong with with you or you're not maybe even loved or appreciated in the way you should. So it's really complicated and a lot of things come into play. And something also to bear in mind is what the philosopher Michel Foucault argued that power creates certain kinds of sensitivities, and it manufactures certain kinds of desires. So very often desires that we find ourselves having might not be the product of, say, reflection or values that we endorse, but they might be the product of these companies having very strong interest in us having certain desires, and using all their power and might, to inculcate that into us. And when you're sitting there scrolling on your phone, you know, in that infinite scroll on social media, let's say you know, you want to write a philosophy paper, but instead you're hooked to this infinite scroll. The people you are up against is a whole army of engineers and designers and psychologists who are trying to hijack your psychology through giving you dopamine hits, and it's really not easy to overcome all that power.

Does this undermine On our sense of freewill, I mean, there's been a conversation for half a century about manufactured consent, about the way that advertisements create needs and desires that don't exist before the advertisement. How much of this goes down to that deterministic discussion of the loss of freedom that we have when we are governed by those dopamine hits when we are governed by the need to be passive when we're when we're reacting? Is that part of a conversation in digital ethics right now as well?

Yes, it is. And I think it should, there should be more of a conversation. One of the things I have argued in my work is that we should look to medical ethics to learn from its experience, not only in its successes, but also its failures. And one of the things that we see in medical ethics is progression from no respect for patient autonomy to a lot of respect for patient autonomy. So if you went to the doctor in the 1890s, your duty as a patient was to follow the orders of the doctor without asking any questions. And now, of course, we think that is outrageous. And the job of the doctor is to enhance your autonomy by giving you different options by explaining the diagnosis and the opportunities available to you. And in the same way, I think at the moment in AI ethics, we're seeing a complete disregard for the users or the citizens autonomy. There's no way to negotiate the terms of the apps to use. And nobody's asking me whether we're whether this is what I really want. And I think that as digital ethics and AI ethics develop, we will see more and more respect for people's autonomy and more and more ways to encode design, with respect for people's autonomy to design tech in a way that enhances our autonomy and doesn't undermine it.

And there's multiple levels of this lack of autonomy, right? There's, there's the, the lack of negotiation that you talked about, that you either accept the Terms of Service, or you don't, the terms of service are, at minimum, incredibly long, if not in legalese, you're sort of trapped if you want to use it. But also, listeners who subscribe to the radio shows newsletter will notice that about two years ago, YouTube just deleted the my radio show and Institute account, I had 50 videos that had you know, hundreds of 1000s of views, and one that gone viral and led to all sorts of stuff. And then one day it was gone. And I had no idea why I have no idea. I didn't violate any terms of service. As far as I know, I didn't violate copyright, there was nothing obscene or anything like that. And not only was I unable to get in touch with anyone in YouTube, when I followed all the procedures, all I would ever get was this account doesn't exist. And so there was no redress. There was no ability to appeal anything, and I just had to move to a different platform, which, you know, now we pay a lot of money for what does it mean to be? What can it mean, to be autonomous to have agency in the face of this sort of powerlessness? Is the idea of autonomy just gone? Can Is that is that just no longer on the table?

I think that's too radical. Of course, you have certain amounts of autonomy, for instance, uh, you know, you were able to open a new account with a new service. But of course, our autonomy is very limited by these companies. And what worries me is that these companies aren't really answerable to anyone. And that story that you tell is, it's very common story. And it just shows how easy it can be to silence someone who the tech companies or whoever else might find just uncomfortable. It's really Kafka esque in a way.

That's exactly what it was. Your book is called privacy is power. And we've been using this word throughout, what is power mean? And would you talk a little bit about the distinction between hard power and soft power and how that explains the discourse that we're having?

Yeah, so power very roughly, is the ability that someone has to either do things to another person or have them do things that they would otherwise not do. And that's powering kind of a social way. Of course, you can have power over natural resources and so on. And the distinction between hard power and soft power, is that hard part is when somebody either does something on to you like they can beat you up, or they they make something happen that there's no way you can resist, there's no way you could have done otherwise, or that that kind can not happen to you. And one one way of hard power, of course, is violence. But there are others, um, code can be hard power. Imagine that you have a car, and that it's a smart car. And the company has access to that car in different ways. And one day there, they just turn your car off, they just turn it into a brick, and there's no way you can turn it on. That's hard power as well. Now Soft power is the power to seduce you, or to mislead you or to manipulate you into doing things that you would otherwise not do. So the infinite scroll is a great example. It uses certain kinds of designs, like bright colors, like notifications, like likes, and so on. To get you hooked on on this infinite scroll, even though you might not endorse that as part of your values, you might want to say like, oh, this week, I really want to read a book, or spend time with my family or right, and instead you just spend your time on the infinite scroll.

I want to I'll go back to soft power in a minute, because you talked about code as hard power. And I'd like to pull that thread a little bit. Because of course, the way that websites are designed. They make people choose agender. They ask for certain information and not other information, they put pop ups on your screen. To what extent is the infrastructure of the internet is the infrastructure of our technology, compelling us to act in a certain way? How much do these interfaces define who we are?

Yeah, they have a lot of power. And of course, some of them are more respectful of people than others. I'm sure you have seen differences in how, for instance, companies allow you to reject cookies. In some cases, it's very easy, and in some cases, it's impossible or so difficult. And it's just unreasonable. There's a great website called Dark patterns. And it is a collection of examples of how platforms manipulate the design of their websites to get people to do what they don't want to do. So for instance, when you're rejecting cookies, the button to reject cookies might be light gray, very hard to see, you might have to click on 100 different buttons to reject that because one for every company. And the button to accept cookies might be green and big and very central. Most people with just out of a reflex, click on the green button to just get ahead.

Even the word cookies right is itself manipulative. I'm going to read from the book, this half a paragraph that I thought was fascinating. You're talking about the use of language and you say privately owned advertising and surveillance networks are called communities, citizens, our users addictions to screen is labeled engagement, our most sensitive information is considered a data exhaust or digital breadcrumbs. Spyware is called cookies. I love that spyware is called cookies, documents that describe our lack of privacy, our titled privacy policies, and what used to be considered wiretapping is now the bedrock of the internet economy. The language of this stuff defines our reality, right? I mean, I never thought of I thought of spyware is something that that you can disinfect. But a cookie itself is inherently spyware. So how much does the language of this stuff define our perception and our reality of it?

Yeah, I think a lot. And people tend to think about tech companies as very successful companies in developing tech. But I think that they're even more successful in developing narratives. That's what's really gotten them ahead. And it's interesting also how they have co opted the language of nature, to call things that are completely opposite of nature. So you know, the cloud, it makes it seem like so ethereal. And so I don't know. So kind of non polluting, and so light, and of course, Apple and tweets, and so on. This is this is not streams. This is not a coincidence. It's a pattern, and it's a very effective way to obfuscate what they do.

Whenever we have this conversation, people tend to emphasize the internet and what happened is at our desks on our phones. And don't talk about the way that this infrastructure fills us as we walk or recognizes our faces automatically will you talk about how widespread this is and how artificial that would be the just talk about the internet and our uses as sort of work in entertainment.

It's incredibly common. It's very obscure. So to give you just a few examples, your phone is collecting data about you throughout the day, like where you are, for instance, and it waits until you plug in the phone at night, to then send the data because if it sends the data during the day, you would notice that your battery's getting drained. As soon as you wake up, you are alerting hundreds of companies, if you have a lot of apps in your phone, that you have just woken up. And the things that you can glean from your phone usage include things like how well do you sleep, sleep, and whether you are searching for something late at night that worries you. Your location data is incredibly sensitive. And people can infer things like whether you've been to the hospital or whether you've been to a family planning clinic, or whether you're speeding, or whether you might be in a religious temple of some sort or a church. Other things that are really surprising is the use, for instance of audio beacons. So companies had this challenge that they didn't know whether it was the same person who was on their phone, who then opened up a web page on their laptop and who then visited a store. And so the way they they managed to triangulate that is very often through audio beacons. So imagine that you're in your home in the morning, and you're listening to the radio or the TV, and you hear an ad for a product. And that app contains an audio beacon that you can't hear but your phone is picking up. And let's say that then you go on to your computer to search for that product. And then you go to the store to the physical store and buy that product in the physical store that will be music, and the music has, again, these audio beacons that are connecting to your phone. And that way, they can recognize that the person who heard that ad in the morning, went to the store and bought that product.

So I mean, this is going to be news for I think 99.9% of the people who are listening right now. So there are trackers that are invisible to us, that will tell corporations that we're the same person, even if we're not giving permission, and they can track us not only store to store right, but aisle to aisle, they can use video cameras to see and track our eye movements and what what products we look at first or second, that in again, in theory, even if you're not connected to the internet, if you have your phone, it's going to record all the different stores you go to, and then send the data at night. That's dystopian,

it goes even further than that, because all of that data gets collected. And then it gets sold on to data brokers who further analyze it collected and then further sell it on to whoever might be interested. And that might be marketing companies, it might be your employer, it might be a prospective employer, your government, insurance companies, bankers, all kinds of companies, and that those data points are really quite sensitive, and they can have potentially 1000s of data points on you. They can know whether maybe you are pregnant, or whether you're thinking of having kids, or whether you're thinking about leaving your job. And all of this without your knowledge of it without you having the capability of stopping at learning, let alone redressing something that might be incorrect so that sometimes, these are probabilistic inferences, and they might get something wrong about you. So in one case, I remember a journalist found out that she had been branded as someone who hadn't finished high school and she had she she had a degree. And there's no easy way to learn about this thing, let alone change it. And this is where power comes in. It's not only that companies have the power for you to do things that you wouldn't otherwise do. But they also have the power to decide what kinds of knowledge about you to put you in a box in a category and have others treat you accordingly without your knowledge or consent.

So devil's advocate, someone comes along and says, okay, all of this is horrible. and maybe unjust, but it is all just about buying products. And so if we are recorded buying one brand of soap or one, go to Banana Republic instead of, you know, Zara or something, who cares, right? Our commercial activity is so public. Anyway, who cares if people are making money off of us and what we buy, we still get to buy what we want. How do you respond to, to that sense that this is really just about our consumer lives. And so what if people know what we buy?

So there's, there are many ways to respond. One is that we could have an equally good system for commercial purposes that isn't. So privacy invasive. And of course, people buy certain things are incredibly sensitive, like we buy medicines, that can make you vulnerable to being discriminated against for having one disease or another. And we buy things that we can be publicly shamed for and extorted for and so on and so forth. But it's not only about our buying behavior, it's also about our romantic life and our sexual lives and our political tendencies. And the case of Cambridge Analytica is a perfect example of how really the most valuable part of about your data isn't about what you buy or, or, or not, but what what you buy, says about you, and how that might enable others to manipulate you in different ways. And it's also a huge national security threat, because having all of these personal data sloshing around means that it's very easy to uncover anyone, whether they are someone high up in the military, or an intelligence officer, or the lawyer of unimportant person or a politician, indeed, so the New York Times had this piece in 2019, about how to journalists who describe themselves as not very tech savvy, managed to find out the location of the President of the United States through access to one of these databases that anyone can get access to, because anyone can buy. And if the President of the United States is not safe, because anyone can know exactly where it is, then Nobody's safe and the country is not safe.

You tell the story of the cycling and running app Strava, which I'm a member of and I use, they have these things called heat maps, which record how many people are running or doing exercise in a particular area. And so the the brighter the colors, the more people there are, and this is used, cyclists around here and everywhere, we'll all do the same routes. And there'll be like the fastest person up a hill and all this sort of stuff. So it's used as as both a social network and competition. But you point out that people were able to discover a secret military base because of this because of just their running app. Could you explain how that worked?

Yeah, that is a fascinating example, because the heat maps have been published online for months. And nobody had noticed anything untoward about this. And then one person figured out that no one they were looking at running routes of people, they realized that there were some routes that seemed to happen in the middle of nowhere. So they triangulated this with Google Maps. And they realized that that area in Google Maps was blurred. And of course, it was blurred because it was protecting secret US bases and of course, everybody tends to run near where they live, including the US military. And so something that seemed very innocuous turned out to be quite dangerous for national security and struggle, I had to change the way they published heat maps after that.

And and the next step, right, you talked about knowing where the President was you talk about military secrets. Many people argue that Cambridge Analytica is the reason why Trump won the election and the reason why Brexit passed. Can you talk about how that data affected these votes?

Yeah, in in different ways. So one way was that Cambridge analytical tried to assess what kind of personality people had. And to those who had certain kinds of personalities that were more prone to being afraid of certain things were more prone to being paranoid. They bombarded them with ads that looked like a horror movie, you know, that were designed to inspire fear and a kind of protective reaction to To other people who they branded as persuadable. So these were people who were undecided about who to vote for. They bombarded with propaganda that suggested that there was no, no reason to vote, things like content that said, like, democracy is broken, all of the candidates are the same, everybody's corrupt, you know, that kind of disincentivize going to vote, which is, of course, a direct affront to democracy. Another interesting part of the puzzle is that, for example, the Trump campaign used 6 million different ads, and distributed them according to people's personality. And that means that academics and journalists and ordinary citizens didn't have access to many of these apps, the only thing that you see is the app that are targeted to you. So you get a very different impression of that candidates than your neighbor might, because they have a very different profile. And so conversations about the candidate become very difficult because everybody's getting a very, very subjective and a very manipulated image. And there's no public discourse that agrees on all the facts. In the past, there used to be two ads for every candidate, and then we could discuss them and see, you know, depending on what people thought it might change your view about whether that was manipulative, or whether it's true or whether it's sincere, and so on. And now we don't have that.

And this extends, Cambridge analytical got in trouble. And it's an example of a bad actor being at least in some, in some sense, punished. But you quote, Eric Schmidt, the one of the leaders that Google's saying in 2010, he said, I actually think most people don't want Google to answer their questions. They want Google to tell them what they should be doing next. Is that as insidious as it sounds?

Yeah. And again, we go back to this idea of not respecting people's autonomy. And you know, that quote, if I remember correctly, is from Eric, Eric Schmidt, who used to be the CEO of Google. But I remember that Steve Jobs also said something like, I'm not I'm not quoting here, but I'm paraphrasing something like, it's not about what users want. They don't they don't know what they want. But they are amazing design, it will kind of show them what they want. And, of course, that assumes that engineers and computer scientists know better. And but of course, they have very important financial interests, that might not align with our interests in well being with our interest in democracy and inequality. And one of the concerns I have with everything that we've talked we've been talking about is how that we're not being treated as equal citizens anymore. If you and I are being treated on the basis of our data and being fed different content and different opportunities, and different prices for different products, then, where's equality of opportunity?

You report that only 1/3 of Americans under 35 say that it's vital to live in a democracy in America. And then you follow it up by saying that the number of of them, who would welcome who want a military government went from 2% in 1995, to 18% in 2017, right, the year after Trump was elected. When people answer that, do we think they really believe it? Or are they responding to the terror of the of the 6 million ads? I mean, what does it mean to have an opinion, in the face of all of this manipulation?

I think the extent to which we can be manipulated, is enough to make us worry about democracy because for instance, you know, an election can be lost or or won by a few 1000 votes. But I'm not sure that the extent to which we can be manipulated or that we've been manipulated, go so far as to affect that kind of opinion. It might have been I don't know, I think this is partly an empirical question, and I don't have the answer. But I think what it does express is that people are very, very disillusioned with democracy. And that is concerning because as imperfect as democracy is, the alternatives are very, very scary. And often, these are people who have not experienced in any way what a dictatorship can be like, in the past, like or the previous generation, even if they hadn't experienced it mates. Oftentimes they had grandparents who had had some brush with authoritarianism. And that gave them a much fuller perspective of what they might stand up against. So I think there's a lot of Navy tear about, about about what the alternatives to democracy might look like. And especially in the age of AI, because with so much personal data being collected, just imagine what it would be like to have something like a Nazi regime that not only has access to your name, and your and where you live, and who your parents are, they have access to everyone in your contact list to everyone you message to all everything that you search for, to anything that you buy, to all your location in real time, it will be very hard to fight an authoritarian regime like that.

As you can imagine, in Grand Forks, North Dakota, the Jewish community is very small. And occasionally we have faculty who are being recruited and who are indicate that they're Jewish and are interested in in openly Jewish life. And I'm Jewish, and all our most of my listeners know that. And so they'll put those people in touch with me and a few years back, and you know, I'm a very public figure in the community. And a few years back, a law professor came up to me during one of the interviews, and she said, do you you know, right, that if you type in a Jewish plus Grand Forks, your name is the only thing that comes up. And you know, that on a certain level is just terrified, right? That anyone can find the families who they want to harass, or worse, and there's no way to hide that. So I guess, I guess, I want to ask, what do we do about this? Is this inescapable trap? How do we change both our sense of what the technology should look like, and also our activities, because one of the points that you make is that people regard this data collection as as necessary and inevitable that this is just what progress in technology looks like. And it's impossible to have the kind of technological advancement and convenience that we have now without this data collection. So A, why isn't that true? And B, how do we start shifting the paradigm? And how do we start shifting behaviors and regulations and things like that to counter all of these dangers?

Okay, so first, that's not true, because for the most part, this data is being collected to fund the internet and to fund companies. But there are many different business models that don't depend on data collection. So one of the examples I gave is, already in 2013, Google was an incredibly wealthy company. And if you calculated the number of users it had, and how much earned it came to about $10 per year per user, that's what Google earned. And I think, you know, most people who are not in dire straits, will be happy to pay $10 a year for having access to Google Maps and Gmail and search and so on, and so forth. So why do we need this surveillance economy? So So that's part of the answer of why we don't need it. I think the you know, there's a longer answer that's in the book, but I don't want to give a lecture about it. And how do we how do we do it? Partly, we need to remember why privacy is valuable. And it's very easy to forget why privacy is valuable, because it's something quite abstract. It doesn't feel like anything to have your data collected, you don't bleed, your you know, it's not like you can, you can breathe like when you get an infectious disease, and the consequences, even though they can be very bad. And it includes losing your job or being discriminated against or being extorted and so on, often happen very far into the future, such that it's hard to connect the privacy laws to that very bad consequences that happened, you might not get a job. And you might never realize that that's because you lost privacy down the road, say regarding your health record. So partly is a cultural thing, we have to realize the value of privacy. And we need much better regulation. There's no way to do this, except with regulation, because this is a kind of collective action problem. And there are no individual solutions for collective action problems. That said, what you do as an individual matters a lot because it expresses your thoughts to policymakers and companies. it incentivizes policymakers to take action and companies to change their ways and to find different business models. And many people think like, well, you know, but it's that's one on one reasonable are not very realistic. But actually, we only need about five to 10% of people to change their behavior. I'm not in any radical way just enough that it makes that it shows up in the data for companies and governments to get really worried and change change their ways. So in general, we have to defend privacy. And the regulation that we should aim for, is to ban the trade in personal data. Even in the most capitalist of societies, we agree that there are certain things that should be outside of the market. We don't buy or sell people, we don't buy or sell, or the result of sports matches, and we don't buy or sell organs, and we don't buy or sell boats. And for the same reasons that we don't buy or sell boats, we shouldn't buy or sell personal data.

This is going to seem like a stupid question. But I actually think given all the things we've talked about, I hope it's not. Can you make the case for us for privacy?

Privacy matters, because it protects us from possible abuses of power. And as long as human beings are human beings and institutions or institutions, there will always be a temptation to abuse power. So privacy protects you from harm, such as discrimination, extortion, public shaming, and even from physical assault. And it protects our society, from possible abuses of power as well, like threats to national security, and like threats to democracy by using personal data to sway elections.

So privacy isn't just about keeping our personal secrets, it's about the safety and security and agency of of everything around us, right. I mean, it's like, we don't think about national security as personal privacy. But given all of the networks and connections and people and the heat maps and all this kind of stuff. That's essential, right. My privacy, in some sense, is a national security interest. Am I getting that right?

Yes, that's absolutely right. And again, I think our obliviousness to this kind of insight comes because we've been very lucky. And we most of us haven't lived through a war. But during the Second World War, part of the propaganda films was to make sure that people were not talking to the enemy and making sure that they weren't telling people who could be spies, things that could do things that could jeopardize the country.

One point that you make in the book, that's, that's very important, in part because it it's a it's a key precept in the philosophy of law. And that is, we have the right to know the rules that govern our lives. What does that mean in this context? And why is that such an essential critique of the current technological system?

So for instance, in the philosophy of roles, this condition is called publicity. I think one way to answer it is that if we don't have access to the rules that govern our lives, then we end up in situations like the one in Kafka, the trial, in which we don't know what we are accused of, we don't know how to defend ourselves, we don't know what the consequences of our actions can be. And that means that we are being ruled in a way that's arbitrary, or at least we can verify that we're not being ruled arbitrarily. And that seems to be going against democracy and self government and sovereignty. So at the moment, for instance, many questions, even for privacy experts, of what happens to our data, are very hard to answer. Very basic question of what happens to our data? And what are the consequences of certain kinds of giving up our data? And that seems to me, like we are not going on a positive trend. So the Economist has, has this index Democracy Index, and every year they measure how well democracy is doing. And it's not only about the number of countries that are considered democratic, but also the quality of democracy within countries. And according to this index, we're not going in the in the right direction, democracy is, is worse off than than it used to be a few years ago. And I think this kind of tendency in which citizens are less and less well informed about the rules of that they're being governed by is one instance of a trend that it's not democratic within democratic countries.

What's the first thing we have to do? What is the foundation is there and if so, what is the foundational decision that people have to make about it? Big Data Policy that will start us down a path of of preserving. I'll call it data justice, but just also preserving democracy and all this other thing. What's the first and most urgent thing we have to attend to?

Well, depends on how, what do you mean by weed, but I think policymakers need to ban the trading personal data. And as individuals, we need to protect privacy whenever we get the chance without making our life too difficult, because I think it's unreasonable to ask people to not have a smartphone, or to not have access to apps that aren't necessary for their jobs. But many times we do have options. So instead of using WhatsApp that's owned by Facebook, and that uses your metadata, you signal, it doesn't collect your data. Instead of using Google search, you stuck to go instead of using Gmail, use protonmail. These are free alternatives. They work very well. And many times, there's no reason why you should given you should be given more information that you need.

This has been fascinating, terrifying, compelling, and has made me want to learn so much more. So. Karissa, thank you so much for joining us on why.

My pleasure. Thank you so much, Jack. And for those of you who might want to know more and get more advice, practical advice as to what exactly you can do to protect your privacy. That is not very hard. There's a whole chapter in the book about that.

Yes, and we will link to the book on our webpage. It is a very, very accessible book, very compelling Carissa, it's tremendously well written. And, and you will not put it down, I promise you, so, really, I recommend it without any reservation, so Karissa, thank you again. You have been listening to Jack Russell Weinstein and Carissa valise on why philosophical discussions about everyday life. I'll be back with a few more thoughts right after this.

Visit IP pls blog P QED philosophical questions every day. For more philosophical discussions of everyday life. Comment on the entries and share your points of view with an ever growing community of professional and amateur philosophers. You can access the blog and view more information on our schedule our broadcasts and the Y radio store at www dot philosophy and public life.org.

You're back with why philosophical discussions about everyday life. I'm Jack Russell Weinstein, we were talking with Karissa valise about her book privacy is power. And the way that Internet data is used to control and surveil us. You know, every time there's a new technology, people freak out. Plato thought that writing was going to make kids lose their memory. When the movable type press was invented by Gutenberg, all sorts of censorship happened. Even sheet music made the music industry freak out because they thought no one was gonna buy music anymore. There is an instinct to think that technology is going to cause disaster. We figure it out. society changes, our laws change our culture changes, sometimes for the better, and sometimes for the worse, because nothing is perfectly good and nothing is perfectly bad. The issue here is knowing what the problem is. And knowing what we want to fix. That is personal freedom, participation and democracy, a strong sense of community and trust in one another. Philosophy will help guide our priorities, it will help us think about good policy and it will help us protect ourselves during the transition. This was a scary discussion, and a shocking discussion. But it's the beginning of a new period of the world and humanity. And there are going to be some bumpy things. I have faith and I want you to have faith that we will figure it out. And we'll do that together and use philosophy to highlight and remain committed to our values. You've been listening to why philosophical discussions about everyday life, please rate us on iTunes and Spotify. And please consider donating at wire radio show.org Click donate on the upper right hand corner. Thank you for listening as always, it's an honor to be with you

is funded by the Institute for philosophy and public life. Prairie Public Broadcasting in the University of North Dakota is College of Arts and Sciences and Division of Research and Economic Development. Skip wood is our studio engineer. The music is written and performed by Mark Weinstein and can be found on his album Louis Sol. For more of his music, visit jazz flute weinstein.com or myspace.com/mark Weinstein philosophy is everywhere you make it and we hope we've inspired you with our discussion today. Remember, as we say at the institute, there is no ivory tower