Hey, everyone, We'll get started in just a minute. Just waiting on one more Speaker to join us. You.
Hey everyone. We're just waiting on one more speaker and to let the room fill out as guests continue to Join, we will get started In just a minute. You
all right. Well, why don't I get started with intros and whatnot, and maybe if someone could shoot Julia an email, hopefully she'll join by the time we get to any questions for her. So thanks everyone for joining us today. I'm Seth Stern. I'm the advocacy director at Freedom of the Press Foundation. We are a nonprofit organization that protects, defends and empowers public interest journalism. We do that through our advocacy work, which I'm involved with. We also do that through digital security training, you'll hear from one of our digital security trainers at Harlow homes shortly. We operate the US press freedom tracker, which documents press freedom violations nationwide. And we we develop tools like secure drop in Danger Zone, which help journalists communicate with sources. Secure. Early and confidentially. Today is, of course, global encryption day, so that's why we're hosting this event on this particular day. It's a day when encryption defenders from around the world make their voices heard to promote and defend the use of strong encryption. Encryption is vitally important to journalists and their sources, and strong encryption can help prevent unauthorized access to journalist communications with sources, their notes and their unpublished work. While encryption is often talked about as a necessity for national security reporting, the truth is that it's important for reporting at the local and regional levels as well. For example, when police raided the newsroom of the Marion County record in August of 2023 they were caught on camera remarking on how a lack of encryption might make it easier for them to access data on journalists computers that they seized during that very much illegal raid. Unfortunately, encryption is also under regular threat, with lawmakers in the US, EU and elsewhere, often proposing or even passing laws that would undermine it, and here with us today to talk about how they use encryption for the for their reporting, and how undermining of encryption would be harmful to journalism. Are Julia anguin. Julia is an award winning investigative journalist and New York Times contributing opinion writer. She's the founder of proof news and of the markup. Sorry, we've also got with us Lorenzo franceski, Bucha Ray. Lorenzo is a senior writer at TechCrunch, where he covers hacking, cybersecurity, surveillance and privacy, and as I mentioned, we're joined by Harlow Holmes, our freedom of the press Foundation's chief information security officer and director of digital security. And Harlow regularly trains journalists in securing their communications within their newsrooms, with their sources and with the public at large, if we have time today, we will try to get your questions, but no guarantee that being said, we do anticipate that we'll write up something about this event afterwards, and perhaps if there are questions that we don't get to, we might be able to address them in a write up. But again, I make no promises on that. So let's get started. My first question is going to be to Harlow. So Harlow, let's start with the basics. Can you explain what it means for something to be end to end encrypted and in what circumstances journalists might need or want to use end to end encrypted services?
Hi, sure. First and foremost, thank you for having me. And also, it's so great to be in a space with Lorenzo and Julia. It's been such a while, and I'm so happy to be here.
But yeah,
let's talk about your question regarding end to end encryption. So first and foremost, encryption exists just about everywhere. Both you know in how we handle our communications on the internet to keep them safe from anyone who wants to perhaps see information or modify it when it's in transit, or in the case of files on computers themselves, like you mentioned with Marion in that unfortunate raid where certain files that are encrypted when the computer is shut down, for instance, like file based encryption, are not able to be, you know, modified or viewed unless someone has the key to open them. And this is, like a fundamental fact. This is great. And as far as communications are concerned, the vast majority of platforms do offer some sort of encryption that keeps them protected. But end to end encryption, as you mentioned and want to speak about, is slightly different, because in addition to encrypting something in transit to make sure that they're just, you know, like, protected in normal use cases, that does mean, however, that the company or the service that is responsible for encrypting your communications, they still do have the ability to see or modify, you know, things if they if they wanted to, most services that we Use do not want to see or modify the communications that we have, but it still does remain a fact that they can, and in some cases, especially when either warrants or subpoenas or some other you know, legal process gets involved, they would be compelled to do so and end to End, encryption protects our communications even from that service, even though that service is usually acting with the best of intent, we don't want it to be able to see stuff, modify stuff, hand it over in the event of, you know, some sort of legal order. And so now we have end to end encryption in certain services. Error. In addition to that great encryption that we already enjoy now, it means that communication is encrypted, again, in a way that only the you know, devices that are part of a of a conversation, so like my phone, my computer, my friends phones, my friends computers, etc, only they have the ability to decrypt, and, you know, be able to see the information that is part of that conversation. And so that turns that service to actually, you know, put its money where its mouth is, so to speak, and being a good faith communication partner, meaning it's just holding on to gibberish, just passing it, you know, to and from other people in the conversation, but it has absolutely no ability to see what those contents are, and that's why it's a really powerful idea.
Thanks so much. Harlow, I'm going to move on to a question for Lorenzo. On your website, you give people options for secure ways to contact you, and some of those options are signal, the encrypted messaging service and encrypted mail. When did you start doing that? And why do you do it?
Hi everyone. First of all, thanks for having me, and it's also a pleasure to be here with Harlow and Julia. So I started doing that a long time ago. I think it was around 2013 when the Snowden revelation started. I remember thinking, you know, Snowden used PGP to reach out to Laura Poitras, Bartel Gellman and grant Greenwald. And I thought, you know, I want to be able to be that journalist maybe one day, and give sources the opportunity to reach out to me in a safe way. And, you know, at the time, I was very much an ignorant about that kind of technology, so I learned about BGP at that point. Signal was still, you called, what was it? It was another name, secure phone or something. So, yeah, yeah, exactly. And so, yeah, you know, I've been giving sources the opportunity to use a bunch of apps. Signal is my preferred one, but my sources are kind of the they have a lot of different preferences. Some of them are on Telegram, which maybe we can talk about a little bit, because it is not actually employed encrypted, as you know, some people know, but some people don't. But yeah, you know, I've been doing for more than 10 years at this point, and I've always found that it makes it easier for me to communicate with sources.
Thanks Lorenzo and moving on to Julia. Julia, you also talk on your website about secure ways for people to contact you. Can you talk about some of your favorite encrypted tools for talking with sources, and also, if there are any that purport to be encrypted but that you prefer to avoid?
Hi there. Thanks so much for having me. It's so great to be here. I think we're in this funny space right now where encryption is like, it used to be so hard, like back when Lorenzo and Harlow and I were, like, the few people doing this, you know, in 2012 you had to have your public key. And there were all these, like, key exchange places where you would look up other people's keys, and it was really hard to get people to do it, and it was hard to use. And now it's gotten so easy with services like signal and WhatsApp, which we can talk about the differences between those. But like, I actually feel like one of the challenges we have right now is people take encryption for granted, and there is it's really important to realize that there are major differences between these services, right? So Telegram, as Lorenzo was mentioning, is not end encrypted by default, and so you have to, like, set it up. And most people don't realize that, and a lot of people are using that unsafely. WhatsApp is end to end encrypted, but you know, WhatsApp and iMessage, which are end to end encrypted, are always pushing you to do cloud backups, which actually defeats a little bit the purpose of the encryption, because that can be accessed in a slightly different way, and so signal is the one that I prefer the most, because signal goes the extra mile of end to end encryption, but also not. They don't store any records of who you spoke to, so they can only tell when you've logged into the service. So it says, you know, Julia logged in, but it doesn't say who I'm communicating with, and that means that when law enforcement sends some requests, that's the only thing that they can turn over. So I always try to encourage sources to communicate and signal. Sometimes they're not willing to because I. Yeah, they think that adding an app to their phone might actually be suspicious and they want to use one of their existing channels, which is fine, but I do think that it's important to draw these distinctions, because encryption is a word that gets thrown around, and everyone sort of thinks, oh, everything's encrypted. You know, Facebook messages are encrypted, but there really are levels of difference between these different services.
Thanks and Harlow from from your perspective, with your digital security training work, are there other communication tools FPF typically recommends or does not recommend for journalists, and are there guides or resources that you recommend for journalists who are new to the world of encrypted communications and, of course, Lorenzo, if you've got anything to add on this topic, please do
Harlow you're muted if you're talking. Ah,
thank you. Thanks for the catch, everybody. If you are listening to this space, please do sign up for our newsletter. If you go to freedom dot press, we do have a fantastic newsletter where we talk about not only this issue, but other topics in digital security, especially as they, you know, impact a journalist's job, and it has a lot of handy information about, you know, how to use certain tools safely, and stuff like that. So to get to this question here, and also to talk to some things that Lorenzo and Julia have brought up, we definitely do love to get journalists started using signal to be as available as possible. But as Julia mentioned, there are all different reasons why a journalist or their sources that they're communicating with cannot use signal, such as it's unavailable, let's say in a region that you're in, or, as Julia mentioned, it makes you look suspicious. These are all reasonable things that a journalist is going to have to use their judgment in determining what the next safest way forward is. And people can be very, very creative about it, such as using things like, let's say, onion share, which would require, you know, everyone to kind of participate in using either the Tor browser or the app itself in order to have this communication or to take it to secure drop. But that also requires, you know, having a access to a secure drop, which is no you know, trivial feat, but more and more we do see journalists just working independently in order to have, like, you know, a certain amount of confidentiality in their conversations with sources. And sometimes that could include, you know, maybe end to end encrypted emails, such as using proton mail, which, to Julia and Lorenzo's point, is kind of the user friendly face on PGP, encryption, encryption via email, which is a great thing for people to get into, into the habit of using, but definitely is incredibly difficult to use. And so we do see people who want to have encrypted email conversations, just take it to protonmail, as long as just you know, if you're thinking about doing this yourselves, you do want to be able to have proton or have a PGP capable way of checking your email, because, you know, just regular email between like, let's say, a Gmail user or a Microsoft Outlook person and a proton mail user does not provide you the same end to end encryption there. So, so there's that, and, yeah, it is 100% true that other apps out there that we like, such as WhatsApp, such as, you know, Facebook Messenger for instance, Threema for instance, these are all examples of apps that do provide end to end encryption, but the devil is in the details, As we say, and I think that this is something that now we're starting to really, really focus on as the details that matter in order to keep encryption end to end, encryption as meaningful and powerful as as it should be, which is that metadata, there is absolutely a difference in the way that signal and Threema, for instance, handle their the connections between users, logging of usage, for instance, and other types of data having to do with when people have like used the service. For instance. Sense and other factors that all of these different apps handled in ways that provide, in certain cases, way more information that can be that can be analyzed and and otherwise, like, you know, part of a subpoena or a warrant or something like that, because metadata is starting to be the type of data that investigators look at when the actual content is not available to them via end to end encryption. Over the years, and I'll end on saying this, over the years, we've started to see the fact that we have end to end encryption, as you know, just the regular status of things. It's status quo for the most part. End to end encryption is not going away, which is great. It's fantastic. But where we're starting to see push back is regarding the nuances for these platforms to secure their metadata and be as confidential with their metadata as possible. And there's a number of legislation both in the US and abroad, that point explicitly to you know how available metadata should be to law enforcement, and that's the danger that we're looking at right now. The challenge, I would say,
thanks. Harlow. The next question for Lorenzo, Lorenzo Julia, or both you, we talk a lot sort of abstractly about how encryption benefits journalists, but I want to maybe bring in some real world examples. If we can, can you talk about a time that encryption has been important for your reporting, whether by enabling you to communicate with the source, or some other way reporting that you've been able to do because of encryption that perhaps you wouldn't have been able to do without it.
I can start if you want so well, I mean, for me, the answer is, I've been using signal for so long that it's basically my, you know, day to day, way to communicate with sources. So, you know, I could give you examples from last week when signal was useful for, for for a story Harlow mentioned secure drop, which I think it's an important tool to shout out, because it's kind of like a unique, unique in how it works and the possibilities that gives both sources and journalists. And a few years ago, at pi said motherboard, we did get a very good tip through securedrop, which led to our first big investigation on stalkerware, which is essentially spyware for regular consumers, which, as the name suggests, is used, unfortunately, often by jealous and abusive partners. And in that case, the source reached out by secure drop so in a completely anonymous way, saying that they hacked one of these stock aware providers, which was called retina X. And by doing this hack, the hacker essentially showed that the company was storing very sensitive information in a very insecure way. So there were two layers of badness. Basically they were like the people using the software to spy on loved ones, and then the company not doing, you know, the basic job of actually securing this, this data. And the original tip came from secured and, you know, it's impossible to know about. Perhaps we would never have gotten that story without secure drop. In that case,
thanks Lorenzo and Julia. That's that's one example. But is that sort of typical of the ways you benefit from your use of encryption? Or do you have any any examples you wanted to add to that?
I mean, that is pretty typical. I also do most of my communication on signal. I would just do a little shout out to onion share, which is kind of a niche kind of tool, but I have used it a couple times. It is a way to send documents, which is often what you know you're trying to do with sources, and it has it allows you to do it in a way where, basically both sides, it's encrypted and anonymous, and so both both sides can, sort of like, leave no trace on their machine of the document transfer. And that is been really useful to me. It takes a little bit of knowledge to set it up, but I've had found that it's actually pretty user friendly. And so I think the challenge is not, it's not always easy as a journalist to say, Oh, this story came through encryption. You don't really want to do that to your sources. So, so I don't want to really, like, go into like details. But I do think that, like, what's really great about today is that versus when I first started doing this, is it is much more routine to communicate. With all types of sources through encryption, not just ones who are trying to share documents or do something that might get them in trouble. And that actually makes it safer for everyone, because the more people who use encryption, the more normal it is, and that way it doesn't seem like an outlier to you know, use it and doesn't make you suspicious. So I think that's the world we want to be in.
Great. Thank you.
I couldn't agree more, actually, if I, if I could say one thing to what Julia just said, and this is also something that we we try to promote when we're speaking to journalists in trainings, which is end to end, encryption has become much more user friendly, much more usable, not only because, you know, people who develop these products really care about that, but also The way that this technology has, like, you know, become part of, like, a global conversation has to do with the way that you know members of the press like Julia and Lorenzo have made it very clear within their writing that you know, not only like what happens about around end to end encryption and reporting on it, but why it's important, just for like general privacy and general confidentiality that every single person should have the right to and popularizing these ideas in the press in a very, very public way has done so much in making sure that people understand why it's important and participate in using these technologies.
Thanks, Harlow, and I'll stick with you for the next question. We mostly talk about this from the perspective of journalists, because, for obvious reasons, confidential sources can't really get on the next space and talk about how they use encryption. But from the sources perspective, I assume there's a lot of overlap between the advice that you give to a journalist and to a source. But What measures do you recommend sources think about before they reach out to a journalist and before they attempt to make contact with the journalist through signal or another encrypted service.
So I do think that it's very important that sources understand exactly what the potential risks are when engaging with in acts of journalism with a journalist, and that has to do with what we like to call the first contact conundrum. It's very important for people to understand that if you, you know, reach out to a news outlet using, let's say, your own, you know, personal Gmail account, and that be the first way that you establish contact with that journalist. That is something that creates a record that can possibly be used to identify you and identify your participation. This is why it's really important for potential sources to you know, like find a journalist that they trust, not only to tell their story, but also that they trust to be as as mindful about their communications with them as possible, and so seeking out that journalist that you really, really love their reporting, who also has a signal username that you can easily just, you know, Pick up and start using or, you know, makes their phone number that is reachable over an end to end encrypted method totally you know available to them, or publishes their PGP key, or has, you know, a proton email address or something like that. So you can be proactive on your own and in establishing the most secure way of initially reaching out to that particular source. And so I'm really, really happy that people over the past, you know, decade or so in journalism have started to become available in as many places as possible by publishing that in their bios by, you know, making sure that their website points to, you know, certain, yeah, certain steps that any potential source can take from individual journalists, some of which I actually see here in this space, to news outlets themselves, that that make sure that they publish it paying attention to that will make all of the difference.
Thanks Harlow, Julia or Lorenzo. Let's talk about ways you use encryption beyond just source communications. For example, Julia, you've written about things like encrypting data you store and. Scripting your web traffic. Why would locking down store data or web traffic matter for journalists? And how do you how do you do that? How do you protect that, that store data and that internet traffic?
Yeah, that's such a good point, because there's so much of our lives are on our computer. You know, it's not just the contact with a source, but your notes for your stories and documents you might have obtained and scanned in to be able to reference. And so I think it's really important to encrypt your hard drive. And you know, Apple has made that a lot easier. Over the years, it used to be a lot harder, and it's also really important, because we still have this real gap in the law about border crossings. So essentially, when you cross a border, the Fourth Amendment no longer applies, and so you normally law enforcement would need some sort of search warrant, which is to access your computer or your phone and go through the through the device. But at the border, the rules don't apply, and they can basically grab your device and look through it. And so I have tried all different types of strategies for making sure that my device would not be accessible in that situation. And really the best strategy is not to bring your devices, but that is really not realistic, you know? I mean, I think that that used to be the advice, but I think nowadays it's very difficult to imagine traveling without them. So a couple of different things that I have tried. Is encrypting the hard drive and and having it with a passcode. There's some pretty nice features. Also, a signal has a little thing. We can delete it really quickly, on your off your phone, the entire app, if you need to. There's kind of like an emergency button. And so I think it's important to think ahead about situations you might be in where somebody might go through your device. It's also worth noting, both for journalists and sources and just anybody that, like your employer, often has some rights over your machine and and so it is smart to have two separate devices, maybe a work one and a personal one, so that you actually keep your personal stuff separate from your employer, because employers are also very much into surveillance these days, unfortunately and remote work sometimes they require you to install all sorts of annoying surveillance devices. So it's actually just important to think about it in a broader context than just journalism.
Couldn't agree more Absolutely.
Thank you, Lorenzo, you've written a lot about hacking and cybersecurity, including hacks by foreign countries, of political campaigns, as well as private companies, are journalists, targets of hacking campaigns by foreign countries or others, and will, using encryption help protect from that risk.
Yeah, so me, the short answer is, yes, we are targets. There's been, there's been, like dozens of examples now, in the last 10 years, you know, going back to hacking team and Finn Fisher, more recently, to NSO and companies like that, a lot of the stories that we heard start with, you know, journalist in Morocco being targeted an activist In Bahrain. And you know, the kind of depressing answer is that encryption, in some of those cases, actually doesn't help, because if your device gets targeted, then it's basically game over. They can access everything. And you know, in that case, what you can do is try to prevent, or rather prepare for that eventuality. So maybe it's compartmentalization like Julia was hinting at, you know, maybe you have a device for travel, maybe you have a device for work and one for, you know, personal life. You can use disappear messages, which signal, I think was one of the first ones to have. So in that case, the messages just disappear. You don't have to worry about remembering, oh, you know, this conversation should not be on my device. It's just gone by default. WhatsApp has that too now, which is great. And so, yeah, I mean this, this, you know, we're talking about the threat of spyware like NSO, and in those cases, it's really, really hard. And, yeah, the best thing you can do is store as little as you can. But that can also be challenging for journalists, obviously, because we need to have access to notes. We need to, you know, go back to quotes and things like that. But, you know, you can be ready for that, and you can take some precautions. Yeah.
Yeah, thanks, Lorenzo and Julia, you've done a lot of reporting on tech companies themselves, which are very much secretive and often hostile to journalists. There have been reports about companies like Tiktok, for example, that have spied on reporters. Should journalists be thinking about tech companies as another adversary that might want access to their data? And if so, same question I asked Lorenzo, how can using encryption help with that risk?
Yeah, that's such a good question, and it is true that that I think it's every one of the tech companies that I think has at least had one instance where they have that that is known about where they have actually gone through one of their users information for their own purpose. So the Tick Tock example was they actually accessed the account of some journalists. This happened at Microsoft. They there was a story written about, I can't remember, something leaked about one of their products, and they went and found the Hotmail account of the person that leaked it. This happened. I think at Google, they disclosed that, like some employee had gone through some people's ex girlfriend's Gmail. So we know that basically at every big tech company, there have been abuses where people inside have access the accounts of people, of users, and that may or may not be quote allowed by the companies, but has happened. And so I think it's you have to be prudent about the I the fact that, like, if you keep all your stuff in Google Docs, like, there's a small chance Google might go through them at some point, right? And I think we have to, but it's also true that it's very hard to operate in this world without using those services. Right? I I wrote a whole book in 2014 where I tried to, like, not use any of these services in my life. Was, you know, total hell. The point of the book being that, like, we need a collective solution. We need national privacy law in the United States is the only like country that doesn't have a federal Baseline Privacy standard. We are, you know, there are things that we need protection that is hard to manage individually, and so as much as this conversation is about like what we can do individually, I think it's really important to also acknowledge that we are fighting an uphill battle without any support, and we really do need to also focus on advocating for legislative and regulatory protections that set a baseline for everybody, because it's not just about journalists, right? Like I don't want anyone to get a hold of my messages with my family or whatever. These are, you know, intimate conversations and people are awful to each other and abuse each other and screenshot each other's messages and put them up on social media to make fun of each other, right? This is, unfortunately, the world we live in, and so I think we we do need much more protection than we have, and so I do advocate for everyone, doing the best that they can themselves, but also being an active participant in asking our legislative representatives to act on our behalf.
I would like to quickly echo something that Julie said, like, I think, I do think that tech companies can be adversaries sometimes. And you know, if I was like, I don't really cover tech companies as a beat all the time, but you know, if I was like a Google reporter at The Wall Street Journal or something like that, then I would probably try to use Outlook or protonmail. So, you know, just try to, yeah, be mindful, as Julia said, that the tools that we use tech companies can have access to them, and they're they're not going to hesitate to use them against their own employees if they think that one of their employees leaked something to the press. Yeah,
let me add on to that. Thank you, Lorenzo. That's such a good point. So I forgot to finish my thought there, which is that I have found that it's basically impossible to not, you know, use these tools. But what I like to do is try to compartmentalize so that not one company doesn't have everything, right? So if I use Google Docs, I'll try to use Microsoft Outlook. And if we're, if I'm doing a story about one of those companies, definitely not use their products for that story, right? So I think it's best to spread the risk, because I think that's the most realistic approach. I mean, Harlow, you might have better ideas than that. I just haven't been able to really live my whole life on libra office as much as I know some people would be able to. Yeah.
Okay, but God bless you for trying to live a life entirely on libra office. And I do think that the approach towards compartmentalization is ultimately the best way forward. It's the most pragmatic, because the you know, like the rules of play are always changing on us. And while, you know, on one hand, we're very, very much invested in, you know, doing the proper advocacy. And whether that's, you know, by ringing the alarm on, you know, our various spaces or reporting on it, or, you know, going into, like, the halls of power and advocating for it, we're doing that on one hand, which is important, but on the other hand, we are trying to live our lives, and these two things are often in conflict, and we still want to keep moving and being be as successful as at both, right? And so to get to compartmentalization, I can offer two tools, which we definitely do teach. They're not actual tools that you can use. They are practices that you can observe. And we have like, you know, proper frameworks for thinking around it. One is performing a risk assessment. What Julia and Lorenzo are describing is, you know, being in a situation where it's required to think through what your adversaries are, what type of tools they have in order to get towards those assets, and the likelihood that any of that is going to happen, and then from there, applying your potential resources in order to thwart that adversary. This is something we innately do all the time, but this is also a practice that journalists can do, not only like in general, about their you know, about how they go about doing their job, but also you might want to think about this on a story by story basis. So if you just pitch something to your editor, now is the time to do a threat model around it, you know. And so there that's one tool that you can use. Another that we like to teach people on is the concept of Bullseye mapping, which allows you, if you picture a bullseye, you know, that has three tiers, and in the very, very middle of that Bullseye is the stuff that absolutely has to be as confidential as possible, which includes your interview notes, right? The communications with that source, your the texts that you send, you know, as you figure out where you're going to meet up for coffee or whatever, like, that stuff stays in your inner tier, and then your outermost tier is the stuff that you are totally going to publish. So, yeah, Google can have it, you know, because we're all on Google Docs and we're all, you know, like going over the draft and the editors got their comments and blah, blah, blah, but you know that is the risk that you accept if you're going to advance the story in the way that it needs to go. And then you actually have that gray tier in the middle, which is a little bit more nebulous, where you'd want to have as much confidentiality as possible. But you also know that something is outside of your control, like, for instance, in a newsroom, do you have to have, like, you know, a certain amount of accountability towards building this particular story as it's required by maybe your editor or by the legal counsel at your organization? You know, like, these are things that you are not particularly within your control, but once again, you're exercising your ability to pinpoint where that risk is acceptable and where it is not, and so that then you just map the tools on top of the various things that reside in any of those tears in the bull's eye. And that's a way to apply that compartmentalization in a more formulaic way, rather than just improvising around it.
Thanks Harlow, and again, I'll stick with you for the next question. We've seen proposals all over the world to add back doors into encrypted services to allow access by governments for criminal investigations, terrorism cases, other categories of crime where the government feels that exceptions need to be made. Can you explain the idea of a back door and whether and what you think of it?
Yeah, and that's a really complicated question, so I'll try to be as concise as possible. So the term back door refers to a system or component in a program that is created, often without knowledge of absolutely everyone who creates those programs, because it's compelled by a governing party, perhaps for compliance, perhaps for legal purposes, etc, in order to undermine the assertion of confidentiality that that that tool would, otherwise, you know, claim to have. And so, for example, and especially. Actually an example where we know that back doors are a bad idea. Back doors have been baked into a number of technologies for decades, if not longer. But just recently, we saw that, you know, the very system that ensures compliance with legal legally obtained pen warrants, meaning like you can tap someone's phone in AT and T that had been used by hackers, by like state actors hacking for another another country in order to surveil people's phone calls. That's an example of why back doors always go wrong, because as much as you would love to comply right, because you have to those very back doors in the hands of the attacker are going to totally like undermine any assumption, any assurance of security and confidentiality and privacy. So back doors in general are just a horrible, horrible idea. Now there are companies that do resist that with very, very stated principles and also with stated limitations around the type of data that they have access to, in general, that makes, you know, any effort to compel an organization to implement a backdoor, kind of moot, which I think is really admirable, and that's also when we are recommending tools to people. It's not just use signal because of this, or, you know, use Tor because of that, or whatever, because they're cool and we like them or whatever. I mean, it's not only that. It's actually the fact that these are technologies that are scoped in such a way to prevent technical back doors from being possible. And we can get into like, you know, as to why, but I think that that's beyond the scope of this particular conversation. But this is where it gets complicated, because given that end to end, encryption is what they would like to call facts on the ground, it's not going away, and we love it like we're keeping it like everyone loves end to end. Encryption, even its opponents, they still use it. They love it. What we're looking at now is not necessarily technical back doors, but legislative back doors. And so when you think about, you know, a a push, not only in the United States, but in a number of countries. In like, you know, the BRICS nations legislation proposed by the EU via what is called the DSA or Digital Services Act or agreement, or whatever, these are all examples of attempts for or attempts to compel under the guise of, like, you know, moderation, trust and safety, like you know, data sovereignty, all of these, like really, really excellent things for platforms to have language being crafted in order to overly give governments the ability to access this otherwise confidential data. And so there, while we don't necessarily see this yet, but if you think about, you know, the ability for a robot to be able to jump into what you thought was like a confidential, end to end, encrypted group chat, for instance, under the guise of, you know, trust and safety, meaning, like, Oh, someone said something crazy in the chat. Let's put a bot in there, and then all of a sudden that that chat is now like, ostensibly, back door. That's an example. Or, you know, data being having to be stored in servers in a certain country where you want things or where you want to like operate. So, you know, a foreign government can grab user data in order to further surveil their their population that's using another country's service. That's another example. Or like, you know, in the near future, what is going to happen where AI lives on every single one of our devices, and you know, on your own side, without breaking your end to end encryption, you still have aI enabled processes that can you know as possibly read the entire contents of your Super, super secret secure chat. So this is, I think, what is at stake and why we pay so much attention to the way these various legal languages are crafted.
Thanks Harlow Lorenzo, you've written quite a bit about government spyware and the government's surveillance capabilities. What you know based on what you know about the US is capabilities. Are the concerns we hear from the FBI, the DOJ, potentially others about criminal enterprises or terrorist organizations. Going dark through using encryption? Are those concerns justified?
Yeah, that's a very good question with a complicated answer. I think that you know, my view in general, is that this is, there's a lot of fear mongering from governments, because it's just the easy way to look at what they want, you know, to get maybe legislation, as Harold was talking about, maybe a technical back door, which I also agree is probably hard to get at this point. You know, like signal is never going to install a back door on their service. But yeah, spyware is an issue, but I think that, you know, I know that this is not like a binary decision, but I would rather, I would rather have governments have to spend millions of dollars on sophisticated spyware than have a back door that they can tap at any point. And as Arlo said, Maybe other people can also tap on so, you know, I think there's a lot of fear mongering, but at the end of the day, like law enforcement still gets the criminals, maybe they have. My friend and former colleague, Joseph Cox, wrote a book about this whole sting operation where the FBI was essentially running an encrypted phone company that criminals thought was super secure. So that's a very good example of law enforcement finding a way around the problem. Again, spyware is used by countries with repressive regimes, but it is also used by countries in the Western world for hopefully legal purposes, and there's, you know, warrants that they can they need to get. And again, I think I would rather have a world in which people need to use that sophisticated spyware rather than have the the ability to just go to a provider and ask give me this person's information.
Thanks. Lorenzo Julia, another kind of proposal we've seen which which you've written about, are proposals that would require tech companies to scan all of the messages, images or data that they store that they transmit. Can you talk about how those proposals would impact end to end encryption if enacted?
Yeah, absolutely. I mean, this is the most popular type of proposal right now. Every like, all over the world, governments are saying, Okay, we've kind of lost the fight to get a back door, because all these technical people have explained to us, and there's been so many examples that, like, once you put in a back door, you can't really control who walks through it, right? So now they have this new idea where they're like, Okay, fine. But then all you encryption, messaging people, you have to do some scanning, basically, of any content before it's sent, to see if it's like, illegal, if is it child sexual abuse material? Is it terrorist material, etc, and they argue that this is not going to break encryption, but this is absolutely insane, right? This is like asking all these providers to do all of this scanning. Is, in fact, an enormous amount of surveillance. It's like more surveillance than we have right now, right? Like, even Gmail is not doing that, right? So they're actually asking for this insane amount of surveillance. And there's actually no evidence that encryption is really the problem here, right? Like, is encryption. What's causing terrorism is encryption? What's causing child sexual abuse? Like the data is not supportive of these two things being correlated, right? Bad people are going to do bad things whether they use these messaging platforms or not. Is not the make or break decision for how, how much bad stuff they're gonna do. And so I wrote a piece in The Times about this last year saying, you know, like we need to really look at what really works when it comes to child sexual abuse. That's one of the most hot button issues. And actually, one of the things that the experts told me is that sex education in school is is actually the best preventive technique, really. And that's something, you know, we have, we would have to invest in, as you know, a civic thing, just like we have done with many other things in school. And so there's a lot of solutions out there that people just love these tech solutions because they seem easy, but they're actually incredibly invasive, right? Nobody wants to live in a world where every single piece of content that you share with your family, just a text message, is scanned by some third party trying to determine if you're doing something bad. That's a real level of. Real and said we would, if it was presented to us just blanket, we would say, That's crazy. That's like 1984 you know. And so it is sort of shocking to me that it continues to gain interest across the world, and there are people fighting these proposals in every country right now.
Julia, you've written that unlike most people who make health related New Year's resolutions, you make security and privacy resolutions. Can you? I don't know if you've had time to think yet about what your resolutions are going to be for 2025 but maybe now's the time to think about it. And if anyone else has any related security and privacy goals that are relevant to this conversation, wanna hate them here, please do
Well, honestly, my goal this year is so embarrassing, is just I've been very unsuccessful in getting my family to do encrypted messaging. And this year I really, really want to move my family group chat onto signal, because, you know, they've been making fun of me for 10 years about why they don't need encryption. But I think, honestly, this year, my my daughter was on campus. She's a college student during the, you know, protests around Gaza, and she sort of saw how the administration cracked down and started doing a lot of surveillance of students, and it was the first time she got interested in signal. And so I said, See, let's all now move to signal for our communication. So that is my goal for the year.
I can tell you from my experience that once you achieve that goal, the next goal is going to be getting them to, like, go into their settings and, you know, turn on some sort of notification. Because I finally got my mom using signal, but she doesn't know when I when I text her, so I end up having to text her the old way anyway.
Oh, my God, that's the worst. Having to text someone check your signal, which I have had to do many times.
Any other thoughts on how we can improve, either individually or collectively, in 2025
okay, I can go actually, one of my resolutions is to be more proactive in finding or Okay, actually, it's to unlearn the stuff I learned in like the 2000 10s regarding security. And it's not to say that any of these things were wrong, but it's the spirit that I had when engaging with it. So I felt that, you know, like the previous, what, decade ago, 15 years ago, or whatever, all of the security rationale, what came from a place of fear, and instead, I want to focus on doing it from a space of empowerment, and be able to communicate that, to explain that to people, yeah, focusing on their empowerment, rather than fear, and also thinking about the longevity of it. What does it mean, you know, to be 15 years older and still work within the same technologies given how my life has changed in that time. So you know, it's a personal growth opportunity.
Thanks. Harlow and Lorenzo. If you don't have an answer to that particular question, any other closing thoughts you would like to offer?
Yeah, I'm really bad at resolutions, but I would like to echo something that Julie said at the beginning, which is, you know, for all the, you know, fear mongering from law enforcement and the worries that we have about legislations, which are, you know, obviously very good. It's good to be more worried about this stuff. It's good to be advocating for better laws. You know, I think the good news is that encryption is here to stay, and it's become really normalized. You know, people don't even realize that maybe using WhatsApp, they are using state of the art encryption, like signal. So life has become easier for journalists and sources. So yeah, you know, I think that is good news, but yeah, we need to stay alert. And maybe now thinking about it. I guess my resolution is to, at the beginning of the year, maybe do a risk assessment and rethink my practices, in case I'm missing something or I got too confident with the advice that maybe is a little bit outdated.
Thanks. Lorenzo, well, we're at the hour, and I had told everyone this would be 45 minutes, so we're going to to wrap up, but I want to thank everyone for joining us, Harlow, Lorenzo and Julia, it was a great conversation, and thanks to everyone who showed up and listened. If you'd like to learn more about freedom of the press Foundation and the work. Do and the issues we we cover. Sign up for our newsletters, as Harlow mentioned earlier, you can go to freedom dot press slash, subscribe or just go to freedom dot press our website, which we just redesigned, and the press freedom tracker, which I mentioned, the database we operate that compiles press freedom violation reports from around the country. Is press freedom tracker.us you can visit Julia's publication, proof news at proof news.org and you can read Lorenzo's work at techcrunch@techcrunch.com, and you can share this space, which will be available afterwards on our x account, and as I said, we will also likely do a write up that we'll post on our website sometime soon. So thanks everyone for joining us. Hope you enjoyed the conversation and happy global encryption day.