Privacy and the Public with Alex Stamos (Stanford University) | Disrupt SF (Day 2)
8:49PM Sep 6, 2018
Okay, so we're just going to jump right in. I think we're all aware that privacy and security are top of mind right now across the entire webs. And there's perhaps no one better to come and chat with that. Or chat about that with us, then former Facebook chief security officer, and now he's going to be starting up a new program at Stanford. Please welcome to the stage Alex Stamos and your moderator, Taylor Hatmaker. Big round of applause.
Everything is branded here. Absolutely everything. Welcome to our brand.
Welcome to the stage as Jordan said, For anyone who might not know um Alex Stamos was at Yahoo. was the chief information security officer. You spent a year there, you left at Facebook, went in 2015. And then you spent three years of Facebook and now you're moving on to bigger different things. I don't know if they're better. I guess you don't know that yet.
I've been so bored. I wanted to get into the, the 2am phone calls of academia.
It's been a really dull year for you. So we'll get into that. Some.
Obviously, you know, we might have a hard time figuring out what to talk about has been very quiet over at Facebook. But
yeah, yes. Let's talk about schools. Reports started looking good.
Yeah, yeah, that sounds fine. We'll kick it off easy. Let's just, you know, dive right in and talk about Russia. Okay. I think it's a light topic right after lunch.
The country, yeah, beautiful place.
It's a beautiful country
I wish I could actually visit it
Well, I mean now you can right?
I have this map. You know, how people have like, maps on their RV? Yeah. Of every state they've been to. I've the map of countries I can never visit for the rest of my life. And unfortunately, that's the biggest by land area.
Maybe things will change.
Not the only one. So.
It's probably true. Yeah. So obviously, yesterday your former boss Sheryl Sandberg sat in front of the senate select intel committee testifying about Facebook and Russia and interference in the 2016 US presidential election and solutions moving forward.
Famously, after the election, Mark Zuckerberg made the comments that the idea that Facebook and potentially impacted the election was a pretty crazy idea. So I would love if you could kind of take us through the day, a day in the life that you and your team realized that Russia had, in fact, potentially interfered on the platform.
So there wasn't one day because we first spotted Russian activity in the spring before the election. So you know, our company and a bunch of other of the big ones have full time threat intelligence teams. Our entire job is to track persistent government actors, groups that are always involved and doing things on the platform. And we already had a team that was watching the activity of what people call fancy bear or APT 28, who are the folks who were believed to work for the GRU, which is the main intelligence director of the of the Russian military.
and so we saw some stuff from them in the spring before the election, in the end, their actual hacking activity happened off of Facebook. But, you know, kind of at the time, the way this works is we have a relationship with law enforcement, the US, we inform them the things we found later on. We heard about the DNC hacks and some other stuff. And so we tried to move quickly to shut down their ability to amplify that stuff on Facebook. But at that time, we didn't have a handle on the activity we found later.
So there wasn't really one day but after the election, we really dive dove into the overall fake news problem. And a big question was what is behind this right of all of the stuff people call fake news, what is driving it and it turns out that the vast majority of it's actually financially motivated so the, you know, stereotypical Macedonian teenagers who actually do exist and who are, you know, living living the good life,
as well as folks in Romania, Pakistan a bunch of other places with kind of good English comprehension, technical skills and low cost structure that they can run these large groups to do fake news farms. They're behind most of it. But then we started to find these chunks of stuff that was obviously not monetizable and leading up to, you know, the announcement in September of 2017 of the biggest chunk, which came from the internet research agency.
So there wasn't one day. It was kind of this progression where Yo, we first saw a little bit during 2016, we saw the public stuff happened around the GRU, the creation of the DC leaks personas, and then probably the big chunk for us was the, the IRA cluster that we found in all of the advertising that came with it in the summer.
So that's interesting, the chunks we're kind of in a bucket and you said they stood out because they weren't monetized well, which indicated that there was another motive.
Right. That's actually if you're looking yourself and you want to try to determine whether something is financially motivated fake news or it might be an information operation one of the sciences whether they take you off The Social Network, right that people want to make money do so by doing arbitrage.
They will push a lot of spam on Twitter or Facebook or any other network take you to a website on which they run a bunch of ads. Some of those ads, those advertisers know they're there some of its ad fraud. So they'll run you know, a,
if you go to like a fake news site, and all sudden your your CPU fan speed spins up. It's because there might be like a 4k BMW ad running the hidden DOM elements, right? They'll do a bunch of fraudulent stuff, but they're basically just trying to take the traffic whereas with the internet research agency, the other government trolls want to do is they want to get you to reshare the content on social media.
And so they what they especially like to do is image memes. Because people love to take an image meme, download it, reshare it something else basically take credit for it, and that makes them no money. There's no way they can make money. Have you re sharing this meme over and over again. And that's a good indication that somebody's paying them other than advertisers.
Yeah, that's really interesting. I definitely in my personal life anyway, have a hard time communicating to like, you know, the kind of folks who wouldn't be at this conference, like how to look out for this kind of content, obviously, it's an ongoing process where, you know, Facebook is always going through this and finding new buckets of, you know, suspicious activity or whatever. And they've been really transparent. I mean, in the last year to their credit around that, but it's hard to, to tell people how to know what to look out for.
Well it's in that I think one of the hard parts here too, is there's a lot of smuggling of the messages, they want to push through intermediaries, right? So if you look at the Russian campaign against 2016, there's really two different buckets, there's the GRU lead work which is about hacking and leaking. They hacked a bunch of data from the DNC, from john Podesta from colon pal from other folks, and then they use that hacked information to create the new stories they wanted to see in the media. And then they amplified it using their trolls later.
But in that case, it was the legitimate newspapers and cable news networks and legitimate journalists who were carrying the message of the GRU and kind of washing it through the respectability of their outlets that then change the entire conversation, which is a very different kind of model than the IRA model, which is to directly push messages to Americans.
Now, the IRA model, it's much less about this candidate or not, the GRU was specifically targeted at Hillary, right, it is, it is pretty clear the G or use goal was the weekend a future Hillary presidency, I think it was less about actually electing Trump, that I find it unlikely that the Russians are better than Nate Silver at predicting elections, it seems that they were assuming a Hillary presidency that they saw as a big threat to them.
Putin has a, you know, it's been well documented, like a personal antipathy towards her and believes that she was behind the protest against them in the 2012 Russian election. And so the GRU activity was specifically focused on on weakening her was the IRA activity which started well before the election and has lasted well after the election is really about driving wedges in American society and so it's much more dispersed.
But then that's that's kind of direct messages that you're getting straight to them that they are pretending to be Americans or pretend to be legitimate outlets. And they do not seem as as good at getting their messages carried by the media. But that's fine because they're going straight to people's eyeballs via Facebook, Twitter and elsewhere.
Do you think that we need to redefine cyber security right now? I mean, it's something you touched on in your law fair blog post, which everyone should read. If you haven't, it's informative. And it's it's a well laid out argument for what needs to happen to secure elections moving forward, which we can talk about a little more later. But do you think we need to expand the definition of cybersecurity, like, I think I feel like that was what was must have been one of the most disorienting things early on because you're like, I mean, you you have to go to your team obviously, or whoever it Facebook the order, you know, your your bosses and say, you know, we weren't hacked, but this thing happened. How do you describe that thing?
Right. And this is I think this is you're totally right about re defining security. I don't know if we'll end up using cyber security as the term I spent years fighting against the word cyber, and now I know I'm old I have a gray beard.
Cyber yeah I say cyber without irony now I but yeah I've personally grown up in this in that I come from a very traditional information security background right teenage hacker CST. You know, he degree started a security consultancy was a security researcher finding bugs finding software flaws and in my jobs first the Yahoo and then Facebook I had to kind of grow into the realization that the vast majority of harm that is caused by technology does not have any kind of interesting technical component.
It is the technically correct use of the products we build to cause harm. And that's not just in the disinformation space that is in the abuse of children in the harassment of individuals in the suicide, suicidal ideation. You know, these are kinds of things that have no technically interesting component to them yet are incredibly harmful. And I do think we as an industry need to vastly expand how we deal with this because that's not actually studying those issues.
What we called trust and safety issues, or Facebook, that term is integrity that is used for that. But those kinds of safety issues, there's not really a field around it, right? You can't take classes in it. And any good CS school, it's not something that you normally put on your LinkedIn resume. Like, how do you hire those kinds of folks, it's very hard to find them.
And that's actually something to give you the soft pitch pitch for the pivot something I'm trying to you know work on at Stanford right is I think if if we're going to graduate out the students who are going to try to change the world they should have an understanding of all these ways technology has been misused in the past and we need to start to build a cross disciplinary academic center around looking at all the ways technology can be misused that doesn't find fall within the really fine confines of additional information security.
Can you also build a time machine so we can go back and have that like yeah not so much for years ago.
That'd be yeah I in your right. This is me and this is the tough part two is we it's always been true that our technological achievements outpace the understanding of how those of how it's going to be abused in the fixes that that's true in traditional security and the the trust and safety area. So I do wish that people, you know that we had an understanding of these things better. I do wish, especially the big companies had a better understanding of these things years ago.
Unfortunately, that's just not how like we've trained people, and not how these companies have grown up.
Switching gears a little. I want to talk about your choice to move from Yahoo, to Facebook, I guess in the Twitter verse and the security community, you're widely regarded as a champion of privacy, but user privacy advocate, how did you reconcile concerns users might have around data privacy on Facebook and Facebook's business, which is predicated on using data to target ads in going in as a user privacy advocate.
Yeah, I mean, I think this was a tough balancing act. Anybody has to make right like if you want, if you want to actually make change, you have to be the man or the woman in the arena. Obviously Teddy Roosevelt wasn't thinking a very gender neutral way. But yeah, you have to put yourself in a position that sometimes you're going to have a fight.
And you're gonna you're gonna fight with folks who might disagree with you, and you have a chance to change their mind. And I mean, there's a one I think the ad supported internet is something that's going to last for us for a long time, we're not going to get away from the model of using data about people to target ads and therefore supporting these platforms, partially because the truth is, is that a small number of consumers mostly in in North America and Western Europe subsidize the existence of these technologies for everybody else in the world.
And finding some kind of way to build products that require millions of servers and billions and billions of dollars of hardware and lots of professionals to build and run it finding a way to support that that then that service can be available freely across the world. That's a super hard problem.
So if these products are going to exist, you're always going to have these privacy trade offs. And I personally thought it would, it's better to be part of that argument and at these companies than to just throw, you know, tomatoes from the outside. But, you know, that's also can be a self serving argument.
Hopefully, when you have when you're on the inside, and you're in, you're trying to work on these issues, you have the ability to change people's minds. And, you know, it's not like I, I was working with people who were like, let's go violet, and raise privacy, right? Like you're working with people who are well meaning and want to do the right thing.
So they're not saying that if Facebook Like every board meeting isn't like, let's go violate everybody's price.
Right. Yeah. And so it's not like you're deciding to go work with people, you're going to an organization that has a lot of different equities. And I think one of the things that's important is to go fight for the equities that you believe in, and then hopefully, that that gets balanced out with everything else.
By any measure, you've had an exceptional career, I mean, you've seen you've seen a lot of things go down just in the last, you know, five years.
Exceptional seems like one of those words that can be read in different ways.
A really technical word. If you look at the definition. Still Yahoo, and then again, at Facebook, your name will be attached to at least to the greatest cybersecurity scandals of all time. Yeah, good. Is that good or bad for your career? Or your resume? If you you know, you put you skip over a few years on your resume. You're like, Oh, yeah, I did a little thing before. Yeah, no, I start with the why.
You know, that's when you decide to take the CSO title, right. You, you decide that you you're going to run the risk of having decisions made above you, or issues created by 10s of thousands of people making decisions that those are gonna be stapled to your resume before anybody else's.
I mean, the truth is, is it's kind of a crappy job in 2018 to be a chief security officer. We're in this time. It's like being a CFO before accounting was invented, right? Like it's a practice. That's only existed for a couple of decades. And we don't have the mechanisms necessary to really understand what happens at these companies into to understand the risk and control the risk.
Are you defining the job in a way, like as you're doing the job? Is that that kind of thing?
They're like, yeah, and I think, I mean, there's a bunch of different kinds of see. So jobs and, you know, but at the big tech companies, it's, again, not just about the straight up information security is about privacy in a way that is does not fall within normal security flaws. It's also about the misuse of the product. And yeah, all of these companies are kind of making up as they go along. And I think that's true for, you know, taking on the position. I mean, it's just
you take on the responsibility, you get the advantages of having the platform. The downside is when things happen that that Yeah, even if you don't have the ability to change your control that that's something you have to take responsibility for. And, you know, I was the CSO and all this stuff happened, it is my responsibility, I'm not going to shirk it. I also hope that I was able to do things to make things
Better and in the alternate universe or somebody else's of that job, maybe they did better, but maybe not. But you know that if you're making individual decisions that you believe are ethical and moral and are pushing the ball that right direction in the end, you know, if if things are in perfect, then you just have to kind of live with yourself and and hope that you can,
you know, continue to do good things. And partially one things I want to do is I've learned a lot of things from the failures that I've seen up close and I I want other people to learn about him, right? That's that's one of the reasons I left is
there's a lot of things that have happened in the last four or five years that are really pretty universal and the big companies that are on the forefront are dealing with them first but there is not a company here at disrupt. There's not a tech company starting up right now that is not going to have to worry about these trust and safety and privacy issues. And hopefully we could take some of those lessons and then spread them out a little bit more.
Switching gears again, we don't have a ton of time left but simple question. Midterms, are we screwed?
If there is no foreign interference in midterms? It's not because we did a great job. It's because our adversaries decided to give us a little forbearance, which is unfortunate
as a society, we just not have have not responded to the 2016 election in the way that would have been necessary to have a much more trustworthy midterms there that the platforms have made changes, we have a lot of transparency. I think out of all the stuff that's happened, add transparency, probably the most positive because you know, Russian interference or not, we do not want a future where campaigns and candidates and packs are cutting up the electorate into smaller and smaller pieces.
And so I think add transparency is the first step there. One thing to talk about in the law fair articles. I think we need to have legal standards as to the minimum advertising segment size available to political advertisers. And that's something has to be done legally across the entire industry. But I think there have been positive changes but overall the actual security of campaigns
Not that much better. The actual security of the election infrastructure is no better. And my big fear is, you know, the 2016 electoral map was this weird, carefully balanced thing. And I think in most cases, actually throwing an election one way or another is going to be very difficult for a foreign adversary. Throwing any election into chaos is totally doable right now. And that's where we have not moved forward.
If we had attacks against the registration infrastructure that drops hundreds of thousands of people from the rules in swing states, D das attacks day of attacks against the intermediate tabulation systems. Even if eventually we knew who won election if you do those attacks and you do a disinformation attack at the same time, you can make it that half the country always believes that election was thrown and and that kind of attack I think is very doable and not just by the Russians by a number of adversaries that have aren't as sophisticated.
But maybe you know the sophistication we're talking about is down here. So the number of countries that can meet that bar is actually quite large.
So you're saying chaos is easy, but an assured outcome is hard. But it doesn't take initiative outcome to. write. Everything is okay.
And that's what I mean. I think we need to have paper ballots, and that's great. But what paper ballots gets us is eventually, 40 days from now, you know who won for real, but that doesn't.
If our adversaries turn every election into Bush v. Gore and open up the doors to crack, you're going to have the lawyers for the DNC rnc put their crowbars in there and fight to the absolute bitter end. And so if they can just create a little bit of chaos, I think us as Americans are going to do the rest of the work for them. And after two or three elections like that, we will be in a pretty bad place.
I think we take really for granted guys, 90% of Facebook users are outside the United States. So I spent a lot of time dealing with these issues in other countries we really take for granted what a peaceful transition of power does for you as a country right that the we will really miss that if one able to defend it.
We're almost out of time. But I did want to ask now that you are transitioning into your, your cushy gig at Stanford. That's how you describe it. Right? Your cushy Stanford I don't know. I don't know what your desk looks like. Maybe it's small. It's probably big.
I have an office for the first time in like six years.
So that's nice. Yeah, you have it up in the world, would you consider going back to work like as the CSO racism for a major internet company?
Not for quite a long time. No. I mean, there's a lot of decisions. Like I said, it's a very tough job. It's also just practically I have three little kids and and this is explicitly making a decision that to actually be part of their lives. And by the time they're out, I'll be old and irrelevant. So it's unlikely anybody would offer me the job?
Well, they'll be hacking by then. So it'll be fine.
Right. Right. My kids can take it over. Yeah, no, I mean, this is my plan for quite a while. And like I said, I it's great to be a CSO. And to work on that. I think we have these humongous issues across the entire industry across the planet that we need to work on. And the ability to tackle those is much more interesting to me right now. Then getting back into kind of the operational side
Alright, well thank you. That's all the time we have for today.