SOTN2023-06 Making Sense of Section 230: A Deep Dive Into Recent Legal Battles
4:08PM Mar 14, 2023
Mary Anne Franks
So we're going to start out with an easy one. Can starting out right over here. Marian, can you talk about did you file an amicus brief? You did? You did. Okay. So everyone here has filed an amicus brief, we're going to just ask you for your key points of your amicus brief that you file recently.
Excellent, thanks. So our amicus brief was filed on behalf not only of myself, but also Professor Daniel Keats Citron, and also our organization, the cyber Civil Rights Initiative. Some of you may know that we are an organization that is dedicated to protecting civil rights online, and we have a particular focus on online image based sexual abuse. And the gist of our Amicus was, you know, we've had more than 20 years of courts making decisions about how to interpret section 230. And in our view, the interpretation that most courts have come up with is wildly detached from the actual text of section 230. It is what we have sometimes referred to as an unqualified immunity position for the tech industry. And that when there was an opportunity for the Supreme Court to take a look at the statute and see if in fact, the way that it has been interpreted is correct. This is a valuable, valuable opportunity to look at the text and history of section 230 with fresh eyes. And what we have pushed forward as that look at the text and history is to emphasize the interaction between two sections of section 230. So the more famous one, the one that gets referred to as the 26 words that created the Internet is the part that says you shall you shall not treat providers or users of interactive computer service providers, as publishers or speakers of somebody else's content. And that's an important part of section 230. But there's this other section called C two, which is actually the section that is specifically refers to immunity, and says that you get that kind of civil immunity when you try to make efforts to voluntarily and in good faith restrict access to offensive, objectionable, et cetera kinds of content. And our view is to say that if you look at the history, look at what was going on in 1996, the primary focus of the statute was that second part, it's reflected in the title, which is the Good Samaritan statute. And most of you probably know that a good Samaritan statute is essentially a statute that says you will be immune for civil liability. If you choose to do good that you were not obligated to do. If you voluntarily assist someone who needs help, and you are not under a legal obligation to do so you will be immune from being sued over that in case you do it imperfectly, in case you maybe cause a little bit of harm when you do it. And that, in our view, there is no way to read section 230 without looking at the interaction between those two provisions. And if you read the first provision in a way that completely undercuts the incentives of C two, then you have done something unintelligible to the statute. So the takeaway position from our brief was, this is a good Samaritan statute, it is about trying to incentivize the tech industry to do good when it doesn't have to. And that's the extent to which it's getting immunity. The other part about publishers and speakers is simply a limitation on the kinds of cases you sorry, the kinds of torts and other types of theories that you can bring, namely defamation and related kinds of claims, and restricting the liability in a sense that you are not going to have the old defamation liability rules that exist offline, which would hold people accountable merely for repeating what someone else was saying. And so our focus is make sure that you're reading it in a way that is consistent with the text and purpose of what Congress actually intended with section 230. And that the opposite way, which we have seen most of the lower courts go towards, is this kind of establishment of a super immunity, that is really perverse in terms of its impact on the incentives that tech industry players have when it comes to abusive content.
Fantastic. Alex, you're up next.
Sure. Great to be here. Thanks for having me. Yes, the we filed amicus briefs in both cases, and both Gonzales and Tom and I'll just talk about the one we follow the zealous for now. And if we come back to Tom migrate, you know, the most direct question in the Gonzales case is whether platforms are immune for their use of recommendation algorithms under Section 230 are whether they can be subject to liability for the use of recommendation algorithms. And we took the position that the court should interpret section 230 to immunize the platforms for the use of recommendation algorithms, unless those algorithms materially contribute to the illegality that is the subject of the suit in a way that goes beyond mere amplification. And so the the kind of key argument we made is that mere amplification is not enough. And the reason that we made that argument is driven, in part because we think as a faithful reading of the statute, which immunizes platforms for their publication decisions, and we felt that there was it's very difficult to draw a line between amplification of speech and the publication of speech because the publication of speech necessarily involves decisions as to what should go first and Watch who goes second and who should see what you're publishing. But also because we worry that a broad ruling of a broad undermining of section 230 immunity would subject the platforms to liability for, you know, what we think of as the core value that they provide to the public. And we were focused very exclusively on the free speech interest of the public in, for example, having search engines and if platforms are subject to liability for using recommendation algorithms, it would seem that every search engine would be subject to liability for deciding to put some posts at the top of the search results, and some posts lower down. And we worried about the collateral censorship that would result from a scheme of liability, a scheme of liability like that. But we drew a line in our brief, we didn't argue that the platforms are always immune, for their use of recommendation algorithms. We said it was important to continue to subject them to liability for things that go beyond mere amplification, for example, further design decisions for things unrelated to content, and most importantly, from our perspective, for claims relating, for example, to algorithmic discrimination. And we argue that if a platform as some are alleged to now, say, distributes housing ads in a way that is discriminatory on the basis of a protected characteristic, even if the advertiser has tried to publish their ad, neutrally to all, that the platforms would be, would not be immune under Section 230 for those decisions, because those are genuinely the platform's decisions. And one way of testing whether that kind of algorithmic discrimination is the mere amplification is to ask the question of whether the platform or whether the harm would still occur if the platform's publish the speech more broadly. And then the context of the claims and Gonzales which were claims about the publication of terrorist related speech and the connection, you know, that those publication decisions to, you know, horrendous terrorist attacks that cause real harm. If in that case, the publication had been made more broadly, presumably, the harm and the causal connection would have been easier to establish for the plaintiff. But in a standard algorithmic discrimination case, if a platform responds to the claim by publishing the housing ad to everybody, then the harm is gone, because the harm is the discriminatory publication of ads, and not just the mere amplification of certain kinds of ads. So that was the, you know, the brief that we followed in Gonzales, you know, we think that section 230 plays an important role in preserving an open public sphere. But we are also concerned about overly broad interpretations of it. But we thought that the line we drew between mere amplification and other contributions of the platforms to the legality of what they publish, was the right one to strike in this case.
Steve, go ahead.
Thanks, Shane. And Tim, and the entire state of the note organizations really glad to be here. If the Supreme Court wants to try to figure out what those 26 words meant, that were written 26 years ago, you had the good fortune of being able to ask the two guys that wrote the words. So net choice director is former Congressman Chris Cox, and he and Ron Wyden, who was the Congressman at the time, they were the ones who wrote section 230. And they they wrote it around a situation that happened in Stratton Oakmont. And that's a case that all of you know, but by a completely different name. So those of you who are far younger than me in this room may not remember CompuServe and prodigy, but you probably saw The Wolf of Wall Street. So that's the origin story for Section 230. So Leonardo DiCaprio and his team were ripping people off all the time. Some of those people went on prodigy. It's a bulletin board service that I used at the time, and they would complain about Stratton Oakmont the name of the firm ripping people off. Well, what did Stratton Oakmont do they sued prodigy for defamation? They didn't sue the individuals who wrote the statements. They sued prodigy and in a bizarre turn of fate. A court in New York said that Yeah. DiCaprio can The Wolf of Wall Street can can go after prodigy. In this case, because while prodigy didn't write the words, Prodigy does some attempt to clean up on its site. It attempts to reduce the amount of profanity or hate speech that shows up on prodigy. So yeah, maybe prodigy should be liable. Well, that bizarre turn in the Stratton Oakmont case is why Chris Cox got on an airplane one morning flying back to Washington from his home district, California. And he wrote the 26 words in the rest of the statute, and they enacted it and it's all that survived from the Communications Decency Act 26 years ago. So they did a brief Coxon Wyden. And you know, just despite Maryann's I think wishful thinking Coxon. Wyden said that they wrote section 230 and a technology neutral manner. So that would apply to some to current methods of presenting content, there is no good samaritan driving C one C one stands alone C one solves the Wolf of Wall Street problem. And for belts and suspenders, let's make sure that C two is in there. So that when a company does take action to take down content, it doesn't get sued for taking it down. And it doesn't suddenly become the publisher, a finding for Gonzalez, in that, in that case on section 230 would render the law, a dead letter. Whenever a platform's content moderation is less than perfect. The platform could be sued for making an implicit endorsement. And that's why the plain meaning and intent of Section 230 According to the two congressmen who wrote it would hold for Google. Now naturally, it's also participated with a number of trade associations who were in the room today, on amicus briefs in the cases as well. And we thought that as a tech trade association, we tackle this mythology of algorithms, this notion that algorithms that are used to determine what shows up in your Twitter feed or on the right side of the YouTube screen, are some sort of magic that ought to render that law obsolete. I think Coxon Wyden and our brief tried to demystify algorithms, they're just rules for the order in which content shows up whether it's reverse chronological, or maybe it's based on topics that you've been following what your friends like, or it could be completely random in all cases, those are algorithms or rules. And Justice Kagan, in her in her discussion, said that 230 was written in the pre algorithm era, the pre algorithm, period of time. And that's just, that's just not true. I had used prodigy and CompuServe in the 1990s. And they would arrange the posts of users in reverse chronological order based on the topic that I was in with breakouts for discussion threads that was algorithmically dictated arrangement. So the authors are 230. And the industry that uses it really believe that a holding for Gonzalez is, is going to completely upturn section 230. And I think most of the amicus briefs came to the same conclusion.
Thanks. Hi, everyone, thanks great to be here, I have the luxury and burden of going last, whichever you choose to think of it as so I can maybe say I kind of agree with everybody, at least in some parts of what they said. I'll try to talk about what we said ourselves before just getting into comparisons and the back and forth on the argument or the other briefs. But Steve did such a great job painting you a villain of the Wolf of Wall Street, I really want to use my Mandalorian Joker as a as a tech freedom advocate and a tech advocate, but also a civil rights and social justice advocate these days, I noticed that the first episode last week was called the apostate. And so sometimes when I come to rooms like this and say we believe in 230, Free Press still defends 230 and thinks it's important. We just want to strike a balance with it. And I feel like people on both sides are gonna say, well, that's not quite good enough. But that's really where we found ourselves landing in this case. And I'll say very quickly, that's for a couple of reasons. I mean, I will go down the panel, but I don't really agree with Professor Frank's that C two A is the most important part. But I also don't agree with Steve that c one is somehow triumphant, or the primary piece, I really do think that they both work together and that they aren't simply mirror images of each other or somehow reinforcing each other. I think they really are two protections. I thought I wouldn't get into other people's briefs. But you know, I do want to think one thing that people do is they try to read publisher, back into C to A, and almost try to tie them together too tightly when they really can be thought of as different protections for different kinds of activities. What we said in our brief, and I'll go back to Shane too, is that, you know, things that we don't find ourselves saying every day I was in my closet listening to a Supreme Court argument. I don't find myself saying this every day, we tend to come down more with Justice Thomas and Josh Hawley, I think, at least in some respects, that is not a comfortable place for me, but it is I think, I mean, hey, I'll admit it, but it's true. I mean, we actually did not take the question presented. We have not talked about the recommendations, at least primarily as our main focus. And so I know there was discussion about well, that's not even what this case is about. But we really did I think, come down where we're Alex and others came down and said it's not about the recommendations exclusively. I think there's a lot of similarities, although some differences too, between where we came out. And the reason I say closer to Thomas and Holly is that we focused on distributor liability. And not Stratton Oakmont not cubby and CompuServe, which I think are very interesting, too. But back to the Zerrin case, and saying that no, in fact, even though there are opinions about this on all sides, and lots and lots of discussion, not just in these briefs, but everywhere else, you can find that there is a way to distinguish, or at least a need to distinguish I'm gonna pretend the way is easy, but a need to distinguish between publisher liability and distributor liability, which comes with a higher bar. It's harder to be found liable for that. So there are arguments saying that's impossible. They're really the same thing, especially on the Internet. I get those. I have a lot of sympathy for those. But we still decided to talk about the finding the platform's liable only when they know that they're causing harm. Again, admit that there's lots of questions about how that test would be run. But we think that's where courts and, frankly, to stay in our lane. And to finish your I would probably prefer Congress to do that. But can Congress do anything in short order is a great question. So we think that thinking about that distributor liability concept and not tying, as I think Alex said, lack of protection to anytime you use an algorithm, Steve was saying this to, you know, to me, that's a disaster if we say computers are illegal or somehow hard to use. But if we can look at what companies are doing and say, Aha, here, they knew about the harm and they exacerbated it, then naturally, I think there has to be at least some test. You know, what I said, when I testified about this last year and got the honor of doing it with Professor Frank's was the yes, I'm concerned about innovation being harmed. And if 230 is weakened too much about frivolous lawsuits, and not just frivolous malicious ones, people suing to take down opinions they don't like, that's a big concern. I don't know why I like the turn of phrase, it sounds kind of old fashioned appropriate to CompuServe. And prodigy. I said, it's not though just in the courtroom where people are harmed or silenced. It's also in the chat room. And so we are thinking about these civil rights, social justice, racial justice, all sorts of impacts that people see when they have no recourse when they are, in fact, shouted down online and see the kinds of harms. But I think Professor Franks and others have done a tremendous job studying and bringing into the light for all these years.
Thank you. So I think it was Justice Kagan, who said we are not all Internet experts during it, it's kind of hard to tell us all the voices start to sound the same when you're listening to it. But I thought after they got through a bit of a rough start, it was a really interesting discussion. So I really appreciate hearing your takeaway from both Gonzalez in in tammana. So we're not going to just stick to and if you want to respond to see it, the mics yours. I do.
So one of the things that I think is so wonderful about the law is that even if a person was the sponsor of legislation to begin with, that doesn't commit all of us to whatever interpretation they think that they had the words matter. The legislative history matters, the conversations, if you look at what Congress was discussing, at the time, when it was enacted wet, had everything to do with good Samaritans had everything to do with saying we want to provide incentives, incentives for platforms to actually do the right thing, and make sure that they're not going to be chilled by lawsuits that will sue them for saying, Hey, you took down my material. So I do think that it's wonderful that the law does not have to be subject to revisionist or self serving interests of interpretations after the fact. But to look at what actually to go back again, to the Good Samaritan notion of this, which to relate to the questions that the court asked, especially Justice Jackson seemed to pick up on because she had really looked at the statute and context back to the parable, right, that the Good Samaritan is based on just just to make things very concrete, that you have someone who is beaten up by robbers left by the side of the road. This is a parable in the Bible, and Jesus is talking about this is what it means to be a good neighbor, not to beat up the guy, but to so the the robbers have left this guy in the ditch. And a priest comes by sees that this person is in distress, but just passes on because it's not his responsibility. And the Levite does the same thing. And then this wonderful person known as the Samaritan comes by and gets down from his horse takes care of this person's wounds, make sure that he gets the assistance that he needs. Now, in a good Samaritan statute, we're supposed to be celebrating the Good Samaritan who helps, right, who didn't have an obligation but who helps. Now, if we haven't interpretation of see one that says, Yeah, you're gonna get the same benefit, and the same praise, if you help, or if you're the priest of the Levite, who just passed on by and didn't care. And we're actually going to add another person who comes on by, and I'm not just the priests of the Levite, who's watching the guy just suffer, right? I'm going to stop and take pictures of the guy, and maybe I'll go through his pockets, right? And the interpretation of see one that says that tech companies can do whatever they want is basically saying, yeah, those people too, should get immunity, they should get the gift of the Good Samaritan immunity, even though they stood by and did nothing, or they actually profited from the harm in some way by making it more visible. So this is what I think I was looking for in the Supreme Court arguments to see will any justice recognize the complex relationship between C two and C one? Will they talk about how if you read c one, and that sense of you can do whatever you want, essentially, with very few limitations, how that completely contradicts the notion of the incentive. Because if you get the same reward for doing bad as you do, forgetting good, I think everyone understands why that would completely undercut the incentive for doing good. And one of the amazing things about not only Justice Jackson picking up on that, but I think you also heard it from Justice Kagan and Justice Sotomayor, was that there's a real difference between immunity and liability. When you say that someone is immune from something, right. It's not the same thing as saying they're responsible. If we say that the priest and the Levite don't get good samaritan immunity. We're not saying the priest and the Levite are responsible for the guy in the ditch. And so this argument that somehow if you don't get immunity under the statute, you're responsible for what people said is simply not true. It means that you are as a tech industry actor, the same as any person in this room who could get sued at any time if somebody chooses to sue them. So one of the things that I was really gratified to see some of the Supreme Court justices pick up on was, there's a difference between absence of immunity and presence of liability. And this notion that then secondly, right, the Google lawyer comes up and says, Yes, but there's all this uncertainty. We don't know if we'll get sued, there might be frivolous lawsuits. I think it was Justice Kagan a few times was saying every industry has to deal with the fact that there might be lawsuits, right? That's the thing that everyone does. So for 20 years, it's not the question today of why is it going to be such a dramatic or alarming thing? If we change the status quo? The question really should be why we had such an alarming differential and preferential treatment to the tech industry for 20 years. That's the problem. The status quo is the problem. And whatever hypothetical harms may come when tech actors actually have to think about whether or not their incentives need to be aligned maybe towards preventing foreseeable harm. What we know is that for 20 years, more than 20 years, when they haven't had to worry about that we've had the so called classic moral hazard. Basically, whatever you want to do, how risky you want to be with their behavior, you will suffer the consequences. And you can fully externalize those negative consequences to other people. Because it's not like harm doesn't happen. It's just a question of who has to pay for it. And right now, it is private individuals who have to pay for it through whether it is a terrorist attack, or whether it is a defamation campaign, or whether it was non consensual pornography, or whether it is Daxing, or deep fakes, or whatever the case may be. They're the ones who have to suffer from it. And they are prevented from even accessing justice, to even get to the point of, let's say, discovery to see how much did this tech platform know? When did they know it could have they prevented it had? Are they making money from it? And so again, the question really should be if we're finally picking up this question, after more than 20 years of the tech industry dominating with its very self interested interpretation of what the statute says, to now ask, what has happened, what did you get from an industry that was told you can commit all the harm you want and never have to pay for it? And you're gonna get all these benefits from this? And the moment someone suggested you have to have the same responsibilities as any other industry, you can throw up in your hands and say, Oh, no, that'll be the end of the Internet in the end of free speech.
Amen. Amen. And amen. Because my mother said, Whenever I hear somebody out or a prayer, I should say, Amen. So we've just heard a biblical reinterpretation of the law from the plaintiffs bar of Galilee. And I appreciate them advancing 2000 years to visit our panel here today for state of the net. Thankfully, only Justice Jackson, I think picked up on Maryann's argument in those questions no one else did. And I'm glad to see that that's not likely to be where this thing will head. Shane, I still don't like Sorry, and I'll get out of the way from Alex. But yeah, we've asked questions which questions? We had assumed that Alito, Gorsuch And Gorsuch and Thomas would have been eager to trim What 230 applies to. They've written about it in the past. And Thomas, I guess, given seniority leaped in with the first question, which he rarely does. And he says, Are you saying that YouTube, if it used a different algorithm that one than the one for cooking videos with? Are they using the same algorithm across the board? So Justice Thomas seems to believe that algorithms that are based on your interests are extremely useful and benign, he also doubted that the word recommendation was a good fit for the mirror arrangement of how content might show up in Twitter when you sign in or next to the screen on YouTube. Just as Barrett asked one of the most surprising questions. Near the end of it, she asked, she snapper the attorney for Gonzalez. And he confirmed the theory of the case that would say that users those of us in this room, we could be held liable for retweeting or liking other users posts, because that is the same theory under which he says that YouTube creates new content, when it puts up thumbnails, and lists to go along. So the questions are difficult to define whether where they'll go in the rolling, but those are some of the biggest surprises to us.
I do wish I could see an skit from Saturday Night Live where they're trying to the clerk is explaining to Thomas and that surprise peel off that sticks. I'm like, Duh, how many other times they go through something and they're like, Okay, that one go with that one. So, Matt, reclaiming your time. Sorry, Alex,
Reiss peel options, take a few stir.
Well, I want to jump in. Let me start by saying that I sympathize a lot with the concerns that Marianne has raised and has, you know, been at the forefront of raising for quite a long time in highlighting the costs of the way that these platforms have operated for a long time. I don't have a lot of sympathy for the platform's I generally think that what motivates their conduct online There's not the public interest, not a commitment to free speech. But it's generally a commercial bottom line. They're companies after all. And that's not surprising. And it's not, you know, necessarily a bad thing. That's just the incentives of a company. But I do think it means that there's a special obligation on people who draft laws, and people interpret them to think about the public interest in whether it'd be served by particular interpretations of the law. So I just want to put that out there. That's my starting point, my starting point is not that we ought to be concerned about the rights of the platforms, I think we ought to be concerned mostly about, you know, the rights of the users of the platforms, but those who use them for good purposes. And we ought to be concerned about those who exploit the platforms to cause harm. And, but it often doesn't even take exploitation, the platforms often it just takes using them as they were designed, which is, as engines of amplifying sensationalist and provocative speech. I think we have to have all those considerations in mind. And I think that the concern that are the line that Mariana is drawn between immunity and liability is a really important one to think through. Let me tell you why I'm not convinced that that line does enough work in the context of the consols case. My concern is that section 230, I think, provides an important procedural protection against suit that makes it less costly for the companies. It makes it so that the companies don't have to decide in the context of any particular piece of user speech that's up, is it worth the cost to them of leaving it up? Or should they just take it down, because they generally don't care about any particular viewpoint expressed on their platforms, they generally don't care about any particular kind of speech. And as somebody who spent most of my career defending, you know, unpopular views. I'm concerned about human rights advocates, dissidents, government employees who want to blow the whistle, who might, through their speech, create a dilemma for the platforms in deciding whether to leave that speech up or to take it down. And so I work to me the distinction between immunity and liability really does matter. Because one, I think, provides a greater procedural protection. And I think you could argue, I think you've got, you know, I don't think it's unreasonable to argue that we'll go through a period of uncertainty. And after that the rules will have become a little bit clearer. We'll all know a little bit more what the First Amendment means today, section 230, has largely stymied development of First Amendment doctrine in the digital age, because we haven't had to answer these questions because of 230. And maybe we'll have a period of 10 or 20 years, whether it be uncertainty, but I'm not sure that it's worth the cost of this 10 or 20 years. And I'm not sure that at the end of those 10 or 20 years, where we'll end up is actually a system where the protections of the First Amendment are clear enough, that the platforms will not have the incentive to engage in collateral censorship, to make the decision that it's easier for them. For example, every time somebody makes an allegation that a head of a foreign state is a human rights persecutor. I think it'd be easier for the platform to take that speech down. Every time somebody makes an allegation that their boss engaged in some kind of corrupt practice or harassing practice, it may be easier for the platform for for Google to decide that that's not going to be on the first page of the search results, because that will be a potential source of liability. I worry about that. And I know there's a you know, I'm not saying, you know, to my mind, this argument is not a trump card of the concerns and the arguments that Marianne is raising. But you know, those are the reasons why unbalanced, I think it's still worth preserving section 230 as an important form of procedural protection for the platform's even as we look for ways, you know, to hold them liable for way for what they do that goes beyond, you know, as I think of it, the mere amplification of speech. So in that vein, what I'll say about the Gonzalez argument, what I found, you know, surprising, maybe I was pleasantly surprised that the justices seemed to recognize that the question of what immunity for a speech intermediary should look like, is a lot more complicated than they had perhaps, or some of them had perhaps realized, when they were calling for challenges to the prevailing interpretation of section 230. I'm not somebody who believes that we have to have a maximalist interpretation to read or to preserve what we have online. I kind of think we should address these case by case. But I was heartened to see that the you know, that the Court recognized that the questions are complicated, and it's not easy to, you know, to weed through these competing interpretations and be sure about what the Internet is going to look like on the other end. And that I think, is a reasonable argument for letting Congress act although I appreciate Maryann's, I think powerful point that we've had, you know, 26 years of this very broad interpretation of section 230. And are we happy with it? And I think there are, you know, there's reason to criticize, you know, where we are now.
Okay, Matt, now, it's your turn.
Okay. Yeah, I'm not sure exactly where to go. I guess I'll start with a series of questions from the argument. And then maybe come back and talk about some of the great themes raised just there. I mean, what I was most interested in And I was also I'll admit, I was reading it since I wasn't in town that week. So they really do get all the pauses in there. It's a great read, because they have a lot of arms and representatives, I probably should have listened. But to me the the real trouble that Professor snapper had was in that question of okay, how are we distinguishing between? What is the content of the third party? And what is the conduct of the platform? And I remember the series of questions about the thumbnails, I felt like there, he was really trying to have it both ways, in some senses and the justices, I don't think that they were nearly as obvious I think they were actually quite aware of the need for sophistication, whether or not each of them brings that, you know, it's kind of as I get older, maybe have more sympathy for those. It's a cliche in DC Oh, the Justice says, What could they possibly know, but I think they actually brought a lot to the table. And it's not like they were completely ignorant. These are just hard questions. So I really do think that different ones of them came with him on that thumbnail question. I think it was a leader who said, are they publishers there of the Sunday on? He said, Yes. And that would I think, for most people trigger this kind of trigger this kind of gotcha moment. Well, if they're publishers, they're not liable. But of course, as people have explained rightly, you can be liable for your own content, right, you can't be treated as a publisher of third party content. So I think that's what he was trying to get to is making that distinction between third party content and your own speech or your own content. But he had a lot of trouble with that. drawing that line between, you know, he was like, basically, well, if it's content from the video, then that's protected, that's covered by the statute. But if it's somehow I don't know, what some kind of thumbnail that is divorced from the third party content. I mean, that might be a thing. You know, I'm thinking about Justice Thomas, and his commentary in the Malwarebytes. Case, on Jones V. Dirty world. Now, you can imagine a website that does a lot more to say, hey, watch this thing. And it's not actually just a clip from the video. But as far as I'm aware, you know, YouTube is mostly like, here's what the Creator gave us. And they put it out there. But yeah, if I have another one, and I'll just swing back and say, Yeah, I don't know. I mean, the kinder, gentler, Matt, this is this is my first panel on DC, DC since I quit Twitter, and also the first ones until I gave up caffeine, really. So I'm, I hope I'm not less feisty or less alert, but I might be I have lower blood pressure for at least two reasons. Now, I mean, just trying to unite some of the themes. Again, I really do think, you know, I don't discard the Good Samaritan story, because it's old. You know, I think there's a lot there. And especially if Professor Frank's explains it, the difference between saying you're protected from liability for what the other person did, but you made, you might still have a duty to help or at least not to exacerbate the harm. And so that's where I do think, again, there's a lot of common ground, if not the same things that were saying between me and Alex, it's like looking at that question of what does the platform do this is where I think it has become somewhat untenable, even though I strongly believe that we need to 30. And where I would disagree with Professor Frank submit, is to say that I think the Internet is if not unique, if not exceptional, in a kind of exceptionalist sense. It is important, you know, we've we've had this law, we've had these structures, to if not eliminate them to lower the barriers to third party content to have those platforms be able to host people's speech without having to go through a letter to the editor desk, or broadcast or some other kind of, you know, we sometimes call these old gatekeepers and new people, new gatekeepers in our space. So I think it is important, but where I think we have reached this untenable situation is if a platform actually can march in the court, and say, well, we knew about the harm, we did use an algorithm, which to me is more of an indication indicator of their knowledge, rather than something I would hold them liable for. We monetized it, it continued to persisted. But you know, tough luck, we have nothing to do with it. You know, that, to me is where I think Silicon Valley companies have gotten a lot of scuffs and stares from politicians and people watching and saying, really, is that the case. And that's not where every politician is, you know, some want them to leave more stuff up. As long as people as long as they agree with what's being left up there, say, don't take anything down. Other politicians say take more down. So there's lots and lots of trade offs, and lots of lots of lots and lots of inconsistencies. But I do think that's where at least I came in our organization came to this view that we do need to look at 230. Again, if you really do have the kind of like, Well, tough luck, we actually made a lot of money off of this. But it's nothing to do with us. You can't hold us liable for anything. That's where I do think we have to take a second look.
We're gonna go to questions in about five minutes, just so you guys are ready. Okay. So last question from me, is so where do you think we come out best for your perspective, the courts or Congress? Start with the marine.
Both. Okay. But But I do think it's it's so much depends on who knows what this Court is actually going to do with any of this information. I do think as I say that there was welcome sophistication that you heard from some of the justices about how, right before, you know, Kagan is probably always going to be remembered for the whole we're not the you know, the experts on the Internet in this room right before she said that. She said, Why should this industry get a pass? Right? She at least understands that there is something very differential going on here that is not necessarily justified. And I think that it's not clear how many Have the justices care about that. But I do think it goes back to other points that have been raised here about is the default state that we're in the status quo, the neutral perspective. And I think that there's at least some suspicion on the part of the court that there's anything neutral about it. And I think everybody in this room should be suspicious of that. Because it is odd that we're still acting like we don't know, right, that we haven't seen scandal after scandal after scandal when whistleblowers come forward, not because of, you know, lawsuits, because they're blocked by 230. But when you have brave whistleblowers who say, You know what, actually meta fully knew that when you do this to your algorithm, it actually makes people unhappier, and it makes them angry, but it's also making us more profit. And so we're going to do that, right. So any tweak that we're going to make to make things healthier or safer is not getting us as much money. So we are making the deliberate choice to choose profit over people as the whistleblowers have been saying, we know that Twitter has released reports transparency reports to their credit before the current regime that actually showed that there is activity, there is heavy prioritization of far right content on Twitter, we know that these are not neutral decisions, the idea that because platforms may now have to take some kind of accountability for their actions might mean that they would suddenly become politicized or they might become too anodyne. That just seems very odd. Because right now, they are making very political choices about what they think is going to get them the most amount of money and which figures in the government they already want to collaborate with. The whole scandal of the Twitter file is really wasn't about Biden, or about hundreds, you know, laptop, it was actually about Trump saying, hey, Twitter, I want you to do this thing. And then Twitter doing it that's already happening. It's already happening with the extraordinary protections that this industry has that Google and Facebook and others are handing over women's data to them, so they can be prosecuted for abortions, that's happening with all of the protections they have right now, the status quo is they do what's going to make them the most amount of money and what will get them political favor. That is where things stand at the moment. So if there's some recognition of that on the part of the court, that that seems like a crazy way to interpret section 230. That would be nice to roll that back. But I do think Congress has got to play a role here to stay, there are things we can clarify because of the 20 odd years that have gone on with this really dysfunctional view of what it is that this industry should be able to be entitled to have. And that all of these things about the sky is going to fall. I just you know, two words for you. I guess this is three but Dominion versus box, right? In other words, every other industry, including right, the news industry can get sued. And I think most of us in this room would say that's a good thing. Now the fact that Fox can get sued, does that mean that Fox has been for the last? I don't know, however long Fox has been around that they've been really toeing the line and being non controversial because they might get sued? Or has it meant that they're going to do exactly what Fox News is going to do? And when we finally get discovery in cases like this and find out just how many people knew how many things at the fox corporation to deliberately promote the big lie? Well, that's good that we know that isn't it good as a public service and good that that lawsuit can go forward? Why should the tech industry be any different?
Now we're gonna get there. I'm just gonna say if Hunter Biden just used a licensed repair shop, we may have this conversation a little later on, but go ahead
to I just want to just do observations before we get to questions. So one is on this question of why the industry should get a pass. This is actually something that that was kind of a consistent theme of Justice Kagan's questioning through both arguments in the thomna case, which is about whether the platform's can be held liable for aiding and abetting terrorists when they have general awareness that terrorists are using their platforms. Justice Kagan's question was, why aren't you just like banks, and we hold banks, you know, more or less strictly liable for providing services to terrorists or terrorist financers? Why aren't they the same? And one thing that I love Justice Kagan, I think she's, you know, the savviest justice, the best writer on the court right now. But one thing that I wanted to scream into the microphone or go into my headphones when I was listening was because there are speech intermediaries. The reason why we think differently about these platforms is because their speech intermediaries, and that doesn't answer all the questions. But I think it's different to say that a bank should be held liable for engaging with, quote, unquote, terrorist speech, to say that a speech or meatery Should and we often have different rules for speech and readers, including people, you know, news organizations like Fox News, which benefit from, you know, the New York Times versus Sullivan standard, which is not available as a general matter to, you know, private corporations. And, you know, that, again, it doesn't answer all the questions, but I think it is a really important consideration as to how we think about the balance of the public interest in the free flow of information. And, you know, the nests the need for liability for you know, when the conduct of platforms exceeds what we think serves the public interest. So then to directly answer your question, shame courts or Congress. To my mind, I think Congress would make more sense I think, you know, it's the words that the text of 230 provide a pretty kind of shallow foothold for courts that are trying to do nuanced work in, in changing the balance of liability that the platforms are exposed to, while simultaneously preserving the public sphere. But a word of caution when it comes to Congress and section 230. And generally, I think most of the conversation around section 230 has been, and I would put Maryann's work as a really important exception to this, at most of the conversation on section 230, has not been very nuanced in thinking really carefully through what problem you're trying to address through amending section 230. And what cost that might create for the users of these platforms, because due to your forum comes up and a lot of different conversations. For the most part, amending to 30, even if you got rid of it entirely would not address the harm that a lot of people really focus on, it would not address misinformation disinformation. There, your problem is not to 30 as the First Amendment, it would not address anything except for narrow categories of speech for which publishers or speakers can be held liable. And those can be really important categories like defamation, like child sexual abuse material, like revenge, pornography, you know, those are important categories. They're not going to solve this, you know, many of the problems that people have been focused on since the you know, 2016 election.
Steve, yeah. Thanks, Shane. Just three quick things. As far as takeaways from Gonzalez, I came away worried that Justice Thomas might be setting the table by supporting broad protections in Gonzalez supporting to 30 and Gonzalez, with one hand and then with his other hand when the net choice CCIE A case has come before the court this fall. This is the cases where we sued Texas and Florida, over there laws that would mandate the carriage of content. So is it possible that Justice Thomas and Gorsuch and Alito might say that we will rule with Google and against Gonzalez and create protections? Oh, but next fall, we're going to force you to carry everything all lawful but awful conduct through some common carrier scheme. And we had one indication of that last May and a dissent that that Thomas Alito and Gorsuch wrote in the naturally CCIE a win at the Supreme Court on our Texas lawsuit. And then I would also mention that in Termina, I was very concerned that under a ruling for either party, that there seems to be an implicit admission that the government has to be trusted when it reports what it thinks is bad content that has to be taken down. The lawyers for both Twitter and tahmina said that an unsubstantiated notice from law enforcement about terrorist activities would confer legally chargeable knowledge under the Terrorism laws. And if any justices had concerns about platforms taking censorship orders from the government, they didn't show it in the questions that we heard in Domina. platform might, under a ruling have to heed all government warnings as they come in about terrorist speech. And look, law enforcement and other government actors, particularly around the world abuse that power. It's the modus operandi of authoritarian regimes. The Turkish government is suppressing discussion of the earthquake response, under under the notion that it's this information. So I worry about that. And then finally, you asked about Congress. Now I know Shane, you said we stay in our swim lane, but it gets it's okay to warm up the pool a little bit for them. Yep. Okay. So if I warm up the pool, I'd say that Alex said What is your aim for Congress? What is your aim? Well, if half of Congress says they want to twist to 30, to get more content, moderation, which is what the Democrats would like, and the other half of Congress wants to twist to 30 so that there's less content moderation, then I don't know how those aims are reconcilable in Congress. Bad news is that they don't reconcile in the States. We have 26 states that go one way and 26 states to go the other, which is why we had to sue them. All right, Matt, I can
think of some ways to warm up a poll, but those don't involve sweat. No, I mean, I think the reason it's tempting to say courts is that they will rule or at least once the Supreme Court has taken a case they will rule and we all know that, you know, Congress is very unlikely to get to the starting line sometimes, let alone the finish line. I still think it especially going back to what I said about our brief and the distributor liability questions around there. And I think Congress was more promising, you know if anything is going to happen here, which I still think is a myth. I would much prefer a scalpel to a sledgehammer. And so again, to kind of get towards legislation. I mean, that's why we said when we testified last year without fully endorsing it, but now it's been reintroduced. Something like the pact act, that would again, not please everybody, but one take, I think kind of a middle ground and say, Okay, you can be charged with knowledge, not as Steve as importantly saying just based on some kind of law enforcement notice, but based on adjudication, you know, that's gonna be way too slow for a lot of plaintiffs I get it. And yet having some kind of due process considerations that are built in there and trusting the judicial system to provide that kind of knowledge, I think is promising. But again, we didn't endorse the bill when it was out in the last Congress, because I think there was work still to do. So I prefer Congress, but I certainly get why courts will rule. And just to finish on, I mean, two things. I mean, yeah, I agree. I don't think tech companies should get a pass organization has done a lot to hold companies accountable for the last five to 10 years, I think. But I do think that without again, being exceptional or completely different from other industries, it is different for the reasons Alex was teasing out. So I can definitely foresee and that's kind of what tempted us into this case. Some of the things that Steve is talking about, those would be more our nightmare scenarios, you know, again, kind of reading neutrality into the statute. And again, just to go over one other tripwire people will laugh at me when I say we don't want it to be neutral, because people know what we've done with telecom companies, right? We just don't think I'm like some conservatives to the point that Steve is making. We don't think Facebook is a common carrier. We do think that broadband providers are. So that's a distinction that is not obvious to everybody, obviously subject to debate. But that's just where we happen to sit. And that's why we think it's important and vital that in that speech, intermediary zone, not only do these platforms have the ability to leave up third party speech, but they do have the ability to curate, and moderate and take down harmful stuff so that we don't have the proliferation of lawful but awful, that I think we would see even more of let's not forget, we see plenty of it today. But they would we would see even more of if 230 were radically altered.
Fantastic. We have 12 minutes for questions, please say your name and who you're with. And you'll be second Baron
pot. Yes. Yep, fine went into Rebecca MacKinnon from Wikimedia Foundation. And, of course, we are being the nonprofit that post is wicked, eto, and a number of other volunteer random free knowledge to projects with a very specific purpose. And no commercial voted. We will, of course, also be affected by the rule like Gonzalez, and a strong ruling in favor of dissolve. This will make it very difficult for us to continue to enable volunteers to build an online encyclopedia, according to the rules that they themselves have set for what he is, in fact, reliable, the fact base and all of it, and rules which they themselves enforce, not our staff, but volunteer so that users are actually setting and enforcing the rule itself governing online community. I'm a little I want to compliment his panel for a really I think high level BS fashion. But My one concern is that we will have been saying that flat colons as if it's one type of thing, when of course, wiki pedia it is a flat. You all have been tagged with the industry and generalizing about intent and so on. without distinguishing between the behavior that we might want to plan it or you know, hold accountable in some way understandably, and the kind of Internet we want to protect and build and have worn. And so I'm curious to hear from the panelists, just sort of based on your own positions. Both are, are we likely to get a ruling that's going to take into account the fact that the platforms are ignoring that the just that big commercial targeted advertising driven types of clapboard? Gan, how do we ensure what advice would you be given to the justice senators? Or especially for those of you who would like to see the wants chain solution? How do we ensure that you're not killing with the P ec as a proxy for killing the ability activities to self govern on the Internet?
I think that a ruling that suggests that when Wikipedia brings up a list in response to me looking for a term, if that suddenly creates defamation and liability. I see why it would cripple Wikipedia. If I go to Yelp and look for a restaurant nearby for an idea for dinner tonight, and Yelp brings up reviews. The suddenly the ruling for Gonzalez means that Yelp gets sued for defamation. If reviews are unduly negative, GoFundMe would go fund me mean that the platform is liable if people lied about the prospects for their investment. And then remember ecommerce it's not talking about social media but e commerce. The platforms that peer to peer economy uses to be able to let individuals sell products or services to people around the world are items that are listed there listed in a recommendation in response to the interest that She just expressed a ruling for Gonzalez says that now those platforms that eBay is going to be liable for product liability issues on anything that had happens to portray in a list. So Rebecca, I mean, we completely get it, that this has broad implications. And I did hear justice Cavanaugh and a few of the others in Gonzalez start to wonder about the implications to the economy of following the Biden administration's advice and saying that recommendations and feeds are not protected. Matt, sorry.
I'll just, I'll try to keep this quick. But I think it's an excellent question. I don't I do think that the, the Justice seems sensitive to the notion that there's more here than just the big corporations, and that we're talking about a ruling that would affect the architecture of the Internet. And that can mean both good things and bad things. But I do think there's an argument for saying that if the the general notion, again, is the division between immunity and liability, one of the terrible things about the status quo is that if you are allowing bad actors to get all the same benefit as good ones, then you're putting good actors at a disadvantage, or at least you're not giving them the same boost. I would like to think that actually a rule that says let's change that and actually have it be more like, let's reward the incentives to do good would actually mean that that Wikimedia does better in some ways, and not just Wikimedia, but other actors that could enter this space that they weren't being crushed to death by Google or by some other major corporation. So I think that there's real reason to think that this would actually be beneficial not to say that there wouldn't be some cost because there's cost to every legal regime. But if there's one that actually does reward the kinds of sites and the kinds of platforms that are trying to do good and are not simply trying to maximize profit, I really do think a better reading closer cued to the text. And history is actually going to be good for that as well.
Thank you, Matt. Yeah, I mean, I don't know that I have a great answer other than to acknowledge how good the question is. I mean, I think it's important to and thank you for calling in on that. You know, I'm familiar with those arguments. I've made those arguments, I think they're true, that we can think only about the big companies and not think about not just smaller companies, but very differently situated companies. I think, though, the converse is true to that, you know, we can't say well, we would never find liability for YouTube in this circumstance, because it might hurt other companies, too. I just want to make sure we're looking at the entire universe and not painting with too broad a brush on on any part of it, ending on the universe. But yeah, I mean, I think that, you know, if I had to answer your question, with the sort of guesswork about what the justice will do, I'm not the best Supreme Court have ever I would imagine, or at least hope for a narrow ruling, narrow ruling that accounts for the facts in this case, and doesn't do more than that. But you know, when you give it to nine justices with very different views, nobody, including me, can guarantee that's what will happen. So I think it's important question, that's where we wound up, you know, and it's not again, not like acknowledge that the way we phrase it is not easy. But we did not take up the Gonzalez position, weeks filed in support of neither party and, you know, have a different question presented, if you will, or a different theory that is I don't think going to be reached. But you know, it's certainly another basket of complicated issues, for sure to.
You want to get Baron
thing started. So good tech freedom, Professor cranks and support stock briefing the space is today. But of course, you said that not really much of what you said there. I've just said here today that who knows what to do with us? Well, but we know how for lunch of discussion, Justice Kagan has said we're all textures of stout. We know if the court will focus on text, and we'll deal with that, but tried to do that. But most of the first is not our text. It's consequentialist arguments about pulse. So let's just talk about textbook because that's what's going to resolve this case. Professor brights claims that the channel publisher means only to refer to a definition and like claims. But if that were true, why did Congress bother including some section, we said it's nothing to miss section 100 C to A so it's good electrolyzing C wants to break, shall be, shall be construed to impair or limit, criminal law, intellectual property right to see and of course, recently, sex trafficking. If your view of the statute will correct we vicious with the no purpose of that because those things have nothing to do a definition or like game. So it's very clear on textureless perspective, that see one has to refer to something far broader than what we also know the publisher functions the actual issue and genuine solace are being broadly because f4 defines one of the things right cube interactive appear service, but not everyone can access all the prior includes filtering, sorting, allowing disallowing pick and choosing, analyzing, adjusting, and whether or not Google and Twitter or access software providers know that those are the kinds of things that are included in Punisher functions that are protected for interactive computer service. But your argument also turns and are very heavily if not entirely on the use of the Good Samaritan penny in subsection C of that episode, Spot have too much weight on it on the headings Supreme Court has said it headings and titles are not meant to take the place of detailed provisions of text, nor are they necessarily designed to be a reference guide or a synopsis. At most quarter said a ship might home son. And they use word or phrase here. Of course, we know why. See one was grouped up in that chapter because in the Prodigy is treating prodigy as the publisher of content created a terrible, perverse incentive for a company that was trying to act as a Good Samaritan. So commerce created two separate actually green separate provisions that dealt with that problem in distinct ways. But you were essentially arguing that we should somehow move with good faith requirement of C to A into C one as it applies there. That's what Congress intended, Congress would have written that good faith argument into C Mon. And it didn't show it only in one commission equals a tree it symbolizes
just in the interest of time. Can I ask if there's a question? I mean, you're you're you're litigating this like you're the Supreme Court, the courts gonna answer this question for us. This panel doesn't need.
Christ centered up discredit misinformation, disinformation, right, and propaganda. So a lot so make well, if it's true, it's very striking or you're making precisely the same origin Trump administration. If it were the case that seats you weigh were the only protection for content moderation. Republicans in attorneys general reveals RPKI in their litigation, that providers are not living content voluntarily to be pressured to sell whether or not the Europeans and whatever we're not to be subjected to a preset or constant moderation would never be resolved. The ball students up at under Ocean City says every syndication term litigated question. So I simply submit that your arguments today are to cut short darkens.
Okay, so it's just a minute, right?
So jolly note for the record that Alex is braver than chain, and that, would you like to respond to any of that?
I didn't actually hear a question. So I'll just okay.
It was for those of you in the back of the room. See Baron later he had some interesting points. And for those who we are at time and apologies to this side of the room, you behave yourself over here, you're being loud. I can tell you're passionate about it. Alright, so if you want to stay for the Congress side of this equation, keep your seat because it's in the same room. And again, all of this is on audio. It is available on video. So if you're not seeing something else that you want to see today, you can go home and watch it tonight. But thank you please thank my panelists for a great discussion.