Revising Section 230: What Proposals are on the Table?
6:38PM Jan 26, 2021
Speakers:
Adriana Lamirande
Alan Davidson
Ellen P. Goodman
Olivier Sylvain
Genevieve Lakier
Keywords:
section
speech
platforms
first amendment
reform
problem
content
bill
proposals
power
newspapers
repeal
amplification
act
people
question
olivier
law
genevieve
companies
I helped curate the state of the net this year, and I am thrilled to introduce our moderator for the next Pat panel on section 230, Alan Davidson, who's a senior advisor at Mozilla, and I will turn it over to him to introduce our panelists today. Thank you.
Thanks Adriana and thanks for all your work on state of the net. Good afternoon everyone and welcome to our state of the net panel on revising section 230. What proposals are on the table. Just by way of a quick introduction. You know, section 330 was a once obscure provision of telecommunications law and it has of course in recent years. Taken center stage in a debate about society's relationship with the internet and President Trump famously called birth repeal, they even threatened to veto a major defense bill over it. President Biden, while he was a candidate agreed on the need for change. In, Congress we saw over 20 bills introduced to change or eliminate section 230 or address some of the issues that it raises. And last Congress many people were skeptical about whether section 230 legislation could pass. It's here. It feels different. The stakes are higher, the coalition's broader, the issues that it raises are even more central to our society. So, to paraphrase The Truman Show. How's it all going to end repeal replace amend, how do we sort through all the many proposals that are on the table. Luckily, here to help us answer these questions we have an all star cast of experts, their bios are largely available online. But let me quickly introduce our panel. We've got Professor Ellen Goodman from Rutgers law school in my great home state of New Jersey. Professor Olivier sylvaine from the Fordham University School of Law, Professor Genevieve Lake here from the University of Chicago Law School, and Professor Eric Goldman from Santa Clara University School of Law. Before we start, just two very quick program notes. First of all we will save some time at the end for questions, which we will collect the zoom so please up to my state from audience questions so please look for that link at the bottom and especially towards the bottom of the hour please feel free to send us your questions. Second thing I just really want to do just a very brief shout out to the internet Education Foundation. For those who don't know for for really almost 20 years IAF has been offering very high quality crafted programs that help shed light on the really tough debates surrounding the internet. And I just wanted to congratulate IETF and Tim lordan on this 17 state of the net. And it's fitting that we're here today as a group discussing what is also really one of the very tough problems facing the internet in section, 230 and what to do about it. Okay, with that, let's dive in with our panel. I'll start by saying, you know, painting with a broad brush, there are arguably three big approaches three main approaches here, when we're talking about section 231 is to, you know, leave things as they are the status quo. A second is something called for really just an outright repeal of Section 230. And then there's a large bucket of ideas around amending section 230 or other provisions of law to achieve the goal our goals in this space. So to start with our panel I've asked, you know, what do you all taking this first piece about just the status quo, you know, what do you all think are the most important arguments today for change for amending or eliminating section 230, and what's wrong with the status quo, and I'll start with Professor Goodman has written quite a bit about this and maybe you can you can start us off
on okay so I think, you know, often what gets conflated and it's hard even to know which of these you're asking Alan is, what's wrong with social media or what's wrong with section 230. So let me start with the first question the larger question and then, you know, full funnel it down to the knocks on section 230. And I think it's an important distinction because most of what's wrong, I think with social media cannot be solved by reforming section 230. So I think there are three general problems that people have identified with social media. It's that the platforms are overly powerful they act capriciously to silence voices or to promote them. The second problem is that their business model is based on massive personal data collection or surveillance capitalism however you want to put it micro targeting engagement. So they have designed flies. That exacerbate polarization and, and elevate harmful speech and other information and other conduct. And then the third problem is that the platform's act irresponsibly, partly because of their business model but also because they're responding to political pressure and they lack care to mitigate the harms that they cause and I think these problems all pointed in different directions sometimes to anti trust or ex ante rules about concentration. They point to privacy law reforms they point to liability. Specifically section 230 reform and also to regulation. Now of all of those problems, you know this section 230 is a smaller. There are narrower critiques ABOUT SECTION 230 and I think, really, they sort of fall into two categories. One is that section 230 creates moral hazard and and you know has ballooned well beyond its original intent because of the development of the industry. And then the second is, you know, I think, also a more general problem in sort of First Amendment law which maybe we can hear from Genevieve on is that it treats all online activity as free expression worthy of protection and fails to distinguish one's really conduct from speech. So I think those are the problems and I'll, we'll hear from others about the taxonomy, I think in that in your third group of reform proposals.
Oh and Can I can I guess, please. Professor Sylvain Go ahead.
Thank you and please call me Olivier. So, I'm. I very much appreciate the way that Elena has set this out. And it's a, you know, kind of comprehensive way of thinking about her information environment. And in another way of arguing that we should slow our roll on on to 30 a bit. I don't think she's gone as far as to say that but that's part of the point. I I'm. I'm. Among those who've been pushing for a role in this regard, far more than we've seen. And the one thing that I would add to the way in which Elon just talked about it is that it with regards to the status quo there is, there are very few if any other regulatory environments in which entities are immune for causing moral hazard, right for causing externalities for which they don't have to bear the cost. And, I mean, I'd love to hear if there are if I'm wrong about that but this is this is the sort of thing that I think is core and important so when we're talking about reform. This isn't you know the status quo is a remarkable situation, where the the costs, the social costs have never been borne by these companies. Now, it's it's this idea arises from a very sensible and romantic conception of what the information market looked like in the 1990s, and that these entities would be able to regulate themselves right the the Safe Harbor is meant to encourage self regulation. But we are now in light of the business model that Elon just talked about in a different world. And it's taken the stuff we've seen in the past month, really the past year, and I've been writing about this for the past three years, to kind of raise our eyebrows on that so I want to underscore this and one more point Alan I want to you, I should have you identified some of these three buckets, I don't think one of these is on the tape, the next time I'm thinking about stays on the table but it arguably, could be one of the things that I've written about well before this current discussion about statutory reform is the role that the courts have in this space. And in fact, after the court after the Ninth Circuit kind of reformed, they were formed but elaborated its view about how to determine whether an intermediary is contributing to the harmful content. You know the courts have been a little more alert to whether they're materially contract contributing and my argument is that the way they design their services, does that. I'm happy to talk more about this but I say that because courts are becoming more alert to this and even the chief judge of the Second Circuit, who's retiring. Judge Katzman observed as much in the dissenting opinion in force versus Facebook,
great points. Were talking about the courts were the first amendment is implicated here and Professor Lake here although Libya has now put us all on a first name basis Genevieve if you wouldn't mind jumping in i'd love your, your views on this question and also as Ellen asked you know sort of where the First Amendment plays in here.
Oh, okay, great. Thank you. Um, so I just wanted to say something in response to the point that Olivier made about how there is this problem of moral hazard on the platforms. And I'm sympathetic to their point, so I am no foul to say I'm no fan of Section 230, because I'm a First Amendment scholar I'm a free speech color. And I'm interested in thinking about questions of power as they relate to free speech I think we don't have a vibrant democratic public sphere, unless we protect the speech of the powerless as well as the speech of powerful and I have as long standing beef against contemporary first amendment law because I think it just reinforces existing property and power relations in a bad way. And so, but maybe when thinking about section 230 it's useful to be thinking in these terms what section 230 is, is it's a grant of power to these powerful tech companies that run these platforms to make decisions about speech regulation both take down and keep up decisions without facing legal liability, and it is intended both to give them the ability to take down when they, you know, the Good Samaritan idea, but I think also there is a strong free speech overlay where it's one of the hopes is to promote and ensure that if you know corporate lawyers fear of legal liability is not going to lead to overly since oil or oppressive practices by the platforms and I think we should be worried about that. And so, when thinking about section 230 reform. I think one of the objectives, what I want to be more part of the debate than it is is to think about how to reshape the power dynamics on social media, because one of the concerns about many of the platforms that I have with many of the proposals that are on the table is they don't do anything to rein in the power of the platforms when it comes to regulating speech, they simply say, you know, you will have more legal liability. If you don't take down speech and therefore they're, there's a lack of parallelism they're imposing liability, they take down but they're not constraining I sorry if the platform's keep up but they're not constraining their power to take down arbitrarily without much justification, or in a discriminatory way. And so, when thinking about the problem of bad speech or marly hazardous speech on the platforms, I think we also have to keep in mind that whatever proposals reforms we embrace to deal with that problem, have to be paired with some kind of protection against discrimination censorship, or just bad decision making by the platforms and many of the proposals to reform to 30 are not I think sufficiently sensitive to that concern.
Maybe this is a good moment to also start to bring in the conversation around one of the big proposals that is out there which is simply repealing section 230. Right. That's the other end of the spectrum that I mentioned, and it is actually the subject of two. The last two bills that were introduced in the previous Congress and senate were were simply repeal bills. Maybe. Eric, You can jump in for us and give us a sense of what would be wrong, start us on the conversation, what would be wrong with that. Why not just do repeal.
Thank you, Alan and thank you to the state of the net organizers for putting this conversation together. I'm and the school of thought that I favor, continuing to support section 230, as opposed to repealing it or reforming it. And there's a few reasons why I think the most important thing that we ought to consider is how much section 230, underlays, the things that we love the most about the internet, the things that we do on daily basis the, the ways in which it has given us benefits on an hour by hour minute by minute basis. So, anywhere from the social media services that we all use. And many of us enjoy to to online shopping in marketplaces to getting services like Wikipedia, to gain reviews like services like consumer reviews. These are all things that that don't exist in the offline world that section 230 has enabled. And so, anyone who's discussing repeal is really saying let's, let's put all of that in play, let's let's reshape the internet fundamentally on the things that we use on an hour by hour minute by minute basis, and potentially, let's get rid of all of them. So to me, repeal it really isn't getting me part of the conversation, it's really saying let's recast the entire internet in a different format. And I find that conversation particularly odd, given that we just are in the middle of a pandemic where we have shifted many of our major institutions into an online equivalent, and that online equivalent. In many cases depends on section 230, including this conversation that we're having right now today on zoom exists because section 230 enables you to provide these services to us. So when we're talking about repealing section two they were also saying let's go ahead and and step back the ability to transform our institutions to an online equivalent. Let's go back and to a physical world, where we put all of ourselves at risk with each other. And I think that the most important question for anyone who advocates or appeals to answer. What problem do you think you're solving. And it's my general view something I hope will amplify throughout this conversation that whatever problem people think they're going to solve section two there is actually probably the solution, not the source of the problem. And I do agree with Alan when she said, whatever objections you have to social media section 230 is not going to fix them. And really I think so much of the reason why Olivier favors slowing Congress's role is because we're not clear what problem we're going to solve, and that we aren't confident that we actually are doing so by targeting section 230, and that's I hope I think the discipline that we can get in this conversation. Show me the problem, then let's make sure section 230 reform is to fix
else want to chime in here on just this, this issue of repeal and the reason I think it is important. I mean we, you know, came close in December, right, you know, I mean it was, we had a major veto threat of an extremely important bill defense bill that, you know, that might have, that, that, arguably hinged on an act an absolute repeal straight up repeal elimination of Section 230. What would the implications of that be I think if you folks could maybe spell them out a little bit more. Probably be helpful. I see Olivia,
maybe, Ellen, you can go into.
Well, I mean, so I'm not for repeal I am for reform. And, and I would say you know it's an It reminds me you know we could have been having this discussion about the chemical industry in the 50s, or 60s, right they they do so much good and. And like we don't want to kill that industry, but they also do a lot of harm right and so, you know, if you're if you're to the argument that the poster child for what would happen if you killed section 230 or even significantly reformed it is that you'd lose Yelp right and i and i do think that a significant danger. On the other hand, you know, we also just had an insurrection and at the Capitol which was significantly you know now so then that's a good question. Would that be you know if that's the boogeyman that we're trying to avoid sort of mass radicalization, and offline harms does would reforming section 230 do anything about that and there you know that's where I'm a little skeptical but what I do think is that I agree with Genevieve about the power relationships, although I do think that some of the 230 proposals do address some of the power imbalances, and we can talk about that. But I also think you know just as an old like regulatory lawyer. Sometimes just having the 230 cuttle out there, gets a lot more self regulation than the sort of reminds me of the filibuster you know negotiation it's like it needs to. It needs to be out there that and other things in order to have to bring that platform to the table.
Yeah. I want to, I mean I I'm actually, there are some bills on the hill that I'm, I think I'm very supportive of. And so, you know, I want to, I hope we get there but speak to some of the issues that Genevieve raised in regards to power, as Elena said. On the question of repeal yeah I'm very wary of such a thing, at least because I worry about startups right and you you envision the burdens imposed on them. On the other hand, I'm on this on this panel I'm pretty sure I'm the one that's probably the most open to that possibility, because I don't think it's completely right i mean i'm not that's not where that's I don't I don't lead with that right but i don't i don't i don't, there are mechanisms in law that would have been available, but for Section 230 right and what the courts decide after 230 is passed and its interpretation of the language of 230 c one is that distributor liability is a is also the sort of thing that is is protected distributor liability would impose obligations on entities that have noticed that there's bad stuff happening on their website. So, so I do think things like Yelp would be endangered I mean, maybe I'm Michael I believe the Yelp as we know it is likely not to be around without 230. But, but there are mechanisms and law that existed to address the third party liability that we, you know, for what it's worth so repeal doesn't mean like there's there's full onslaught of regulation there are careful there are there's a tort regime that exists to address it.
Well, let me follow up with you then just quickly that as we sort of think about what the other good legislative proposals out there that might be more in several have said the reform not repeal camp and in fact somebody on our chat already called us out for setting up a straw man of just status quo versus repeal and i i agree i think that is strong and I think we're more likely to end up with, there are many many ideas on the table so Olivia let me, let me turn to since you just mentioned it, you know, what are some of your favorites. Among these legislative proposals. What do you think are the most promising that we should be talking about.
Well you know I I'm hopeful that we open this up I mean I am Yeah sure, of course, the, among other things, I mean I was thinking in particular, I don't know, Alan. This is what you were thinking about, but there are distinctions in the, in the pack. And, which is senator shots his contribution to the debate that makes distinctions between big companies and small ones that is, those that have fewer than 1 million users and those that are, that have more than them. Also, as a matter of annual revenue 25 million is the cutoff point and that would be within that cutoff would would would would determine the transparency requirements and the legal process that the companies would entitle users to in the event.
Those are actually things I think many people are on the same page on conservatives, you know, liberals, and people who are you know absolutist about two thirds exception, about the security protection and those like me who have been very worried about that you
want to jump in.
I mean I'm actually I also am somewhat, I don't mind, I think the pack deck is interesting and pop because it's focusing on procedural requirements. Transparency requirements. Information posting requirements, I mean I think about how section 230 is implemented right now they've created these like gigantically powerful immune in certain ways not from federal criminal law not from copyright but in in in many ways legally immune companies that have not been very transparent about their decisions when it comes to speech regulation that don't give a lot of due process rights, and that have just been free to do what they want. And I have to say I'm somewhat sympathetic to the companies I mean, the scale of speech regulation that's occurring in these platforms outstrips anything, and that you know the courts have to deal with and these speech regulation decisions I have made very fast, all around the world. I think it's actually very difficult for the companies to be doing this in any kind of sensible way. But that only reinforces the need for there to be some kind of legislation that provides some kind of guidance, minimum due process and procedural requirements. And so, like Olivia I think that that's very interesting, but that's not, that's, that's about trying to take this opportunity to improve the democratic concern nature of the platform's. It's not what I worry about more other reform proposals that are simply about cutting out certain kinds of speech from the community that section 50 right now, provides that seems like only
Western listening to some degree, this real worry that I have about discriminatory and arbitrary takedowns just a follow up that would be more like what we saw with the foster sets that they were we
looked at, you know, particular kinds of harm speech related particular kinds of harm. And, you know, people have talked about now expanding that approach to, you know, removing the section 230 protections or changing section 230 protections for things like opioids, or child sexual abuse material I think there are other there are other categories that have been proposed. Is that what, what are the, what's the, you know, that is, that is something that's been posed what what do you think the limits of that are the implications of that are, why not go that way. And also I
think there's a proposal out there about a terrorist terrorist activities things that violate terrorism laws.
And I just I
understand, and it seems to me, given the politics of the current moment after the capital invasion, that that's good there's gonna be a lot of sort of political energy behind these kinds of reforms and so this is why I'm totally behind Ellen's idea of slowing the roll, because I'm worried that anything that happens immediately. if the thing are only going to in some ways increase the platform's power when it comes to taking down imposing due process, transparency requirements information, sort of research, opening up their data to researchers requirements, these are going to be costly for the platforms and I think, politically, it's just gonna be harder or no policy long, but it seems like this is a time to take this is a period when we should be worried about immediate reforms that seem very attractive. And with all of these reforms you know i i think the foster sister. History should give us real pause. There's a lot of criticisms that the, that it didn't solve any problems or made the lives and safety of sex workers much worse. And also just encourages the platforms to be have a broad brush when they're taking content down, so I think we should be worried about all of these you.
You've written actually last year a great terrific taxonomy of sorts of, sort of the various approaches that have been put out there, it's actually been a lot even since that paper that you wrote thoughts on what, which of these, which of the proposals are most promising or also, you know, which, which others that we should be taking note of.
I think a new one. Since then, although this category was in there were interventions with the business model. And, and especially around. You know, I guess you could say it's, it's the question of when the platforms are through their algorithms essentially creating content, content, content creation hosting distinction and so here it's Malinovski and cashews, Bill, Olivia has mentioned the force versus Facebook case and, and judge katzmann dissent and I think that's what they're trying to sort of time to implement in this, in this act. I Alan should I should I give a short Should I summarize my face yeah
that'd be great. Please do. Um,
so that was a case where Family Force was, was suing Facebook because their, their child had been the victim of a terrorist attack and perpetrator of that attack had been sort of introduced to his circle had been radicalized to recommend a Facebook friend recommendation. And so, and section 230 applied and I think, you know, Eric has written a really good critique of a judge katzmann dissent in that case so and you know it's on the law judge Katzman I think what's wrong with your existing laws, Judge Katzman said no you know here Facebook is sort of acting as a, as a content creator because they are recommending friends and that should be outside the scope of Section 230 and I think under existing law, that's not true, but to what this bill tries to do is make that law. So what the bill tries to do is to say, when a platform is I think it's amplifying algorithmically amplifying or recommending. Maybe monetizing I can't remember if that's part of the bill. But, but then they will be outside the shield until I you know I like the spirit of that proposal, because it gets it this. You know, I think it sort of gets it platform power, it gets to the business model. It creates moral hazard for amplifying. The problem is that I don't know how to operationalize it, I don't know how to define those terms. I don't know how as a litigant as a litigant, you would be able to show that your harm was traceable to an amplifier to the amplification as opposed to some sort of organic spread. And I know that people may admit they know how to do that. I suspect that you would need a regulatory sort of the FTC to, to create some sort of
demand just to talk a little bit. Please, and then issues. Yeah, Well, because I think that's super interesting as well I really liked that katzmann dissent. And it's pretty clever right because the idea is that we want to immunize the platforms, from the liabilities that publishers head and this is something that traditional book publishers didn't do this should be outside the scope of 230, it seems, it seems like a really clever way of talking about I get into what is novel about the social media environment without forcing us to make platforms act as if there are publishers when they're not or newspapers I think would be the ordinary analogy, but okay but the worry is and I also think from a theoretical perspective, if it is in fact the case that it's simply that they have civil liability for causing harm through these decisions that shouldn't raise a fist or criminal liability, whatever it may be. It shouldn't raise a first amendment issue because they already have the liability of the first amendment issue is with issue with the with the underlying laws that they're being held accountable to. But if in operationalizing it, there has to be some government body that says these are the practices that editorial practices that you know platform curating practices that are okay or not. Okay, and of course the platform's gonna want to know this so that they can avoid liability under this law. I think all of a sudden it starts to look like the government is determining the editorial choices that platforms can make, and under the First Amendment, as far as we understand it right now. And I think this is likely how the courts would interpreted the First Amendment immunizes platforms as well as other kinds of hosts property owners who host speech forums, from any interference with their editorial choices. And so I think that's a really interesting way of thinking about 230 I think that reform bill does get at platform power and maybe identifies what is particularly harmful about the platforms when it comes to their regulation speech, but I just think it raises all kinds of difficult first amendment questions.
Yeah. All right,
can jump in here, please I'd appreciate it. Um, first of all, I think that it's helpful in our stand in the forest versus Facebook case that similar cases have failed on non section 230 grounds. They failed on the statutory elements, they failed on tort principles of causation. And they failed on the First Amendment. And this really gets straight to the heart of what problem are you trying to solve if you think that amending section 230 will fix whatever problem you identify in the forest versus Facebook case, you need to understand that you're probably running into a bunch of other problems and section 234 won't actually help. And, and I think that, so much discussion that I'm hearing between Alan and Genevieve really is getting into issues that raise some, some really problematic questions about the First Amendment. And when we talk about things like amplification of content I don't understand what amplification means to me that's called publishing content, anytime that you publish content by definition, you raise the profile some content and in priority to whatever didn't get published. And so, if publishing content is subject to congressional regulation. That sounds like a First Amendment problem. Just to be even mentioned things like information forcing processes I guess my question for you. Genevieve back in the 1970s when newspapers had local monopolies in their metro area, would you have supported a addressing the power that they had by forcing them to produce more information and would that have been constitutional, and I'm hoping I can get the answer that because, because I think that really gets at the heart of the question are we are we approaching this with the idea. These internet services are something other than publishers of third party content which gives us free rein to discuss a whole range of regulatory events and if we were to apply it to newspapers circa 1975, or we would apply it to any other publisher of content. Genevieve you mentioned book publishers, we would say of course you can't do that cannot tell publishers how to run their operations. So, on with your permission. Would it be okay if you could at least tell me. Do you think that this is something that you the solutions you're proposing now are identical to what you would have favored circa 1975.
Okay. Several responses so one is. I mean, you know, there's a certain kind of nominalism to his his publishing or something else. The fact is that the social media platforms are regulating speech they're acting in the speech marketplace, our media environment in a different way than newspapers, operate and book sellers operate and town criers operate they're a new technology with new economic and institutional forms and so they regulate speech in different ways. And so I am not a person who thinks that the First Amendment means that we just must mechanically apply the same rules to different to different kinds of actors. The purpose of the First Amendment is to facilitate and protect a vibrant diverse and inclusive democratic public debate and I so I think we should be attentive to the differences between social media companies and booksellers, which is not to say that we shouldn't take the first amendment issues. Yeah, seriously I take them very seriously. In terms of information forcing you know i think one reason why section 230 keeps coming in to conversations, even conversations that don't seem really about section 230 so for example, the desire to create more due process. When it comes to social media or more transparency, it's not really about carving immunity out or right the the suggestion I think this is in the pact act as well to ask the platform's to open up their get information to researchers or to participate in our kind of think some kind of oversight digital oversight body. I think in those contexts, we can think what section 230 is doing is it serving as the carrot or the stick, that's trying to incentivize the companies to voluntarily engaging, give information, engaging good behavior that maybe they wouldn't be able to constitutionally be required to do. I think that raises lots of questions about whether section 230 is going to work in that way. But that's one thought would I have thought of doing to ask you to do questions specifically. During the period when newspapers were monopolies if they could give information to local regulators so that the regulators could regulate them more effectively. I mean, so I'll just say, and this is a I think an reflection of the fact that newspapers just operate differently. I don't exactly know what information regulators would have needed to do to have. But I think that the term that the court took in the 70s to think that there could be no constraints on the editorial freedom of even powerful newspaper monopolies was a mistake and so maybe you and I just disagree about the First Amendment values here. And
I'd like to come down please. There are a bunch of things that have been said that I have haven't, I haven't had the opportunity to weigh in on. And I'm gonna take liberty and answer, part of the question that Eric addressed to Genevieve. It is absolutely bonkers. to me, to compare the 1975 information environment to what was what is happening now, and okay bonkers is not fair. I think that it is not correct. To do that, because what we have in this environment is something far different, and I want to talk about amplification right and and and talk about this as a matter of conduct, not speech, this is kind of bizarre romanticization of the content that is trafficking through these companies as if it is speech right. Genevieve appears in a great Emily bazelon piece today where this issue comes up in the context of the way in which the Supreme Court has enlarge a conception of what is protected speech but but but never mind if I'm never mind if you know this is not conduct. Let's talk talk about amplification, which is what the, the monarchy bill is addressed to and and I want to talk about something that is in a different setting so, so, with regards to force versus Facebook that was a terrorism case, and the underlying statute, the material support statute. If there is no problem here. Why not allow 230 to go. You know it melt the material support standard has never met by plaintiffs under the underlying statute Why worry about two thirds. But more than that, there are a variety of other settings that underscore what amplification entails. And I was writing two or three years ago about automated decision making systems that these intermediaries rely on to distribute content in ways that are never never been possible before. And that speakers themselves didn't didn't know how to make possible, and the cases I talked about were in the context of housing discrimination made in a made possible in Facebook's ad manager, and how do they do this how does Facebook do this, they first get a customer list from the advertiser, and then they enlarge it by looking at creating look alike audiences. Okay, that's something that is beyond the pale for anything nevermind the 1970s beyond the pale of what most people are doing today. And they also deliver this content to people who are likely to get it right they're likely to be most interested in it and Facebook is making that decision, no one else is making that decision. That is what amplification entails. That does not look like speech to me, that looks like that to me looks like material contribution. What's interesting is that plaintiffs brought this case after the markup did a wonderful series of studies and pieces from 2016 to 2018, and that led to a case that was filed by civil rights groups against Facebook's ad manager, and Facebook settled the case. We never got to find out what material contribution looks like whether amplification like this is allowed. And, and for me to associate this kind of commercial activity with what has happened in the 70s is not right and one more thing Alan I didn't get to wait, what the other carve outs were I mean, there are arguments for the you know about current bills and why there might be problems with carve outs. For what it's worth, we see carve outs and all in a variety of areas of law. We see it in tax we see it in copyright. It isn't among them is. So you mentioned the foster phenomenon, the hirono bill that is floating out these days senator Rona bill is addressed to civil rights violations, which by the way are not protected speech like you can't. there's some things you can't say right in housing markets defamation is always the sort of thing you can't say, but wrongful any activity that leads to wrongful death. Any trust suit suits for injunctive relief. I mean this is what the carve outs entail and that is actually far closer to what is happening in DC, and I suspect we'll talk more about that later.
Yeah. Well, thank you for that. Yes and Alan I was gonna turn to you as well and please jump right in. I want to
say something I don't think we've covered this, which goes to the newspaper issue but it's really about culture and I think, you know, law is downstream of culture and if we look at what the newspaper industry looked like in the 70s and I think this is embedded in the legislation and also in your kind of Sullivan decision, there were certain assumptions about how newspapers operated and just, you know, first of all newspapers had mass pets and and fine lines and they separated family units are those practices not encoded in law. And that made a lot of information. Also, they weren't regulated because the FCC had alternative ownership policies that tried to set recognize the monopoly power of newspapers and so then that merged with with broadcasting companies and so you know that was a, an orthogonal mode of regulation that didn't actually apply papers. But I think that I you know what if, when we talk about and so. So, the offline companies grew up without any of this culture right they come from a completely different culture, not from the media culture and so it's only been very recently that they've acknowledged any kind of responsibility in the way that newspapers you know for either. Either they acknowledge that or they were afraid they were sort of bound in a regulatory culture which, although it didn't apply to them, they operated in the shadow of it and so I think that, you know, among. In addition to the, to the distinctions that Olivier draws is sort of cultural distinction.
I want to make sure we have a few minutes to get to our questions but just really quickly we've just in terms of things we've touched on, we talked about the pluses and minuses of outright repeal and concerns there we talked about the pact Act and the transparency and moderation requirements we've talked about the carve outs. As an idea, sort of like sesta, both the potential first amendment concerns and as Olivier mentioned places where this is not first amendment protected speech necessarily. We've talked quite a bit about amplification, and monetization and and the SU Malinowski bill. Is there anything else that folks want to put on the table just to flag for I turn to questions quickly. As promising things we should all be looking at.
Well, another, we haven't yet mentioned it, and it was, it was the first. I think major effort bipartisan effort is in that in the past year and that is the, the art of that. Right. And that basically draws on work of Daniel Citron and Ben with us there. The, the immunity or the Safe Harbor would be contingent on intermediaries taking reasonable steps in good faith to take content down. I mean there's more to it, but but that's the basic idea and I just thought maybe just for the for to be a little more comprehensive that that was that I think was the first major conference, you know, by Austin's bipartisan effort. And so just
to clarify the pact act did strip out any notion of trying to condition, good behavior as a prerequisite. It just was a categorical content carve out. Yeah.
And by the way, I was referencing the fact that in part two, this is really more of this. The idea behind it around, focusing on transparency and moderation. Content moderation practices. There are other elements of tacktech for sure. Let me, there were, there's a whole set of questions in our q&a we won't get to all of them. Let me see if I can pull out a couple of them. One question was about how, if we're talking about doing this how you how. If we're talking about reform, how do we address how reforms can be enacted at scale. And I think that that goes both to recognizing the scope of people who would be impacted by the reforms, or changes and also we've touched on this enforcement issue. How does it float let me throw in how does enforcement happen here Do we have the capacity for enforcement around this. What do we need to do there to any takers.
I guess I'm not
sure I understand the question fully but I guess I'd start with the, the challenge to anyone who thinks that they can tell a Google or Facebook how to run their content moderation operations, whether they're going to do what you think they'll do based on the regulation or if they're going to do something that you didn't expect or in fact actually the opposite. So for example, if you do a small business carve out Google and Facebook are still are not no longer had section 230, but everyone else has. What do they do at that point. You know, one scenario, the most likely scenario is that they simply removed content much more freely. Is that what you wanted, ask that question. That's going to be the lowest cost avoidance technique for them. But another star is, they just shut conversations down, they don't they don't remove content post hoc, they simply shut the conversations down and carve up pieces of their business and say, these are no longer tenable. And so, you know anyone who who wants to talk about what will reform look like at scale needs to gain out, what are the counter moves that the internet service is gonna make that it'll be in their self interest under this scenario, and as Jennifer mentioned foster assessed as a great predictor about this, a bunch of things that happen from classes so we're not what Congress thought it was going to get it was actually the opposite.
Another question from the audience. Oh, sorry, gentlemen. Go ahead.
Cuz I think that's a really important point and I will just say, see like when people are talking about section 230 and sort of carve outs and how to reform the regulatory practices. I don't know if they're thinking, enough about how these this these choices speech choices happen on the ground, like, there are 10s of 1000s of people employed to make immediate decisions about whether a particular act of speech is going to raise liability for the company or otherwise against its terms of services. And so the concern I have is not that the companies are not going to take down what is in fact legally prohibited speech or speech that is defamatory whatever I'm not worried about that. I'm worried about all those split second decisions in which people are going to be having to figure out whether or not there's a risk of liability. And so, I do think another way in which social media companies are just different from newspapers is the scale of the speech for good. I mean, we have a much more democratized public sphere than we used to have there's lots more people speaking this is all really wonderful, but it does mean that when we're thinking about asking any decision maker to figure out whether speech is committed or not committed the scale of the speech flowing through the platforms is going to make these decisions really quick. And maybe quick and dirty and so we should worry about that. I'll say,
go ahead and go ahead and look
at the q&a but I want to associate myself with the comments that both Genevieve and Eric have made on this. I think it's important for me to say that, by the way, because of how hot these things can be, I completely agree that there are unintended consequences that we might worry about. And when foster acesta was written it looked shaky from the beginning right and people were expressing that concern content addressed directly to content right Never mind that. I mean, there may really be some first amendment concerns there. So, so I want to associate myself with that, for what it's worth.
Okay I'm totally have just two or three minutes a couple minutes, three minutes left I'm gonna throw in one more question from the audience and a little quick lightning round. The question was about whether there's anything that we can learn from Europe, which has just gone through a multi year process to propose the Digital Services Act there. They do things that include, you know, treating different players in the stack like ISP are treated differently from say social media companies. They also have, for example, which some of the bills that we've seen include different treatment for very large platforms versus small platforms, any of this sound good to folks anything else we can learn from Europe. Any takers.
I have a comment on that. Two things that we can learn from Europe. One is that almost all of the Global Innovation when it comes to user generated content services comes from here the United States. And so Europe I think has categorically been okay with the idea that they're discouraging that kind of innovation. So, you know, if you want a model where we have less user driven content services, Europe, proposes that model. But I think we have a better here, that's my view. The other thing we can learn from Europe is based on things like the GDPR how the GDPR actually has solidified the power and market presence of Google and Facebook and hurt. It's their competitors. I think that many of the reforms are being posed in Europe and that have been implemented, actually, continue to give Google and Facebook greater control, and then that creates that downstream problems well what do we do about the fact that Google and Facebook are in control. Well, it all started with the regulatory environment that helped build a competitive moat around their services. Yeah.
Well, yeah. Real quickly, you know so last December the European Commission adopted some proposals addressed to this right and and one of them that I think we're seeing in the packets differently formed, but it's from Europe is imposing an obligation on intermediaries to take down content that a court has decreed is unlawful right under current law in the United States that's not the case. Right, so to me that's kind of like a no brainer. The other the other point that we haven't yet talked about with and this is this is addressed to your question about scale, the pact act. I like the hirono bill, a lot too but we were talking about the pack deck a lot the pack that creates an opportunity for government agencies, not lay people but government agencies to bring lawsuit not without worry about the 230 defense. And what are we talking about we're talking about the Department of Justice HUD, the Federal Elections Commission right now I could see why this would worry people for the same reasons we'd heard, but you know for what it's worth, these are public regulating authorities and either we have a problem with them or not,
quick lightning round for to close us out here and we'll go in the opposite order from the first the way we started So Eric, you're it'll be up for it so if you were advising, a member of Congress or their staff. Something might be listening. What would you suggest that they focus on in the next few months of drafting bills and putting hearings together very quickly less than 30 seconds, Eric. Yeah,
I think that they should focus on the right factual baseline that we're never going to eliminate antisocial behavior in our society. And so, trying to hold internet services to a higher standard is unrealistic and, ultimately, it's going to fail. I think we should also embrace the notion that we in the United States are global leaders on free speech some something that we gave a lot of ground on in the last four years. And when we target section 230 and discuss how to regulate publication of content online, what we're really doing is signaling to the rest of the world. That's okay. And the rest of the world is watching us. Genevieve.
What would you suggest. Oh, I'm
a free speech presents never very free speech he also which is that it seems to me that there's the conversation about how to regulate speech and social media is still in its early stages, the platform's themselves are changing their views about what how what to take down what to keep up all the time. And my concern is I want the public to be involved in there to be some kind of public buy in. And so the focus I think right now is to ensuring that there's efficient flow of information from the platforms to the people and back, so that we can continue to have a vibrant discussion, and that it's not just which it is right now. These non transparent corporate players making all the crucial speech regulation decisions for all of us.
Olivier.
I'm 30 seconds.
So,
I think, you know, so I'll be at the 30,000, foot level two point I've made elsewhere. What can you do sitting in this chamber that engender a sense of civic
responsibility, where it's
been missing. What can you do. And, among other things you might want to think about the most important kinds of speech acts if we're going to allow ourselves to call these speech acts. And, and that are consistent with the operation of democracy and I think this is a nice segue to the kinds of things that Elon has started with and maybe we should say more about
Elon bring us home, get the last word here
because this isn't budget reconciliation it's got to be bipartisan and so I would say three things there two low hanging fruits transparency and data portability which we haven't talked about, but I think there's bipartisan support for both of those and then the bigger idea which I do think there would be bipartisan support for is to open up the Public Broadcasting Act and make it the public media act and support, high quality information and other kinds of civic civic responsibility and engagement.
Well this was terrific. And I hope everyone will join me in a rowdy round of virtual applause for our panelists. And thanks to all of you in the audience and the terrific questions that came in I hope, will they they they are they're terrific record of our conversation in a lot of ways. And, again, thank you for Thank you The IETF and the state of the net conference for hosting this and again to our panelists for such a thoughtful conversation.
Thank you. Adriana. Thank you, everyone else in the attend in attendance please stay on we have a keynote with Charlotte willner, and Neil Potts from Facebook kicking off in minutes so.