Hi everyone. Welcome. We're so glad to have you here for this panel. My name is Nadine Farid Johnson and I am the policy director for the knight First Amendment Institute at Columbia University. And I am delighted to introduce today's panelists. Jess Miers is Senior Counsel at Chamber of progress. She focuses primarily on the intersection of law and the Internet. She is widely considered an expert on us intermediary liability law and has written spoken and taught extensively about topics with his speech in Section 230. Content moderation, intellectual property and cybercrime just joined Chamber of Progress from Google where she was a senior government affairs and public policy analyst. She's a former software engineer who earned her bachelor's degree from George Mason University and her JD from Santa Clara University School of Law. Nicole Saad-Bembridge is counsel at NetChoice and Associate Director of NetChoice's litigation Center, where she focuses on NetChoice's litigation and amicus efforts. She specializes in reviewing federal and state legislation that affect the First Amendment freedom of speech, section 230, and AI. Before joining NetChoice, Saad-Bembridge worked as a legal associate at the Cato Institute's Center for Constitutional Studies. She earned her law degree from Georgetown and holds a Bachelor of Arts degree holds a Bachelor of Arts degrees in economics and piano performance from the University of Washington in Seattle. Olivier Sylvain is a professor of law at Fordham University and a Senior Policy Research Fellow at Columbia University's Knight First Amendment Institute. I should tell you Olivier is here in his Fordham capacity. His research is an information and communications law and policy with his most recent work focusing on online intermediary liability, commercial surveillance, artificial intelligence network of quality and broadband localism. He was a senior adviser to the Chair of the Federal Trade Commission from 2021 to 2023. And previously it was Karpatkin fellow at the ACLU and a litigation associate at Jenner & Block, here in DC. He holds a PhD from Columbia University, a law degree from Georgetown and a BA from Williams College. Matt Wood is Vice President of Policy and General Counsel at Free Press where he helped shape the policy team's effort to protect the open Internet, prevent media concentration, promote affordable broadband deployment and safeguard press freedom. Matt has served as an expert witness before Congress on multiple occasions. Before joining Free Press, he worked at the public interest law firm Media Access Project and in the communications practice groups of two private law firms in Washington DC. Matt earned his BA in film studies from Columbia and his JD from Harvard Law.
So we are in a period in which what are arguably the most important First Amendment Internet cases in decades are before this report, which is why you're all in here today, right? We've seen this Court's take on the First Amendment in recent cases such as 303 Creative and Counterman v. Colorado. And I would imagine that all eyes in this room are on the court decisions in Gonzalez and Taamneh last session. Now a pair of cases from the 5thand 11th circuit's NetChoice versus Paxton and Moody V. NetChoice are set to be argued in two weeks. The question before the court is whether social media oversight laws promulgated in Texas and Florida, which prohibit social media platforms from censoring users content via so called must carry provisions and impose stringent disclosure requirements violate the First Amendment.
So we're going to touch upon those cases, there are a number of other related issues today, I know we only have a little bit less than an hour. So we're going to try to get through as much as we can, because I was talking earlier with the panelists, we think we could talk about any one of these smaller issues for an entire hour. But hopefully, you'll get a nice overview about the about what's but what's really what's at stake here. So for all the panelists, I'm really going to start here with an end game question to help us set the stage for how each of you are thinking about the issues before the court, and then we'll work our way back into a few different avenues from there. So for each of you take in any order you wish. How do you anticipate the justices will come down to the NetChoice cases? And how does that expectation differ, if at all, from what your ideal outcome would be? Now, you might start
As the pinch hitter, I'm not sure I should be starting. But thank you for having me. I'm glad to be here. I'm not Gus Rossi, as you all know, but I'm Matt Wood, handwritten for you here. You know, I don't know, I'm not enough of a Supreme Court watcher as much as some people. So some people will say, here's exactly what this justice would thinking. And with this one well, I think really Justice Thomas drives it, maybe that's the least profound thing any of us could say up here today. But he has certainly been itching to take a look at these issues for some time. I think one of the things I want to talk about today to the extent of interest is not just the dry legal side of it, which is so important, but also the politics of it all. And so the fact that we're sitting here in 2024, saying a conservative justice, whatever labels people like to put on these things, is somebody calling for more Internet regulation. And more regulation of what platforms can do is an interesting phenomenon. It's not where we were sitting 10 or 15 years ago. Obviously these issues are very complex. They don't really fall neatly on right versus left lines. They're very much dividing each of those sides really, but you know, I think that The Supreme Court will take some whack at it. I'm not sure though I wouldn't want to bet on upholding or striking down the losses because I don't think the other justices positions are well enough known on this for at least me to make a gamble with this. My ideal would be not to uphold these laws, we actually did not file an initial use case, we were in the Git Talos and ML cases last year. But that's not anything reflective of our views. We believe these are actually dangerous provisions. I'm sure we can talk about why that is. But you know, I'm worried that Thomas will drive this, but I'm really not sure how the other votes will go.
Sure. Um, so in the last 250 years, we've known that the First Amendment constrains the government and protects private actors. The founders specifically identified with the press clause that the act of publication itself needs to be shielded from government interference, in order to make sure that personal and political freedoms flourish. Florida and Texas, in defending their efforts to transfer private editorial judgment to the state, in the name of free speech is relying on a legal theory that turns that really fundamental constitutional principle on its head. And I don't think the justices are going to find it persuasive. Just very recently, in the last term, they reaffirmed in the context of online speech no less that 303 creative that compelled speech, government speech, compulsions are just as constitutionally suspect, as constitutional no as speech restrictions. And that's exactly what the Florida and Texas laws are speech compulsions, and I think that the courts pronouncement in 303 coupled with Miami Herald and Reno AC V. ACLU, what the law requires, it's clear here and I think as as Knight and Toomba progress said, into a very different but equally apt briefs. You know, the the Texas and Florida laws can't stand. But But to be sure, so Alito, Thomas and Gorsuch two years ago, did dissent from the Supreme Court allowing that choice and emergency status to stop HB 20 from going into effect. But they did so firstly, before examining the full record. Secondly, before Justice Gorsuch himself wrote 303, creative which the other two joined. And thirdly, before the court heard Gonzalez v. Google, which is where we really saw them in real time, kind of come to terms with what social media platforms are doing when they're moderating content, they're exercising editorial discretion. So we're we're very optimistic for a strong holding in favor of editorial freedom.
Thank you.
So I was asked last time I was asked this question, it was about Gonzalez, and I got it wrong. So I'm a little hesitant to make predictions. Again, I echo everything that the CO panelists have said so far. I, you know, I think it really is going to depend, are we going to apply the law or are we going to play politics? I think it's very obvious, you know, Justice Thomas would love nothing more than to see these Internet companies restrained in their ability to moderate content. But I'm cautiously optimistic knock on wood. I think based on what we saw on Gonzalez and terminus, specifically, I think the justices are, I think they understand what's at stake when it comes to our, you know, freedom of expression when it comes to the way that these Internet services function. And so I think that at least the first question with regards to whether online companies have first amendment rights to engage in editorial discretion, I can see that, you know, going preferably Well, I think what I'm going to be part of what I'm going to be specifically paying attention to is the question around the transparency provisions. I think the transparency discussion is a little bit less obvious, given the First Amendment concerns, and I think again, just given the it wasn't heavily briefed by the Miki as well, I think there was there's one excellent brief by Professor Goldman who's in the room. But I still think that it's going to be somewhat obscure or vague for these justices to be able to connect those dots with regards to the disclosures that are required in the law. So we'll see. I'm not going to take any certain stances this time around. But yeah.
Well, it's gonna be disappointing because we have agreement, I think on what the court is likely to do with regards to the First Amendment issues in these cases. I wouldn't be surprised if the Court turned out differently. But, you know, in the interest of mixing things up a little bit, I would like to observe that Justice Sotomayor and dissent in the halo case raised questions about whether a private entity should be obliged Ah to attend to Public Law norms writing for dissent that you can imagine in a world that joins Clarence Thomas and Alito and maybe Gorsuch. So I mean, I don't think that's where the court will go. But for what it's worth, there is there is a logic in the air that is on both sides of the aisle, it raises questions about the gatekeeping power that companies have, which is, of course, what is animating the Florida and Texas laws? I also want to just put it out there since no one has said it yet. That, to the extent this is a case about expressive activity, yes, it's a First Amendment case. But I hope the court doesn't lose sight of the fact that these are commercial enterprises period. They are not in my mind speech platforms in the way that a lot of people like to believe. And if they are commercial enterprises, then they ought to be subject to commercial regulation. That doesn't mean that you can't regulate expressive activity by commercial enterprises. That's what we've done in this country for a long time. But hope we don't lose sight of that. And I worry that the court might, I'd love to see a careful opinion that thinks about the ways in which companies extract exploit us confused consumer information. And I just I'm in complete agreement, I'm really curious to know where the court ends up with regards to the disclosure and transparency provisions in this law, I worry that they that they might undermine speaking of government regulation of commercial enterprises, you know, laws that are addressed to disclosure and transparency, generally in the interest of consumers. So I'd like to see the court, be careful about that.
I appreciate you putting some intrigue into the span. And we're going to we're going to get into some of the questions here because there are nuances at stake is is not these are not easy questions at all. So Nicole, I'd like to turn to you because Florida and Texas places plays significant weight on the concept of common carriage as a basis for their efforts to regulate the platforms and of course, net choice argues Otherwise, I'd like to hear your perspective on that question. Why is this not a common carrier case, in your view? And to explore a particular slice you mentioned the editorial functions editorial freedom, is there an opening for the court to find the platform to be common carriers for some part of their function, for example, respect to the hosting function has been promulgated by a number of academics. We if we accept the idea that traditional common carrier rules do not automatically raise First Amendment concerns, if there is some kind of bifurcated, if you will, functionality approach that comes that comes to pass with the platform's be able to carry out editorial functions. And what were their implications be in your view for speech online?
So to answer your first question, the netwrix cases are not common carrier questions because they're about speech and the publication of speech something the First Amendment prohibits government interference against twice over. So to try to get around the First Amendment's constraints we see Texas and Florida try to evoke a sort of expansion of a common carrier doctrine that's been applied to entities like meat packers, and fairies and freight carriers to impose forced hosting obligations on them. But critically, none of those industries are traffic in the business of protected speech. And I think that is the really critical distinction here. Another thing that Texas and Florida are doing at the 11th hour is trying to position themselves before the court as these sort of civil rights warriors. So they go as far as to analogize, HB 20 and 7072 to the Civil Rights Act of 1964. So the Texas and Florida laws would force platforms to host racist speech. And the Civil Rights Act of 64 Prohibited delis from denying service to people of color. And I just really don't think the court is going to be fooled by this conflation of the word. Discrimination, especially not justice Cavanaugh, who wrote when he was a judge on the DC Circuit and opinion that specifically warned of the danger and even the absurdity of regulators trying to unilaterally expand their authority by by crying common carrier. Indeed, as he said, then if Florida and Texas can seize control of private speech platforms, because they suspect Silicon Valley ideology, what's next California will take control over Fox News. I don't know Alabama, MSNBC as one of our Miki highlighted publishers might be next but book publishers with regard to the question about bifurcating their functions look, it's important to remember that the act of If we were to impose a must carry on the New York Times like the act of publishing speech, if we were to impose that on that simple function, what we some refer to as the hosting function, it would be totally unthinkable and the Supreme Court's been extremely clear. First Amendment protections don't differ based on what medium the speech occurs on. But we're that kind of bifurcated version of a common carrier mandate to apply. I think what we would see is these popular speech platform suddenly being flooded with the kind of stuff that you see on on Kiwi farms and each and so naturally sent a very broad array of amici from our SCOTUS to NAACP to job explaining that that that will happen and and if it does, it will make the social media platforms unusable for most.
Thank you. I want to pick up on on the the characterization you put about the states, the state's briefs invoking the civil rights, the Civil Rights theme and have to turn to just because the Chamber of Commerce progress Chamber of Commerce, excuse me, chamber progress, nothingness and make us brief that the Florida antitrust laws to my chambers, and then somehow not enough, will enable extremist extremist content to thrive at the expense of civil discourse and space for marginalized voices. And as Nicole nosa states briefs, briefs characterize the laws as in part an effort to defend against discrimination. They even invoke, as Nicole noted civil rights themes to support this claim. So my question for you is, is there a universe in which the type of authority that the states are trying to impose actually supports protection of marginalized voices online?
It's a great question. My answer is no, for several reasons. First, I find it quite rich that the two states involved here have now shifted their arguments to talk about protecting marginalized voices, when these are two states that have consistently had efforts to one ban books on race conscious decision making, to ban books and information access to information on LGBTQ plus resources, and three forced women to carry to labor over miscarriages to term so I don't buy the arguments from Texas and Florida. And again, their arguments shifted quite drastically. I think what what's actually happening here is we're seeing as Nicole was talking about, A, we're seeing the state's conflate discrimination right at first in the various they when they're first briefing this issue. We talked about discrimination as pertaining to the message of the speech, the viewpoint of the speech, not who the speaker is. And I think that's that's obviously a very important way to distinguish here. Now we're seeing these two concepts conflated when we're talking about the Internet services, for example, the Internet services have content moderation policies, because they are specifically moderating on the basis of the message not on the person. So again, we're talking about viewpoint and the content of the expression. I completely agree with Nicole, this is about ensuring that heinous speech remains online. And how do we know this? Well, both Texas and Florida when these laws came out, they champion the laws as being last you push conservative ideology, at the expense of of sort of eliminating the Silicon Valley liberal approach to the Internet. Let me call out a couple of cases that we've seen in recent that are championed by the conservative right, it gives you a good idea as to the kind of content that they would like to keep up on the Internet to ensure that we are boxing out and keeping out actual marginalized voices. You have dilema via YouTube, for example, this was a case that a a reporter a conservative reporter was suing over based on the fact that YouTube demonetized and remove videos of her pushing conspiracies against Jewish people. The Federal Agency of news versus Facebook, another case this has to do with keeping Russian propaganda online. And then we have the Murphy and Wilson be Twitter cases as well, both in those cases. One was about misgendering and dead naming trans folks online on Twitter. The other one was about being able to use LGBTQ+ slurs in the LGBTQ+ slurs case, the plaintiff actually argued that he was being discriminated against as a heterosexual male. So that's the kind of content that we're going to, you know, that these states I think are hoping to see online. Let me be very clear about something though, these services, you know, the reputable legitimate ones, they're not going to allow LGBTQ+ slurs, they're not going to allow the kind of hate speech that you know, we were kind of talking about earlier. It's much it's horrible for their brand, it's horrible for their users. The likely reality of of this is actually just going to be to remove content entirely just to shut off all voices. And so to say that to argue that this is all about ensuring that more voices are online. And then that the marginalized voices are the ones that stand out. That can't be the case if they're not able to speak in the first place, or if they're driven entirely off the platform because it's unusable because it's become a cesspool.
Nadine, can I jump into this conversation? So I too, am struck. What strikes me is disingenuous invocation of the history of anti discrimination law in the United States by the states here. And I agree that's a kind of conflation going on. But the interest of mixing things up, I think that the language of discrimination that apart from what the states are doing, that all kinds of people are concerned about sounds in competition law, that is their worry that these large companies are discriminating in and that they're gatekeepers, and making it hard for people to participate in conversation based on their own gatekeeping position. So that's, I just want to make sure we're clear about that interest. Because I do think we can get distracted by this. What is what to me is clearly disingenuous invocation at the last minute on this, and by the way, it is not uncommon for people in the space to invoke advocacy on behalf of marginalized groups to advance their claims, whether we're talking about 230, or the First Amendment. So I tend to be skeptical about that generally. So I just want to make that observation. Also, I think, just an Cola, right. To the extent there are distinctions in the cases between entities that are compelled to say something that they don't want to say, and the kinds of things that are issue in this case. So the fair case, for example. That's the that's the military recruiter case. And Pruneyard, which the conservatives like to invoke, also, this that those are not raised here, right. That's not imposition of speech on these companies. I didn't want I want to make sure we don't lose sight of what is an interesting problem here. And that is the gatekeeping role these companies have,
right? Oh, maybe I'm gonna stick with you actually, because earlier you mentioned the commercial enterprise nature of the platforms. And you have written extensively and thoughtfully about the need for reform in the realm of platform reg regulation. You've also noted the courts narrow approach to examining platforms interests, focusing on the fostering of expressive online engagement and discourse, rather than, for example, the pecuniary or commercial interests. So do you believe that this approach that the courts have taken will affect their First Amendment thinking in the context of net choice? And if so, to what end? And if not, how should the court be thinking about the platform's role positioning and power? Yeah,
I don't know if it's going to change their view with regards to the First Amendment analysis, you know, you know, you know, Justice Cavanaugh wrote the USDA opinion, I think you were invoking, just wrote and wrote the Haylock opinion. I think that's where the court is going to end up on these first amendment questions. What I hope, which is not just the First Amendment problem is that that, that the court is alert to the commercial endeavor that these companies are engaged in. And and, you know, I mean, this is a far more interesting question for me also, in the context of the 230 debate, because it is that we ought to focus to the extent the extent to which these companies are allowing the delivery of unlawful content content that is actually dangerous, and against the law. And we don't have sufficient protections for consumers, in that instance, into the country that protections for the companies. I don't think the First Amendment is the vehicle for that. I do think it is 230. Reform.
Great, and we will actually get into a little bit of a policy discussion as on this as well, um, in terms of if it's not via the courts, what other avenues do we have? So Matt, I'd like to turn to you, we talked a little bit about the transparency rules at issue here that were set up by Florida and Texas, in the statutes. What are your thoughts on the notice and appeal requirements up for review? Do you think they're overly burdensome? Do you think they represent maybe a workable reproach for transparency? Or would you see like a third, third way?
I mean, probably a third way, if I could take the easy way out for a panelist I, I would agree with Livia, that we don't want to see a regime where you know, any kind of regulation of companies is out of bounds, meaning even when he touches things like this, when transparency about their policies, right. So I think we've seen this at the FTC somewhat successfully over the years, tell us what you're doing. And then we're not going to say you can't do it. It's just really more about the transparency piece and whether or not the company is following the acceptable use policies and puts out there. Now, of course, that's very different than a law that says you must protect certain kinds of viewpoints. And you know, report on what you're doing takedown content that the state has ruled to be protected or unlawful in different ways. So I mean, I did not read a refund it as I heard Professor Coleman did, and I don't know that others have, just as Nicole said, maybe the court is hasn't gotten as much briefing on this as they need. This one does seem like more of a toss up to me. I just want to go back though to the previous pieces too. And just note for one thing, it's fun to be up here as a nonprofit group and have chamber progress, and that choice be even more fiery than I am, because usually that's supposed to be my role. But I completely agree with the disingenuous nature of the claims being made that this is I'm protecting civil rights. I mean, that's why our group has been has somebody have noticed walking in this balance beam over over time was saying, we actually do want to have commentary for certain kinds of communications platforms, and not for others. And here, we do actually see these companies moderating speech taking things down. So I think, you know, what gets conflated here is not just people trying to falsely claim the mantle of civil rights law, but conflating the different kinds of non discrimination law. And I always hesitate when I throw this out in the panel, because it feels like multiple law review articles. But I would say there are at least three kinds of non discrimination law, there's commentary, which we're talking about here. In some respects, there's protected categories, right civil rights law, and then there's anti trust, which is not discriminating against your competitors. And so that's where I think we get into trouble with when people want to mix those things up and then throw a dash on the First Amendment and and say, You're censoring? Well, no, that's only the government can censor. So I mean, the transparency part to try to get back to the question you gave me is, I think, lesser than some of that. But that's that kind of overreach that we're afraid of seeing. Last thing I would say to just in response to the call talking about common carriers and speeches, we actually, of course, have had common carriage regimes applied to people who carry speech, and forget Internet, right, let's not even get to net neutrality. Let's talk about telephones, I think very few people would argue that telephones are not common carriers. To the extent we still have plain old telephone service. So you know, there have been cases on that there have been cases on muskerry There been cases on the Fairness Doctrine, it's not that somehow common carriage is holy and applicable in the speech context, it just said, it has to be very carefully done. And I think that's exactly what these states have not done here. But
I think that you'll find in the case of the telephones, I guess that that had to do with non public transit. So it was not in the case of publishing. And there certainly was not the expressive act of content, moderation that we see, you know, sites as wildly different as I don't know, hunting net Christian, that Twitter, everybody that the act of curating an online community is expressive. And we don't see that in the context of telephone. So I do think this speech kind of application is different.
Yeah. To the extent it's not clear, I wholly agree. I think they're very different services that are being offered. It's just that it's not completely out of bounds to say, well, we want to look at carriage of speech. In some contexts. It's just up here. And when you do have I mean, it's it's unfortunate. But even under certain tests, commentary, just kind of, you know, when you see it, Tesla has been put forward. I mean, there have been cases where the FCC has said, Well, we're going to require you to be a common carrier, not just based on what you've held out to the public. But I do think that they're very different services. And that will not be satisfactory to everybody. But that's certainly our view.
Well, I mean, the I do want to say, you know, I offer these views, because we want to complicate things a little bit here. The there are cases involving common carriers, and the speech that is carried over them, I think of the case of a network neutrality case from two decades ago, involving 18 t's blocking of narrow messages like there is this does happen in the context of this infrastructure in terms of traditional common carrier infrastructure. So it's not completely different. But I do want to, I want to double down and agree that the kind of common carrier that we're describing, that is not the sort of thing that I think the the market for what I call the market for moderation looks like, you have infrastructure, telephones and infrastructure greeneries. That's infrastructure that has is essential for the function of a market, I would not elevate these companies to that level. These are commercial enterprises, pure and simple. And they compete in a market for viewer attention. That's different from what telephone infrastructure is.
Before I change course, I'm gonna offer any of you opportunity to comment on the transparency rules if we wanted to bring that up. All right.
All right. The transparency provisions aren't exactly before the court. It's only provisions about individualized like explanations of why stuff was taken down the transparency requirements that Professor Goldman has been leading the charge with wonderful scholarship about solder and otherwise are not before the court in this room. Yes.
Thank you. All right. So this is this is another one for all of you. There are a number of other significant cases on the docket, this term relating to platforms, including Murthy V, Missouri, the jawboning case that are most allegation that the Biden administration had pressured the platforms to remove certain content. How do you see the court addressing Murthy and and I'm asking you to put your predictive hats on, particularly given our discussion on net choice and what relationship if any, do you see between these cases?
So So net choice filed an amicus brief in support of neither party in this case along with Chamber of progress our CO plaintiff, CCA and the Cato Institute, because we believe that proxy censorship is still censorship. But I really think about the net choice cases and Murthy and also NRA volo as two sides of the same Boy, so both concern the permissible scope of government interference and in private editorial decisions. One the natural use cases, formal government actions. So legislation, executive enforcement, and then Murthy it's like nudge nudge wink wink threats or implied threats of adverse regulatory action. And I think that how the courts will will find in these cases the rules that they issue will are kind of codependent so. So if on the one hand, the court finds for net choice that state efforts to commandeer private content moderation violate the First Amendment, but the government can continue kind of going around whispering in platforms ears and kind of a coercive way to achieve what Florida and Texas want to do whatever rule they issue in the net choice cases will be a Holloway. On the other hand, if the court finds against net choice that somehow forced hosting under crippling civil penalty does not violate the First Amendment. It seems implausible to me that a tersely worded email could could possibly violate the First Amendment. So I think that these are intertwined. And what the online speech landscape after them will, really depends on on the the interaction between the two, we were really excited to see the court take the job winning cases with the net choice cases.
Yeah, I think it's, I've been struggling with how the court is supposed to detangle If the court decides that, okay, yes, there was coercion here, the government can have no involvement whatsoever in content moderation. How can we have then state laws that then involve themselves in the content moderation processes of these companies? So I think that dynamic is definitely at play. I have the same concerns that I have with regards to the transparency pieces. I'm I'm concerned that there's going to be a lot of dots to connect for the court here. One big one being that look, the government talks about content all the time. I mean, are all these hearings that we're seeing these companies dragged into? or can that now be used against the companies whenever they make a decision? That's either and that's either favorable to what someone said at a hearing versus not favorable? So I think, you know, and we made this very clear in our brief as well, that, you know, wherever the court, whatever the court comes up with here, I think it's crucial that it remains the case that these Internet companies are not considered state actors, that they're still considered private actors, private entities that have a First Amendment right to continue moderating, despite what you know, Ted Cruz says on Twitter, you know, that the other threats that we've seen so far,
the court does have an opportunity to be clear about the difference between coercion and persuasion in the doctrine to double down on an approach that it has adopted elsewhere. I think the night you could probably talk about this, Nadine, the night brief talks about what needs needing the clarity on that, because, you know, governments are there in the bolt bully pulpit is a thing, we honor because it is part of the democratic process. Right. So presidents are elected, because to persuade, and they are attentive to the concerns of the public, the surgeon general can issue a report that expresses concerns about the ways in which social media are proliferating harmful data and information about public health. How far does this go? Can governments can the President, can the Surgeon General engage in conversation with people whose who are affecting consumers? Absolutely. But we do need some clarity about what that line is. And that's what this case, I'm hopeful. That's what this case sets out an opportunity make a distinction, a totality of the circumstances distinction between what is coercion? and what isn't. And, you know, this is also as with many of those, these other cases and opportunity for the courts start thinking critically about what these companies do.
Yeah, and I mean, I think I would agree with Olivia, to some extent to, there must be a difference between communication and coercion. I think, you know, if somebody puts out on some platform, whatever power the government has to regulate that platform, Election Day is on a Wednesday this year, and the election board calls up and says no, it's not. It's still on a Tuesday. I don't think that's coercion. But of course, it's not just based on the substance of what's being communicated. It's based on the threat. So is there a threat, if you don't take this down, then bad things will happen to you. That's what we see former president and current candidate Trump saying he wants to do is take things to that level and say, if you say the thing I don't like then we'll take away your license. But I think that just communicating that there's something false or at least something questionable on your platform, does not rise to the level of coercion without more. I think the way these cases are connected, though to is that mean? It's not just these cases, obviously, it's the 230 regime as well. You got this funny. Unification of the positions right where Biden and Trump both said get rid of 230 and Democrats tend to think If you get rid of these protections, more things will come down more bad speech will come down. Republicans tend to think that if you get rid of the laws and just mathematical systems, more speech will stay up. I usually say they can't both be right. But maybe they're both right, to some extent, because there'll be more chaos, and you'd have more of a mixed bag, or potentially even the same results we get today, just with a different landscape. But just to finish here, I mean, I think the way these cases are connected is this is why it's so dangerous when you have the government making these determinations about what must stay up, because we all agree and I've had to throughout the years, whether it's talking about net neutrality, or now platforms, well, this doesn't protect unlawful content. Right. I mean, that's, I mean, some of those laws even speak to that. But that's very dangerous when you have the government deciding what is lawful or unlawful, right. So Texas not only had the bills at issue here, but I think it was HB 2690, I was just trying to look it up, I'm not sure if I remember correctly, but in connection with their abortion, denial and worse and rights decisions, trying to criminalize communication of information about abortion mistreated from even, even from speakers outside of the state. So you know, there I think that really tests this bargain. And this notion that somehow what we're doing here is just neutral, if we're saying well don't discriminate based on viewpoint, when the very same state is saying we want to make it illegal to talk about certain topics. And once we the state have decided something is unlawful, well, then suddenly, it falls out of this viewpoint protection, because we get to decide those things. That's
a really important point and the politics of it are obviously, at front and center. So I'd like to shift a little bit and talk about the fact that both the political left and the political right, have taken a strong interest in platform regulation. And they find it they define it differently. They want to look at it from different angles, but but it is there. And that's not only state level, but also in Congress. Last session, Senator Warner introduced the safe Tech Act, which would have limited section 230 immunity. I'm sure many of you watched the Senate Judiciary Committee hearing last week, where tech CEOs were facing sharp questioning from both Democrats and Republicans who sought to establish which companies would endorse bills that they had promulgated that would they were aimed at ostensibly improving child safety online. So many of the groups have expressed a concern about those bills and about regulation more broadly from a First Amendment perspective. I'm gonna start with you, Olivier. But I'd love to hear from all of you. What do you see as the path forward for regulation?
I think I want to disagree. I have been happy that we've been in agreement mostly. But Matt, I want to push back on the observation you made about how about the about the equilibrium politically, the safe tech at the pact that they articulate vision view, a compromise view between parties about how to attack this, even though attack the problem of content moderation, even though they come from different places. And I don't want to understate that. Right. So so the pact act, attempts a legislative settlement. It has all these transparency provisions, for example, and it creates a regime for, you know, consumer appeal, moderation decisions, but it also gives law enforcement agents, civil law enforcement agencies, the ability to bring cases in way that 230 would not allow it. And, you know, and executive governments exist to enforce the law. The safe Tech Act carves out areas of law so that 230 is not a block, say, for a civil rights case against a company or a cyber stalking case or an anti trust case. This is a legislative settlement between parties that are otherwise at odds because there is concern about the impact these companies are having on our public lives and avoiding liability for unlawful content. That's what these laws are about holding these companies accountable for unlawful cat trafficking and unlawful content.
Yeah, so all, I'll be the one to push back here a bit when we're talking about section 230, just to start. So first of all, the majority of the section 230 cases these days are actually not turning on section 230. They're being thrown out very early, or they're turning on other matters of law. And so, you know, if this was four years ago, I would I would say, yes, you know, we're having section two that has been very successful in staving off a lot of frivolous litigation. Today, that's really not the case. Obviously, you know, I'll have to point out the obvious here that we're talking about unlawful content, section 230 does have an exception for unlawful for, you know, exception for criminal federal criminal prosecution as well. So when we're truly talking about unlawful activities, that isn't going to be a section 230 Question anyways. So it makes me these calls for 230 or form. I always have to go back and ask, you know, what do you expect to solve like, what is the problem that we're trying to solve here? Because if we get rid of section 232 The First Amendment is still in play. I mean, the net choice cases completely are centered on the First Amendment here. We have other questions of law at play. I question the question the the regulatory strategy around section 230, specifically. And then I'll also go back to the discussions that we were having with regards to common carriage regulation, I'll state strongly here. I don't think common carriage regulation makes sense for Internet companies. We said this in 1996. Right, in the Reno, ACLU, the Supreme Court had said, the Internet is a wholly new medium of worldwide human communication. And there is a point to that, that it's different from the telephone companies and from what we see traditional common carriers, right. Like, if you get kicked off of Twitter, X, whatever we're calling it these days, you can still find a community, you can still have a voice online, there are many facets of the Internet, where you can put up your own blog, you can join a community on Reddit, your voice can still exist. So it's not like when we're talking about an Internet access provider, for example, where they kick you off. That's it. And if that's the only access provider in your area, you're done. You're you don't have access to the Internet anymore. That's not the case here. So I don't think the path forward is to 30 regulation. I don't think the path forward is common carriage. I think, you know, again, we need to go back to the question of what problem is, are we actually trying to solve here, because at the end of the day, I think the way that these different Internet systems, different Internet companies exist, they're different content moderation policies, their extensive transparency reports that they already put out, I think what we'll find is sort of this already existent, happy medium that is a reflection on us as a society. And if we want to change the way that our society is operating, then we need to address those underlying problems with society, not the source in which we communicate about those problems, can
answer the question of what is the problem we're trying to solve here? The problem, the problem we're trying to fix is the fact that there are companies that do not have to attend to their civic responsibility. Law engenders a sense of responsibility. That's that's kind of a social contract concept. This is one area in which there is a protection from abiding by civic responsibility. Now, that's the bottom line concern, there are material harms, that these cases that are percolating up, aim to redress. The companies invoke section 230, in where the facts are, are horrible, right. And we don't even have to go that far. The civil rights cases, true civil rights allegations against, say, the ad manager Facebook, something I've written about and others have written about underscores that these companies will invoke 230, or avoid litigating in a court for fear of what will happen if 230 is narrowed. That's what's at stake, material harms and the responsibility to attend to civic responsibility. That's an obligation every company owes, or auto.
I mean, I'll respond as well to that, when we're talking even the discrimination cases. Right. So the Facebook HUD discrimination case that didn't turn on section 230, that was settled when we're talking about again, so like when we're talking about unlawful content, even even today, right, like, let's take the Neville V. Snap case, right, a lot of these cases that are turning on discussions about fentanyl and access to that kind of information, right. These courts are not permitting a section 230 defense anymore on those kinds of topics. So the only pushback I have here is that to say that the the solution here is to reform to 30. We're kind of past that these, at least with at least with where the courts are today. It seems like sort of a moot discussion.
I don't think it's moot. Because there are so many aspects of the public law that have not been evaluated that have not been elaborated because section 230 has been invoked. Finally, so the Gonzalez case is a really interesting case from last year. That's the section 230 cases has come up already as a case in which people allege that the social media companies a couple of cases, right, but principally their argument is that social media companies are allowing the distribution of terrorist content that promotes terrorism. So they're, you know, aiding and abetting terrorism. And the Supreme Court is asked to decide whether or not to 30 Shields companies from being held accountable for that. As I just said, I think many of us didn't anticipate this, where that opinion would end up. What the court decided to do is, you know, we can't figure out the 230 issue. Let's actually sort out the aiding and abetting question, let's resolve it on that. I find that to be a promising term, it suggests to 30 is not an easy way out. Now, we actually have to attend to the underlying potential harms. And I agree in a lot of these cases, the social media companies YouTube should not be held accountable for aiding and abetting, but at least we get some law on this. We are not getting law on substantive Public Law because of subsection 230 protection.
I disagree with that. I disagree that entirely I mean, we again, the majority of these cases are turning without even having To evaluate Secretary there's so like, let's take Gonzalez, for example. The speech at issue and Gonzalez is heinous, but it's not unlawful. So what do we do that and what is what is the public? What what recourse do we have for speech that is legal that the companies are using their First Amendment rights to host and to moderate? What? What's the recourse? So
just You're getting ahead of it, in my view. So you say it's not unlawful, the underlying conduct, the conduct issue, the material, the content? Yeah, the material dub aiding and abetting. Right? And you say, it's not unlawful, the content, the content,
I'm not adjusting ating about it, I'm more addressing the content itself. And if we look at it, again, we're talking about the United States that there is a lot of content that is legal speech. So you know, I we looked at Professor Coleman, I we looked at like 60 Plus cases a couple years ago that that looked at that word, speech based content base cases. And the majority of them again, turned on First Amendment or, you know, poor pleading issues at the beginning, etc. But my question was like, what issues? Does the current law, not address? Well,
I so I've mentioned, you mentioned the HUD case, which maybe we're talking too much. You mentioned the HUD case, the HUD case was a case of Facebook desperately wanted to sell, because they didn't want the 230 issue litigated, they invoked 230 In that case. So there are harms, material harms that are content based on lawful speech acts. And my only point is to throw things at block to that
Vargas, we Facebook, the court throughout sections that they the Facebook invoke section 230. The court threw it out. Yes. So we're having those discussions today.
We're not having discussion. Actually. Go ahead. Go ahead. Why don't you go ahead? Yeah,
I mean, I think, right. I mean, there's something compelling to say, well, the law is supposed to be about engendering civic responsibility is obviously but I mean, the question still is, To whom do you owe a duty? who owes that duty? Who's gonna go after for violating that? So it's a very hard dance to do. But we've tried to do it to say yes, when it is the platform's own action that is being called. And the question then to 30 should not be a bar. Obviously, that's going to be called into question when they're saying, Well, we're just curating somebody else's unlawful content. You mentioned the pact act. I mean, we've not fully endorsed that, because I think it hasn't really come back around again. But I'm actually intrigued by that notion of having courts decide when and where we can find that the platform has some knowledge of the harm, it's causing to say that there are some I think I would call them good faith, bipartisan attempts to find common ground does not mean that every bipartisan attempt is a good faith attempt. And so that's where I would stand is to say that there have been some explorations of this. I think rep buyer said earlier that he thinks that, you know, there are 190 ai bills in Congress right now. And most of them are good. And I would probably flip that ratio. And say, probably most of them are bad, because laws are not easy things to write. But that mean, exactly. There's there's a lot of adjustment, saying a lot of ground we can cover under the existing law. And, you know, again, it's just not, it's not simple to say, there ought to be somebody who's responsible for bad things happening, because the question still is who and under what grounds? So I would characterize Google, I was a pinch hitter earlier, I'll switch to Super Bowl as I complete punt, right? And to say, No, we're not going to reach the 230 question here, because they have failed to state a claim. Because I think the basic holding there and so that might be unsatisfying to people, too. But I think that's really what that turned on not deciding that they ought to go past 230. Let's say we're not even going to get as far as 230. They
didn't want to do the 230 briefing was an argument was not perfect. I put it that way. So I don't think they had a lot of clarity about what to do at 230. But they knew to turn to what you just said. That is who owes a duty to whom?
I think one of the biggest questions here right, is that accountability question. And actually, I had a question for you, Nicole, which is, let's assume for the moment the Court declined to uphold the Florida and Texas statutes. From your perspective, what kind of action from Congress? Is there action from Congress that will be workable to address the concerns that do exist about platforms about content that does continue to proliferate on those platforms?
Yeah, so speaking of odious content that keeps flooding Internet, the Texas and Florida laws are really must carry mandates like that is really what would create that problem and under a crippling civil penalty at that. So hopefully, we dodged that bullet, court fines in favor of editorial freedom. But okay, regulatory landscape after the net choice cases. So Congress would be unable to take action against private content moderation, so interfere in editorial judgments, force removals or force content to be left up, but it's for the exact same reason that Congress isn't allowed to affect book bans and Congress isn't allowed to make us all drive around with Live Free or Die license plates. I also want to address an idea that you know, if net choice wins, then social media platforms are unregulated bubble, which I think night touched on a little bit in the brief. I really don't think that's true. So what we're asking for According to do is reaffirmed that the First Amendment applies with full force to the Internet, antitrust consumer protection. Certainly privacy, which net choice has been asking Congress for for about a decade. None of those are precluded under the favorable holding for net choice. What is precluded is privacy law or a content moderation law masquerading as a privacy law, which is what we see in states like California, which the with the age appropriate design code act. But if Congress is thinking about taking some action against online speech after the 2324 term, I think it should consider thinking about passing a job owning statute. So Congress, it's well within Congress's power to impose liability on executive officials for seeking to, uh, to browbeat in a constitutionally impermissible fashion social media companies to take certain editorial actions, I think a good bill would seek to impose that liability on government efforts to leave stuff up, which should be primarily the concern of the Democratic Party, at least in today's political climate, as well as efforts to get them to take take stuff down, I think that would be a worthwhile effort they should consider.
So we have about five minutes left, I would like to allow for some questions. And I let us analyze one more anything more to say about potentially meaningful oversight from your perspective, so we can open it up to the audience? All right. Great. Thank
you. Awesome. Hi, thanks, everybody. Nice to have on Maskull from Reason Foundation. I wouldn't just say a lot of things that had been I had, I just want to share for the benefit of that 230 to a group was mainly First Amendment focused people. That I think is helpful perspective. It seems like there's a lot of write about the way this case should be a decent amount of agreement more than normal, I'd say about as a Supreme Court will decide this. I guess the question I have is, does the other side of this case statements are illegal possibly in the Supreme Court, or that they see is really as a piece of performance art. Fair issuance system is rigged against them. And if there's a distinction between those two things, is there a distinction between
I mean, it's frankly, seemed a bit performative to us from from the start from, from 2021. There, Texas and Florida are writing two cases Pruneyard and fair, so hard, and I don't blame them, because they've passed laws that are kind of impossible to defend. Pruneyard is a wild aberration from the rest of the Supreme Court's compelled speech jurisprudence. And I really don't think that the court is going to be willing to extend it far beyond its facts to expressive platforms engaged, like who's whose business is publishing and curating speech? I think it's implausible. And I'm not sure. I don't know what's in Texas, and Florida has had but it certainly seems implausible to us.
Probably problems confuse infections, jewelry and other emergencies.
Thanks, my name is Courtney Raj. I'm the director of the Center for journalism on liberty open markets Institute. And could you talk a little bit about how do you reconcile kind of the hypocritical views of claiming publisher rights and equating Facebook, for example, with the New York Times, but also gaining, you know, protections from the responsibility to moderate or for defamation, etcetera, that publishers like the New York Times do not have. So there's, you know, platforms, I think, to your sense, are certainly new in terms of the levels of protections and unprecedented risk, you know, protections that they have from existing legal regimes. And then just second, you, you know, we're really talking about social media. But there's another I think, really interesting online expression case, which has to do with open AI his case, or New York Times case against open AI, which will have implications for generative AI and the expressiveness of that type of content and effects online expression, because we're seeing now how people and organizations are deciding whether to make their content open or free or not. So could you address those two if there's time? Thanks.
I'll take the first one. Yeah. Yeah, it's excellent questions. Thank you so much. Um, I don't find it to Uh, we have seen that argument of that, you know, the First Amendment section 230. They kind of they don't interchange, when you're talking about it's, for example, newspapers and online companies. I don't actually see that argument. I think the First Amendment says it's that these companies, private entities have a right to editorial discretion. That's what the New York Times does. That's what the Internet companies do when they are deciding to curate speech where section 230 comes in is it really, it's an exceptional law that recognizes that, while these Internet companies are akin to our offline publishers, like newspapers, for example, they are also unique from offline publishers in really important ways. And the one really important way has to do with the control of that speech. So New York Times, for example, is never going to publish something that New York Times has not first vetted or has, you know, put through their own editorial processes. That's different from Twitter, or say, a social media experience where, you know, I put something out there, there's not somebody on the other end of Twitter that's looking at my posts before it goes out there that would break the entire way in which the modern Internet and social media work. So that's really, again, these services, they are like they are akin to private publishers in the fact that they use editorial discretion, they make choices about what kinds of information they choose to host, and that, you know, algorithmically, are promoted to specific users. But they are also different in that regard when it comes to control because at the end of the day, the New York Times is just trying to keep a control on the messages that they're putting out. It's the same for these other companies as well take again, look at what happened with Twitter and x, for example, you know, the message that x conveys is probably very different than the message that Twitter conveyed back in the day.
I just like to add, so Senator Wyden and Chris Cox, the authors of section 230, filed an amicus brief supporting net choice at the Supreme Court. And they explained that the passage of section 230 reflected Congress's judgment that editorial discretion is indeed a First Amendment activity, and they passed it so they would be able to exercise it without this sort of like hecklers veto that you get in the American system without loser pays, where, you know, powerful people would be suing online speech intermediaries every time there was something that was that was offensive to them. I also want to say there's been, I think some confusion among Texas and Florida's a Miki saying stuff like the existence of 230 counsels against net choice and the net choice cases. I think that's a message of basic separation of powers misunderstanding, because Congress cannot change the meaning of the Constitution with with with a statute. So So I hope that the court resolves that confusion in its opinion, right,
there is no power granted for Section 230. I think that gets missed often. Can
I Can I finish? Can we I know that you're wrapping up, but I just so that we get different point of view, aired a little bit. So I'm hopeful that the courts are attending to the way in which these companies design their services and operate in ways that are not what Representative cots Cox and Senator Wyden had in mind in 1996. And that's why this is a problem for us. We have evolution in the way in which commercial surveillance works and the ways in which companies collect and distribute information that was never seen before. It is no mistake that we are trying to revisit this, even if sometimes some states are invoking their own kind of performance art to get there.
I know we're done. But I just don't see it as hypocritical because I mean, there's often this gotcha sense of, aha, they could be treated as publishers without 230. And I'm like, Yeah, that's why we need the law. So I think we've had, for the longest time, this notion that online, there are not no barriers, but there are lower barriers to speech. And that's a benefit to us. Not every time, but oftentimes to have that kind of freedom that has just said so well. You don't get from a newspaper.
I think that's a perfect place to close. Thank you all so much. It's wonderful.