I just, I just don't think that in the United States with the First Amendment, which has obviously like there are trade offs that we all agree to with the First Amendment, and part of that trade off is that vile speech is legal. In Europe, they have a different view. I don't think that the federal government can ever strike the right balance and say like we know where that threshold is. I just don't think they can. And part of the problem is the federal government is not this neutral actor that has no interest in what speech goes up and goes down. It's a political force, and the number one incentive for politicians is to remain in power. So there's just no way. I don't I don't think that that balance can be struck. And I think part of the problem here is centralization. I'm going to shout out my colleague Luke Hoag in the audience here. He's written about middleware. So much of the frustration around content moderation, whether it was when the platforms were catering to Democrats or whether, allegedly, now they're catering to Republicans. The frustration is, it's centralized control, and it's up to the whims, largely of CEOs, depending on how they feel on a given day, and they can change it on a dime. That is, that's the power, that's the network effect. If you want more choice and more competition, you have to look to third parties and middleware that can decide, let's have a filter that is friendly to you know, Christians, or a filter that is friendly to a different religion, or a school can say, this is our filter. So that we know that parents know that their kids can interact with social media in a way that is healthy to their education. You're not going to get to a place where everyone is going to agree with the decisions of a handful of people that control a vast amount of the information on this earth. So if the companies aren't going to open up voluntarily and allow third parties, potentially, there is a role for Congress to come in and say, Look, we need you to at least grant some API access to third parties and researchers so that they can decide to build filters on top of your platform. That's a potential option here, because there's just no way we're all going to agree on what these platforms are doing, and it's just going to be a constant issue otherwise.