The Future of Online Safety for Kids: Legislative Changes on the Horizon
6:51PM Mar 13, 2025
Speakers:
Connor Wood
Andrew Zack
Miranda Nazzaro
Representative Common Sense
Maureen Flatley
Ash Johnson
Miranda
Maureen
Keywords:
Online safety
kids legislation
Congressional Internet Caucus
KOSA
COPPA 2.0
KOSMA
regulatory power
harmful design features
transparency provisions
law enforcement
parental controls
digital literacy
AI literacy
free speech
state laws.
Good afternoon, everyone. My name is Connor Wood, legislative assistant for Congresswoman Haley Stevens. Thank you for attending our briefing today the future of online safety for kids. Several weeks ago, the congresswoman joined as a co-chair of the Congressional Internet Caucus following the departure of Congresswoman Anna Eshoo, one of our founders. The Internet Caucus has existed since 1996 and has done a lot of great work. We're excited to be a part of this Caucus with Congressman Michael McCaul and Senator John Thune, our House and our Senate co-chair. This briefing today is hosted by the Congressional Internet Caucus Academy. We do want to flag for your awareness we have an upcoming event in Rayburn this April, on April 23 we'll be discussing regulating freedom of expression, navigating the future of online speech. We hope you'll join us for this event. Today, we have a panel of experts. It's great to have you all with us today who are on the front lines of kids online safety issues. Our moderator today is Miranda Nazzaro, who is a technology reporter from The Hill, thank you for being here, and Miranda, we'll kick it off to you. Great. Thank you. Connor, okay, yes, I think this is on. Hello everyone. My name is Miranda. I cover technology and policy for the hill, and really excited for this conversation today, as I think that sometimes, with the crazy news cycle right now, these kind of topics can sometimes get brushed under the rug, but obviously a very important topic to be talking about today. Before we get started, I will introduce you to our panel. So starting from over there, we have _________ with the Common Sense Media. We have Ash Johnson with the Information Technology and Innovation Foundation. We have Andrew Zack with the Family Online Safety Institute. And we have Maureen Flatley with the Stop Child Predators. So as we all know, Congress has attempted to pass legislation aimed at protecting children online for many sessions now with with no success, and there's a lot of debate over their language, is it too broad or too vague? And there's a lot of intense lobbying efforts from tech companies and industry stakeholders that have kind of prevented some of these measures from passing. Some lawmakers are concerned that the bills would be overreaching, giving too much power to regulatory agencies to go after big tech companies or lead to censorship of certain political views. I think many of us can agree kids online safety is important, but how we get there has opened a door to a host of complex questions and debate over what is too much or too little in legislation. I know that our panel has some varying opinions on the legislation that has been introduced in either this session or past sessions, and I'm excited to kind of debate those measures and see what the best path is moving forward. But I figured to kind of start broadly, we should probably all get familiar with the legislation that has so far been introduced lots of abbreviations, but so I don't know if any of our panelists kind of want to start by breaking down. I think what would be easiest kind of how each of these ones differ. So we have cosa COPPA 2.0 and KOSMA, which all are based on kids online safety, but targeted to different things. So does anyone feel apt to start? I know Andrew does a lot of this policy stuff, sure. So I'll start. Thank you, and thank you for having me here today. I'll start with a little bit of scene setting. So yes, KOSA Kids Online Safety Act started from came out of the Francis Haugen whistleblower hearings in the Senate years ago. Now, this has been a multi year effort, multi Congress effort, and originally focused on social media, it's expanded and contracted a little bit to include other types of online platforms as well. But the key part of KOSA is the duty of care, to protect kids from harms, and most of those harms, or at least categories of harms, are named in the bill
KOSMA Kids Off Social Media Act is exact. At least the first part is exactly what it says keeping under thirteens off of social media, which makes sense, given that most social media, as we know it has Terms of Service that prevent that. Anyways, you must be at least 13. There's other parts in there about algorithmic curation and recommendations for under 17, as well as school ed tech, part of it, which we can get into, but that's a little more KOSMA, a little more straightforward. And then COPPA 2.0 the effort to update the now decades old COPPA original COPPA Children's Online Privacy Protection Act. And so COPPA 2.0 would do much needed updates, including creating a whole new protected age range from 13 to 16, instead of just Yes, No, binary is the child under 13, they have privacy protections kind of or not. And so. Would add a lot more nuance there.
Great. Thank you, Andrew. And I don't know if you can dive anyone dive more into kind of what these enforcement measures would look like. Is this even possible in today's digital landscape? I know that there are concerns, again, that it would give whether it's regulatory agencies, too much power or, or, yeah, anyone want to take a stab at that? I
mean, I think all of these bills have one thing in common, but they do give some regulatory power to the
FTC and some regulatory power to the Attorney Generals to the states. Can you hear me? There we go. Can you hear me now better? Anyone in the back Yes, closer to me? Okay, there we go. Yes. So one of the things the through lines that all of these pieces of legislation have in common is that they do have enforcement through the state AGs, and then also through the FTC, partially, at least with some oversight of the AGs. So it kind of depends on which provision of each of the bills you're talking about, but each of them does have that through line. I think one of the things that's really important about KOSA in particular, but just the way that, like the direction these bills have all moved in the past several years, is towards design features. So really focusing on the harmful design of these platforms. And so the KOSA duty of care requires platforms to design their design their platforms with kids health and safety in mind, and prioritize that health and safety over the engagement numbers and the profits of the platforms. So that's definitely been a shift in how KOSA actually treats the standard of care over the past several years, and got us to a point where there was bipartisan support of KOSA in the Senate, 91 to three last year, when co small was voted on in the Senate. So a lot of work has been done on the legislation, and has gotten us to the point that we are while we were in December of 2024 for KOSA that has not been reintroduced yet this session, so we're still waiting to see what that piece of legislation would look like in the 119th Congress. I
can't believe that was only four months ago. Feels like a lifetime. Yeah, no. Thank you both for kind of setting the scene. I just figured that would be good. Now it seems that a lot of these bills have something in common. Is there something to be said, though, about are there too many bills being weighed right now? I mean, again, they're all going to the same goal. None of them are passing. What's the problem here?
I don't I mean, like I said, KOSA passed 91 to three KOSMA, which was the combination of KOSA and COPPA 2.0 passed 91 to three in the Senate last year. We ran out of time. We were making progress. We were getting co sponsors in the house, and then we had a new session of Congress. So we're kind of luckily not feeling like we have to start from day one, and again, we have to look for KOSA look and see what that legislation is going to look like in this Congress. But if it, if it's representative of what we saw last Congress, I think we'll be able to move forward rather quickly, and I'm hoping that we can all work together and protect kids.
I thought you were preparing.
It might be a good time for me to jump in, because I have a very different view of these bills than most people that you probably heard from. How many people in the room are staffers? Okay? Rule number one, just because a bill has the word child in its name doesn't mean it's good for kids. And with all due respect to my co panelists, who I have a lot of respect for, I think one of the reasons these bills haven't passed is because they are. There definitely are too many of them. But in my view, the problem here is that somehow and I went to the Stop CSAM hearing the other day, and I was kind of astonished at the at the at the rhetoric, because everything was framed as empowering big tech victims, and the word that I didn't hear anywhere was criminal or predator, or any discussion of the people who are actually doing The victimizing of the children. And so my big concern is that none of these bills really address the what I would at this point consider, the appalling lack of law enforcement focus on attacking the real underlying issues. So my father was an FBI agent. He spent his whole career working for the Senate racketeering committee. I come at these issues with sort of a combination of law enforcement experience and 30 plus years of child welfare, watching child welfare policy unfold as the Internet became a thing. And from my perspective, we look at this as. Yes, the tech companies are mandated reporters, just like school teachers and school nurses and pastors and mandated reporters are typically accorded immunity from civil liability, not from federal criminal liability at all, but in the meantime, we have, I think, in 2023 36.2 million reports from tech companies of illicit activity on their platforms. They're private companies. They cannot prosecute these people. They can't really even investigate it, other than to identify it and remove it when possible. But my concern has been that the approach to this problem has not prioritized things. So in other words, the house is burning down, but instead of calling the fire department or calling the building inspector and trying to, like, change the electrical codes for the community, meanwhile, the arsonist is running down the street with a book of matches. These are, in many instances, well organized global operations of organized gangsters and criminals victimizing children. There is no question in my mind that as a result of the lack of enforcement of existing law DOJ, lack of focus on this, the FBI is lack of focus on this. And I'll say this is probably the first time in my life that I've ever criticized the FBI publicly. But these bills, in my view, should be called, hey, we really don't like the tech community that much, so we want to regulate them, but I don't really see these bills is protecting children in any kind of meaningful way in the long run, and most importantly, because they are almost like Mothers Against Drunk Driving, saying, you know, we're going to pass a bill that lets us sue the automakers, but we're not really going to take the drunk drivers off the street. And so from my point of view, just from a practical perspective, there are too many different bills, and they each have sort of poison pills that if those poison pills were removed and perhaps the bills were somehow consolidated, we could get to a point where there was something affirmative on the sort of the building inspector side, but if we don't start to attack the criminal aspects of this problem, and it was a global problem, so 94% of the cyber tips are actually referred to foreign governments. The only law enforcement agency that can really look at that is the FBI. And what have we been doing to promote that? Nothing. The only bill that addresses that piece of this is Senator wyden's Invest in Child Safety Act, which will be reintroduced in this session. It was introduced in the last session. It's the only bill that I currently endorse at all. But I think that when you talk about protecting children, a lot of the stuff that's being proposed is really so far removed from the day to day issues that it and by the way, most of them will take years to be implemented, that the immediate effect will not be felt for children, and at the same time, we will be fueling a I mean, we've already fueled a digital crime wave because we really haven't looked at this seriously enough, But at the end of the day, I'm very concerned about the way that the tech companies have been demonized while they are reporting diligently, and I can tell you that a very small number of those reports are actually addressed from a law enforcement perspective. So you know, one of the concerns I have is that, in my view, the last federal agency that should have any law enforcement dominion over this issue is the Federal Trade Commission. And there needs to be, this needs to be the DOJ and the FBI.
Okay, thank you for that. Anyone want to jump in with her?
I would, I would love to. Thank you. So prior to coming to Common Sense Media, I was a prosecutor for almost 15 years, and as a prosecutor, I spent much of my career focusing on Internet Crimes Against Children, and I not only prosecuted and investigated those crimes, but also taught other prosecutors investigators how to investigate those crimes. So the reason I took this role is to prevent those heinous crimes that I saw happening to kids all over the country. And I really think that these policies that are being proposed in this legislation could help prevent these horrible crimes from happening for kids hard stuff. So when you're talking about the cyber tips and the cyber tips that are going uninvestigated and unreported, we're talking about a problem that was created by these features. So these pieces of legislation impose a duty of care on these platforms to prevent these horrible crimes from happening, from the sexual exploitation of children from happening from the production and distribution of CSAM from happening from strangers contacting kids on platforms and being able to exploit them, from happening that. Is exactly what the duty of care and these design features that are being used currently in these platforms are enabling. So when you talk about these companies that are submitting these cyber tips to NCMEC The National Center for Exploited Children and are doing such a great job in exposing the cyber tips that are happening on the platforms that they've created. They have created the problem. They absolutely have a duty to stop the problem, and they really should, you know, I don't think, you know. I don't think that any of these pieces of legislation stands alone by itself and is going to fix this problem all out. The industry has to be involved in fixing the problem as well and protecting kids as well. And they have to want to do it, and they should want to do it. We're talking about children well,
so _______, I mean, I hear what you're saying, okay, and I have a lot of respect for you personally, but it's really difficult to foster a dialog when the conversation has become so polarized. So when I look at the press release that was sent out after the stop csIm hearing, and the headline reads, Senator chairs hearing on fighting online sexual abuse material calls for empowering big tax victims, where's the discussion there? You know? I mean, I think that we have an opportunity here to take out of this conversation, the conflict and to create a discussion, but you also with the duty of care issues, and I hear you, and I don't necessarily disagree with all of it, but one of the big issues with these bills is What's the role of parents in all this now in real world child welfare, which is really where I started, and I was one of the architects of one of the fundamental federal child welfare laws, the Adoption and Safe Families Act. But even before that, the Child Abuse Prevention and Treatment Act very clearly articulates that parents have a duty to prevent their children from harm, from injury, from death. And so as we have moved this discussion into cyberspace, we've sort of lost the grounding of that basic existing federal law which every state has to abide by. And incidentally, let's talk about what the states are doing around technology. Every single state that is suing a tech company either has been or is being sued for their own appalling performance when it comes to taking care of children in their care and so again, I feel that as a result of all of the polarization and the conflict and the way that this whole discussion has been poised at the tech community as in, as if they're the ones literally abusing and exploiting the children, it completely eliminates any ability to sit down and say, Hey, let's do a better job here. Hey, how can we help you? Hey, what do you need from us? Because I heard John pizzero The other day say, the tech companies are terrible. They don't respond to anything. Well, I do talk to the tech companies, and I'm not here to defend the tech companies today. I have one mission in life, and that is to protect.
I do think that when you ask about the power for the parents, I think that actually KOSA does provide power to the parents in section 103 or what was 103 and the 118th Congress was giving all these tools to parents. So it was giving parents the ability to click off the algorithmic feed, or teens to click off the algorithmic feed, to stop the auto play like these features that we know to be the harmful design features are do fall under section 103, of what was the last introduction of cosa and really does give power to the parents to be able to control their children's or for teens to control their own experience. And when you're talking about the States, we've actually had some experience with this in the States, so in New York and in California. Last year, we successfully worked with sponsors in those states in introducing two different laws, the safer kids act in New York and the social media addiction for Youth Law in California, SB 976, and both of these gave control to parents and to teens over their own experience. And we heard so much from parents and people in New York that when we were talking about this bill in New York and we're saying you could turn off the algorithmic feed for kids, everyone was like, Wait, can you do that for me too? I mean, people just want control over their experience. And CO saw under section 103, provides that control to parents. It allows them to set time limits so it also gives reporting tools to parents. So when these horrible harms happen to their kids, or are happening, for instance, a cyber bullying incident happening on a platform, then they have the tools to actually report that to the platform, and have the platform actually respond to them. We hear from parents, and I can tell you when I was working on these cases of. As a prosecutor, working with families trying to get materials taken down off the internet, was brutal on these platforms, brutal, absolutely brutal. And having a reporting mechanism that parents can go to easily and get a response from the platform is what parents want, and that's actually getting control to parents, and not big tech. Okay,
here one second, I'm just gonna let Ash weigh in. Well, I think, I think
This conversation has been really great so far. I if I had to place myself stance wise, I would probably be somewhere in the middle. I think all three of these bills have really great provisions that I definitely support. I think they also have, at least in the most recent versions of them that we've seen, some provisions that I personally would disagree with. You know, how they've been worded or executed, and we can definitely discuss the finer details of things later, but I do agree about the, again, the need to protect or continue to preserve parents choice. I think I agree that cosa does a pretty good job of that. And I think, you know, I've, I've written before in my kind of musings on Children's Online Safety, that this really is like a three legged stool of responsibility when it comes to protecting kids online. You know, there is some responsibility that falls on government to make laws and enforce laws that protect children. And I definitely agree that there, you know, is a lack of enforcement in many areas that we need more funding and resources to kind of patch up those gaping holes. And then, you know, the second leg of the stool companies certainly do have, or should have, you know, a duty to respond to illegal content and objectively harmful content, you know, in a quick and responsive way. Obviously, when it comes to certain forms of harmful but not illegal content, it's kind of difficult to know where the word draw the line, and so I give some grace there, but certainly when it comes to illegal content, absolutely a lot of responsibility on the platforms themselves. And then, of course, parents are the third kind of leg of that stool. But unfortunately, the reality is that a lot of parents do not have the resources or the knowledge to be the only leg holding up the stool, and even those that do, I was a kid who grew up online. I was a very resourceful kid. I got around parental controls pretty easily at a looking back worryingly young age. So I do think having all three legs of that stool be kind of reinforced and in place, while also not giving too much again responsibility to government, especially when it comes to anything related to content or online speech. I think that's always very worrying, and also not giving too much responsibility solely on platforms either. I think you know many of the other issues that we're discussing when it comes to the internet, content moderation, just in general, misinformation, other forms of harmful content that are not necessarily targeted at children, our own political disagreements, maybe with certain platforms, content moderation choices, a lot of the conversation happening in that kind of space is saying these platforms have too much power. They're making too many decisions that kind of influence the political and social conversation and landscape that we all live in. Whereas I see a lot in kind of the child safety conversation, like, well, let's put more responsibility on these platforms. And I think, you know, there is a kind of a balance between the two. We don't want to give platforms the power to make every single decision that will control our entire lives. But we also, you know, don't want to completely take away their ability to moderate content in an effective way, especially to protect children.
And if we're not too off track, I'll add on and talk love the stool analogy. That's something where Fauci talks about the culture of responsibility for online safety. And we named six different actors, parents, the kids themselves, people who are often forgot in this debate of their own agency and their own resourcefulness. So parents, kids, law enforcement, policy makers, industry themselves, and teachers and educators. And so it does. It's a balance in here, in responsibility, in terms of getting a little specific cosa, parental controls, I agree that's a pretty good section of cosa. And for staffers considering how to write these walking that balance of empowering parents but not creating surveillance tools, so giving making sure that kids do have some. Amount of privacy or autonomy from their parents is important. I also the original question about, like, how many bills are going on? I hate to say it, but I kind of want to see more. We're moving towards this piecemeal approach, not just, you know, good bills, which so cosa and age appropriate design codes. I know KOSA passed the Senate. In the States we'll get into age appropriate design codes. These bigger sweeping bills are getting stopped in the courts because they get close to speech and First Amendment, and that's what ______ mentioned, of kind of shifting this a little bit to focus more on product design, on products, which will be especially important as we get into AI and chat bots, which you can have a whole argument against. I'm not a free speech lawyer and our First Amendment lawyer here, but so I think having we've already seen a bit of a taking smaller slices of the pie at a time with Ted Cruz passing the take it down act, that's one way to address a small part of this problem, and there are others in the States as well. Child influencer protection, kids off social media under 13. There are smaller ways to address this at a time that a big sweeping bill might not run into, that might not run into the same problems as a big sweeping bill.
And I'm glad you brought up implementation, because whenever I'm working on a big package, especially, I'm always thinking about, what will the implementation look like? How quickly can it be implemented if this is especially something that's responding to a critical problem? So whether you agree with these bills or not, and I made it pretty clear that I'm not a gigantic fan, one of the big concerns I have is that I think it will be years before they're implemented. And so even in a best case scenario, let's say that cosa passed. And by the way, 444 members of Congress did not vote for KOSA, the implementation arc will be years. And so how does that help kids? It really doesn't. So this also suggests to your point, Andrew, that maybe it's time to regroup the conversation. Maybe it's time to bring everybody to the table on both sides of the issues, to say, okay, you know, take all the bills and remove the things that are poison pills and emphasize the things that might foster some consensus. I mean, a perfect example of this is that the Protect act of 2008 which was a law enforcement oriented bill designed to bolster security around kids online, which was a bill that I love, that was Joe Biden's last bill. Is US Senator signed into law by George Bush, absolutely bipartisan two years ago. GAO comes out with a report, scathing report about how it was never really implemented. If that Bill had been implemented, we would probably be in a somewhat different position that we're in than we're in today. The other thing, and this is just a little bit of a sidebar, but I have to bring it up. Let's talk about consistency here. So when we talk about parents, you know, I'm a mom, I'm a grandmother, I feel like it's a little bit of common sense to say you're not letting your child drive to the mall by themselves when they're 12 years old. So handing your child a tablet without supervision is probably reckless. And one of the ironies, and this is just for general information, we're talking a lot about creating a cause of action against the tech companies. But let's talk for a moment about guns and how we have had an absolute catastrophic experience with children and guns in this country, doubling the number of gun deaths of kids in the last few years, and ironically, one of the sponsors of cosa led the charge to immunize gun manufacturers from any kind of liability. So, you know, I'm big on consistency. So when we're talking about making kids safe, you know, everyone's talking about devices and access to the internet as if they're letting a kid play with a loaded gun, and which is, by the way, in 30 states, kids can possess guns. So, you know, I get the concerns that are being expressed here, and I don't believe me, I don't take them lightly, but I don't think that the approach that's been taken with these bills is the way to solve the problem. The other thing is, who can sit here and say that there has been a meaningful, reciprocal conversation with the tech companies to say, what do you need to do a better job? Because I'll tell you one thing, I've worked on these cases. I know a lot of victims, and I've seen varying levels of response, but one of the problems, I'm guessing, that the tech companies have is that the volume of this crime is so voluminous that they themselves are challenged when it comes to dealing with it. So perhaps, if there were more robust law enforcement response, maybe they'd be turning things around a little more quickly. Just you know, a thought, I think
we need to stop blaming law enforcement problems that are happening on the. Internet, the harms to kids that are happening on the internet. Law enforcement is responding to these cyber tips. They are funded. They are looking at these I can tell you that there are ICAC Task Force
and prosecutors across the country
on these cases. If we prevented the crimes from happening in the first place, there would be less crimes for them to investigate and have to prosecute, number one, but you made the point of 444 members of Congress not voting for Congress last session. And I hope that this session, they have an opportunity to do that. I hope that coza gets to the floor this year and that we can change the hearts and minds of those congressmen. And I think we can well, but
as we can see, it's a big debate. I actually wanted to go off of your point on implementation, because I find that when I am reporting, this is something that I struggle to fully grasp, in terms of, say, any of these bills actually made it through and get signed. How do we actually assess whether the legislation is effective? You know, are, from your point of views, are we going off of a certain number of reports, if we see reports drop? Or how do we assess if, whether that's successful? Because I think sometimes with any bill, you know, we can pass it and say, like, that's great, and that's success. But how do we actually measure down the road how that is successful?
Well, right now, I think that I'm sorry if anybody else wants
to start, and then we'll
go right now. I think a lot of these platforms are a black box. We don't know what's happening behind the scenes. We don't know what's happening behind the scenes until a big lawsuit happens, that like has been happening across the states, where we then get to see discovery and get to see what's really, really happening. We get to see emails, we get to learn of things that the CEOs knew of about these design features that they still decided to implement, even though they're harming kids. So one of the things KOSA does is includes transparency provisions so that reporting will have to be more transparent. So I do think that transparency reporting, getting some more information, having provisions in there about, you know, research might be a good idea so that we can actually do some research on what's actually happening. But right now, it's just this big, huge black box, and kids are getting harmed, and we need to fix it.
So I think so. My answer that question would be, in a universe where there are 36.2 million cyber tips in 2023 what was the arresting conviction rate? For instance, are we advancing in any meaningful way? Why bother to report them if we don't know what the disposition of the cases is? Just an idea, as a
former prosecutor, I can tell you, but sometimes these cases take a very long time go through the system, so I'm not sure that arrest and prosecution rate is the best method to determine whether or not harms are happening to kids, because we're, of course, also talking a lot of harms that aren't illegal, so they aren't leading to, like, a cyber tip. So certainly, for like, CSAM reporting, it might help. But you know, it's the platforms that are reporting the cyber tips for the most part. So I'm not sure if that's
really but Ashley, did you want to weigh in on that? I actually
wanted to weigh in on a point that you made a couple minutes ago about the transparency. I just wanted to underscore that point. I completely agree when it comes to not just children, safe people, when it comes to so many of the issues that we talk about online today, when it comes to online platforms, when it comes to speech, when it comes to content moderation, when it comes to privacy, even there's so much that we don't know, and I think it is detrimental not just to us as users. Obviously, we would like to know how our data is being used. We would like to know what safety measures are in place. We would like to know, you know what, how content moderation is working, how children and parents can protect themselves. We would like to know all these things, and we don't know enough about them currently, at least from the platforms. I think it's also beneficial, again, as you mentioned, for government and researchers to know this, this information, to see to understand the true scope of the of the problem, where problems exist, and to more effectively respond to those problems, and kind of allocate resources and regulation effectively. And then I also think it would be to the benefit of the platforms if we knew more about what they were doing, because there is so much distrust, and I think that has been to the detriment of online platforms. I think it's led to a lot of bills targeting them, laws targeting them, all of which, you know, they've been very vocal. They don't like a lot of this kind of narrative, this anti tech narrative, but I think transparency would go a long way in you know, if it does expose problems, we can fix the problems and also maybe potentially increasing trust as we start to fix those problems.
But would you agree that, let's say, in a universe where the Yahoo boys based in Nigeria are driving a lot of this extortion cases that you. Where only a handful of them have been extradited back to the US to be prosecuted, that that's a problem, that's a problem, and the and the tech companies can't fix that problem. The other issue, one last thought, and then _____, because I do, I do get where you're coming from if you're proposing legislation that creates liability for mandated reporters and then expecting them to be more forthcoming. I would argue that that's a poison bill. You know, they're you're asking them to engender more liability, which no private for profit company would do. And finally, one last thing on this topic, there's been a lot of representations that it's the tech company lobbying that has thrown the sand in the gears of these bills moving forward, and I am here to tell you, because they're sitting in the audience with me, that there are plenty of people who are opposing these bills for other reasons, that they're not constitutional, that kids do have a right to free speech in some Ways, that they do potentially encroach on parental rights. As a mother and a grandmother, I can tell you right now, I make all the decisions about what my kids can and can't do, and what the grandchildren can and can't do, and and that works in our family. Maybe we're not the average family, but at the end of the day, whether or not you want to advance a discussion with the tech companies, you cannot advance a productive conversation if you're starting the conversation by saying you're wrong and it's your fault because I because it's a combination of factors.
Andrew, Oh, I saw you were all right. Well, this actually is, I think, a good segue you touched upon free speech, and I think that, yes, kosa did pass in the Senate, and why we saw it kind of backlogged in the house. Was House leadership had a lot of concerns about infringing on free speech rights, whether it's going to lead to censorship of certain political views on both sides of the aisle. You know, this is a question I feel like, as I report, I struggle to it seems like there is no clear answer, like, is there a way to create legislation that, quote, unquote, protects children but does not halt innovation and does not create sort of these excessive burdens on the tech companies. Is that possible? Ash, sure. I
think any regulation design for safety in general, and especially for children's safety, does require, you know, a few trade offs. You know, we all give up, certain you could call them rights. You call them things that you can do in the name of public safety. Um, plenty of examples of that, and especially to protect children. So I do, I do think there is an acceptable kind of trade off to make in certain areas. I completely agree. Um, protecting not just everyone's free speech rights, but kids speak free speech rights specifically, is really important, I think, as you know, you mentioned, making sure that that that kind of parental controls and and other features are not being used as surveillance tools, that especially as kids start to get a little bit older, they're offered kind of increasing autonomy and freedom over their own internet experience within certain safety boundaries. But I do think there is an acceptable trade off. I think you know, the real issue is finding exactly where that line is that kind of maximizes the benefits and reduces as much possible harm.
I mean, again, you know, focusing on the design features and not content and the harmful design features, soKOSA, and KOSA for one, does not limit the ability for any user to go on and search for any material. And that's in the rules of construction, in section 102 the duty of care, 102 that it does not prevent any user for searching for any content period. So I think that there is that. I think that, and that's really, really important, that the design that we're talking about not pushing the harmful content to users with design features, rather than preventing the content from existing in the space. So I think that that is one of the balances that was kind of taken and those are the changes that have been made over the years since cosa was first introduced. And then I was going to make another point, oh, I mean, yeah, so I mean, and also we have to remember that we're always going to exist in a world where, in our country, where the First Amendment and section 230 exists as well. So if a action was brought that was violative of Section 230 or the First Amendment, then that obviously would not go anywhere. So I think that that's, you know, obviously that's something that's in the minds of the people that are drafting this language. Not to be violating section 230 not to be violating the First Amendment, because any enforcement that were to do that just would be a lot of waste of time.
Yeah, I'll add one more quick thought. I completely agree. I personally think with every iteration we've seen of cosa and some of and the other bills that we're discussing, but kosa, particularly has generated a lot of discussion about free speech. I think every iteration has gotten better and better on the free speech front. Every iteration I've had fewer kind of hang ups related to free speech. And so I think we're definitely going in the right direction. We have been going in the right direction. And I also kind of think that this speaks to the piecemeal approach that you mentioned, and kind of the benefits of crafting these really targeted legislations focusing on design features instead of content, you know, trying to preserve free speech as much as possible, because it is such an important right for people of all ages. Yeah, so I think the debate is definitely going in the right direction when it comes to free speech.
And I think on KOSA, a small tweak that could go a long way to the Free Speech implications, where, in cosa, they name a couple harms, including eating disorder, content, mental health, suicide, ideation, self harm, and so a fear in society and advocacy groups is that platforms to avoid this new duty of care liability will over, moderate and take down anything, any content associated with eating disorders, with self harm, with any of this, and including good things like stories of survival and recovery and resources. And so I think having a provision that _____ already mentioned, of being able to actively search for something versus being pushed a recommendation, is a key distinction. And learning from our friends around the world, in the UK, their Online Safety Act, instead of just talking about eating disorders, content related to eating disorders, they say content that encourages, promotes or provides instruction for eating disorder self harm, and that's just such a small way to make it more specific, and hopefully not have companies take down positive examples of potentially harmful but likely not if it's positive
content. Thank you for mentioning that, because that reminded me of the one other thing I wanted to say. So when you ask about like, will this stifle innovation, and will this prevent innovation? So in the UK, where the AADC has already been implemented, Children and Screens, which is another organization in this space, issued a report last summer talking about the hundreds of innovations that have happened at these platforms since the since the introduction of the AADC in the United Kingdom. So changes that have been made of these platforms. And guess what? The platforms still exist there. They're still functioning. Kids are still on them. They're just functioning in a safer way, and that's what we're hoping for. How
How are we measuring that they're safer? How do we know that they're safer? Because I this business of eating disorders and bullying comes up all the time. Now, I was born in 1948 so as you can imagine, I grew up way before the internet came along, and I have kids who range in age from 56 to 36 and so I've seen bullying, eating disorders, all of those cultural problems play out in a lot of different ways. And I mean, I'd like to know that we could actually measure whether there was a was an impact. I'm not sure that we can, though, because when I look at some of the cases and some of the cases that have been raised in the context of promoting coza, when you do a deeper dive into what happened, oftentimes the bad behavior played out both online and offline. And so I guess what I'm saying is no guarantees. I mean, I don't have a particular problem with what you're talking about. I'm just saying that this is not a panacea, right?
No, I think certainly, especially with like child exploitation, drug sales, fentanyl sales, that's happening, and it's starting online. And of course, it is moving to unfortunately person to person contact, because that's the nature of sexual unfortunately, sexual exploitation, sexual abuse and drug sales. So but it's being enabled by some of these features online.
Can I ask a question? I wonder, because I have seen very little interest or discussion in the Invest in Child Safety Act, which is a very important bill that Senator Wyden crafted, but it has bipartisan support in the House and Senate. That bill is going to be reintroduced soon, and where is everybody on that? Why don't you guys jump on that train?
I'll go first quickly. I don't have the answer. FOSI hasn't. FOSI has endorsed that bill. Thank you. It is a good idea, and I don't want to be cynical, but it's also $5 billion in price tag and, well, I I'm not putting the price on child safety, I'm just saying I think that has halted some progress. But,
well, I think the price tag is always a problem on a bill. I. That, but the price of not doing it has been substantially more expensive. We've endorsed
it. I'm not sure if common sense has come up with a position on that. I do think that, as you were saying earlier, I think that, you know, there are a lot of bills out there, and rather than shifting the focus to one issue, I think that we've been trying to focus on, I
think, and the reason I brought it up, ______, is that I think that part of getting to yes in this whole conversation is, as Andrew pointed out, before perhaps rethinking both the bills that have been put forth and perhaps adding some other content and other solutions to the discussion. Because I you know, nobody's going to change my mind about the law enforcement problems here and the lack of bandwidth that law enforcement has. I'm a huge fan of the ICAC system. This is the Internet Crimes Against Children, where state and local police departments then interface with the feds. It's an essential part of this system, in my view. But, you know, we don't have enough prosecutors, we don't have enough training, we don't the
amend to that, I agree. But so my
point is that, you know, we can't foster it's sort of like blaming the banks for getting robbed, you know, at this point. And so we can't take our eye off that ball. We just can't. And at the end of the day, as someone who has gotten a lot of big, complicated bills passed, and Andrew, when he worked on the Hill, was often a part of those discussions. It's all about building consensus. And by consensus, I mean all sides, you know, not just the the good guys, the self appointed good guys, if you will. And I think that we have an opportunity here to create, honestly, a much more meaningful discussion and a much more effective outcome. But we have to stop demonizing the tech companies and looking at them more as partners when it comes to again, what do you what do you need? I'm pretty sure that most of them would be really happy to have more police intervening on these cases so that they're not alone themselves.
I'm sure they will if it means less regulation. But, you know, happy to have the conversation with anyone from tech, yeah,
well, thank you all for Gary is clearly you kind of, I feel like gave a good representation of the debates that are happening inside of here in Congress. I guess Lastly, I know we're talking about Congress and federal legislation, you know, we've mentioned some of the state efforts that are happening on this topic. Sometimes, when I talk to these technology companies, you know, they say that they're a little frustrated because they're it's a patchwork of different framework at this point. And you know, states have said, well, we're not waiting for the federal level because nothing's getting passed. Do any of you have concerns about this patchwork? Or do you think that you know, the state level is a good thing, or maybe both.
Ash, I would say I have kind of a nuanced position on this, which I guess has been kind of the trend of what I've been saying. But yeah, I'm somewhere in the middle on this as well. I completely agree with the states that, you know, it's probably not the best move to just sit there and do nothing and wait for something to happen at a federal level, because there's, there's not a lot they that they can do to make that happen. And certainly some protection is better than no protection. I am however, usually in favor of, you know, when there is a good federal law that it includes at least some level of preemption of very similar state laws to avoid that patchwork. Because again, when it comes to the internet, it is inherently cross border, and so the costs of kind of complying with a bunch of very similar, but not quite the same. State laws can get pretty high, but I certainly would never say that the states should just sit and do nothing and wait.
So I would say that the states are most of them, are falling behind and perhaps in outright violation when it comes to enforcing their own existing child welfare laws. You can look at any child welfare system in America and see abject failure, bad things happening to kids. Child Welfare is viewed as a state law issue, although dating back to the orphan trains in the 1850s that was fundamentally interstate activity. So for several reasons, I think the state's efforts to regulate the internet are completely misguided and a useless waste of time, especially when their own track records around children are appalling, and when I look at that letter that was sent to the speaker last session in an effort to push him in the direction of KOSA by the 33 state attorneys general I went through and documented what was going on in all 33 states. And I can tell you, it was not pretty. So to the extent that the states are trying to virtue signal, they're only going to confuse the issues. And the other thing that happens here, and this is my big message to staffers, once anything passes, whatever it is, good, bad or indifferent, that box is going to get checked, and we're going to be done with child exploitation, good, bad or indifferent. And I would argue that if it does take us a little more time to come up with something that's more thoughtful, more effective, and can be implemented constitutionally, that will be time well spent.
I think that it is embarrassing that we do not have a federal data privacy law in this country. I think that's an example of 20 plus years for kids, 25 years, and that's not even comprehensive. That's over under 13. And so that's an example of states going all right. We keep talking about data privacy not happening now, over half of the states have their own data privacy laws or are well on their way to having them. So data privacy is a great place to start. I love that Ash was talking about trade offs and nuance, because that is absolutely where we live in this space as a safety organization, we are not free speech absolutists, we are not privacy absolutists, and we're pretty sick of the extreme sides really gumming up the gears here and stopping letting perfect be the enemy of the good. So that's more so at the state level than the federal level. That's sort of your net choices stopping, potentially some good ideas and some harmful ideas. But
state, I only took almost an hour to bring them up, so
I know they were on last month's panel. So you know we definitely overlap in some issues that we work with them on, but not all. I'm sure _____ can expand on this, but age appropriate design codes. We saw California's stopped pretty quickly in the courts. Maryland had a great reaction to this. They really tried to learn from what paused California to move it from regulating content and speech to be as platform designee and products as possible. And they were still sued over after years of trying to make it better. So it's frustrating to see even the best faith efforts, even if they're still flawed, just get totally cut off at the knees every time. I mean,
I think I agree, every single time a common sense regulation is almost going to be implemented, NetChoice, or some other organization sues just because they're being sued. Though, Maryland, we haven't seen the results of that lawsuit, so there's several lawsuits that are pending. NetChoice sued just before January 1, 2025 when SB 976 was going to go into effect in California, and the Northern District of California upheld part of that law, but then an appeal was filed, and so now we're caught in the appellate courts in California on that law. So yes, NetChoice has been successful in filing lawsuits, and they've been successful in getting some findings handed down by the courts that are in favor of them, but also some that are not. So we, I mean, I think we're going to have to wait some time, unfortunately, like you said, before these, any of these things really get implemented. Because, you know, as soon as the law gets passed and signed into law, there's lawsuits, and that takes time as well. But, I mean, there are so many great things happening in states like I already mentioned, California. You know, there's a lot of work that we're doing in the AI space. I think that AI and our organization thinks that AI is the new space that is like the new social media platform. And it's time now to be putting regulations in place around AI, guardrails in place that protect kids, because we don't want to be in a position in 20 years from now that we are today with social media. So that's really, really important. And we are, you know, sponsoring legislation in California and New York, and then also, I think that digital literacy and AI literacy is also really, really important. And I think we all touched on this a little bit the role of parents. But parents and for kids, you know, the take it down act is, is a piece of legislation that we support at our organization. But I think that take down also can't live alone. And having been an icap prosecutor, I'd go talk to kids in high schoolers. And this was back in the day where it was kind of like a sex sexting was what led to big blow ups of images online, because they were they became viral. And now, because of AI, it doesn't have to be someone taking their own image or a friend taking their image. You can take a stranger's image and turn that into a viral image. And that's really, really terrifying. But because of that, I think it's really important to be teaching kids about AI literacy and keep teaching them about the consequences of, you know, doing this, using it.
One of the things that I have suggested is that there was a bill passed in Florida. Florida by Brad Yeager, a Florida State Rep that is completely non controversial, that promotes the notion of digital education for parents and kids. I mean, we don't let kids get behind the wheel of a car without taking driver's ed. So I've put that bill language in the hands of both Senate and House offices, and I would say that that might be an excellent way to start to build a conversation with everyone on all sides of the issue, to say, look, we want to get to yes. And I mean, honestly, I think this has been a very helpful conversation. And I know I think we all learn from each other, but if we can't put something on the table that people can agree on, we're not going to build a culture of agreement around these issues. And so that, to me, would be the one really simple place to start.
I'd love to talk to you about that more, I think,
don't threaten me with a good time.
Well, I think that just about concludes our time. A great note to leave it off on thank you all so much for doing this. I think having conversations like this is how stuff gets moving in Congress. So thank you all, and thank you all for taking time out of what I imagine is all very busy days to be here with us, but otherwise, I think that's all from us. And yeah, thank you. Have a great rest of your day. Thank.