My name is Naomi Nix. I'm a reporter for The Washington Post where I cover social media. And I focus a lot on how social media impacts our world. And lately, that's meant covering this larger conversation that we've been having among parents, among activists among regulators about whether our youth are really safe online, and how tech companies can bolster their safeguards to protect our kids. We've had the US Surgeon General warn that, you know, our teenagers are in a mental health crisis. And that might be partially to blame because of the time they spent on social media. Dozens of state attorneys generals are suing Mehta alleging the company is compromising their mental health. And just last month, we had the Senate Judiciary Committee grilled some of our biggest tech CEOs about how they're protecting kids online. It's clear that there might be a problem here. I think it's less clear what the right solution is. There's a lot of disagreement about that. I expect we'll hear some disagreement about that this morning. But here to sort of dive into some of those tricky questions about how we protect youth while preserving freedom of speech, and privacy. We have an esteemed panel of guests. I'll briefly introduce them, and then I'll ask them to introduce themselves at length. So we have John Carr. He is the Secretary of the UK children's charity Coalition on Internet Society. We have Hayley Hinkle. She's a Policy Counsel for fairplay. We have marine flatly, she is a stop child predators advisor. And then we have Daniel Castro. He's the vice president of the Information Technology and Innovation foundation. So why don't we just start, have you introduce yourselves briefly and talk a little bit about what your organization does? John, you want to start? Okay,
yeah, so I'm a Brit, as you might have gathered from my accent. And I'm very glad to be here see many over familiar faces, particularly my old friend Rick Lane is that I was very distressed, however, because the 49ers recently acquired and became the owners of my football team. In England, Leeds United. So I'm beginning to worry about the knock on effects of last night. So if anybody's got any tips or clues, please see me at the break and let me know. So I work with a vital adviser to the British government. I've advised the Council of Europe and United Nations various bits of the United Nations, but above all children's organizations in the UK, with a particular focus on digital technologies.
Morning all my name is Haley. I am Policy Counsel at Fairplay, we are a children's media and marketing watchdog focused on the ways in which big tech's business model and the excessive screen time it encourages impact kids healthy development.
And more, Maureen Flatley, I have spent the last almost 40 years engaged in Oversight and Government Reform of issues related to children across a range of systems. I always tell people that one of the most formative experiences of my life was being the daughter of an FBI agent who spent most of his career detailed to the Senate racketeering committee, where he developed testimony against the Cosa Nostra with Joe Valachi. And that proved to me with my own eyes for most of my childhood that Congress can solve big problems involving criminals. And I think at the end of the day, my point of view on this issue is that this is not fundamentally a technology problem. This is fundamentally a crime problem. And by ignoring that, for far too many years, we have now created a problem that seems almost insurmountable. But I hope today that we talk a little bit about some concrete solutions.
Thanks. And I'm Daniel Castro, Vice President of ITIF, Information Technology and Innovation Foundation, which is a nonprofit think tank, focused on innovation. I've been doing a lot of work generally on tech policy and the Internet economy and how it works, as well as the metaverse and children's safety online. And then these kind of emerging platforms. I agree 100%, with Maria on this point that I think this isn't a technology issue. And we tend to wrap these up and intertwine these things when they're not there. So hopefully we'll get into that today.
I'm sure we will. I want to start with the hearing. You know, I've been covering tech for a long time in Washington. And that was probably one of the most emotional hearings I've ever covered in part because we did have families show up and talk about the experiences of young people who had lost their lives to suicide or found drugs on online platforms. And yet, even though it was very heightened, there was also a lot of like blame shifting right. We had lawmakers accused the tech companies of having blood on their hands for not Doing more and then also sort of criticize themselves for not passing legislation. Even in that point it, you know, question answering session with Senator Josh Hawley. You know, Mark Zuckerberg said, the company had a lot of parental tools, implying that like parents could play a role in protecting their kids. Amid sort of all of that, all of the various proposals and all of the various diagnosis that happened in that hearing, just to sort of start us off in broad strokes, I'm wondering if each of you can talk a little bit more about what you see as sort of the top risk we're facing right now, when it comes to youth safety online? And who and what would be your sort of first biggest step that you think should happen right now? And, Daniel, you want to start us off? Sure.
I mean, I think, you know, watching that hearing, you can't come away with anything other than thinking it's political theater, and probably the worst possible way, because I mean, we're seeing, you know, parents there with real issues that impact children. You know, like you said, I mean, suicide, self harm, eating disorders, I mean, so much, that affects so many people. And the solution to those problems are not going to be the top solutions, which these are very complex issues. Technology is not at the top of that list, right? We have bullying problems, we have, you know, problems of addiction, in communities with drugs and overdose and in all these issues. And, you know, when you when you look at that you realize that, the purpose of that hearing, again, in my my opinion, watching this, it's not to advance solutions that will actually help children. In the end, it's to advance legislation that's intended to regulate big tech. And it's using children and children who were suffering, to advance that narrative and to advance legislation. And we've seen this successful playbook in the past, which is why it's being used again, today. We've seen it with access to FOSTA, it was the same issue where nobody wanted to oppose it, because you know, who was going to be on the other side of that issue? I think that's what I've seen with Children's Online Safety. There are legitimate issues, and there are issues that we can address. But if we think that the reason that these, you know, again, addiction, self harm, the reason these things are happening are because of technology platforms. That's not the reason, and it's distracting from real solutions.
Maureen, do you have anything to add about just in broad strokes what you see as sort of the most urgent priority right now?
Yeah, for sure. I've been to hundreds of congressional hearings in my career. And I have to say that I was really appalled at the framing of the hearing last week, which I did attend, it was, as if, you know, the panel was blaming the tech industry for everything from global warming to kidnapping the Lindbergh baby. I mean, it was just so over the top, it was not an exercise designed to come up with solutions. Was the show trial, there wasn't a panel that had any kind of affirmative input at all. And, you know, I was thinking about this last night. And I thought, you know, if I had been Mark Zuckerberg, what would I have said to that panel, and I actually wrote it down. So this is what I would have done, I would have said Congress has had 16 years to implement the Protect act. It was a virtually perfect first step to building an infrastructure around law enforcement to mitigate all of the things that we're seeing today. Yet, in a recent GAO report, which if you haven't read it, I recommend it highly. They issued a scathing indictment of DOJ for failing to implement this really important bill. Congress hasn't held any really meaningful oversight hearings, DOJ hasn't issued any real reports. It's I mean, Child Exploitation isn't even on their list of priorities. And as someone who grew up in a family with a father who worked for DOJ, I've always had the highest respect for that agency. But I have to tell you, that in a universe of institutional players, who are responsible for what we see before us today, the Justice Department is at the top of the list, at least as far as I'm concerned, closely, followed by Congress, which has failed to do its job which is authorized appropriate and overseas spending. The tech companies are mandated reporters, just like teachers and pediatricians, pastors and daycare providers. All mandated reporters in other contexts are specifically shielded from civil liability because if they weren't, we would never get any meaningful reporting. But if you look at the list of side attempts, virtually all of them come from tech companies. The number one reporter of cyber chips is Mata. And meanwhile, no one even bothered to ask the owner, the founder of Meta, what would be helpful to them to prevent the proliferation of this activity on their platform? I don't know how many people live in Washington. But remember, when the Columbia Heights CVS was swarmed by shoplifters, and and so badly that they're now closing that store? Nobody blamed CVS for the shoplifters, right, they call the police. And in this instance, I'm waiting for somebody to say, hey, tech companies or private companies, they can't arrest or prosecute anybody, they they post those cyber chips, dutifully, every year, year in and year out. And yet, they can't provide the public safety response to them that is needed. And quite frankly, when you look at what is really going on here, and what happens to those chips, they're geo located, and they're referred to countries all around the world. And, again, if we're looking to mitigate the problem, we have to look at the underlying crime problems, no tech company is going to be able to combat the sextortion ring that is, for instance, the Yahoo boys and Nigerian based gang that is operating in probably 25 different countries. When I listened to those parents, in their anguish, and believe me, I have worked with hundreds and hundreds of victims when I discovered in 2006, that the civil penalty to download a song was 1/3, the civil penalty to download Suzanne, I got John Kerry to write the bill that fix that in six months, tripling the civil penalty.
So it's not that I'm unsympathetic. But I'm saying that blaming the tech companies for a global crime problem is not a path to success here. It just isn't. So law enforcement could play a big if I if I were Mark Zuckerberg, I would have been asking a lot of questions. And one last observation as long as we're on this subject. When I look at the plaintiffs that are suing Mehta, most of them have been sued themselves for poor child welfare outcomes, several of them horrendous Child Welfare outcomes. So any suggestion that these individual state claims against metta are somehow on the moral high ground with respect to outcomes for children. Don't make me laugh. So we, at the end of the day, we really need to sort of refocus this conversation, and look at what's really going on here and work with the tech companies, because anger is not a strategy, conflict is not a strategy, whatever this is, whatever is going on right now is not working, and it's certainly not helping kids.
Great. Um, so I would say, my framing of the problem, you know, a fair play and with our fellow advocates is is this, which is that the incentives are such that these platforms are using kids data, and designing user interfaces in a way that's meant to extend their time online, expose them to advertising, and give the platform's access to more data in order to better refine features that extend use and targeted advertising. And so I think a lot of our work in this space has been very focused on a sort of two pronged approach. One is around data, privacy and the other around safety by design. I don't think you know, I think this year, we've started to see some push and pull on the debate sort of implying that advocates think that, you know, there's sort of a direct, like, solving these big tech issues will solve all the problems around teen mental health. I think that that's certainly not the case, certainly not a view that's shared by my organization, or many of the folks that we advocate alongside, I think that what we are seeing is parents and youth advocates and organizations that are on the frontlines of these issues, seeing that families really need tools and help. And we were at the hearing last month with quite a few parents who have unfortunately lost their kids to a range of horrifying online related harms. And, you know, these are folks that had all kinds of parental settings that have a lot of conversations with their kids. But I think that for me, one of the big takeaways from the hearing was that there's an understanding across the aisle on both sides of the aisle on Capitol Hill, that parents need help that these companies have failed to self regulate. I mean, part of the reason Mark Zuckerberg was getting so many questions is because of The information that we've learned through the lawsuits that many, many, many states now have brought against Instagram, some of the things that have been unsealed some of you know what's revealed in in metazoan internal research about the way its product impacts kids and teens. And so those are the things that are really sort of top of mind for us as we carry this advocacy forward. You know, I think that while again, you know, we're not attributing some sort of direct, you know, will solve all of teens problems if we if we regulate these issues. The fact is that there are features and functions that exacerbate existing issues for kids as they're developing, we're talking about, you know, young kids whose whose prefrontal cortex is are very much not developed, who are very vulnerable to what we know our, you know, scientific techniques to influence their behavior. And so all that said, the hearing only means anything if we actually see action from Congress, we've had many, many, many hearings we've had, in the last couple of years, two very notable whistleblowers and Francis Haugen, and Arturo Bihar. And, you know, our message has been and will continue to be that we need to see action now we've done enough talking.
Okay, so there's nothing new under the sun. There's always been misinformation, there have been children who've had drug problems, or they've been bullied at school, or they've been, they've been bullies. There's all kinds of things that have been going wrong in society for centuries. But What's distinctive about the period that we're living in, is the way in which digital technologies have put various of these problems on steroids. And therefore, in my mind, no question that the technology companies being the Masters, mistresses, bosses, whatever, of that technology have a uniquely important role to play, the idea that the cops are going to sit on the network's monitoring stuff or watching stuff, it's I mean, it's probably even scarier. It's a very scary notion. And it's not the practical notion. So the companies have to step up and take an important role. I can't think of a single area of law enforcement in my country, and I'm gathering it's the same year, but is adequately staffed to deal with any area of crime, that they're always under resource that doesn't give the criminals a pass. It doesn't give other agencies permission to ignore what they can do to help minimize or reduce crime. Now I'm going to I mean, verification was something I was specifically asked to comment on, because we've got, we've done various things in that field in the UK. And I'll start with a short story. So in the UK, your kids go from middle school to big school at around age 11. So when our kids went, like most parents, we opened up our kids their first bank accounts, to help teach them how to manage money, we failed, which is why I'm a poor man. But nevertheless, routinely as part of being given their bank accounts, aged 11 They were given debit cards. So around 2003 2004, we began to hear cases reported to children's organizations, not my kids, I hasten to add, have children typically boys 1415 years of age, being clinically diagnosed as gambling addicts, and just so we're clear about that, what they were doing, they were getting their pocket money or their part time earnings, putting it into their bank account and blowing it on a horse or a football match, or, or whatever it might be. Now, it was quite clear, you have to be at least 18 years old before you can gamble on anything. online gambling has been possible since the dot, basically, in the UK, I know that the situation has been different here in the UK, I went to see all of the gambling, big gambling companies, offices in the City of London, very expensive suits, wonderful boardrooms, all of that stuff. I said, you've got kids being diagnosed as gambling addicts coming onto your site. And they said, Yeah, we take this problem very seriously, though, if I had a pound, for every time I'd heard a tech executive say we take this problem very seriously. I'd probably be in the Bahamas now. But anyway, they all said the same thing. Privately, publicly, they said we take it very seriously. We were working on solutions. They did nothing. privately. What they said was, of course, there is friction that we couldn't enter into, we could engage on our platforms to try and protect kids or slow them down and things. But if we do it first, any of our all of our competitors will basically steal our business. So none of the gambling companies did anything until we change the law. And we change the law in the 2005 Gambling Act, the relevant parts of which became operative on the first September 2007. Under that law, the Gambling Commission will not give you a license to operate a gambling website in the United Kingdom. Unless you can show you've got a robust age verification system in place since the first of September 2007. I'm not aware of a single case, not one, where a child has gone on to a website, tech to say that they were 18 and then gone and blown their pocket money, or get or gambled in any way at all. I'm not saying it hasn't happened, they may have impersonated their parents, they may have borrowed a parent's credit. That's a different set of issues. But in relation to the gambling companies and the available technology they had at their command, they had the means of doing that before the law changed. They didn't do it until the law compel them to. And now we're going to do the same with pornography sites. But I might come on to that later if you if you want me to. Well, you
know, actually, why don't why don't we just dive into that topic? Now? You know, we're in a moment where there's a lot of state bills that are pushing this idea of parental age verification and getting parents having youth get their parents permission to use social media, and sort of enhanced parental controls. I think, you know, we heard even in the hearing just last month, I think it was the snap, Evans snaps, Evan Spiegel, who said, you know, just 400,000 of our teenagers have enabled parental supervision that's like 2% of their platform. My own reporting shows that like, you know, Metis safety experts have for years been concerned about this idea that like, relying on parents to police, their own kids, online activity might not be the most effective strategy. And yet it does seem like one, you know, that that lawmakers and even parent advocates can embrace despite some of sort of the tricky technological issues. I'm wondering if you can all sort of reflect on a like, what is the best way to verify kids ages these days? And can we even rely on parents to police their own kids activities?
I think our new legislation, the Online Safety Act 2023, which became law in October last year, it's not yet fully implemented. Because the regulations are being drawn up, they have to go back to the government and then to parliament before they're operative. So we won't actually see any concrete action until probably towards the end of this year. parental education, digital literacy, all of these things are a crucial part of the total picture. But they are not an alternative to expecting and demanding that the tech companies do their best with the technology that they've got available in which they understand better than anybody else. If you in the United Kingdom, under an agreement, which I helped to negotiate in 2004, if you get a device, which depends upon a SIM card, so a mobile device, typically a telephone, smartphone, whatever, then it will be by default, it will be assumed that you are a child, you will not be able to access pornography, gambling, alcohol, anything at all through that device from the word go, unless and until you've been through an age verification process. And it's established that the normal everyday owner of that device is an adult. So this is typical of what happens in most cases, with most consumer products, you deliver the consumer product in the safest possible way, at the point of first use. Of course, if I wanted to kill myself with an electric fire, I could probably by a very, very long flex, plug it in at one end, and then run into a swimming pool with the electric fire, you know, clasp to my chest. There are certain ways in which, you know, if you're determined to mess things, you will mess things up. My point is at the point of first use, the tech companies should be under an obligation, where children are likely to be most engaged with the product or the service to ensure that it's as safe as it possibly can be. Parents, of course, can liberalize the settings, if they want to put at the point the first use, it should be in its safest possible condition.
Yeah, sure. Okay. I very much agree with a lot of what John just said. I think that platforms have been very enthusiastic about legislation around parental permission to access. The parents we talked to, you are very clear that that doesn't really solve many of the problems because the choice of parent faces then is do I socially isolate my kid and deny that permission? Or do I say, okay, and then they have access to a platform that just has all the same problems, because all we've done is passed a permission to access regulation. And also very much we'll just agree with the sentiment that, you know, the strongest default settings and protections need to be in place, you know, when a child accesses the platform, and and I think that that's, you know, an important piece of this conversation as we think about what the state parental permission bills mean.
If I can add on, I mean, I think you know, the, the problem we have with age verification United States is I mean, basically, right now we're in a world where we assume everyone's an adult, unless they affirmatively say, their child. And a lot of the proposals on the table are to assume everyone's a child unless you prove that you're an adult. And for many adults, you know, there are privacy concerns with that. And for many children, there are privacy concerns with that, and I think it is, you know, when we're talking about, for example, some of these state laws like Utah, right, this is a state that, you know, they, they've said that, you know, they think some of these apps are so unsafe, they can't be on government devices, yet, they want to have a law that says, you know, every parent has to now upload a copy of their ID and give it give a copy of their personal ID to the same apps, there's a serious, you know, conflict there. And I think, you know, we need to have alternative options. And so for example, one of the options that IETF has come up with, and we're looking for others, like this are ways that empower parents with something like a trusted, you know, child flag that you can attach to a device, so you can put it in a child mode. And then once you give it to them, you know, every, every app, every site, you visit, after that would have to respect that child flag, if it's an adult oriented side, something like that gets around having to verify IDs, having to verify age, it's no longer about, you know, this kind of legalistic, you know, are you 18? Are you 16, or whatever threshold we want to put, which, you know, the problem with that is, you know, one, it's kind of substituting government oversight for parental oversight. And you know, to not all children are the same, right, some 16 year olds probably can't handle certain things that 18 year olds can, and vice versa, there's such a wide range. And so, you know, creating something like a trust a child flag is basically saying, can we take the ecosystem we already have of some of these controls that you mentioned, that aren't being well used? And think about? How can we actually make them so that they are well used? Well, one of the problems one of the reasons parents don't use these, you know, child safety features right now, is because there's so many of them, you have to go, and you have to figure out the one for this social network, and then another social network, and then this device and another device, there's no interoperability between any of that. So our point is, why don't we work on actually making this all work together? So that you can give one child one device, you can set screen time, but you know, applies across a Chromebook and an iPad, and their windows device? You can you can do much more with that, then trying to say, okay, you know, we're gonna ban certain types of features on certain types of social media sites, or require, you know, everyone to display their ID. We know people don't want to do that. And there's huge resistance, it's because you know, people rightly are concerned with their privacy.
Wow, my head's gonna explode. You know, I agree, John, that we should be providing them the safest possible environment for kids. I've spent my whole life working toward that goal. But I think that when we talk about the public safety aspects of this problem, which we do not talk about nearly enough, we're overlooking the fact that we're not talking about having like, I live in a town of 3000 people 30 miles north of Boston, we have a wonderful little police department. We're not talking about having policemen set on the systems, right? We're talking about breaking up global criminal enterprises that are not just terrorizing kids. They're terrorizing the companies, too. And so at some point, we skip this part of the conversation, you know, there was, so I was reading the other night that there was this. This is from one of the pro parent groups US has traditionally put the onus on parents to supervise their children's online experience. However, this doesn't get to the root of the problem about companies designed platforms to maximize engagement. Okay, we skipped a huge element there, which is that as crime moved from the real world into cyberspace, we have forgot a legal obligation, we have a moral obligation to tackle that and we're not doing it. Why are we even bothering to collect the cyber tips, if something like 3000 out of every 100,000 Cyber tips are even examined? I mean, the notion that we're doing enough to protect kids in this conversation is preposterous. And quite frankly, by shifting the blame directly to the tech companies, we're missing a huge opportunity to protect kids. We talk about data privacy, I see my good friend, the Newman to my Seinfeld recliners here. And Rick and I banter about encryption all the time. So I've spent a lot of time working on identity theft and children. It's a huge problem. If you think the problems of exploitation online are problem. Come look at those numbers. And so I've had a lot of concerns about weakening encryption. I mean, if my dad was an FBI agent, sure we want to help law 1/4 But I mean, what's next we just strip away the Miranda warnings, we've got to find a better way to do this. And encryption is the backbone of protecting not just kids, but every single consumer. We talk about children and data privacy. Hello. It's a nightmare. You know, let's talk again about some of the meta plaintiffs because I've been looking at these states for 40 years, I started my work in California where they were routinely issuing multiple social security numbers to kids, so they could make multiple title Bori claims to the Feds for their kids in foster care. If you're a child in foster care when we're talking about meta monetizing kids Jada without parental consent, if you're a child that enters the foster care system in any state, but Maryland, and you have social security benefits, the state is taking them from you without anybody's permission, and you're leaving the foster care system destitute? So I guess I'm finding this whole conversation a little troubling, because I see a huge disconnect, between the lack of outrage about those practices that are going on in the real world and by a lot of the people who are now directing attention to the tech companies, because it directs attention away from them. And I think that there's a lack of consistency. I mean, we could talk about Utah all day. Okay, these state rules that are being put forth, I'll let's just use my favorite example. My least favorite Democrat, Gavin Newsom. Okay, so can you have
Can we wait on Gavin Newsom, for one second,
Sure, but we're going to get to get to Gavin Newsom.
There is a project, which I'm chairman of, which is trying to develop an international framework for doing age verification. So the problem that you were mentioning about having to jump between different platforms, different methods for doing it should be solved. It was originally funded by the European Union. It's now funded by the UN and the chairman of its advisory board. So we recognize the point the problem that you've raised, and it should become smoother and easier to a meta is part of the the experiment that we're involved in, and other tech companies are as well. Okay, just,
you know, I think it's important that, you know, no one's really objecting to, you know, gambling sites and keeping children off that. I mean, the questions come up when we're talking about sites, and venue services that have, you know, a broad range of users that are, you know, I think about, you know, if you look at how children, you know, children are most likely to be have amputations because of lawn mowers, right, we could have ID checks for lawn mowers so that, you know, you could push a lawn mower if you're under the age of 18. But we don't write we expect parents to be responsible in this space. And we expect there to be a bounce. That doesn't mean we sell lawn mowers that are intentionally dangerous. We do have safety standards. But I think this is where we need more of a balance and more of a respect for the fact that there are going to be, you know, multiple types of, you know, parents and standards and what we want out there. And it can't just be, you know, treat everything like a gambling site on the Internet, because they're not.
I think one of the so one of the laws that was getting a lot of sort of airplay during the hearing was kosa, the kids Online Safety Act. And you know, that would establish some reasonable measures that tech companies could take to prevent harm. But it's been controversial, right, because I think there's some who are concerned that you know, that it might empower state attorneys General's to limit certain types of content for vulnerable use, like our LGBT communities. I'm wondering, and Marina, I want to start with you, if you don't mind. Just because you brought up Newsom. If you, you know, who, how do we even define what is harmful content for kids and shit. The state attorneys generals in some of these states really be the decider of like, what's what's actually harming our kids are not God.
No. I mean, first of all, it was I think it was Justice Potter Stewart, who said he knew pornography when he saw it. So it's a very subjective matter. So one of the reasons that I've become so concerned about the role that state attorneys general have played in all of this is that a no individual state is regulating the Internet. Right. And also, as child welfare and adoption, another issue that I do a lot of work on over the years have been generally viewed as state law issues. They have become so fundamentally inter state and in our country activity, that I don't believe that the states can adequately control or enforce them anyway. I mean, one of the reasons that I frankly pulled away from supporting cosa is that the state attorneys general are not enforcing their existing Child Welfare statutes, much less adding all this new stuff on that they're not experts in to begin with. So Here's the Gavin Newsom example of why leaving it up to the states is a bad idea. So Gavin Newsom had this, you know, flashy press conference, talking about a bill that California passed that was supposed to be fantastic and solve the problem and protect children in California. And at the same time, they were letting 1000s of pedophiles out of prison on early release, one guy served a whopping two days in the LA County Jail for a pretty gruesome crime. Now, this is a crime that has arguably the highest recidivism rate of any category of criminal activity. And taking, first of all, that it was a miracle that any of these guys were convicted, because one of my concerns is that the conviction rates as against the cyber tip numbers are negligible. So really, Gavin, I'm not really that interested in having you or Mr. Bonta, do anything that has to do with protecting children, because there's a fundamental hypocrisy to that kind of disconnected thinking. And as far as the states are concerned, what we're talking about is dramatically interstate activity most of the time, there is no way that they can really wrap their arms around it. So these state laws have just become this performative exercise. So I'm really not at this moment, I'm not going to support any bill a that doesn't focus on the criminal justice aspects of the problem. And be that leaves anything up to the state attorneys general, because I just don't think that they've done a good job with kids, not just on this, but on anything.
Others have a viewpoint?
I might add to this conversation, I think that there has been a lot of important discussion around the kids Online Safety Act and the duty of care over the past almost two years now. You know, I think the bill in its current iteration is pretty clear that it's not meant to impact the existence of any single piece of content, it says very clearly in a rule of construction and the duty of care that this isn't about a child searching. I think that there's an important distinction to be made between holding platforms responsible for the mere existence of content, which is protected under Section 230 of the Communications Decency Act, and the decisions they make about what they are actually promoting into our feeds. Because they are, you know, training algorithms and targeting metrics. And we know that more outrageous content gets more eyeballs and therefore ends up pushed onto feeds. And so you know, I think that's an important piece of this conversation, as we continue to talk about, you know, how we can conceptualize conceptualized cosa, I also think that the text in its current iteration is clear that it runs to the design and operation of the platforms. And that's really what we're trying to get at is that features and functions are just simply not the same as just the mere existence of content.
Can I just make a quick comeback on that point about sub state substituting parents rights? Absolutely not how we see it in Europe, what we see and I might remind you, the United States of America is the only country in the world that's not signed the United Nations.
Oh, I have a lot to say about that John Brennan, and they're never going to, so kind of a false amalogy
It's not immensely relevant right now. The point is, we accept that the state has an obligation to help parents, particularly in complex areas like this, what is it Apple sick figures, less than 1% of Apple users actually use any of the safety tools that they've they've put on there, it's quite clear that there is a disconnect between what we expect and hope and would like parents to do, and what's actually happening. And what we're saying in with our legislation in the United Kingdom is that's over. We're not going to take find words from tech executives about their hopes and aspirations. We're not going to let them do their own homework anymore. This is a key piece of our new legislation. There will be legal obligations on tech companies to report in detail, the risk assessments that they're taking in relation to how child users of their service are actually faring in terms of that service, and the steps that they're taking to make sure that those problems are minimized or disappear. And if they tell lies, and by the way they have done in the past, somebody will go to jail, because another unique feature of our legislation, is there are criminal sanctions being attached to the reporting obligations under our legislation. And I expect we'll see how it works. We have a new regulator, that will be undertaking the work here, but we've had it with fine words and promises from tech executives, that period is ended. We're in a new era where the state is failed to the ad. And by the way, this went through with all party support. Nobody voted against this legislation when it went through the British Parliament. That is extremely rare, extremely unusual. But it happened.
So, John, you know, I emailed you about this over the weekend, you know, the tech companies are not shielded from federal criminal prosecution right now. And quite frankly, one of the reasons that I am so keen on law enforcement is that they shouldn't be marking their own homework. If there are companies out there that are conspiring with groups like the younger boys, for instance, let's put them in jail. There is absolutely no scenario in which existing US law does not provide exactly the sanctions that you're seeking prevention. Well, John, here's the thing. You can you can talk about that stuff all you want. And I agree that we want to have safe platforms. But there you have skipped over the entire discussion of public safety. And so if you're dealing with I'm working with a victim of sextortion right now, she's a lovely young woman. It's amazing how she's come back from a horrible experience. But really, do you think that any platform can push back on an international global ring of sextortion is, without some help, you think that all of these issues that we're talking about cannot be bolstered, in fact, strengthened by having a public safety response to? Okay. And then, and then I just have to address this US versus the world thing. The US framework of child welfare law is very different from international standards. I've worked all over the world, okay. The US is never going to ratify the UN Convention on the Rights of the Child, for better or worse, because our framework around children is more like property law than human rights law. Okay. That's just is what it is. Okay. But at the same time, I've looked in detail at the global law enforcement response to this problem. And guess what, nobody's doing a good job. Right? Why? Because there isn't enough investment in public safety. There isn't enough investment in collaboration, there isn't enough focus on the criminal activity that is fueling the victimization of kids, I've got eight children, okay, two daughters, and six stepchildren, while all of them I raise 17. Grandchildren, those kids range in age from 56, to six. So I've seen every possible application in my own family of the evolution of technology. And, and Would that it were so simple, that a company could just wave a magic wand and create a safe space. But if we're not creating a safe space for the company to do business in, they're certainly not going to be able to create a safe space for kids. And I'm not here to defend the tech companies. But I am here to tell you that I've been doing this for a long time. And I can tell you that if you ignore the criminal activity, that is fueling the suicide rates. And by the way, this whole discussion of mental health has a lot to do with a lot of things and not technology. And by the way, I haven't heard anybody talk about the positive things that technology does for kids right this minute. I do want to get there actually, to get there. Because it's a it's a, it's an important part of the conversation.
Yeah, I agree.
If I can, just I want to put a fine point on why cosa is problem at the state level. And I mean, the problem is, if you look around the past year, we've had these heated debates in many states around book bans and other types of content, where, you know, we are seeing state legislators substituting their view about what it we say this, you know, at the lower level of school boards as well, substituting their view about what content is appropriate, what content is not. And even though the legislation says they're not, you know, trying to focus on any specific content. I mean, just think of this last year, and that we saw and think about what that hearing would look like, if a law like cosa was on the books, every CEO would be, you know, they have a parade of content from third party content and saying, Why did you think this was appropriate for children? Did you not have a duty of care to take this down? And we would just see more of the same political theater where it will be about the content, it will be very subjective about what content is allowable, what content is not. And to the point about, you know, are we actually helping children at the end of this? I mean, one of the concerns I have with a lot of these bills is basically taking more and more of the Internet away from children. And the idea is, of course, you know, they're saying, Well, we're doing this in ways to protect children. We're also taking a lot of the value away from children. And increasingly, what's left for children on the Internet isn't very useful unless you have money to pay for, you know, paid Software, paid apps paid services. So you know, if you want the future, the Internet to benefit children of all backgrounds, those who can afford things and those who can't, we need an ad supported Internet that actually delivers value. Now, we want to make sure we're not delivering them harmful ads, but it will be ad supported, just like much of, you know, other public content that we have on broadcast TV was ad supported content.
I think you guys, both of you actually talked a little bit about some of the promises of social media. And I think one of the things we actually heard some of the tech CEOs talk about is, you know, actually, a lot of teams, their whole lives are on social media, right? Like they find connection, they find communities they find people they wouldn't maybe have otherwise met. Most people who use social media don't face the tragic stories that we heard about during the hearing. And I think there is starting to be an argument about you know, is there a sense in which some of this is a bit of a moral panic? Like, are we just sort of fretting about social media in the same way that we might be once fretted about video games or other sorts of television and other sorts of technology? And I'm wondering if you all, you know, can reflect on that idea, and what you might tell regular parents about how concerned we actually should be about some of the risks we've discussed here today.
Also, briefly, I think the comparison with video games is very appropriate. I mean, if I think back, you know, to Columbine, right? And right, after Columbine people were looking for something to blame and what they could do. And the answer then was video games, video games was seen as the problem and video games. You know, we've seen clearly, you know, regulating that would not have addressed all the school shootings we've seen since then. There were other problems there. I think the same thing is going on with social media where, you know, part of the frustration I think a lot of parents have myself included with this debate is, it takes attention away from the real issues. You know, if we're talking about real issues that are gonna help children, more law enforcement, better safety nets for children, more calcium, there's so much more that we can be doing to help children address children's need. And it's not about you know, whether we're having autoplay on videos on social media, those aren't the big issues.
You know, I think, you know, I think about how, you know, kids used to go to school on horseback, and now they ride school buses that might crash, you know, there's a, there's always a generational thing that goes on here. But, you know, at the moment, the Ukrainian government is looking for 10 to 10s of 1000s of children that were abducted by the Russians, and how are they doing that? They're using AI. And they're using AI to scrape the images that the Russians are putting out of the kids. And they're matching them with the images they got from the parents that reported the missing, and they're geotagging their pictures. And guess what we know where 23,000 of those kids are probably, we look at the way that over a million kids have been adopted from foster care in the last 20 years, I probably would have had a really hard time finding families, if it weren't for the Internet, we'll look at all of the educational applications of kids are in COVID. I mean, good lord, kids would have been dead in the water without having some technology access to education, to mental health services. And from where I sit looking at a massively underserved population of kids of all kinds. We look at technology as the as the new frontier in terms of of delivering adequate mental health services, in particular to all kinds of children for all kinds of reasons. But when you're talking about a moral panic, Naomi, I think there's a there's another aspect of this that really hasn't been discussed, and that is opportunism. And so when people started to figure out that they might be able to sue the tech companies who have some pretty deep pockets. That seemed like a pretty attractive alternative. Now, again, I look at that and go, Okay, they're mandated reporters. So if you're going to start to sue them, you need to start suing your pastor, your school teacher, your school nurse, all the other mandated reporters. And in the meantime, by the way, because they're not mandated reporters, let's go after the gun manufacturers because they're shielded from liability, and they kill a lot more kids and Facebook does. So I think that there has been this sort of sort of a masterful manipulation of the message that has departed sharply from what you know, I would consider and a lot of other people would consider to be best practices in child welfare mandated reporting has been around since 1974. And thank God it is because it saves a lot of kids lives. As far as the issues of suicide and other side effects of being victimized by criminals. Well, I say, let's just give it a shot. Let's see what happens if we go after the organized criminal enterprise on the Internet. Because you know what I think will happen, it'll be safer, and it'll be safe for everybody. Just some thoughts.
Other thoughts?
My issue with the moral panic framing is that these harms have been realized they're not imaginary. We have a lot of youth advocates and a lot of families saying they need help. I think there are many wonderful things for kids and teens online. And, you know, our driving sort of motivation behind this advocacy is that they should be able to learn and play and socialize free from some of these incentives that currently exist because of a lack of regulation. Kids are not going on to a social media feed, and just seeing what their friends have posted. They are seeing things that are being pushed to them, because a platform has decided it as such, in order for it to be profitable. And so you know, I think the fact is that we talk with families week in and week out who have experienced the very worst of this and have struggled very mightily against it. And this is something imaginary, we've got the American Psychological Association and the US Surgeon General, saying these features and functions will exacerbate issues for kids and overcome their own sort of developing sense of things like when to log off and go outside. And that's why, you know, taking action is so necessary.
Of course, the internet's been hugely beneficial for the vast majority of children and society as a whole. I mean, we're not here to celebrate hope that's not what we're here to talk about. We're here to talk about the bits that are still not getting the right degree of attention. And the damage that's done to children through something the way that you've just described, it is huge, and love and life, life, lifelong. So what would what I suppose in the UK, what we basically said is, we're going to compel you to do the stuff that you've been saying you want to do, because we don't think you're doing it consistently enough. We don't think you're developing enough result, devoting enough resources to it. So we're going to put the force of law behind it, we're going to make you honest, we're going to make you keep the promises that you've been making, with your fine words over so many years, which so far have have not produced a good enough result. And can I just say, on the child sex abuse thing, if you looked at my bibliography, the subject I've written most about, and lectured most on is victims of child sexual abuse on the Internet. It's an that's a very, very serious, huge issue. And I'm not trying to minimize or reduce it in any way whatsoever. But there are a whole set of other things, too, that have very little to do with organized crime, and everything to do with the way algorithms work.
Oh, John. All right. Now, now we're having some fun. Listen, I started working with the Boston Globe spotlight team on the Catholic Church abuse cases, in 2001. Most of that abuse didn't happen on the Internet. Okay. Yeah. So this, again, the moral panic aspect of this sort of ignores the underlying cultural issues, the underlying criminal issues, and the underlying just sense of well being of children. And, you know, at some point, and God knows, as a mother, I feel this keenly, we have to also make a decision of where do we as parents draw the line? Where do we as parents take control of our children's lives? I can tell you right now, because I have spent over almost 40 years looking at it, the government makes a terrible parent. And there are right this minute 500,000 kids in foster care here to tell you that that is true. So it really concerns me as much as I am concerned about child safety, that we hand any of these overarching decisions to the government, especially when it might be elected officials who theological positions will change from administration to administration, and take it away from parents. And if you look at what's happened to the real world's child welfare system, which is really with what I call family policing had some very damaging effects. Not necessarily a world who want to build in cyberspace.
Maureen, you might have gotten in the last word, because we've run out of time. I don't know that there was any more agreement here on this panel as there was in Congress last month. But obviously, this is an important issue that we'll have to continue discussing as we figure out what the right policy solutions are. So thank you for sharing your perspectives and thank you for listening