SOTN2023-17 The Future of Children's Online Privacy & Safety: Keeping Pace with Technology
6:52AM Mar 13, 2023
Speakers:
Jamie Susskind
Lauren Feiner
Natalie Dunleavy Campbell
Caitriona Fitzgerald
Dayo Simms
Jane Horvath
Keywords:
privacy
online
age
kids
platforms
cosa
bill
parents
legislation
law
people
teens
state
age verification
companies
implement
parental controls
year
protect
community
Great. Well, thanks, everyone for coming to this panel. Today we're talking about protecting kids on the Internet, protecting them, their safety and protecting their privacy. So I'm more than Byner. I report on tech policy for cnbc.com, where I cover a lot of these issues. And it's an area where I think a lot of people, pretty much everyone agrees on the value of keeping kids safe online. But in terms of how exactly to do that effectively, and practically, that's where there's room for discussion. So we're gonna dive into a lot of these questions today. And I'm going to just let everyone go down the line and introduce themselves and maybe just say a brief sentence about your background in this space or the policies that you're following.
Sure, my name is Natalie Campbell and T director for North American government regulatory affairs as the Internet Society. Another aspect of my background that's relevant to this conversation is that I used to be a trustee on a school board based in tantos North care story, and also long as a eight year
old and no mother knows
a thing. Okay. Hi, Katrina Fitzgerald on Deputy Director of the Electronic Privacy Information Center epic, epic has been around for almost 30 years, defending the fundamental right to privacy in the digital age for people of all ages, all Americans, we, you know, we're involved when COPPA was passed the last time Congress passed a privacy law to keep kids safe online 25 years ago, and, you know, following what's happening in Congress with the kids Online Safety Act COPPA 2.0, and the American data Privacy Protection Act, and also involved in states where you're looking at age appropriate design codes.
Hello, everyone, I'm Dale Sims. I'm a privacy counsel at tick tock. Prior to working at tick tock, I worked for three US federal government agencies, SSA, treasury and Department of Homeland Security. After that, I went to Uber and about two and a half years ago, I joined Tic Toc, I work on privacy issues in the in the US public policy team. I've been following a lot of the developments and then I work with our product management engineering teams to come up with solutions that are forward looking. And so I'm happy to be speaking with you all today.
Hi, I'm Jane Horvath. I'm a partner at Gibson done. And prior to this for the last 11 years, I was the chief privacy officer at Apple.
Hi, I'm Jamie Susskind. I'm Tech policy advisor for Senator Marsha Blackburn from Tennessee. I oversee the senators work on the Commerce Committee, including the consumer protection subcommittee which includes privacy and data security issues. And I was the primary staffer on cosa and sort of managing the privacy issues for the senator. And you all are lucky because my almost six year old wanted to come today. And I was able to hand her off to her dad.
Well, thanks, everybody. To kick things off, I thought we could just first kind of go over some of the key pieces of legislation in this area. So Jamie, I was gonna kick it to you to talk a little bit about the kids Online Safety Act. Could you tell us a little bit about what that legislation does? Where it is and you know how it fits into this charaters? At feet?
Yep, happy to. So the kids Online Safety Act, Senator Blackburn introduced with Senator Blumenthal from Connecticut, about a year ago. And that followed five hearings they had had in their Subcommittee on the Commerce Committee about harms to kids and teens online. Those hearings had had both academics and researchers and industry representatives from different different companies. And after that the coastal legislation was basically the culmination of the things that they had found after those hearings. So we have not yet reintroduce it here in the 100 and 18th. I wouldn't take anything from that because it's just taking some time to get the final text ready. We have new leadership on the Commerce Committee and Senator Cruz and we just want to work with him and Senator Cantwell to sort of make sure the product is the best it can be. So we don't have to have, you know, a big long protracted markup process. So last year, the bill was voted out of the Commerce Committee 28 to zero we're hopeful for a similar result this year. Basically, what the bill does is it would require a cupboard platforms to create what we call safety by default. And so in our view, this picks up where COPPA leaves off. COPPA, which probably since you're in this room, most of you know that is the notice and consent regime for kids and teens to get online. It cuts off at age 13 is where It would begin. And so, you know, basically what we have come to learn is that, you know, Papa has its, its good points and it has its bad points. But at the end of the day is simply a notice and consent or theme for kids and teens, is it preventing a lot of the harms that we're seeing? And so, you know, happy to talk with our fellow panelists about that, but we need to be doing more. And so that's our coastal looks.
Great. And then just to give us an overview of some of the other legislation on the table, Katrina, would you be able to tell us a little bit about the efforts to revise Kapha and I think you also looked into the California age appropriate design, Kurt?
Yeah, sure. So the old BOSU helper 2.0 as a Marty and Todd ZF, on the House side, we will book all that stuff in update as of late, you know, opportune to to hop its own product, the chain and franchising it out tide and pickin, so they go a little further than the novice and consent Hodel. But they're focused on on privacy and data collection and of things, more and more knowledge design pieces. And then in California, and they test at Public Design code law session following the model with team out of being K. It's, you know, I think it has the same pain discourse, a bit of guilt about it a little bit a different way focuses more on like data practices that are driving the business decisions, have never keeping the Kitahara ain't been in the car 10 Times Online. So it quiet companies to do data protection, impact assessments that look at what harms are coming from the data practices, and then we'll pay Sakis coming in timely for dates.
Okay, great. And, you know, I think when it comes to these kinds of discussions, we end up talking a lot about how will it actually look to implement these kinds of policies. So di I wanted to ask you a little bit about as someone in the industry, yeah, we're looking at this legislation, what are your concerns? Or what do you see as the challenges and trying to implement some of these legislations that aim to protect kids online?
So I think what we try to do at tick tock as look at I guess the overall trends and what these legislate with this legislation is trying to do. And so we look to implement some of these things in advance because they're happening in like a piecemeal fashion. So in the UK, you have ADC in California, there's a in DC, but No, there isn't a national bill that protects teens beyond the age of 12. And so we're kind of thinking about how do we build in default protections. So for example, at Tik Tok, we've thought about ways to we thought about children's development. So from 13 to 15, we believe that teens be a lot more hand holding and guardrails. So we make their accounts private by default, we think about ways to implement default time limitations on how much time they can use. So just last weekend, we announced that there will be 60 minute time limit screen time limitations. And then we also rolled out a suite of additional tools to add to our family pairing feature, which is actually kind of consistent with CO says parental controls provision. So we've been thinking about ways to look at this legislation, apply it broadly across our community of Team users. Think about, you know, how much hand holding they need, implement those things, and then continue to monitor as developments come up.
Yeah, I think something interesting that you brought up there is like looking at the different age brackets of users and seeing, you know, if they might need different levels of hand holding, and privacy protections. So does anyone want to speak to how we might think about different ages of minors, and, you know, if they should have different kinds of protections or different kinds of default, depending on their actual age bracket.
I can add a little color and Angela, everybody else can chime in. So we've been thinking about it from 13 to 15. We've added a lot of additional protections. So for example, you can DM on Tik Tok if you are under 16. And the reason we do that is because we know that there is some grooming activity and behavior that happens in direct messaging. And we want to safeguard teens from being groomed online. And so those are some of the things we do. And then at 16. We understand and we work with child safety experts and adolescent development experts to kind of think about this in this way. But you know, at 16 You start to make a little bit more decisions about how you want to be online, who you are, how you want to spend time but then we still provide tool way for parents to work with their team. used to make decisions around what's appropriate for them, whether it's screentime, whether it's the content they look at, I think there's a number of different people that are involved in in designing this kind of experiences. So we partner with parents, and other organization. Setting and I'll speak to that.
I'm happy to just speak quickly. I think I applaud all of the efforts to increase the delta between 13 and 18. And to protect children. But I think that we also need to be looking at an omnibus privacy law, because your question was, should we be looking at different levels of privacy protection for children, everybody in this country needs a fundamental level of protection in this country. So I'm not saying we should do neither or but we really also need to be focused on making sure every citizen in the country has privacy protections, because I applaud the states who have been acted privacy laws. But there are a lot of citizens in this country that right now do not have a state level privacy law. So I think that would end up protecting kids privacy as well. And it would educate their parents, because a lot of the problems that we have right now is we have children who are digital natives, and in parents who some of them have no knowledge. I mean, guess maybe they're carrying a mobile phone around, but they have very little knowledge about what they're doing online, but
kids know more than they do. And, you know, I
go back to thinking about COPPA compliance and verified parental consent. That was a real challenge with children that are digital natives and parents who were not digital natives as we were doing that when I was at out
reached out I couldn't agree more than we need a comprehensive Privacy Bill and I'd say you know, I think sometimes the complications with determining users ages online can be used as a crutch to not ask privacy regulations and it shouldn't be because the last thing we can do to protect kids online it doesn't require it later requires changing the harmful business practices that are positive resulting in endless data collection that would protect us all but we do a lot for kids because so much of addiction and and so much of harmful practice the horrible thing to wrapping my mind or dripping my dad
so are saw all the thinking upstate night that I agree with that see and so couple things and leave that number one would mean love you coast that she could love relax, okay, there's something as not IBC 201 So I think you're building up the scale along crease HetLand but sets as I say, it's more about safety by design, that sort of that Hills deflection but a lot of that deals with things while beyond grade because what you're looking at how it algorithms just aren't at Jensen to chance with that suicide content and that's because self harm is Bianca sort of concept and goes well beyond any of that is collected because you know, it's possible either never do the routine for that information. And suddenly you stumble across that noise. So that was beyond just the native flashes. I think it also does us frankly a disservice to say well maybe do something that both the citizen interest kids achieved you've never get out past 5049 I literally sat here a year L and we set the same thing the pile you know Senator Blackburn has been a vocal vocal supporter of get a credit spread see won't on but opposite. So aircrews epochs on her can well I paid you know we get the responses I feel bad because I was hoping they'll all want to be read the battles and you can't do this and I don't get beats Yeah, if you do this you can never get caught hence Dunn's comfort zone. It's hard it's hard to reply reasons but yeah.
Does it make it harder to get comprehensive privacy done? Suddenly
sending to us you know, something that deals with kids online seat egos it's a dumb conversation right and I don't believe that it takes a ray at all on the other side. Well, I had to get your car accident but you know, he could get out frankly, at the end of the year last year right toasted in Budo caca. 2.0x equals silted that and warm air Austin Finnish live you know that don't disrespect for houses that are tastes under for frying. But even with those out of that conversation, taken to be didn't believe it was so can be you know, be Philip read how all those conversations we can walk and chew gum, so we shouldn't let that stop this. And that one.
It works it. So I want to share a little bit about first of all, how The Internet Society looks at these issues of child safety online.
So the Internet Society, our mission is to make sure that the Internet is for everyone. And that means everyone. And there's two kind of facets to that there's, you know, growing the Internet to make sure that everyone who wants it can have it. But then there's also protecting and defending the Internet, and not just any Internet, but one that is open, globally, connected, secure and trustworthy. How do you do that? How can we know for sure, and analyze whether bills are living up to this aspirational state of the Internet, this is where we developed an intimate Impact Assessment toolkit. And so anytime we flagged any of these kinds of bills and have concerns, we run it through this analysis to see if is this hindering you know, an open, global connected, secure and trustworthy Internet. And this is how we know whether this is getting in the way of our mission, which also happens to be what a lot of countries committed to upholding in the Freedom Online Coalition. So that said, when we look at things like privacy, privacy, and you're really crucial thing for a trustworthy and secure Internet, so privacy for everyone is something that is a good thing for the Internet. But where we start to see concerns is when we talk about things like age verification, because that could it could kind of looking for the word, it only comes up in French, it can undermine our efforts to privacy, right if we're asking platform to collect more information that the mother otherwise might otherwise not collect. And then another thing that we spot in some of these privacy and safety bills, is that sometimes they they ask platforms to do content, moderation, and takedowns. And some platforms already have really great content moderation policies. But if you're forcing platforms to do this, and forcing platforms that are, let's say, end to end encrypted messaging platforms, then that undermines security online. And that's where we get further from this ash, aspirational state of the Internet, those secure and trustworthy.
Yeah, I mean, so I'm curious, based on this discussion, would any of you make the argument that this whole discussion around kids online safety legislation is maybe somewhat misguided? You know, is it that we, you know, these are well intentioned goals, but that really, we should be focusing at a more fundamental level on comprehensive privacy or incentives of these business models or things like that?
Can I actually wanted to read from the latest draft of cosa, if this is helpful, nothing in this division shall be construed to require the affirmative collection of any personal data with respect to the age of users that are covered platform is not already collecting in the normal course of business, or a cover platform to implement an age gating functionality. So I think that's pretty clear that we're not asking for that. And, you know, once again, with respect to the research you guys are doing I have, you know, that's great. But, you know, I don't think that it gives us concern, if we hear that, you know, platforms are really struggling with this, and they don't know how to do it, because COPPA has been a law for a long time, and a flatforms don't know how to do that, then there's bigger problems, and they probably should be talking to the FTC about it.
Well, you could choose you want to jump in there.
Yeah, just to your question. I think we can do both. I think you know, and that's what the American data privacy and 13 Act does. It, you know, sets comprehensive privacy protections for everyone. It sets a data minimization standard that says companies have to stop collecting limitless amounts of data, the data they're collecting should be necessary for the purpose we're asking for, you know, at some some very exception, because not everything, it's in that box. But then it also has heightened protections for kids. It says kids data is what's considered sensitive, so it has to be strictly necessary for the purpose you're you're using it for bands targeted advertising to kids, it creates a youth marketing division at the FTC. So I think we can do both. I mean, I don't think there's anyone in this room that thinks we don't need to do something like cinta Allah, and you know that the kids are not alright, this is we need to do something. And you can't, we can't put it up to parents. You know, I'm a privacy advocate. And I have my eight year old, you know, I can limit what I can on her device that we do at home. But at school, they're on apps that I would never allow at home. In middle school, she's going to be given a district issued device. And I've heard from parents in the middle school that they are not allowed to disable YouTube because teachers are assigned YouTube videos. So there's no way to disable YouTube on these district devices. Kids are just watching How to while they're supposed to be doing their homework. So like even the most educated parents can win that these things needs to be changes to the business practices.
Yeah, Does anyone else want to speak to that? No, not me business. I think I would agree that legislation needs to happen. A Kennedy got the book did and broadly for all American. Oh, I'm sorry. So I was just saying I agree, I think there are a number of different people who have a role to play here. Legislators, for example, have a role to play here. So, you know, I totally agree that legislation should be passed. I also agree that parents and platforms have a responsibility.
Yeah. So I think we touched a little bit on age verification. And, Jamie, I appreciate you bringing up that point in cosa. But when it comes to just like the specifics of age verification, because I know there's a lot of different Ledger's pieces of legislation across states across countries where a verification might come into play. So how is there are there ways to be sure or pretty sure about a someone's and user's age without infringing on their privacy?
I think that is probably the biggest question is how do you verify you're on a terminal, they're on a terminal, you can't look at them, you really rely on an EEG right now, we haven't come up with a better thing right now. It's when you're signing up for an account, enter your age? Well, I have a now 25 year old who's 26 on Facebook, because he had figured this out wanted to get on early. This was a long, long time. 25. So this was before he was 13. But I think that is the harder thing, when you use the term verification, it implies that you actually know how old they are. And so when I think in terms of that, it really thinks even though and thanks for the clarification, cosa says you don't have to collect additional information, it's hard to understand how you're going to need the standard of verification without actually collecting some kind of official age document, like a birth certificate.
So just add it tick tock, we have a neutral age gate. So we asked for birthday, you can't change your birthday, once you find out that, you know, the some settings don't apply to you. And there are some challenges, I think what we do is if we find that a teen has lied, we remove their account, we've had to remove over 16 million accounts globally, last year alone. Because, you know, people say that they're you know, 16. And then they say in their profile, that they're doing a third grade project. And and in those cases, I mean, people saying kids are savvy, but they're not that savvy, they give a lot, you know, to give some things away. And when they do, we have to we have the responsibility of taking that down. But you know, to your point, and even to the point on cosa, you know, there's a there's a provision in cosa that says, you know, that they want to conduct a study about how to do age verification, maybe at the platform, or at the device level, I think, you know, as they're going to be a lot of acts that come up over the course, you mentioned their apps that your kid you don't even know what apps your kid is gonna see on this state owned device. So I think there are going to be continuous new apps that are popping up and without some kind of centralized way to say this is an 11 year old, your old phone and platforms can then take that information and do a better job of making sure that we're protecting teams.
Yes, so that's an interesting point is like maybe the verification needs to be at the level of the device. And so Jane, maybe you could talk a little bit to that about, are there other challenges in verifying age on the device in a way that could be applied to other apps that are on that device? I
mean, again, a huge issue. If you're you're you're talking about one device, but a lot of families, a lot of people have multiple devices. So on what device are we looking at, and you have maybe their school device or this device him maybe that the school device would be the right way to look, but not everybody has a school device. So I think the using the device is a source of truth is, again, difficult because you might have shared devices, when you look at Apple TV, and then there's different accounts that are signed up, you can switch accounts. And so there's no one account one to one correlation between a device and account.
I think this is a situation where legislation can drive innovation, right? Companies will figure this out if we require them to figure it out. They love innovating. They tell us they love innovating. So, you know, is there a good system now? No, the French data protection agency did a report last year saying no, there's not a good way to eat verification now, but I'm confident that if we have legislation that required it, and they do, and they knew that, and they're in the process of figuring out one thing, the California property design code does is talk a little bit is focused more on what they call ages surance, rather than age verification. So it's kind of like having a good idea of what age ish the user is, and then putting in appropriate safeguards based on the risk of the product that you're dealing with. So you know, it's like a social media company, you want to be a little more sure than if it's, you know,
a watch stupid search website or something like that. Yeah. So what do books on the panel, think about the idea of age assurance? Is that just the best that we can do right now? Is there? Should we aim for age verification? Or does that implicate too much data to want to go there?
He can pick it up. Well, I think the bottom line is that there's no for sure way to do it, right. There's no reliability way right now to do age verification. And whether we're doing it by facial recognition, or we're asking a bunch of intrusive questions, it's still generating more sensitive data, which undermine their goals of privacy.
So as far as legislatively goes, I mean, I think that we're open to the conversation. I don't think, you know, my boss is not and I talked to her about this, right? She's not going to support the world in which you provide a government ID card to get on an app like that is not, that's not what we want to do here. But, you know, I think a second question is, you know, are the platforms going to be incentivized to remove these folks right at, you know, sorry to be hitting Instagram was not on this panel and can't come back. But when we had had one of our hearings in the subcommittee last year, they said, Oh, but we removed 800,000 that are age accounts. But then at the same time, we saw articles where they were having kids who were old enough to be on the platform recruiting their siblings who were not. And that was in the news. So I'm going to assume that that, you know, it's probably true, because there were interviews with the company. So are the company is going to be incentivized to actually remove the folks, you know, if that's the case, sure. Age assurance is feasible, because nobody's ever going to be perfect. But there also has to be some give and take.
Yeah. So when it comes to just the realities of implementing these kinds of regulations, is there anything that anyone here would say? Legislature, legislators and regulators don't seem to understand about the complexities in these issues, and, you know, maybe unintended consequences that could happen as a result of them?
I don't want to get not excuse setting that that just you know, it's just another excuse to not forget this privacy legislation or PASS Kids item own safety. Legislation, there's plenty of expertise in Congress, they can get this done. It's just a matter of will.
I think one thing that concerns is at the Internet Society is when certain bills, ask platforms to take reasonable measures to take down and moderate, moderate and take down what could be harmful content. One of the complexities with this is it's really hard to tell what is harmful content and education that might be trying to mitigate harmful content. And I can give like a really good example of this when I was in Canada's Northwest Territories, which is large, vast geography of mostly remote communities and indigenous communities, they faced disproportionate rates of suicide compared to the rest of the country, is really tragic. But there's this group of kids that started a really great campaign called we matter. And what this does is it provides videos and testimonials about sharing stories about surviving, you know, attempts of suicide, to make kids feel like they're not alone. And when you're in a community of like, 300 people that is a flying community, sometimes this is enough to mitigate that risk. And yet, if we're asking platforms to do moderation, this video might pop up as something that was harmful and just get taken down. This is an absolute lifeline for kids in these communities. And there's other services too, right like end to end encrypted messaging services. For example, when we're asking platforms to moderate content, that means you can't do it with end to end encryption, because end to end encryption means that nobody both the sender and receiver knows what's being shared or communicated. And so, again, we're in a community of 300 people. Maybe there's no mental health counselor in town. Maybe you need to reach out to a CEO, a service provider downtown though and you know, into encrypted messaging services or video conferencing services are super critical to make sure that your, you know, personal information remains confidential. You can imagine in a community of 300 people, it's really hard to do that without end to end encryption. And so that's some of the complexities that we see that are concerning, but also with respect to, you know, the secure and trustworthy Internet that we're hoping to, to get closer to.
So I think we fully appreciate the need for encryption, particularly when it comes to things like banking online. But I would point out that there's nothing at least I can speak for our bill, there's nothing in our bill that requires anybody to break an encryption. And sometimes it does seem like this is what companies fall back on as sort of don't look at that, right? We saw that with earnin. And you know, whether you have good feelings or bad feelings about ernet, sort of, that's not why I'm here. But there's nothing that requires that I don't really think parental controls that companies are doing on their own in a lot of cases. That's not, you know, requiring somebody to go into anybody's private communications, it's not asking anybody to break into an encryption. So you don't, I think there's obviously a need for encryption in certain cases. But these bills don't actually touch that.
On the point about is suicide and self harm content, I know, I think you mentioned a really important point about the delicate balance of, you know, people looking and seeking communities to heal from various traumas, and then also, you know, training moderators to understand the context around these videos. So for example, we, you know, if someone were to search for suicide content on our platform, we direct them immediately to resources and our community guidelines. So we have like, the link to the suicide, national hotline and texting. So but then, you know, we also have a trained moderators to understand the context. So every piece of content that is about suicide isn't necessarily encouraging suicide, it may be encouraging people to seek help. And so being able to identify that, and then also train and have people, we have over 40,000, people who work in trust and safety at tick tock, and they work on, like identifying the context around these types of videos. So I think you raise a really good point, people might not have access to resources, and they do turn to the Internet for things and they just don't have access to in their communities. And so I think that's a really good point raised.
Yeah, and I think that that discussion kind of touches on something that came up in some criticism of cosa and the age appropriate design coding California, have like, a concern of would these bills unintentionally keep certain resources from kids on the Internet, especially in certain marginalized communities? By virtue of, you know, maybe certain topics being considered not really age appropriate, but there could be resources that are really helpful for kids who are struggling in certain ways. So I guess, does anyone have anything to say about how those concerns can be balanced? You know, balancing access to resources and access to age appropriate information?
And, you know, I think the way to go about it, and the way, the appropriate way to do it is you link it to accepted standard and you link it to, you know, things the American Academy of Pediatrics with Fidel would, you know, define it that way to make sure that it's not like a State Ag interpreting why these harmful to mental health, it's medical professionals that are determining
Is that something that Koza or the California Bell have done, or they leave a bit more open ended?
And in the definition, right, mental health harms was added to?
Yeah, and we've yes, we've worked to clarify and specify and those are some of the things that we're still working on cleaning up before introduction, we also have a kids council that has experts, both from companies and the mental health community, you know, teens themselves, things like that. And so their job is to work on clarifying sort of the universe. And, you know, fully appreciate the issue about I guess, I mean, we would even call it lawful but awful, because I don't think that what Natalie was, you know, that's an awful, but, you know, fully appreciate the idea of, you know, worrying about over moderation, but at the same time, we're seeing so many cases of under moderation of things like fentanyl sales that are just there and not being caught for whatever the reason that, you know, we, it's just not excusable.
Yeah, and I think we talked a bit about parental controls here, and I just wanted to go into that a little bit more. I think a lot of the time we see so wish shins from the tech companies that focus on parental controls, which it seems parents really liked the ability to have those options, but at the same time, it's really hard for parents who are busy to have to manage one other things. So is there a way to move beyond parental controls? And, you know, how can parental control themselves be made more manageable when we put these kinds of solutions in place?
Yeah, we need to move beyond them both. Because it all it's too much for parents. And also, parental controls, assume a healthy parent child relationship. And that's not always the case. So we need to account for that. But the way that we move away from parental controls is we force changes to the business practices that are keeping kids online.
That's, that's the way to do it. I think I've made this point a quite a few times, I do think that, you know, parents do have a role to play in their children's development and their lives, and what they want to see online. And I think you do raise a good point about, you know, the fact that that might not be a healthy relationship, but I don't know that we can solve for all the problems that parents and children may face. What we can do is give them resources and guidance and the ability to make choices that will allow them to be safeguarded. And so I think the parents have a role. I've said it a million times. So I will keep saying it. And I think we all do. So giving them those tools is something that we work to do.
heinous, is this something that you've had experience with that app?
Well, I think that is an area where the device can be very, very helpful. Parental Controls are on Apple devices, they're behind a separate parental login. So if you have your child using the device, you can set the controls and have a separate passcode. So your child can't get in. And I think there's some creative things there that, you know, resources that could come where to educate parents on the type of parental controls for the certain age groups, and maybe some of the medical associations and mental health associations could come up with some recommendations for parents, if you have a child between X and Y ages, these are the parental controls you should set. So that's an idea.
And Noticias Anynet,
just to maybe build on to that I think you picked up on a good point is what's age appropriate. Some bills that we're seeing is not just for 13. And under, it's up to 1718 years old. And when the Internet Society, you know, goes back to its mission in Internet for everyone, some kids might not have access to the Internet, if say there are potential parental controls that are preventing them from doing so. And for example, if you know, you know, a teen wants to access information about LGBTQ resources, right, or is discovering their identity online, they might not be able to do so if there are parental controls that might prevent them to do so unfortunately. And there are lots of other examples like that. So that's just one of the tricky parts of some of these bills that, you know, really widen and focus on that age range from kids to
at least.
And we, in the latest iteration of cosa did try to distinguish between kids that are at the younger age of that range, and the folks that are at the older and have different settings on versus off for certain things, recognizing that there might be distinctions, and that, you know, there may be different approaches that folks would take with, you know, parents and kids for that age.
And I think more recently, we've seen a little bit more discussion and proposals around just banning social media for kids under certain ages, like six Canadian, or whatever the age might be. What do you all think of that sort of discussion? And, you know, what, is there a certain value of teens being on social media that they might miss out on? If they're not allowed access to it at all?
I think there's a there's a lot of value online. I think we talk a lot about the harms, and we're here to address those harm. But there's so much educational content that we now have access to because of the Internet. And so I think to say completely banning teens from using a tool that they'll have to use it as an adult at some point in their lives as they go on. And not allowing them to learn how to develop healthy habits online I think deprives them of like some of the more essential things that they they're going to need to do as they grow up. So um, I'm supportive of that. And then even when I think about like, the pandemic, how isolated we all were hitting all of the teams, and how else would they be communicating if they weren't able to access, you know, online tooling, it's just no, we have a responsibility to keep teams safe online. But we also need to enable them with tools to figure out how to manage themselves.
I think the other thing that's important to consider is there, people use different ways to access the Internet. And sometimes that's just Facebook, in northwest territories in Canada, Facebook is super popular, especially for communities that are spread out across his giant territory. That is how we talk to grandma. That is how we stay in touch with our loved ones in communities that are 1000s of kilometres away. So blanket banning social media. I'm not, there's no silver bullet, right, I think that we just have to be mindful that there are different ways to access the Internet. And there's no one thing that we can do to mitigate harm, it's going to take a holistic approach that involves education that involves consideration for how people are accessing the Internet. And that we're all contributing to I also
think you have a definitional problem. How do you define social media? Is social media now a chat group where you have 20 people in a chat? I mean, I think, and I'm bands don't generally work, people and kids are really good at getting around banned. So
I think people have asked my boss this already, because other senators have had bills about this. I think, you know, our view is that it's I know, that's a family decision, and that's between kids and their parents to figure out. But you know, the thing that I will say is that parents are drowning in all of this. And, you know, we did cosa because so many parents, and physicians and teachers came to us and said, We don't know what to do here. We're lost, and our kids are lost. And we don't know what to do. So no, I don't think that it's the government's role to outright ban them from doing it. And I don't think that that is Jain said, would work. But at the same time, they, they need help.
Yeah. And I mean, I think we've already seen some governments try to ban tick tock, of course, I mean, have we seen any impacts on kid yet from those efforts? Or is it too early to say?
Well, we're still here. So I, my colleague, Will, we'll be talking a lot more about this at 4pm. So if anybody else wants to talk about a question, feel free, but we're happy to talk more about it. I mean, we have a community of people who are over 100 million users Tiktok, who loves using Yap, who learn who have built their small businesses up, and it would really be a shame if Tik Tok from baned. And I'm not just saying this, because I've worked there and love. I love the app, and I feel committed to the mission. So
we should probably be on tick tock sorry, to be the best.
Well, looking at a little bit outside of the US, I know that coza look to the UK online safety bill. Jamie, could you talk a little bit about the UK is model and you know, where we are in relation to the UK and maybe other parts of Europe?
Yeah, I mean, I'm not going to get into all the details. But it's, you know, it's similar to cosa, but it's broader than Kosovo, which if you put it in context of them having the GDPR right, it sort of was an outgrowth of GDPR. My boss recently actually met with the Member of Parliament, and he kid Ron, who was sort of the person pushing the UK online child's code. So it's a little bit broader, right. Like they do get into some of the issues of why not processing data in ways that are detrimental to kids. They talk about sort of different data protection principles. And once again, you know, I think it's broader because it is an outgrowth of GDPR and sort of talks about how GDPR in particular is going to be applied to kids and teens. And that, you know, I think GDPR what happens with GDPR sort of here is a broader conversation that we could sit for another hour and talk about so, you know, I think we were left with the view of, yes, I think that we have many similar viewpoints here. And what they have done to the UK is, you know, in many ways helpful, but at the same time, you know, I don't know that that That model is always going to work in the US. So we're trying to do it our way.
And one of the critical differences, the UK is on the UN declaration for rights to the child. So there's an actual definition. And rest is not is an actual definition of best interests of the child, but they can follow in their legislation, whereas we don't have that.
Yeah, so I guess is, are the other aspects of the UK bill, places where the US might grow or expand on these efforts in the future? Or, you know, Jamie, are there certain things that make it much more, you know, not really an American type of law that we would ever really seen here?
I just don't see it and you know, it No way. Am I looking to sort of denigrate their efforts, right. I think that, like I said, we have similar similar goals and nine, but yeah, it's a very UK focused law, right. It's the place where GDPR came from. And it's a very GDPR focused thing. And so I just don't see us sort of going down that road. But yeah, there's a lot of conversations still to be had about sort of how does the US to be on the international stage? Well, they have GDPR, we nothing so inept, there's always that hole, but I just don't know that I see us moving to adopt that model to fill it. Something I hold, but I don't know that that's it.
State those are really close to it. The California age appropriate design code is fairly close to the way UK one. And then you're seeing that football is one, Oregon, Maryland, Minnesota. Don't quote me on this Nevada and Mexico at this session are supposed to be mixed up. But Connecticut,
Texas just dropped Blanca and Costa Flores it was interesting with a private right of after
tax at home. And then the other hand, you said Utah with job I had this last session was going to or last week, it's going to the governor's desk. That goes too far it requires parental access to all of minors communications. So I guess, but that's not enough to get the State football.
I guess we have seen this discussion about, you know, industry saying that having too many state laws will really make it difficult for us to do business across the country. Are we already seeing those effects? Like is it? How difficult is it already to offer in different states that might have different regulations around online privacy or online safety for kid?
I think it can be challenging, because right now, we're seeing all of these states pop up with different laws, and there'll be different requirements. Right? And so how do you make sure that the standards are either applied across the board? Do you do do you decide the most stringent law is the law that you will apply? Do you? So you know, I think it's hard because now we'll have a patchwork of right, distributed across the US. And it would be great if we had something comprehensive. That said, everybody in the US every child in the US has been writing these protections, it would be much easier from a business standpoint to implement that. But you know, we'll have we have to with all of the state bills that are coming up, and it makes sense that they're legislating it's their job to protect their citizens. You know, we have to think about okay, well, how do we apply this law to this population? Or do we apply to everyone, so it starts to be a little challenging, especially if laws are in conflict with each other, or different or one goes way further than another? It's hard to sometimes navigate.
And then you just compounded if you're a global company, you've got the United States to grapple with from a compliance standpoint. And then we are seeing I mean, across the board, privacy laws being passed, and they're all not consistent at all. And I think it is interesting to look at what happened in Europe, in 1995. They pass the directive, it was a directive to every member state to pass a privacy law. Well, it turned out each member state passed a different, you know, somewhat the same, but somewhat different privacy law. And it was impeding the free flow of data across the European Union. So that was one of the drivers of the GDPR. Of course, it was a fundamental right to privacy as well, but they wanted to ensure the same level of protection across the entire European Union. So that's why they passed a regulation which requires every member state to enact the same law. And so we in the US are kind of taking the directive approach in a haphazard way. So I think, if we could get a very strong omnibus privacy law, that would of course, increase the free flow of data and make it easier from a compliance stand. endpoint for business.
I just want to call it a little bit that industry is complaining about a patchwork of state laws. At the same time, they're pushing weak state privacy laws in many states. I'm opening with memes on this here, but I'm going to call it a thumbs up. And, you know, the market, investigative journalists of the markup showed that, you know, Amazon lobbyists pushed the Virginia consumer privacy consumer data privacy act. That's, you know, a couple of years ago, and they're pushing that model in many states it went through in a matter of weeks. And it does very little privacy. So you know, they're trying to get all these weak state privacy laws passed in order to lower the bar at the federal level, but let be fast. At the same time, they're complaining about a patchwork of state laws.
Are there certain hallmarks of what would be a weak state privacy law, like when you see a bill that leaves out certain things? Or says, you know, certain key words, does that tell you this is probably not what it was me,
they split if it only provides users the ability to know what companies are collecting about them without limiting what competencies collect about them at all, like, just provides those user rights without any forced changes to the business practices?
Yeah, so when you're working at a company and trying to figure out how to implement laws across different states, like how do you decide, you know, are we just going to implement this standard across all of our operations, or we're going to limit to a certain geographic area?
And it's that question is for me, I would say that we, it really just depends on what the law says and how it applies, and where else it applies. So I think there are various different laws that are popping up, we mentioned this. And it really depends on what the provisions are. But I think it's easier for companies a lot of times to take one approach and say, Okay, this, the strongest law is the one that we're applying across the board. Because we don't always know where people live. So you could be a California resident, I spent time I used to live in California, I now live in DC, and I still say that I'm a California residents sometimes to get CCPA rights, and are gonna lag. But we don't, you know, we don't always know. And we don't really necessarily need to know, right? And so it's, it's hard, because otherwise, we have to think about where people live in a way that we don't necessarily want to do.
And when the different platforms pull that information of like where your bases are based on where you currently are as a base somewhere, I don't know you fought your phone, or
I think that's, that's challenging, too, right. So like, tick tock doesn't collect direct GPS location. So you may be and I travel a lot, too. So I mean, I think it could be determined by where the bison bought, or, you know, IP address, but I think it really depends. And, you know, you'd have to really ask that, ask for that information, I guess, to comply with the law. Yeah.
Sure, I think that there's also another, perhaps unintended consequences are another outcome is that services could decide to not offer service in certain locations. So we, at the Internet Society have a little bit more of a global perspective on this. But for example, with the UK online safety bill, which opposes a duty of care, that means that service providers have to moderate content, and in effect can't use end to end encryption. Lots of service providers have decided not to offer a service when second grade security for users if that bill becomes law, because they're not willing to do that and to take that responsibility. And so that is another potential outcome as well. And allow out.
Yeah, I think even short of seeing some of these bills, become law, we've already seen companies taking action, like dyo mentioned trying to proactively meet some of these new standards that are being discussed, or figure out new ways to protect kids on their platform. So I'm just curious how many of you have seen maybe even just the discussions of these regulations pushing the industry to make changes? Or do you think it has volunteered of that so far?
Well, I'd certainly say it's fallen short, I think, you know, this experiments were going on for 20 plus years, I think, you know, we know self regulation doesn't work. And it's helpful for businesses to have certainty, you know, startups that are just starting out, but know what rules to comply with. They know the rules of the road and they're competing in the same rules of the road as the big tech company so it provides certainty for businesses at the same time. It protects bribes.
So we've seen a lot of companies that will roll out changes in advance of a hearing or advance of a bill drop, and then in some cases, those changes don't actually end up getting implemented. So, you know, I think for us, it's a lot of too little too late, which is why we're gonna keep going on this. You know, for example, we saw last year, that Instagram came to us and said, okay, great, well, we're gonna make all of the profiles for minors private, by default. And then lo and behold, they weren't. But they started in advance of a hearing. So they could put out the press release in advance of the hearing. And so, you know, good. I mean, if the company is want to start implementing changes, great, but you know, for us, it's like, okay, this feels like the least you could do. And you're doing it because this is public.
Yeah, yeah. And I guess just to wrap up, I'll go down the line. If Congress could pass one thing to most protect kids online privacy and safety, what would it be? And what's one thing that industry should do right away? The better protect kids online? And the real start with Jamie, how's that spelling out? Pretty obviously, I know. Yeah.
What did they do? I don't know, be better about taking down the actual, like illegal things that parents have tried and tried and tried to report. So you know, my boss doesn't have to continue taking meetings with parents whose kids have died from fentanyl being sold on platforms, things like that.
Yeah, I am still convinced that an omnibus privacy law comes first. And then second would be it because I don't think they're exclusive. I think then, looking at the Children's Online Safety Code, because you look at the UK, UK as GDPR is a backstop to protect privacy as as they're implementing some of these requirements. And then what could they do better? I think, again, I I tend to agree with you on some of the harmful content, but I think they need to do it across the board. You know, it's not just children.
I would echo comprehensive privacy, privacy legislation in the US that has strong children's protections is something that would improve our ability to protect teens for sure. I would also say that, you know, as platforms, we understand that there are harms, and we understand that we have a responsibility, and we're constantly working to improve. And so, you know, as we continue to approve, we'll do what we can to make sure that things are safe online, because you don't We don't want to see those headlines, I'm
not sure we have enough time for me to list off all the things I've been covering this industry should be doing differently. So I might just pass on now. But the best thing we think Congress can do to protect kids online is to pass a comprehensive privacy law, like the American data privacy protection act that includes strong data minimization rules, and has heightened protections for kids. So it's addressing those conditions. But it's setting a culture of privacy in the US that's just missing right now.
So I'm going to give a vaguer response, more privacy for everyone, the more we get towards a secure and trustworthy Internet, and the more that we can make sure that the Internet actually is for everyone. And that involves not undermining encryption, or incentivizing companies to undermine encryption because it is a crucial way that people can control their own privacy and protect their data.
Well, thank you all so much for being part of this discussion. And thank you all for joining us. Fashion