Welcome to stay to the net 2022 We're really glad to have you today for a great panel on Kids, Cashless Apps, & COPPA: Who's Protecting Kids' Fintech Privacy? This is an issue that's as prescient as ever, as everyone in the room knows screentime usage of apps like these has really soared during the pandemic. With added screentime, we see additional interactions with different platforms and, inevitably, more business transactions that kids are engaging in. Looking forward to having a discussion today about the laws that are on the books to protect kids as they're engaging in these transactions, and engaging with FinTech, and what more policymakers and private actors could be doing to protect children.
I'm joined by a terrific panel of experts today that have been working on this issue all for a long time. We have Jamie Susskind, who is a tech policy adviser for Senator Marsha Blackburn, who is the ranking member on the Senate Commerce Consumer Protection Committee. She previously worked at the Consumer Technology Association and at the FCC. We have Jessica Rich who is of counsel at the law firm Kelley Dyre, and she previously served at the Federal Trade Commission in various roles, including Director of the Bureau of Consumer Protection. We have Sid Terry, who is the Chief of Staff to Congresswoman Jan Schakowsky, who chairs the House Energy and Commerce Committee's consumer protection panel, and he previously served as her legislative director. Lastly, but not least, we have Rick Lane, CEO of Iggy Ventures. He previously served as Senior Vice President of Government Affairs at 21st Century Fox.
I wanted to kick off the discussion today by talking a little bit about what makes FinTech and these issues around financial transactions distinct. I'm wondering, perhaps we can just go down the line. I'd like to hear what you all think are distinct concerns, issues, challenges, posed by FinTech when it comes to children's privacy. So Jamie, would you like to start off?
Sure. First of all, thanks to State of the Net for having me today. Happy to join all these really great panelists. I would say, for anyone who's been following my boss since she started in the Senate, moved over from the House, she's talked about how consumer privacy is one of her -- probably the biggest focus for her in the tech space. Also, for the past several years, she's been talking about the importance of kids' privacy, and kids' safety online. As we start thinking about this, we had occasion to start thinking a bit about the financial space, and learning more about how kids are starting to use financial apps, things like Venmo, kids are getting debit cards, which in and of itself, this doesn't seem like a problem. I have a daughter, she's going to be five soon, maybe when she hits about 12 is the time that I would start thinking, okay, I could teach her how to use a debit card, how to use financial apps, so she could learn about sort of being, you know, financially responsible, things like that.
But, at the same time, we started to think about how it seems like there's this gap there, where the existing privacy laws that are out there don't really seem to cover or anticipate this issue. Gramm-Leach-Bliley seems to assume that you would be not a minor. It doesn't seem like you would ever think that a minor would be using something like a debit card or a credit card, or at least back then you probably wouldn't have thought about them using financial apps.
COPPA obviously exists. There's a consent regime, but at the same time -- it's there, and it's the bare bones notice and consent type stuff -- but at the same time, it might not be anticipating the wealth of financial information, and other things, location data, things like that, that kids could be providing to these apps on a real time basis. So, I think that this is a really interesting area. In general, kids' privacy is a really interesting area for us. So, just happy to chat about it more.
It's great to be here, talking about kids privacy and data, apps, etc. I think that there are a bunch of concerns that are raised by these tools. First, there is comprehensive data collection. It's not just your personal data, it's your financial transactions, it's your location data. Several laws apply, COPPA applies, GLB applies, but not comprehensively and not adequately. One key area that's a gap is teens or tweens.
And, by the way, GLB and COPPA, the main laws that apply, were both passed in about 2000. I did the first rulemaking under COPPA, and my 24 year old son had just been born. The kids, and this is true in other areas, too, but kids are far more sophisticated about apps. And and the controls on them and their parents are. So these models that assume parents are going to control for their kids are very outdated. And the final thing I'll say is that, to the extent that these laws are not adequate, part of that is that the way we have been approaching privacy with notice and choice is not adequate. Because once people click through these annoying screens and pop ups, and say, Yeah, sure, because they're not really focusing on what's going on. That's not the that's not their focus at the moment, they just want to use an app, anything goes, are notice and choice laws. That's basically what they do. So they don't provide, even when it's a very comprehensive, comprehensive notice and choice, people are inclined to just click through, and it just really doesn't provide the protections that are needed.
Thanks, Jessica. And thanks, everyone, for being here. And it's very humbling to be on this distinguished panel, I'll kind of go, just build off a lot of what Jessica just said, you know, and actually what Jamie said to about the idea of getting people a debit card or for your kids or whatever. You know, I think in many ways that was, you know, having a teenager with a credit card or a debit card was sort of a high class problem in the past, you know, something that wealthier parents would be doing. But I think that many ways pandemic, as well as the apps have made that a much more universal issue and have brought to the front, the need that we have to really think critically, and to step into the space. I'll sort of make a little bit broader comment beyond that. I mean, I think that one thing that, you know, we we and the Democrats in the House Energy and Commerce Committee, particularly the consumer protection subcommittee, pretty pretty much feel universally that the idea that we more or less treat, as adults, children, 14 and up is a fairly major flaw, especially with kind of the, for lack of a better term great power that the Internet has, you know, I mean, the Internet at this point where we stand now. So it's kind of Jessica noted, is both our past, present and future, right web one, web two, web three. But, you know, when I was 14, 15 years old, if I wanted to do something, either good or bad, I had to go down to the, to the computer in the basement, in my parents house, I had a dial up, I had to search for something and let it load, usually I'd go do something else it would take so long to load the page I wanted to do, and then come back to it. I mean, again, this, these, they get the good and the bad is all at their fingertips now on their phones. So we really need to be thinking about that. But the other thing that I really think about in terms of this loophole is the way in which it can allow for targeting of kids and tweens. As Jessica noted, I think back we had a hearing in September 2020, at which Tim Kendall, who was the first revenue officer of Facebook, and later the head of Pinterest spoke. And in my boss's testimony, there was a vice article shortly before that, where Facebook engineers said of the group algorithms feature were the quote, unquote, scariest feature of the platform, the darkest Manam infestation, a user enters into one group and Facebook essentially pigeonholes them into a lifestyle they can never get out of. Well, now that sounds hyperbolic, and it perhaps is, and who knows why that employee was talking device in the first place. It's important to remember that they're talking about adults in that case, they're not talking about kids. And I think if we all think back to what we were like when we were 14 or 15 I think we have to really think about and think about this in terms of cigarettes that was Tim Kendall did that very hearing like you know we recognize that that those years are habit forming and can can be built can can shape who you are for the rest of your life. And by allowing for companies who may not always have the best intentions. I know many of the ones that all the ones that are here surely do. But but who may not have the best intentions to be able to target our children like that, I think is a huge oversight and something that needs rectifying quickly. That's all even there.
And I just have to note that you look way too young with us dial
I feel really all because I remember having my son using dial up and showing them how to use the Internet. So that's really kind of scary and of itself. Now he's worked up on Capitol Hill, but the for me when I left Fox, I wanted to do something good. And so I decided to volunteer my time to advise online child safety groups on tech policy, I felt 30 years of experience could help them. And they don't have the resources to compete with the professional lobbyists in this town who are really good and really smart, but may have a different view on some of these tech policy issues. I worked on the foster SSL legislation working on section 230 reform earned it act, as well as the dark who is GDPR problem that's putting our national security at risk. I was approached about a year and a half ago by Rico, which is the company that is the parent company of the Missoula app. And they wanted to talk to me about this app that they were rolling out that helped protect kids privacy. And we hear a lot of these companies that are out there. And once they explained to me what they were trying to do. Here, I'm supposed to be a child privacy, advocate and expert. And I had never thought of the simple fact that when I got my kids credit cards, that I didn't opt out of anything that they had been tracked since the age of 14, all the way now through adulthood, with every credit card purchase they've made with their social security numbers, their name, their date of birth, everything to create a dossier. And then this light bulb went on and said, Imagine once that financial transactions get combined with your social networking history, you have a dossier on kids before they hit a teen that is second to none and nothing that we have ever seen or forcing, as we were looking at capa and GLBA. So when Rigo was showing me their tech, I was like oh my gosh, a technological solution that solves a public policy problem. And I went to people I reached out to is Jessica, like, what have you thought about this? We're kind of in the same world. When I was doing MySpace, Jessica, I would have conversations all the time. How do we pick kids on MySpace, never thinking oh, what about their financial transactions. So working with the, you know, members in house in the Senate and others, one of the things I'm really excited about this event is to really bring this issue to the forefront, because it's not an issue that we really have spoken about since the 98 passage of capa and the 99 passage of GLBA.
I realized a couple of minutes ago that I never introduced myself. I'm Christianna Lima technology reporter with The Washington Post, I read our tech toto newsletter, first live event a couple years, folks. So I wanted to turn and Jessica, you alluded to this, but you mentioned that you think there are gaps in the existing laws. And you mentioned the courthouse, but I'm wondering if you could talk a little bit more about what you see as existing gaps in capa and then also, gramm leach Bliley and other standards that we have around this?
Well, I think it I think it's important to understand, first, what does exist? Because I think there's a lot of confusion around that. So maybe could I just start with that? So, so I think the main laws that apply in this space, and I'm talking us and I'm talking commercial, you know, is Capa, which is parental consent for the collection use and disclosure of information for kids under 13. Then there's GLB, which allows people to opt out of third party sharing, but not with affiliates, which is a big loophole. And that would apply to all people, including kids. So there's an overlap there. There's two. There's also the FTC Act, which prohibits unfair and deceptive practices and provides the FTC with a lot of flexibility to protect kids even outside of those sectoral laws. And then there are state laws and one in particular, California adds a consent for third party sharing with 13 to 16. And a lot of other state laws in the in the pipeline that may address kids. And then of course, there are proposals like Jamie's boss's recent proposal for for minors. But it's very important to know that cup of does, you know, it's it's it's certainly not up to the task today. But it does have a lot of protections that parents can use if they're paying attention. It applies to this space. Not only can the FTC enforce it, but all the banking red regulators can enforce it because the FTC lacks jurisdiction over banks, but it does allow the banking regulators to also enforce it. But it applies to these apps. It applies not only to the apps themselves, but to third parties that collect information through these apps. It's a it's it has noticed On the website and directly to parents alerting parents that that data collection may be going on. It's got parental consent requirements, that that have some teeth, but also have some flaws. Because, for example, using a credit card is considered a form of parental consent. So if a parent doesn't want a kid to just use their credit card to say my parent consented, then they're going to need to be paying attention. And, you know, I don't want to go on too long. But oh, parents can get access to data that's collected about their kids. And then as I said, GLB also provides some tools. So the reason this is inadequate is it's fundamentally based on the notice and choice model. Which if you're not paying attention, you just run right through it. And then all all, you know, it's it's a free for all. And it doesn't cop it doesn't extend to teens or tweens. So we do need new laws. And I guess that's my summary of it for now,
and if I could just jump in and say, and then channel 100%, my bossier. But, I mean, one of the things that you kind of that my big takeaway from your really helpful run through there, Jessica, was that the burden is on parents in this case, right? And so well educated, often well to do parents are going to be able to protect their children, but many others are not going to have that opportunity, nor even know that they're missing that opportunity. Right. And that's where I think we need that's kind of the paradigm shift has to take place in our laws, which right,
to the extent that parents have access to the Internet, and sophistication about the tools, they can do this regardless of income, but many lower income parents need both don't have the time because they're working double jobs, and they may not have access to these tools.
Canal. So I mean, I think all those points are right, so my boss, along with Senator Blumenthal, who chairs the consumer protection subcommittee, they helped five hearings over the last year, bringing in a lot of the big players in this space. They brought in YouTube, they brought in Tik Tok, they brought in Instagram, they brought in Facebook, which I guess is now meta. We brought in Snapchat. So they've had everybody up there, they had the Facebook whistleblower, you know, and basically, what we've seen is that regardless of Capa, it seems like a lot of these companies are either willfully or perhaps I guess, negligently disregarding COPPA, right. I mean, we saw a lot of documents that showed that Facebook was actively recruiting kids under 13 to the platform using siblings to do it. And so it seems like, you know, I'm Republican, right? I mean, my boss is a Republican, we're not looking to create a nanny state, but it's very tough for parents to be working in sort of a good faith way when the companies are not. And it seems like you know, me as a parent, right, and I'm, I work in this space, right? I feel like I'm knowledgeable in this space. But even I probably would miss things that sort of an everyday parent, you know, living in the Midwest would never even think to look for particularly when these companies are trying to actively flower, you know, the rules in common now. So that's why my boss and Senator Blumenthal recently introduced a bill. I would say that we should not necessarily get rid of COPPA. But at the same time, I acknowledge, as Jessica says that when it comes to kids, and you know, what do we call it tween safety, that there definitely is a major gap here, and that there's more that needs to happen, you know, there's parental safety features that you know, can be put on by default. There are safety audits that can be taking place, there's age verifications that they need to, frankly, do a better job of doing. I recognize that some of these companies will say, Hey, we're doing the best that we can, and we can't help but if kids lie, but if nothing else, it's like you're trying to collect the age data based on the data that you already collect, you should be setting things more. I mean, we saw that, for example, Instagram said that, you know, kids and teens, profiles should automatically be defaulted to private, and then we saw that that actually wasn't happening. It's things like that, that we need to be thinking about, in addition to copper, I mean, certainly hell is talking about whether copper works, whether copper should be updated, but there's more than that, that needs to be happening. And like I said, you know, we're not looking to create a nanny state, but we also recognize that parents are, you know, in a way, like they're working at odds with companies that are, you know, able to pull one over on them, frankly, and that parents are stressed, it's been a pandemic, parents are working kids are at home. They're just trying to figure out what to do. And so, you know, we think that there's just there's more that needs to be done to protect kids safety online here.
So you mentioned that you think that there are companies there that are flaunting the standards on the books so putting a pin in potential changes to the laws that exist. I mean, how much of the issue here is enforcement? And the extent to which I mean, we've we've heard from Kid safety advocates for years that they wish that capa and these standards would be more rigorously enforced. So how much of that do you think is part of the issue?
Yeah, I mean, obviously, Jessica spent many years at FTC. But I do think that there's more enforcement that needs to happen. I don't. So I don't happen to agree that the answer is getting rid of the safe harbors and Capa, she may have a different view. But I do think that it would be nice to see more enforcement, but at the same time, when we have, you know, Facebook coming up there and saying, Well, we just removed 800,000 underage accounts, like how do you get 800,000 underage accounts on your platform in the first place. So you know, we need to be really thinking about sort of the protections that are in place on the front end, so that we don't have to rely on the FTC, in every case to be, you know, enforcing, and gets 800,000 accounts.
Gearing up over there.
One of the things about capa that early on, you know always was a strong supporter of is the self regulatory regime of having these secondary entities that are certified by the Federal Trade Commission to help in this enforcement side. I do think that that could be stepped up. And there could be greater scrutiny of the self regulatory regimes that are out there. We're very proud that Missoula and Rico that we are capa certified. We took that very seriously. We built privacy from the ground up working with Primo and others to make sure that we had the regimes in place. And that's very helpful for parents. I always looked at that as sort of a good housekeeping seal of approval, so that you wouldn't have to worry about the 800,000 companies that are out there and hoping and cross your fingers are doing the right thing, that there was a mechanism that could help guide parents, but also hold these companies accountable. So I hope that stays in place. The other thing I just want to make sure is that digital wallets and debit cards are a good thing for the FinTech world, as we heard from Congressman McCall, it can be plusses and minuses just like you know, digital currencies and Bitcoin and all the other issues that are out there, we just need to make sure we have in place. But one of the positive things is we're heading into the Financial Literacy Month is that there's a lot of great products that FinTech can be helpful on and and to help kids learn how to manage money to get chores to donate to charities, and a few other functionalities, Missoula, and other digital financial apps do. But the same time, you don't want all that information collected on these kids of what they're doing and their financial literacy. So providing tools to help kids in this cashless society and learn about financial literacy is critically important. The question here is what are the rules of the road in this FinTech world to protect kids privacy and their financial privacy?
Yeah, I just want to note for especially the press here that we the Energy and Commerce Committee, led by Congressman Castro, who's kind of our, our lead along with Miss Trey hen, Miss rice and Miss Dingell on kids issues, conducted oversight on all these third party companies. And I believe we have all of our responses back from them, if any of them are here, or watching, and you haven't gotten it back. This is me guilting. You. But so I think that we're gonna be we're kind of evaluating the space right now. And if there are other people here who have insights, and we have not reached out to you, we'd welcome those at this time.
So can I make a point about something that is particularly challenging this area, is knowing when you are dealing with a kid and capa maybe too porous? In that it's kid directed sites. And then you're, you're, if you're a kid directed site, you're covered. Or if you're a site that no has actual knowledge, you're dealing with a kid. So your bill deals with it differently. It covers it protects protections when the covered entities when when somebody is reasonably likely to be when when, when the site is reasonably likely to be used by a minor or when the entity that owns the site that operates the site or the app reasonably believes they're dealing with a minor. Those are difficult standards to and so for all you researchers out there, I think getting this right is going to require that knowledge standard that that standard for when you're responsible when you know you're dealing with a kid to be adjusted properly, so we get the proper coverage. It's a very challenging area.
Yet, so I'd love to dig into some of the proposals that you all referenced. Obviously, both the House Energy and Commerce Committee in the Senate Commerce Committee You've been very active on this. Sit there. I know you all are holding a hearing around, you know, tech accountability tomorrow. I'm wondering if you could talk about how you all see this particular piece around kids privacy FinTech fitting into your agenda there.
Well, you know, we're we're still reviewing the the Blumenthal Blackburn proposal. That being said, when you have Senator Blumenthal and Senator Blackburn coming together, it gets a lot of attention. And so we're reviewing it really, really closely. Um, that being said, you know, again, I mentioned Miss Castor, and kind of the other members who she's working with, they have a proposal out there that we Democrats like a lot, I think our minority counterparts aren't as enthralled with it. But I, I could be wrong. But either way, I think we see this as being a key part of a comprehensive Privacy Bill. We're looking to kind of, you know, it was announced in November that we made a profit to the minority there on that, you know, the base text for that would be at this point, the Castro bill, from our perspective, but obviously, we expect that to be negotiated. And, you know, we'll see where it goes. But you know, in our our message, everybody here would be that we view the the Blumenthal Blackburn bill as, as a live a live issue. And, you know, our view, our hope would be that we could use that as a springboard to a comprehensive Privacy Bill. But again, I don't think we're going to hold it up or wait for that either.
Yeah, so that's interesting. And, Jamie, maybe if you want to speak to this, you know, that there's there have been questions for a long time about whether lawmakers should move ahead on this idea of a comprehensive Privacy Bill, move ahead on a more distinct proposal that's focused on updating Capa, it seems that so you think that maybe it should should be rolled into that comprehensive proposal. But Jamie, I'm wondering what your thoughts are on that?
Sure. Yeah. I mean, I'm not. I'm not at liberty to share the discussions my boss is having with other members that I you know, she has said publicly, right, like, Let's get them all done. I don't think she views this bill as a substitute for comprehensive consumer privacy. She thinks it'd be needed, she would tell you, she's on TV all the time talking about it. Right. That is, I think, one of the most important things that she thinks that Congress needs to get done as soon as possible. Obviously, this is also a really top priority. But you know, I don't think that that changes her calculus about the importance of consumer privacy. So you know, I think if I were channeling her, she'd say, well, let's get it all done.
And so I want to dig into a couple of the specific aspects of legislation that Joel mentioned, the the age cut offs, and Capa, there are a number of different standards across CCPA and California GDPR. The UK is age appropriate design code that all would potentially boost, or have boosted protections for not just children, but also tweens, teens to an extent, wondering if you could speak to which protections you think should be, if at all, to be expanded beyond the current federal US standards, and which ones do you think should be kept in place? And that's for anyone. Yeah, right.
So my, my personal view, is that we should start at age 17. And under that, I think, you know, as a parent has gone through it, you know, I thought when I was 17, I knew everything. It's just typical being 17. But they're gonna make mistakes, and you don't want them making life altering mistakes, as I like to say. So I think should be 17 and younger, but I understand that compromises need to be made. And it looks like there's a consensus around 16. And if we can get really secure protections at that age, you know, again, compromise is the name of the game in this town, if you want to get legislation through, especially in the privacy area. But I also am concerned about like, I love Venmo. I use it all the time, I think it's great. I don't understand why people allow others to see their transactions that just baffles the heck out of me. I you know, I look at my phone and I Venmo. And people are like saying what they gave their kids money for and their friends. I don't get that. But they want to do that. That is their right. But when they're 17. And under, that is a little more troubling, because that's open data. That's the data that can be scraped. That's the data that can be used by others who are not even part as we've seen with Facebook, and other social networking sites where they're scraped, that that information can be used for somebody who's younger, who thinks they know everything, and who cares, because I'm 17 I'm gonna be 17 forever. So I think those are some of the areas that we need to look at
Yeah, I mean, I'll sound a bit like a broken record. I think that our official position at this point is the is the Castro bill, which I believe takes it up to 17. And, and I think that we'd also want to take a hard look at the issues that are being raised here as well. But again, I get a sound like I said, like it's like broken record. Blackburn, Blumenthal is definitely got a lot of attention.
So a chief element of any bill needs to be that it's enforceable. And I do see in the UK code and even the bill that that you guys have the Blumenthal, Blackburn bill, some elements, that would be very difficult to enforce that I think there needs to be meat on the bones, for example, a general duty of care, it will be very hard for any agency to enforce. And so there needs to be a little more fleshing out of what that means. I do think there needs to be some special controls for tweens or whatever we're calling them. But more work needs to be done on what happens if both the tweens and the parents try to implement those controls at the same time. And otherwise, there's this terrible battle and the tweens are going to win, at least in my house. And I do strongly believe I don't have a view on 16 or 17. But you know that the 13 to 16 should have some special protections.
So we've alluded to the the UK is age appropriate design code. There was recently legislation introduced in California that basically seeks to export some of those protections and create a framework that mirrors that in the US context. But But more broadly, I'm wondering if you think that that is an approach. And again, I'll open this up to anyone that other states should be looking to emulate? Or what if anything we should be taking from other standards on this featured in GDPR or CCPA.
No states don't do it. Jessica. And I had a conversation right before this panel about, you know, whether our bill had intended to preempt states. I mean, I think that's probably the clear intent. We just, you know, I think, in our view, we didn't need to put the language in. But you know, perhaps we do need to put the language and I think that it's not right now, not today. But yeah, I think it would get very messy, particularly given the way that companies are structured. And maybe that's just me, you know, belying my Republican preemption preemption. Right. But we need we need not for states to do that.
Let me broaden this out a little bit. If there are aspects of those proposals that you think should also feature in a federal bill.
So let me just reiterate that I do believe the UK code would be difficult to just inform, to take straight to enforcement. Because the it's written like a best practices document. And as a former longtime enforcer, I would have a hard time using that to to go to court. But you could combine an approach like the UK code with safe harbors, that where groups could put meat on the bones could interpret that kind of code, specific to particular industries. And then people who enter those safe harbors who were willing to be part of those programs would make pledges to comply with those more specific standards, which would then be enforceable and more specific.
So I want to turn a little bit to we've talked a lot about the regulatory side. I'm wondering, I want to talk a little bit about what private actors companies could be doing to boost children's privacy around FinTech financial transactions. I'd love to go down the line in here, if you all can think of an example of a company in the private sector that you think is taking some positive steps to address some of these issues. And that you think other companies should potentially be emulating. Or if anyone who ever wants to jump in?
Well, I know that when my kids were tweens, we used a payment card that had limited features. So there was only so it didn't it was a I think it was a credit union. No, it was USAA not that I'm trying to advertise for anyone but you know, it didn't it didn't track them in in a comprehensive way. They there was only so much they could do with the cards but they weren't able to pay for things and they could deal with it in an emergency. So that seems like a very basic way to approach. Not that I'm sure we would want to legislate that. But there could be options for parents so that they don't sign kids up for data collection machine.
You know, it's actually hard for me to say, of a company, it's doing a good job. I know, Rick has flagged his company that's that I just heard about them fairly recently. But, uh, you know, I would just as an anecdote, one of our staffers was an engineer at a big tech companies social media company, who did GDPR implementation. And they told me and our whole office basically that there, the company basically made a blank announcement that they did not have any users under 13. And therefore, they did not have to worry about capa. And that's no actual knowledge. Correct. And I think that that's a lot what Jamie was kind of talking about earlier. And so, you know, given the size and power of the company that that she came from, that suggests to me that that was that's a that's a common industry practice. So, I'm not really, I think we really ought to be more trying to press them on this issue instead of instead of, you know, praising him at this point.
So I'm a little biased here, obviously, working with Missoula, and knowing what they have done in this space, you know, this is a commercial for them now. But the the ability of building privacy from the ground up and ensuring the protection of kids is what you should be looking for. I think, you know, I protect my own personal name brand, with, with a lot of effort to ensure that when I am working with an entity or groups that they people know that I truly believe in what those companies are doing. And working closely with Missoula, I know that their hearts and their minds are in the right place. When they started this company, this company was actually started in 2008, when they began looking at this issue way before the pandemic, yeah, the technology has been in development those years. And now we're, I always like it says, very old, it's like the old saying, you know, we, it took me 10 years to become an overnight success. And I think Missoula and Rigo is in that space right now. And again, when they came to me, and looking what they did, it's all the things that we always talk about, from a policy standpoint of what companies need to do, and Rigo Missoula with their Missoula Pay button, which is like a Apple Pay button or Google, Android or Google Pay button, on sites and all these things. It's what companies should be doing. And we're trying to become the standard that other companies are trying to reach.
So I won't list a company, in part, I probably wouldn't be appropriate for me to but number one, the example you cited was on the show Silicon Valley for anyone to watch that. So it's real, and it tracks. But I would say, right, as a matter of principles, we should be thinking about things like data minimization for kids, you know, limiting the data transfers for kids, right? A lot of the companies mean, I will throw a tick tock under the bus because my boss would let me throw a tick tock under the bus. But if you read their privacy policy, that's scary. They're collecting a lot of things that probably you all as adults would not want collected on you. You know, also query what they do with it. But that's a lot of things, and our kids are on it. And our kids are pretty much addicted to it, you know, kids in the US. And that's a lot of information that is not I'm not comfortable with, you know, having collected on kids and teens.
So, so we're actually coming up on time, and I do want to give a chance for the audience to ask any questions. Don't get over here in just a minute. I want to ask just one more question to the panel. But if after this questions done, if you'd like to raise your hand ask a question to our panelists. We'll run out a mic to you. Just the last question for me. So you know, Humor me and say that you are made, omnipotent, omnipotent Ruler of all things privacy for a day. In the US what would be the single most significant change, you think could be made, whether it's something private sector companies could be doing something regulators should be doing or a change to existing standards? What would be the most impactful?
Oh, my Lord 22 years, I've been involved in monitoring congressional actions on privacy. In 2000. I was the manager that that that was responsible for for a report to Congress calling on Congress to pass comprehensive privacy legislation 22 years of hearings and bills etc. We need that so bad The mess that we have today is in large part due to Congress's not you personally, failure to failure to act all these years. And so we need a comprehensive approach. I agree.
I'll just I'll go really quickly. I think that a ban on cross app tracking and a opt in consent model would be the most impactful things.
I mean, being a public policy person and having worked on all these years on the privacy issue in general, we let's face it, there's two major hurdles that Congress has overcome federal preemption and private right of action. Those are the hurdles and unless we come to some type of agreement, which we did just get in the can spam act, I always thought that was a shows you how old I am, because I worked on the can spam act. But it was a model that I thought could could work. The other thing I think, if I was able to implement legislation, is the bill making it illegal except in certain circumstances, to re anonymize data. I think that's also a big loophole, because you can take, quote, anonymous data and re anonymize it, we saw that with research that was done weathers when there was a big data link. So having restrictions on RE and RE anonymizing data, I think would also be key. So federal preemption, strong federal preemption, so we can have one, having some dealing with the private right of action in a way that makes sense and giving states the ability to enforce and dealing with the anonymized data in a real anonymization of data.
Okay, so if anyone has questions, if you want to just raise your hand, we'll we'll run out of mic to you.
So we talked a lot about teens and tweens privacy, and that oftentimes our we hear this debate discusses children privacy, we're actually talking about much older children, or teenagers. How do you balance the trade offs between the fact that we have children or teenagers in this range that are trying to seek information that they may not want their parents to know about, with also ensuring that there is parental choice and that parents are well educated around the tools that are available
to them? So you all may have other feelings, but you know, I think in our boat, we tried to balance that. I mean, we raised the age to 16, which I think for companies, they told us, that would be easier in terms of compliance with existing regimes like GDPR. But at the same time, you know, the parental controls are there by default, but can be turned off. You know, I think that that's important that the parents and the kids be able to work together to be able to say, you know, hey, I'm mature enough, right? When I was a kid, frankly, my parents would never have deign to put controls on to me, right. But, you know, to be able to say, okay, you know, I don't need my profile set to private, I don't need to have this particular control in place, right, or the parent can just kind of do that knowing that their kid is mature enough. Right. And I would also say, right, that, certainly, our intent is not to apply to sites, like for example, Wikipedia, right? I mean, if that's how you all read it, I'm happy to clarify that that's not the case. But certainly, it's not meant to apply to like general usage, research type sites, where kids should be able to go and, you know, research and look up other things that are sort of outside of the, this is a child intended site.
Great question. And that's always a balance as a parent, right? When do you track your kids when they're driving, you know, or the first time they're riding a bike and go in the neighborhood? It's scary to let go. And you know, the first time they go to school, do you follow them to school, that's the age old question of being a parent of those bouncing. And so having the tools in an OP that are set on helps in that because parents, what I hear in my online, the child safety groups, is parents don't know how to turn the tools on. They're very complicated, and they're unsure what to do. And we always hear this refrain that my kids know more than I do about the technology. And they've been saying that for 20 years. And now the pure people were saying that are the kids who are now adults and have kids. And so I think there's that bouncing. There are tools out there, like safety net, and others who have looked at that I'd highly recommend you look at those. I have no financial interest in safety net, but they're a UK entity and they try to balance that or the privacy versus notifying parents when there's baby cyber bullying. I think there's bark is another one that is out there that has some of these tools. So there's there's tools out there, but I am working on the earnest act. And one of the things that was very disappointing when Apple flip flopped on their announcement what bothered me is that they were saying parents don't have a right to know when they're teenagers have nude pictures and see Sam going across their phones. I think that's an easy one. Yes, as a parent, if it see Sam, which is illegal I think I should know as a parent. And I was disappointed that Apple decided to flip on that with some pressure. But other issues of research and Wikipedia and finding other information of your own sexuality and things like that, I think, again, that's where parents need to trust their kids and have some latitude in doing so.
So you you've hit on a critical issue, which also fed into what I was saying about, you know, the battling controls, you know, and it's one of the reasons that we, the Congress, with the FTCs input originally did the under 13, although sometimes I joke, it's just about that was Bar Mitzvah age and that somebody just just chose it for that. I think the balance between parents and their children isn't really solvable in in legislation. The idea is that you give parents the tools that they have in other contexts, but not so much online. And but you can't solve for the parents relationship with their children, how parents supervise the minors, and some of that is just going to have to be worked out on an individual family level. We can't solve for that perfectly in legislation.
Yet, I just also say that I think that tension is something that a lot of industry has exploited for a long time in order to prevent being regulated. So that that would just be, you know, and I think it's, you know, it from my perspective, it's kind of we're not I mean, if you look at, I think on the one hand, you know, we have a bill, the informed consumers that passed the House as part of Pete's act. And that would require verification of third party sellers on marketplaces, as well as disclosure when they hit certain level to sellers. We think that's a perfectly appropriate and correct thing to do to make a safer Internet. At the same time, you know, we are we can be supportive of anonymizing children's data, raising the age and all that those are not in cutting in conflict despite the despite a constant refrain that they are on the part of industry.
So we've come up on time, this has been a great panel, just just to put a couple of fine points on some of what we've heard, there was certainly a lot of urgency from the panel around need for comprehensive privacy legislation, including greater protections for children. We heard about some of the trade offs around notice and choice and that model has always around a parental control model. So thank you all for joining us. And thanks so much for panelists for all their terrific thoughts today.