Assembly Standing Committee on Privacy and Consumer Protection
- Rebecca Bauer-Kahan
Legislator
My mic's on. Good afternoon. We're gonna call this hearing of the privacy consumer protection committee into order. This is an informational hearing that we are hosting today on online safety controls, and I wanna start by thanking all of our panelists for attending and participating in our hearing today. And, of course, I wanna thank the privacy staff, which is, as I believe, one of the best in the building, the rules committee, sergeant's office, and other support staff for helping to organize this hearing.
- Rebecca Bauer-Kahan
Legislator
This hearing was really born out of, my experience serving on this committee for now seven years. We hear and will begin with the lived experience of parents navigating social media with their children who say that the systems are broken for their kids and they aren't working. And then we kept hearing from industry, the answer is parental controls. And so I decided that we needed to put these two perspectives together and have a real moment to talk about the parental controls, how they work, what they are, what they're doing. Do people know they exist?
- Rebecca Bauer-Kahan
Legislator
Are people using them? How and why and can they fail children, and how do we navigate a path forward online that is safer for California's children. And so I wanna really acknowledge all of the speakers who are here today to have that conversation with us. As I said, we're gonna start with, the Hinxes who will really share what it means to be a parent navigating parents navigating this world. We're gonna hear from folks representing children.
- Rebecca Bauer-Kahan
Legislator
It's a voice that often isn't loud enough in this building, but we have, two incredible organizations, Common Sense Media and Children Now, who will be here to speak on behalf of California's children. I also wanna give a huge debt of gratitude to the four companies, California companies. I will say Google, Meta, OpenAI, and Roblox, who all agreed to be here. Often, we have to go, a second round to get people to participate, but every single one of the companies that we invited in the first round agreed to come and have a conversation with us about parental control. So I really wanna express my gratitude.
- Rebecca Bauer-Kahan
Legislator
I think this is an important conversation. Thank you for being here to have it with us. And then lastly, we'll have a panel to discuss potential solutions with some experts that research this space every day. And I think this conversation will hopefully help us, navigate a path forward in the social media space that is informed by what is happening online, the realities of these, products, and the experts' research. Because I know that all of us hopefully are committed to a safer online future for California's children.
- Rebecca Bauer-Kahan
Legislator
I will say that I'm entering this conversation personally with the fundamental question in my mind that if we know that these, in some cases, these online spaces are designed to be addictive and to keep our children engaged, can any amount of time be safe for them. And so that's where I come at this, but I I also think that it is the reality that our kids are growing up with, and so we need to figure out, you know, what is the way for California to create the safest spaces for our children. So with that, I wanna turn it over to my colleagues if they have any opening remarks. Assemblymember Lowenthal, or Wicks?
- Josh Lowenthal
Legislator
Yeah. I'll be very brief. First of all, I just wanna thank our chair. You know, this committee has led with moral clarity in a way nowhere else in The United States has, actually, including our Federal Government, and I am grateful as a father. Thank you, madam chair, for today and every day that we do this work.
- Josh Lowenthal
Legislator
And I also wanna thank everybody that has come here. I believe we are all a community together. All of us ultimately want the same things for, a healthy consumer, a robust business, you know, a a future that we all know, this generation is surpassing the generation before it and so forth. And so I look forward to having individual relationships with each and every one of you, and I know that everybody on this committee feels this way. And, it's just a joy that you that you showed up today. So thank you so much.
- Buffy Wicks
Legislator
Thank you, madam chair, for pulling together this hearing and for your leadership in this space. And I also wanna thank my colleague here from Long Beach who's been a tremendous leader as well. I've been working in this space now since day one, when I got to the legislature. And, oh my, has technology changed in those almost eight years now? And, you know, I've done a number of bills, many of which have resulted in being challenged in the courts and continue every single year to figure out how we keep our children safe.
- Buffy Wicks
Legislator
You know, one I the one thing I'm inspired by, honestly, is the fact that, you have lawmakers who are first and foremost parents before they're Democrats or Republicans. And we have a bipartisan group of of parent lawmakers who are just trying to figure out how to keep our kids safe. That is the goal. And we welcome industry in that conversation and and being a part of the solution to that problem. We love our tech companies.
- Buffy Wicks
Legislator
They're a big part of our economic engine in California, and they need to make sure that our children are safe. And I think we can have all of those things, and obviously appreciate the expertise of, and the diverse points of view of the advocates, the children's advocates who are part of this conversation as well. I also know that often what we do in California leads not only the nation but the globe. And we have regular conversations with our counterparts in the European Union and in The UK and other places as well. We're looking what other countries are doing and modeling work from them and learning from some of their lessons.
- Buffy Wicks
Legislator
But I think we all stand here committed to making sure our number one job, and I've always said this, is the most important thing we need to do is keep our community safe and from my perspective, most specifically our children, and that is my goal, my mission in an incredibly complex, technologically evolving, complicated space that is evolving all the time. So we also wanna create legislation that can be implemented, is implementable, right, is doable. And so that's where I I always welcome conversation from opposition. I I'm I genuinely actually love conversation with opposition because you learn more about what you're trying to do in that in that context. But it's also I think you get better policy when you are really in the weeds trying to figure out, again, how to adhere to these these guardrails, but in a way that can be implemented. So with that, excited to be here, and thanks for your leadership.
- Rebecca Bauer-Kahan
Legislator
Thank you, Assemblymember. With that, we will start our first panel, as I mentioned, will be or opening remarks will be from Victoria and Paul Hinks who are advocates for social media safety. So if you guys wanna come up. And as you get comfortable, I just wanna express our gratitude for you being here. I think you provide a really critical humanizing voice to any conversation around social media. So thank you for being here.
- Victoria Hinks
Person
Thank you. Thank you so much for having us. Good afternoon. My name is Victoria Hinks, and I'm a survivor parent who lost our daughter, Alexandra Hinks. Everyone knew her as owl forever 16. We lost her to suicide 500 and 87 days ago. And she was a beautiful girl on the inside and out. She was kind. She was cross country runner. She wanted to be a preschool teacher one day and have a family.
- Victoria Hinks
Person
This is a loss that has so profoundly changed our family's life, and I it's left me living with severe PTSD. And since her death, I've dedicated my life to speaking out about the ways that social media can impact vulnerable young people and families, and I share the story so that other families will not have to endure the horrible tragedy that happened to our family. So, hopefully, it can help bring awareness and accountability and stronger protections for others so no other family has to go through what we went through. We didn't solve car deaths with parental controls. We fixed the product itself by implementing mandatory seat belt loss.
- Victoria Hinks
Person
So Owl would be graduating from high school from Redwood High School in Marin County in June. And while her friends are all eagerly, you know, awaiting their college acceptance letters, we've been eagerly awaiting her headstone finally being put up. And I brought pictures of that for you all today. So these so called parental controls never worked. She found a way around, and we never really stood a chance.
- Victoria Hinks
Person
And this is why the work that you all are doing is so important to us because this could be anyone's child. It doesn't discriminate, Republican, Democrat. It's you know, she was a she had a bright future ahead of her. So the grief that we live with is the most painful thing ever, and it could be anyone's child. We thought this is something that could never happen to us. So that's and thank you so much for having us here.
- Paul Hinks
Person
Afternoon. My name is Paul Hinks. I'm Victoria's husband, Alexandra's father. I have been a software engineer in Silicon Valley in San Francisco for over thirty years. We as a family consider ourselves to be tech savvy, and our children grew up in a house full of gadgets, video games, smart TVs, speakers.
- Paul Hinks
Person
Our house is pretty much controlled by an app. We never let our children have a TV in their rooms. We always discourage prolonged tech use, and we didn't allow devices at the dinner table or out in public. And we held off getting them phones until much later than their peers. Her older sister had already been through this successfully.
- Paul Hinks
Person
We weren't starting from scratch. When we finally gave in to the inevitable and bought 13 year old Alexandra an iPhone, we thought we did everything right. We researched the dangers. We made her sign a contract acknowledging that the phone was our prophecy, that we controlled it, and that we could take it away from her at any time. No phones at night.
- Paul Hinks
Person
She happily agreed to show us what was happening on it And to keep track of her own usage, we set up screen time limits, age appropriate content restrictions, and a firm 9PM curfew, after which the phone could only be used to play music or call her family. We felt prepared. This was an Apple device. They make great devices that just work. So what could go wrong?
- Paul Hinks
Person
We weren't naive parents stumbling into this blindly. We thought we genuinely understood the dangers our daughter was being exposed to. We'd attended meetings at school where online bullying was discussed and the contagion of self harm and eating disorders among teenagers. We took it seriously, but the threat felt manageable, local even. Her school friends talking around themselves, the kind of thing that could be sorted out with a phone call to another parent.
- Paul Hinks
Person
After all, we had all the parental controls on. We had devices on our network that was supposed to filter out dangerous websites. No random stranger from across the world would affect our child. What we didn't realize was that the dangers were coming from inside some of the apps that Apple told us were trusted. Initially, we did not allow social media at all.
- Paul Hinks
Person
Slowly, we added apps as our daughter grew older and wanted them to keep in touch with friends. Each app had its own parental controls, and we set them up to keep her as safe as possible. But, again, surely, the app manufacturers had their customers' best interests at heart. Surely, they would not allow dangerous content to reach the screen of a teenager. What had worked at 13 did not work at 15.
- Paul Hinks
Person
Our daughter began obsessing over her phone. She seemed very fragile and upset all the time. We didn't know the cause. There were probably many. She was transitioning from middle school to a high school that none of her friends were attending.
- Paul Hinks
Person
Her older sister, who she was very close to, had left for college. She was desperate to make friends, and some of the people she chose were not great people. She felt isolated and turned more and more to social media for companionship. We were aware of this, but it wasn't a major concern at the time. Surely, social media major benefits was to keep her in touch with friends from her old school, and we had all the controls and limitations turned on.
- Paul Hinks
Person
Surely, nothing bad could be going on. She was happy to show us the apps when we asked, but she had ways of hiding things she did not want us to see. When we finally accepted that something was seriously wrong, we had lots of fights. We began to suspect that the phone was causing her problems. We restricted her use of social media to one hour a day, not realizing that these restrictions were broken.
- Paul Hinks
Person
She could simply tap to ask for more time and stay on the phone as long as she wanted. We took the phone away from her for days, weeks at a time. That helped. She would apologize and ask for the phone back so she should keep in touch with friends. Her therapist told us that taking the phone away was actually harmful and isolating, and letting her use it even to listen to music would help.
- Paul Hinks
Person
So we agreed to this. Who wants to totally isolate their teenager from their friends? These devices can be made safe. Consider what happens when a company issues a device to an employee. There is an IT department. There are policies. There are people whose job it is to ensure that that device is configured correctly, that dangerous content cannot reach it, and that someone is accountable if it does. The company has legal obligations. The device manufacturer has contractual obligations. The chain of responsibility is clear.
- Paul Hinks
Person
But when a parent device buys a device for their child, there is no IT department. There are settings buried in menus that most people cannot find and that are ambiguous as to their effect, connected to restrictions that can be bypassed with a tap. There are app manufacturers shielded from liability by law and a platform company that makes takes no responsibility for what is displayed on its screens. The chain of responsibility leads nowhere. Nobody is accountable.
- Paul Hinks
Person
The companies don't care. They will happily feed a 15 year old girl content about self harm if that will keep her engaged and strolling for longer. And the people paying the price are children. Our daughter was presented with content that painted suicide as a rational and reasonable way to deal with her problems. Eventually, she was able to use social media to find the best way to kill herself. Thank you.
- Rebecca Bauer-Kahan
Legislator
Thank you both so much for being here. I know this cannot be easy, but your advocacy absolutely makes us better.
- Rebecca Bauer-Kahan
Legislator
We'll now move to the first panel, which is an overview of the types of parental control challenges and reasons for failure. We have Sunny Liu, director of the Stanford Social Media Lab, Lishaun Francis, policy analyst and advocate for children now, Anik what? Anneke. Anneke. How do you pronounce her last name? Buffone? Did I get that right, Anneke? Yeah.
- Rebecca Bauer-Kahan
Legislator
Who's the PhD and founder and CEO of Clara, Clear AI Risk Assurance. And they will be opening our first panel, and then we'll take questions after they finish.
- Sunny Liu
Person
..., thanks so much for sharing your stories and have courage to be here. As a mom myself, I start to research about online harms because tragedy like this. Like so many parents, we simply just want to protect our children. Madam chair and the committee members, today, I will share about our research at Stanford Social Media Lab on the challenges parents facing digital parenting. The view present here on my own, I should be not be interpreted to represent the views of the universities.
- Sunny Liu
Person
So I'll start my, presentation. So we asked about 500, parents and kids across The United States. I know the next, panelist will talk about children's perspectives. So I want to briefly highlight the key, findings here. So we ask kids age from 10 to 18, what do they wish their parents to know about their social media use and online world?
- Sunny Liu
Person
The answer is mostly trust them more, put, like, give them more clear guidelines and also have clear expectations. When we ask parent so we will ask parents what they are most concerned about their children's own experiences. The most concerned are excessive use, harms, risks, privacies, and also impact on mental and social well-being. So if we look at really all those different perspectives, we can see both alignments and disalignments. Both children and the parents, they're aligned on the goals.
- Sunny Liu
Person
They want a safe and healthy online world. The misalignment centered on the approach. So how to set up boundaries and what is the way to control that? So currently, what our parents are doing now? So what are the ways they really prevent harms and protect their children's online?
- Sunny Liu
Person
So what they do is by parental control. So what okay. So what is what are parental controls? Parent controls are those tools and features to parents use to manage their kids' digital access from screen time limits, content filters, app limits. For example, for, Apple's family sharing, Android's family links, and third party looks links like barks, custodian, and nannies.
- Sunny Liu
Person
So those tools definitely exist, but we're here still today talking about how to protect our children and reduce harms. So clearly, those tools are not sufficiently to prevent the harms as we want. So today, I want to share about the research at our lab. So core challenges that parents facing to really protect their children's online. The first one is digital parenting is challenging.
- Sunny Liu
Person
The second is tech is complicated. Third, there are constraints on parental controls. And last, accessibility and equity gap. Digital parenting is challenging. A Pew report, suggests that two thirds of parents today is thinking that parenting is harder today than twenty years ago because of technologies like social media and smartphone.
- Sunny Liu
Person
Digital parenting is just one part of parenting. Parenting is challenging. Here's what mom shares with with us. There is a pressure to be everything everywhere all at once to your children. The sense of constantly needing to do more, to be around more, and be more of this and be more of that within the environment that doesn't really support.
- Sunny Liu
Person
And online make it even more challenging. So parents has to constantly understand and really navigate those complicated online safety world. So here's one parent share with us. It's a struggle to make sure my child doesn't see inappropriate content, image, or pornography, knows who he interact with, and that he doesn't not bullied. So in our research at the lab, we identified 22 type of harms young people can encounter online.
- Sunny Liu
Person
From cyberbullying to sextortion to harmful content, online hate, algorithm risks. So parents have to constantly navigate those involving technologies and involving harms. And third, there is a knowledge gap. So, so there is a knowledge gap. Kids know those technologies better than their parents.
- Sunny Liu
Person
Parents always feel that they are one step or even 10 steps behind of their kids. So what's they're happening online? So those 43 points really makes digital to a parent is really challenging. And tech is complicated. There's so many different platforms, features, interfaces, and products.
- Sunny Liu
Person
Parents has to constantly navigate all those different settings. And as soon as they figure this out, their updates, they have to relearn everything again. And the settings at device leverage don't work at apps level, and apps level settings will not work at device level. I have a 16 year old daughter who loves to use Instagram, and I delete her on her phone, and then she now use on her laptop. So which might be even more risk or more interfere with her study more or her life more.
- Sunny Liu
Person
And the third point is there are constraints on parental controls. Kids circumvent, so they find all the different ways to bypass all the parental controls. Some parents caught Wi Fi at midnight, then kids go to a neighbor's house for connections. And protections and controls can backfire as well. So overly controlled or two restrictions can sometimes involve parents and kids, cohesion, enroll trust, and increase confidence in families.
- Sunny Liu
Person
Screen time is the number one conflicts in family now. And the lastly, I want to highlight accessibility and equity gaps. Not all the parents have the time and energy to constantly moderate. So we have single parents' families and the families that have parents have multiple jobs. What caregivers, like their grandparents, older siblings, they don't have the time and energy to really constantly monitor.
- Sunny Liu
Person
Tools do exist, but not every family's can afford those tools. Third party apps from custodial to bark costs from $10 per month to $40 per month. So our research show that the currently parent controls don't work for four main reasons. Digital parenting is challenging, tech is complicated, constraints or parental controls, accessibility and equity gap. So I'm so glad, as we can see, it's really a complicated issue and the stake is so high.
- Sunny Liu
Person
For this reason, I'm so glad that the committee take this seriously and bring such a wide range of, stakeholders here. So I hope that my research here will help to frame in the discussions. I look forward to hear from the future for the other panelists, witnesses, and I look forward for questions, in the discussion.
- Lishaun Francis
Person
Thank you so much, Madam Chair, members. My name is Lishaun Francis, and I'm with Children Now. We are a statewide research policy and advocacy org focused on the whole child. Our organization also leads the children's movement, a California network of more than 6,000 direct service, parent, youth, civil rights, faith based, and community groups dedicated to improving children's well-being. Our goal overall is to sound the alarm about how kids are doing in our state.
- Lishaun Francis
Person
In regards to mental health, addiction, and online spaces, not well. And the data makes it clear that digital spaces are both a reflection and a driver of that crisis. I know that today, the header of this hearing is social media, but I'm going to talk broadly about digital spaces. I grew up in a time of AOL online chat rooms, and, obviously, that has changed. And so we I'm gonna say digital spaces more broadly because the iteration of things is constantly changing.
- Lishaun Francis
Person
That's just the nature of tech. In 2021, children now wrote a letter to the governor asking him to declare a state of emergency for California's youth due to the mental health crisis. That declaration was never made. And the urgency around the mental health crisis for kids actually remains today. The connection between mental health, addiction, and social and and digital spaces has never been clearer.
- Lishaun Francis
Person
According to our 2025 youth poll, about 94 percent of young people in California report experiencing regular mental health challenges, with one third describing their mental health as fair or poor. Nearly all of those reporting poor mental health98 percent, were youth of color. More than one in three LGBTQ youth in California seriously considered suicide in the last year. For transgender and non binary youth, that number climbs to nearly four in ten. Indigenous youth in California bear the highest rate of suicide deaths among any youth group by a wide margin.
- Lishaun Francis
Person
On overdoses, fentanyl has transformed the crisis entirely. Adolescent drug fatalities remain more than twice pre pandemic levels. Seven hundred and eight deaths nationally in 2023 compared to 200 and 82 in 2019. The National Crime Prevention Council estimates that eight in ten fentanyl overdose deaths are connected to social media contact with dealers actively using these platforms to reach young people. Psychiatrists warn that generative AI affirms, affirms, enables, and fails to challenge delusional beliefs.
- Lishaun Francis
Person
The digital crisis connection to these mental health and addiction outcomes are no longer speculative. According to our youth poll, nearly a third of California young people say social media has been harmful to their mental health. About one in three report being cyberbullied, and roughly seven in ten say social media contributed to a negative body image. So what is the industry offered as a solution? Parental controls.
- Lishaun Francis
Person
And one of the things that I really do wanna flag in this talk when we talk about parental controls and one of the reasons why I spent the majority of my introduction on the state of the mental health and addiction of young people is to realize that we're not actually answering a tech problem. We're answering a child safety problem. And once we understand that, I think the solutions will be clear. We need to be a little clear eyed about what parental controls actually are and where they come from. The design and definition of parental controls have so far been dictated by tech companies themselves.
- Lishaun Francis
Person
That means the industry has controlled the narrative around what safety looks like. And too often, it looks good on paper while doing very little in practice. When companies use parental control features as a public relations shield, it allows them to sidestep sidestep the deeper systemic problems, harmful design, exploitive engagement algorithms, and inadequate privacy protections. A 2025 report titled teen accounts broken promises tested 47 of Instagram's teen safety and parental control features and found only eight worked as intended. Most were ineffective, unavailable, or easy to bypass.
- Lishaun Francis
Person
Fairplay found that parental controls do not accurately reflect what a teen is actually experiencing online. Parents are not notified by default when their child reports a poster account, and children can easily open a Finsda account with no indication appearing in parental supervision tools. In 2025, pediatric experts warned that YouTube Kids still allows low quality and borderline harmful content to slip through even when parental controls are unable because creators can self label videos as for kids and game the system with friendly thumbnails and keywords. These aren't isolated glitches. They reveal a pattern.
- Lishaun Francis
Person
Parental control is designed to look like protection without actually having to provide it. Young people see through that. When we talk to youth about technology and online safety, parental controls are rarely what they bring up. In fact, when I bring it up, they actually chuckle. And it's not because they don't care about safety.
- Lishaun Francis
Person
It's because they know these tools don't work. Many of their parents aren't fully equipped to manage or understand how these systems work. Setting them up requires technological skill, time, and patience that parents simply don't have. And even when parents do engage with these tools, young people say the controls are set in such a way that they can easily navigate around them. So when I ask what will be effective, because I know they care about their safety online.
- Lishaun Francis
Person
They say instead of focusing on parental controls, they want online literacy, digital responsibility, and corporate accountability. They understand that the online environment they inhabit is not shaped by personal choices alone. It's engineered by the design decisions tech companies make about platforms, algorithms, and engagement tools. In their view, teaching young people to critically evaluate content and understand data practices is more empowering than any parental dashboard. Young parents also want their parents to be educated, not just on how to use parental controls, but on how to have open informed conversations about tech.
- Lishaun Francis
Person
They want collaboration, not surveillance. When parents understand digital culture, social media norms, gaming communities, content creation spaces, they connect with their kids on a human level rather than a policing one. Importantly, the approach to digital safety needs to evolve as children grow. One of the things we see very often is that we write legislation where the tech rules apply to a three year old in the same way that it would apply to a 17 year old. That is not sufficient.
- Lishaun Francis
Person
Perhaps what's most telling is this. When I spend time with youth advocates and ask why they keep using platforms they clearly dislike, their answers reveal just how much the stakes have changed. They tell me they feel compelled to participate not for entertainment, but because school announcements live on social media, political activism happens on social media, and job opportunities are shared online. For today's young people, these platforms are not a fun pastime like my AOL chat rooms. They are infrastructure.
- Lishaun Francis
Person
Opting out isn't really a choice. That is precisely why the burden of safety cannot rest on families alone. The real question before us is not how to build better parental controls. It's how to shift the conversation entirely away from tech companies defining what digital safety means and towards families and young peoples young people and policymakers outlining what is expected from corporations that provide products to our kids. This should be no different than the safety protocols for vehicles, car seats, toys, cribs, and the like.
- Lishaun Francis
Person
We need policymakers to come together with urgency to examine which rules and regulations need to change, address the structural crises in our digital spaces, and put meaningful guardrails on corporations because our children do not feel that they have the ability to leave these digital spaces that are offering them different ways to engage in life. The resources and reforms we pursue must reflect the full scope of this, both offline and on. Thank you for your time.
- Rebecca Bauer-Kahan
Legislator
Thank you for that insight. And now we will turn to doctor Anneke Buffone
- Anneke Buffone
Person
I am a PhD trained social psychologist and positive psychologist, And I've done most of my research on well-being, and empathy, until I transitioned actually to tech itself where I spent seven years. And unlike a lot of other researchers, actually, I was on growth and safety teams. So I understand, sort of the full stack pretty well. And then the last two years, I worked on Asia Assurance, which is particularly on the youth well-being team in this particular space. And so, I'm here.
- Anneke Buffone
Person
I'm now, I founded a company that is a nonprofit that is very, very new. But the goal is to do research based advocacy and to build the right products in this space and to have the right conversations and suggest sort of the technical solutions that can actually work. And so I wanna start maybe sort of break code a little bit here. You know, last night here was the Nine Inch Nails concert. I don't know if anyone here was there, but so when my parents were parenting me, their biggest worry was that I might like Marilyn Manson and that I was going to go to the love parade.
- Anneke Buffone
Person
And so they said no to bells. And today, parenting is so much more difficult. Right? Because we don't actually know what the kids are seeing on these different apps because a lot of it is hidden to our view. And so, I get a lot of times the question, like, is it the algorithmic change that we need?
- Anneke Buffone
Person
Do we need to raise minimum age? Do we need to change parental controls? And I think my answer is that this isn't the right question because we need a lot of different layers, and we need a lot of layers because every family is different. And so we're not gonna convince every family to be as strict as possible. We're not gonna convince any family to be as loose as possible.
- Anneke Buffone
Person
And the rules that we make have to work for the conservative Christian parent that wants to shield their children from certain ideologies and from the parent with the LGBTQ teenager that wants to protect their child from hate speech and so on and so forth. And so we need the whole stack to address the problem. I I if, you know, raising hands were appropriate, I would ask, which of us in the room have changed their child's age on the device up because things otherwise weren't working and things were broken, like me. A lot of us have. A lot of us had have noticed that when we put on parental controls, things we wanna try to be using don't work anymore.
- Anneke Buffone
Person
Be it, you know, they can't listen to an audible book for a bedtime. Be it that they can't get sent Apple Cash, so that we can have them try independence and go to the store with their friends after school. And so it it really needs to be a redesign of the whole system. So I think, again, we need everything. Age assurance is the number one barrier.
- Anneke Buffone
Person
Right? Because if we have kids on with false ages because things otherwise break, then we can't protect the child in the end of that. Right? Because the child will be assumed by the app to be an adult. So privacy privacy assuring age assurance is really, really important.
- Anneke Buffone
Person
I get the question, can't we just verify everyone with an ID? My personal opinion is that that may not be the right approach because a lot of adults don't wanna do it. And so it just leads to a circumvention of adults, but it can also lead to a circumvention of parents because, like, 80% of parents say that they're really concerned about the privacy of their children, about data breaches, about you know? So so now what is what is the good news is we have a lot of technology now that can go beyond IDs and can go beyond these approaches and have unobtrusive ways to to get to get to age assurance. And California passed this amazing device level age assurance law, which is really great because it it opens up a lot of privacy preserving, methods for children.
- Anneke Buffone
Person
Then there's device level controls. Those are really great, but they're very high levels. You can set things like screen time at the very high level, but you can't actually touch what happens in the app. So it's kinda like it's like Vegas. Like, what happens in the app stays in the app. You can't really see it. You can't really influence it. Or, like, global regulation. Right? Like, global regulation usually can't touch individual countries' legislation.
- Anneke Buffone
Person
So so you gotta understand how that works. And then on the app level, that's where you can set different controls. But as Sunny was saying, like, if then every single app has their own interface and different symbols and, you know, and some things that exist somewhere but not on the other app, then, you know, it just gets very thin on what you can actually control. And then there's the third party tools, which are great gap fillers, but they have the same problem. They can't really see very deeply inside of the different apps and they cost money, which is an equitable, equitably issue, which both of my previous speakers have spoken about.
- Anneke Buffone
Person
So again, I think privacy preserving insurance is really, really important and, you know, having multiple different ways. Parents often help kids. The younger the child, the more likely it is when they're on something the parent has helped them. And kids, like, of course, can also get around. And then what we see now is kids moving to less safe apps.
- Anneke Buffone
Person
So there's all these apps coming out, like, some of which you probably haven't heard of before. I haven't really either before I started doing the research. So there is Yubo and Lemonade and Lock It Widget and CoverStar, and some of them have atrocious things happening on them. A lot of them are actually trying to do the right thing. They're trying to be safe for children.
- Anneke Buffone
Person
But if you're starting out, you're not gonna have a huge safety team. Right? So you're limited. So when kids leave TikTok and Instagram, they might go to these other places, and so we just have to make sure that we make, make them safe everywhere. Right now, it's about a hundred to a hundred and twenty hours of setup, required of a parent, between initial setup and setting up every single app and doing all the monitoring. These controls are often hard to find, multiple clicks. Like, often if you
- Anneke Buffone
Person
A hundred to a hundred and twenty hours a year. Yeah. And and this is research based. Like, this is, like so this is based on expert opinion of people that have tried. Like, it's it just takes a long time.
- Anneke Buffone
Person
You have to find all the different settings. You have to set it up. Like, it's just a lot of work. And then at the end, you have a lot of awesome dashboards that have a lot of data and no information. So, again, and the settings break, so a lot of parents give up.
- Anneke Buffone
Person
Parent child linking, a lot of times, it's just a link that sends gets sent to an email or some QR code being sent being scanned, so that's very easy to get around. Sometimes there's adult verification. We have hardly any cases of actual parent verification or guardian verification. A lot of times, parents need their own account to supervise for the app, which is also, I think, unacceptable. There is silent graduation a lot of times.
- Anneke Buffone
Person
There was a famous example where kids got an email, hey. You can soon unlik yourself from supervision. So, obviously, we don't wanna do that. Kids usually can remove supervision unilaterally. If the parent is lucky, that app has decided that parents should get a notification, also not always the case.
- Anneke Buffone
Person
And then there's false positives. So I get every few days, I get a your daughter got a nude picture alert, and it's never a nude picture. It's just twelve years taking really bad pictures and, you know, sending pictures of their warts and whatever, things like that. So, again, parents, can only control and see the tip of the iceberg. It's like things like screen controls and, you know, some content, like, sense of content can be blocked.
- Anneke Buffone
Person
But what does the algorithm optimize for? What kind of profiling is there? What kind of advertising? What, you know, what is what auto completes for search? Like, posting privacy. One of my daughter's apps, when you do a challenge, a dance challenge, suddenly it becomes public. So my engineering husband had to flag that, and so she no longer has that app. But it's sort of you find out over time.
- Anneke Buffone
Person
That was CoverStar. So AI chatbots are the next unregulated frontier. Kids use these apps, but no one is really empowered to watch. Really, OpenAI is the only app that has had any meaningful, in my opinion, controls here in h h checks. And, really, I think, especially for AI, it it really is is very disappointing to me, like, personally, as someone who has worked in tech, because we have seen social media.
- Anneke Buffone
Person
And so the fact that a lot of age restriction is just a checkbox, and that there isn't parental controls is a really big concern, especially with how powerful, these apps really are. And parents really are in the dark. When you look at research, parents don't know how these these apps really work, like, what to worry about, how to keep the kids safe. Teachers say the same things. So and, of course, we've already seen some pretty pretty bad harms happen to children.
- Anneke Buffone
Person
The data and tools, and this is sort of one of my last points here, is absolutely exist within the companies. Companies have the data. Companies have the tech the capabilities. And now with language models, it really is is in reach. It used to be harder, admittedly, to classify content and to provide some of these controls, but today, it it is absolutely possible and it is being used in other ways.
- Anneke Buffone
Person
So, companies, of course, safety investments can create a lot of competitive disadvantage, like, age verifying everyone. You know, it it loses a lot of adults, and youth are important to the business. And this isn't some earth shattering fact. Right? Like, I think every company, be it like Nike or social media, they want the next generation to be customers too.
- Anneke Buffone
Person
And so I actually do see myself also as an advocate for the safety researchers that are working in companies today because I have been there and a lot of us want these, you know, these things to, you know, to happen. We have a lot of you know, there's a lot of values. Like, this is my personal opinion, obviously, but, like, a lot of the values and, like, wanting to do the right thing but aren't always empowered to do that work. And especially now, a lot of safety researchers have been laid off, and so I think there's even less of, feeling protected to speak out and to really advocate for the change internally, which is one of the reasons I'm doing this work outside right now. The the, chair, Rebecca Bauer, started with saying, is it safe at all?
- Anneke Buffone
Person
And and I think there is sort of this idea of Pleasure Island at Pinocchio where the kids go and they get handed cigarettes and whatever. I do think if you have apps that are optimized for for engagement, have like, are optimized for content that is meant for adults, and then you tack on safety in the end, it may not be good enough. And so one thing that will need to happen is really thinking through how should these work these apps work, and, also, is there a responsibility to make the safe version of the app just as fun and entertaining as the adult version because, otherwise, that also will drive circumvention. So self regulation is not working. This is my last point.
- Anneke Buffone
Person
We need independent standards. We really need to know what are the base rates of kids with false ages on the app. Are the control the control features and the age regulation features the companies are that that they are putting in, are they actually reducing that rate? Are we what is the harm base rate? What are the interventions?
- Anneke Buffone
Person
Is it going down? Every intervention and safety feature that isn't meeting that bar really isn't good enough, and we need the standards for that. So, my four takeaways, like, kids and youth safety needs a lot of different layers. We need multiple approaches. We need minimal standards.
- Anneke Buffone
Person
AI chatbots really need more regulation than there is right now, and we need independent standards so that we have really a baseline for cause and effect, and we can make sure that we can really assure that kids no longer get harmed. So, yeah, that is my pitch. Thank you.
- Josh Lowenthal
Legislator
Okay. Before I say anything to, to the panel here, I just wanna acknowledge the Hinks family. It is so important to have your voice in this conversation, and I know how well, I don't know. I don't know how challenging it is for you to come up and relive this all the time. But I can tell you that your presence here is meaningful to all of us and helpful for this conversation because it we are able to make it real.
- Josh Lowenthal
Legislator
So thank you for being here. I'm having I'm struggling. And let me tell you why I'm struggling. Because I don't understand what safe is all about. Is safe meaning that we are what we're we're stopping harm, crisis type harm from taking place, interactions that can be deadly, suicidal ideation, you know, things that are absolutely catastrophic harm.
- Josh Lowenthal
Legislator
What about intellectual harm, academic harm? The the empirical data that we're hearing right now, which is about our kids no longer surpassing this generation. I alluded to that earlier, which is a, you know, grave concern, I think, to all of us. So I wanna ask an open question about that, and and I'd like to hear how you answer that. That'd be great.
- Josh Lowenthal
Legislator
And I also wanna ask you about China and your feelings about what's happening in China. China, to me, is the only country that I know of that was ahead of of this from a regulatory standpoint. I don't think of them as a beacon of civil and human rights whatsoever. And clearly, they don't have a constitution with the bill of rights that we have here in The United States. And yet, I wonder and do we have any information empirical information about mental health, disorders with youth happening in China right now as as a result of those things?
- Josh Lowenthal
Legislator
And I know that their efforts have been quite draconian. But to me, when it come back to this issue of harm, you know, they're very focused on STEM and STEAM. They're very focused on making sure kids raise the bar on their goals and their dreams and their hopes. They're they're focused on teaching kids healthy lifestyles and healthy choices and so forth. And, you know, to me, that's very attractive. And so I just wanted to ask your you for your comments and thoughts on these things.
- Sunny Liu
Person
Yeah. Thanks so much for those two questions. I think I think those are the same question. I think, fundamentally, how we can support children to develop to have a healthy development. So we don't know I think that our research relates to technology is limited, but we know a lot about what will make kids thriving.
- Sunny Liu
Person
So their fundamental needs, both setting physical, psychological, mental, emotional, those parts. We know all those psychology teachers how that. So I answered those questions in three ways. First is I think that if we think about how to think about harms, I think that reducing harms is one part of how to supporting case development. Kids cannot thrive when they are bullied, when they see all those online hate, when they have those showed us on the harmful content and in content risk harms.
- Sunny Liu
Person
So that's one aspect. But without harm does not equal to benefit. So we not only we don't want harms, but we also want kids to develop healthy, their identities, who they are, fulfill their potentials. Maybe it's intellectual, maybe it's social, maybe it's emotional. So for those parts, I think that it's really so one thing about without harm, that's one aspect, But how to make our environment both online and offline, support kids' development, their fundamental needs.
- Sunny Liu
Person
I think that's all part of the picture. And then I think that as about China, China, for a little bit background, I think a few years ago, China had this regulation for video game specific. So kids can only play video games. I I don't even I don't remember details, but maybe half hour on Fridays or hour on Sundays. It's less than two hours per per week for all the kids.
- Sunny Liu
Person
And they actually they implemented that really all the platforms. They have to call all those kids and their families to have to take responsible. You cannot have those kids to play video games. There was one piece of high evidence research coming out, I think, a few years ago. I'm happy to share that article.
- Sunny Liu
Person
It shows that actually kids' time to play video games did not decrease. So the policies in the implemented did not increase decrease kids' time online. But I think we do need more research to understand. Do those kids under those regulation develop better, have more time to play with friends, to intellectual development? I'm happy to do more research and then figure that out.
- Rebecca Bauer-Kahan
Legislator
Thank you. And I assume miss Francis wants to join us. Yeah.
- Lishaun Francis
Person
So I'll say first, I don't know much about China, so I can't answer that question. What I will say about how I know we've operated in The US, unfortunately, is that corporations and businesses seem to believe they have more rights than individual children and families, and they will sue to prove it. And that tells me everything I need to know about how we are engaging here with corporations and who is really trying to set the bar and the parameters for safety. What I'll also say in the mental health context in terms of what is even healthy for and and safe and thriving is we know healthy face to face interactions are the best. That is the gold standard, not online interactions.
- Lishaun Francis
Person
So the gold standard is face to face in person interactions. The ability to read microexpressions, the ability to hug someone, the ability to, you know, put your hands on over someone's and show comfort. That's the gold standard. We have we have began talking as if the gold standard are online spaces with interaction when it comes to mental health, whether it's through how we provide therapy or how we find community. It's not the gold standard.
- Lishaun Francis
Person
It's what we've done because we have a workforce shortage, but it's not the gold standard. So I just I just wanna say that that it's actually something that we should be thinking about as secondary, not as primary.
- Anneke Buffone
Person
So I think my answer is that I think they absolutely are good things that kids can do online. And so my one of my favorite examples actually was Meta's portal, which, like, I don't know anyone besides me remembers, but it had a story time feature. And kids were able to talk to their grandparents, and their grandparents would turn into, like, the big bad wolf and, like and kids were and it would it actually did something remarkable, which was it let kids talk to their grandparents and actually wanna keep talking to them. And the grandparents thought it was weird at first because, like, you know, like, you don't look super attractive as a big bad wolf as grandma, you know. And but it it sort of worked.
- Anneke Buffone
Person
Right? And there is lots of equivalence of that also in online spaces. So so I think that it absolutely can have benefits for kids to connect to different interests that they may not have a community for themselves, like, to like minded kids that, you know, maybe have special needs in the same way that they do. But I think that these benefits can't really be reached in a safe way unless we have the right safety manual standards and the right controls. And so I think it absolutely is possible, but when I sort of review what's out there and sort of what the state protections are, it's just not where we need it to be.
- Anneke Buffone
Person
And so I think it's it's really like, there really is a lot of research that's needed to see sort of how can we make sure that we create these right spaces for kids so that they can can claim the benefits. Obviously, I do agree that, you know, in person, experiences are the best. And then the second piece, I think, is that, in in terms of the oversight model, I think there's also something really broken about how oversight was ever created because it actually it is it's a process where the tech company ends up winning because it puts up kids against their parents. Right? The parents are the police officers.
- Anneke Buffone
Person
In a system that's not even working with false alerts and all these different things. You accuse your kid of something they didn't actually do. And so and there isn't any education like, you know, what you were saying, Francis. And I think, really, there there needs to be some accountability, I think, of companies as well to be part of educating kids to what the dangers are. You know, like, how can you tell that, like, something is upsetting you?
- Anneke Buffone
Person
What are the controls you can use? How can you report? And I think there needs to be accountability on what happens to all these reports. Like, you know, how many reports that kids sent in about, like, this is eating disorders, this is this, how many get actually action on. We have no idea.
- Anneke Buffone
Person
Like, I've never seen that data of what the percentage is, of what reports are just getting dismissed and, you know, like, it's fine. So so I think there's just a lot of accountability that we can ask for, and and I think then we can make real progress on answering that question.
- Rebecca Bauer-Kahan
Legislator
Thank you. Yeah. I just one thing that I think miss Francis said that's really interesting is the First Amendment law that is coming out on this because the First Amendment does not allow for all speech with no exceptions. Right? When there is a public health risk or another risk to a First Amendment, you can't hate speech, for example.
- Rebecca Bauer-Kahan
Legislator
It's not protection under the First Amendment. And yet the case law on the social media companies appears to be protecting everything they do under the First Amendment with no exception. And I find it fascinating because that is not, as I understood it as a young law student and practitioner, the way the First Amendment works. And hopefully, we will get to a place where we are weighing both sides of that debate evenly in courts.
- Rebecca Bauer-Kahan
Legislator
So I just thought that was a really interesting point. With that, miss McKenna.
- Tina McKinnor
Legislator
Yes. Thank you guys so much for coming and testifying today. I have one question. Is online harm today more a technology problem, a business model problem, or a regulatory gap?
- Tina McKinnor
Legislator
Yeah. Wow. That was a that was a short answer. Oh, I'm sorry. One last thing. What does what does success look like? How should we measure where the platforms are actually safer?
- Anneke Buffone
Person
Reducing harm. So I I think I mean, this is my this is my big point. Right? The only way we can is if we know what the base rates are. So, like, for example, what is the estimated percentage of children 13 on these different apps?
- Anneke Buffone
Person
And if we change our age predictions and we improve it, does that rate go down? Like, if we see, you know, like, certain harms that are, you know, emerging, like, if we have better safety systems and standard response controls, do those harms go down? I think without research, to really see the data on on cause and effect, it it will be really difficult. And as much as we can experimental data, but also, you know, seeing if the interventions are actually working, because I think that's one of the big problems. If if the mandate is do this thing and then the implementation of that doesn't actually fix the problem, then really we're just, you know, we're just it's just the service.
- Anneke Buffone
Person
Right? And so and it is accountability that I think benefits everyone. It benefits the tech companies. It benefits business. It benefits regulators, and it benefits the the families that, you know, we're trying to serve as well.
- Rebecca Bauer-Kahan
Legislator
I was just gonna say, no. An interesting point on that is that Assemblymember Gabriel had a bill a long time ago while I was here on disclosures on hate crimes on social media platforms. It was challenged by industry and struck down by the courts under the first amendment saying that they did not have to disclose these materials, which makes it harder to track all of what she's saying. So I just thought I would point that out.
- Lishaun Francis
Person
Thank you. That's And do you mind if I add really quickly? One of the things every young person I've talked to has used one of the reporting features to report content or something happening online. Every single one of them said they've never heard back. It just kinda goes off into the ether.
- Buffy Wicks
Legislator
Thank you for the testimony. I don't know a single parent that feels great about their tech situation with their children. They're like, this is awesome. Like, it's not a you know, every time it's like a war, it's a fight. It's like the parents trying to navigate a very complex I mean, I can't even navigate my own phone.
- Buffy Wicks
Legislator
You know, I can't keep up with all of it myself and then to manage your children's as well. So parents are just at their wit's end. At the at the in the most generous terms and and the most horrifying terms, we hear testimony from Victoria and Paul, and I also wanna recognize their testimony. And this is obviously the worst case scenario as a parent. So thank you for testifying.
- Buffy Wicks
Legislator
And that is like every parent I've ever talked to about this. And I when I do pick up and drop off, this is what parents are talking about when you when taking your kids to birthday parties, going to soccer practice. It's all consuming. And everyone's looking for a solution, and they need help. And, you know, they're eager for government to take action because it feels like if it's a parent against a tech company, it's just an unfair fight, especially when the kids are often aligned with the tech company, you know, because they want the product more and more and more.
- Buffy Wicks
Legislator
And so that's why a holistic approach, I think, is is critical. Miss Francis, I'd love to ask you a question. Is there any benefit to social media access for kids? And if so, at what age does that benefit outweigh the risk? And the answer might be no. But I'm just kind of curious because I I don't know the answer to that question. I'd love to know your thoughts.
- Lishaun Francis
Person
So is this a personal question, or am I? You know, there's how I there there is how I feel, and there is what young people tell me. So I wanna be clear about that. What young people say is that they they see a benefit, that they see benefits because it's how they are engaging politically. It is how they're finding jobs.
- Lishaun Francis
Person
It's how they're interacting with their school, and there's a social benefit. You know, I remember a time before social media, so I'm not as convinced that we need it. Right? So my personal feeling is it's probably not that great of a product. We probably shouldn't expose it to children of any age.
- Lishaun Francis
Person
I know there's been a lot of conversation around 16. I think 16 is an arbitrary number. The science and data tells us that your brain doesn't really fully develop until 25, so I'm not I'm not really even thrilled about that. I know we would never get something through that bans social media for 25 and under. So I I get that desire, but, you know, to me, that train has left the station, unfortunately.
- Lishaun Francis
Person
Things that I am concerned about is creating an environment where young people feel like they have to sneak and use social media. And that's what I'm also trying to avoid when I when I talk about this is I don't wanna create this environment where they're hiding social media now because that's even more harmful and more problematic. So, no, I don't I don't love it. I'm barely on social media these days. You know, social media didn't come out until I was already an adult, so the impact is completely different.
- Lishaun Francis
Person
But they they want to be engaged with the world differently, and I think we should make sure that it's safe for them to do so.
- Buffy Wicks
Legislator
Right. On that note, are there and and I'm I'm happy to entertain your response, but others as well on this. Are all social media platforms created equally? Like, are you seeing any of the companies actually put forth meaningful guardrails? And, again, the answer could be no. I don't know, but I I'd be curious your thoughts.
- Anneke Buffone
Person
I just I think my my website on this just went live today. So so I will I will share that with you, but I think my answer is that all of them like, a lot of them have things where they're better than others, but I don't think there is one that is better than all the others. So so I think that so, for example, Instagram teen accounts, I think, was a important step forward. TikTok has certain, like, minimal safety standards that are quite good. So it it really depends on sort of the area, but I think it's, like, the problem is that not one of the platforms right now is is all across the board doing the right things across all the different controls, and I think that's where the legislation is needed.
- Anneke Buffone
Person
That's where, like, mandated standards and, like, minimum standards and also parental controls, are going to come in as really, really important. And, to your last question very quickly, I think one problem that I see is that, right, like, today, kids, they go from activity to activity. They're so busy all day, all afternoon. They have no free minute. And so I think technology ends up becoming the solution to that you can only talk to your friends for five minutes between soccer practice and tutoring.
- Anneke Buffone
Person
And I think that's just like this is a societal, you know, in some ways problem where kids are expected to be in all these different activities and have no unstructured time to play and to be just to be free and to be with each other. And so in some ways, like, these technology companies have picked up on sort of a need for kids to wanna socialize as teens and to be independent. And and I think we have to understand that ecosystem that they're operating in. This doesn't mean, like, I think tech is good or social media is good or or bad, but it does mean that, like, you know, we have like, if we take certain things away and there isn't space for that to be filled with real life interactions, that that's a problem too. So, like, this is not really answering your question of should it be, but I just like, as a social psychologist, it's important for me to point out, like, sort of why why are we having the system and, you know, and maybe why is it even that kids want to be in these apps as opposed to being in person? So I thought that was important to mention. Thank you.
- Gail Pellerin
Legislator
Yeah. This is this is hard stuff. It's taken me a while to really digest everything, and I I wanna thank Victoria and Paul Hinks for being here. And your story is so powerful, and I know how hard it is to tell it. And thank you for being here and sharing Alexandra with us.
- Gail Pellerin
Legislator
And thank you all for your testimony. You've given us lots to think about. I mean, it seems like shouldn't we be designing safety systems from the very start and and not it seems like we're putting a lot of it on the parents to control? And is that happening at any level of speed and urgency?
- Gail Pellerin
Legislator
Lots of make .... I feel like I wanna scream. Okay. And then, I mean, mental health is something that's very concerning to me and the the connections that we're seeing between social media use and youth anxiety, depression, self harm. Are are there are there platform features that are are most harmful to a healthy kid? Or, I mean, have we identified yes. There's
- Sunny Liu
Person
Yeah. I think that, usually, those harms are not equally distributed. So they are mostly target those extremely vulnerable populations and the kids. When I talk about vulnerabilities, mostly they have those at risk factors in their daily lives. They don't have those supporting systems in their offline world.
- Sunny Liu
Person
And online setting also don't have those guardrails for those as well. More kids have those, for example, eating disorder and the algorithm driving them more to those kind of content. So those algorithms sent a harm enhanced all those their offline vulnerabilities and make them even more vulnerable vulnerable. So I think that's yeah. Not sure. I think what you I I
- Anneke Buffone
Person
think very concretely, like, I I think end to end encryption is a big problem. I think that's extremely unsafe. Like, children in these chats, like, there's just no way if a predator talks to them, like, it's even hard for law enforcement to track those conversations. So there are certainly features. I I think private versus public posting, like, whether kids, you know, are exposed themselves publicly.
- Anneke Buffone
Person
I think there's group chats. So so for bullying and harassment, I'm quite concerned about them. Even in Imessage, like, you don't even have to go to social media and tech. Like, there's school wide text message threads going on, like, in my own kid's school. So, yeah, there's definitely some features that are particularly concerning.
- Anneke Buffone
Person
I agree also with the the way that the algorithm is designed. And on your on your question on on apps, like, there actually, there was, famously example of Instagram Kids that got shut down. And I think I think that tech companies probably do need more guidance on when such apps are designed, how should they be designed, so that, you know, the the the attempts to do so really can actually be successful. And, of course, like, I mean, I my personal opinion is that those apps should not be optimized for engagement because I don't think that is ultimately safe because it will easily lead to these rabbit holes for unsafe trajectories. But but it but I do agree with you, and I think that is where regulatory support can really come in on on sort of what that should look like.
- Gail Pellerin
Legislator
Yeah. I'm I'm grateful. My kids are 28 and soon to be 31. I can't imagine raising young children in this environment right now. And quite frankly, I feel like we should just ban, you know, smartphones for kids age 16 and under.
- Gail Pellerin
Legislator
And I know you raised a good point, and that was good because I need to hear that because that's just how I feel. I feel like this is an evil device for them, and this is hurting them. And it's causing kids to take their own lives, and I can't stand by and watch that. I just wanna take it all away.
- Lishaun Francis
Person
Oh, I feel the same way. It's just not realistic, but I feel the same way.
- Gail Pellerin
Legislator
I know. Okay. So so I guess I'm just I mean, is is there I mean, other countries are doing, I think, bolder, more aggressive actions. And are are we are those successful, and should we be thinking about those for here? And I I'm trying to I know we're all trying to navigate this to the the end path where everyone's happy and thriving and no one's having mental health crises.
- Anneke Buffone
Person
I think it depends on like, I mean, if the the thing is, like, we the honest answer is we don't know yet because we would need a lot more data, and these interventions are also new. And I think there is a good chance that it will reduce the number of kids on these apps, but there's also kids that are moving to these, you know, newer, less safe apps, you know, that aren't as well regulated, that also aren't part like, they're not, you know, they're not affected by the regulation. And so my my biggest concern is, like, I I think that, you know, trying these things and, you know, and maybe that is I can't really wait on is it right or is it not right? But but there's definitely concerns on is it keeping the child with less supervision safe as well. Right?
- Anneke Buffone
Person
Because if, you know, if there is parents that are willing to help the kids circumvent and give the kid a phone and telling the phone the kid is 18, at that moment, you know, it's it is that child is less safe. And so and I think it's very hard to say this is the right versus the wrong way. I think it should have a lot of data behind it when we make those kind of decisions. But that's those are the trade offs that I think about is sort of, like, you know, where are kids wandering to, which kids are least protected, and so how do we keep them safe in the end? And I think those are all very tough questions that need a lot of data support that we just don't have yet.
- Anneke Buffone
Person
You know, I think it'll be monitoring those countries and sort of what happens there.
- Gail Pellerin
Legislator
So since I can't ban social media and phones from 16 year olds, but but what does the research tell us about which safety tools actually work and which ones are largely ineffective?
- Sunny Liu
Person
Yeah. So I would share a little bit more about the solutions in the in the third panel, but I think that the safety yeah. Briefly answer, I think that safety tools works when that first they had the report from bottom, but the report has to really connect. Kids understand that there's action taking. So kids have to feel that they're empowered and have efficacy and efficiency to use those tools, and those tools actually work.
- Sunny Liu
Person
So I think that's first part is the tools actually work. The second is education part. Kids know that they have to know that sometimes they don't want to report because they don't want to get their friends get into trouble, or they have this one. So they never even reporting offline. So I think there's we do need educate.
- Sunny Liu
Person
Our kids and family know that here are those functions and how that'll work. So I think education and to make those tools actually work, I think those are two things really important to keep those safe online as well. What's currently BC, I think, that on the offline world, we all figured out. We have schools. We have people.
- Sunny Liu
Person
We have communities. We have coach. We build those circle of care and circle support and circle safety for the offline world. But online world, we don't have that yet. So that's why those kids were likely when they were we cannot even protect them and hold them up because we don't have those circle protection. And I didn't have those safety fields.
- Anneke Buffone
Person
Taking the phone away works too. So with my own kids, that's what I do. They have screen terminals, and the rule is they charge the phone in my room at night, and I constantly just take it and put it in my pocket. And I think that that is about the extent of the parental controls, like, on on the device. I think the one that the one the only one that I really trust actually is the screen time one.
- Rebecca Bauer-Kahan
Legislator
Me too, but then you can just press that button that says ignore. Yes. I know.
- Rebecca Bauer-Kahan
Legislator
Thank you all, and I think that some important points are made. I will say that my own children, my mother-in-law was a second grade teacher her entire career, and they would, when they were little, FaceTime with her for hours, and she had puppets, and she she lives far away, and read to them, and it was the most it was amazing. It was honestly really positive connecting time that was happening through a device. And so I am a huge believer that actually there is a way to do it, that was real connection with a real person. And I will say that I love what you said about circles of trust online, honey, because one of our sort of I come from a very large southern family.
- Rebecca Bauer-Kahan
Legislator
One of our safety mechanisms is that aunties get to follow their nieces and nephews online. And so so I watch all my nieces and nephews' Instagrams, and they know I'm watching, but it's different than their parent. And, so I do think it is about building circles of love even in these spaces with people who care about you and you trust. And, so I think, again, these are not technological solutions, but they're important things to think about as we navigate the future in a way that really centers public health, I think, is we keep trying to talk about this in a way that is a public health centered way because it's so important that we remember that at the bottom of this problem, it's not technology. It's the health and safety and well-being of California's kids.
- Rebecca Bauer-Kahan
Legislator
And I really appreciate all of you being here. Another thing that I wanted to point out was something you said, miss Francis, about kids reporting, content not getting a response. One of the things we are working on this year, that I hope will be successful, as a, bipartisan coalition is a consumer facing regulatory regime that will allow customers to come to California and say, this isn't working for me. I need your help, regulator, because we do this for so many other industries, and yet we have not done this for technology. And I think that that would be game changing.
- Rebecca Bauer-Kahan
Legislator
And so I hope that in the future when kids do face that, they have the state to turn to. And I really appreciate us continuing to have this conversation in a way that helps our kids because their lives matter. And so I wanna close lastly by just reiterating my immense gratitude for both Victoria and Paul who I know I've had many conversations, and you say that coming here and telling your story is part of what empowers you every day. But I am just so grateful for it because as a mother, as an auntie, as someone who cares deeply about California's children, At the end of the day, we wanna make sure that no parent experiences what you experienced, and it will take hard work to do that. And you just remind us that that work is worth doing and showing up for every day.
- Rebecca Bauer-Kahan
Legislator
So thank you. And with that, I will turn to the next panel. Thank you guys. So the next panel is, our industry panel. We're going to hear about these online tools, some of which, I know from my own experience, have been updated.
- Rebecca Bauer-Kahan
Legislator
So we may get some updates on what is new and exciting online. First we have, Nicole Lopez, who is the director of go Global Litigation Strategy at Meta. You guys can sit wherever you'd like. Right. We have Emily Cashman Kirstein, and I apologize if I'm butchering Kirstein. Thank you. Child safety manager at Google. Lauren Haber Jonas, head of youth well-being and families at OpenAI. And Eliza Jacobs, senior director of product policy at Roblox. And I will say that I didn't plan to have all female panels.
- Rebecca Bauer-Kahan
Legislator
We didn't choose who's here, but I'm not mad at it. So with that, we will turn it over to who was supposed to be first? Nicole Lopez from Meta.
- Nicole Lopez
Person
So I also wanna thank Victoria and Paul. I appreciated, and it meant a lot that you shared your story today. And I'm chairman as well as assembly members. I'm Nicole. I'm here testifying on behalf of Meta, but first and foremost, I, like you, am a parent. I have two tweens who are online quite a bit.
- Nicole Lopez
Person
Screen time is the battle that we fight often in our household. I'm also here as a California resident, born and raised in Oakland where I live, five minutes from my parents today. I joined Meta roughly three and a half years ago where I've continued to work both in the policy as well as legal side of the house on what I care deeply about, which is the safety and well-being of young people. I have done this for the bulk of my career, both in the private and public sectors, including eight and a half, almost nine years as a prosecutor in California where I did two stints in the domestic violence unit. I worked on child endangerment, child abuse, child exploitation cases, and then I worked in the community violence reduction unit where I focused on violence impacting teens and their families.
- Nicole Lopez
Person
I care deeply about protecting young people online as well as supporting their parents, which we've touched on today. Parents are supporting their teens navigate these online spaces. I wanna talk first about Meta's approach to teen safety. I think it's really important as a backdrop for how we build these features and experiences for teens. At Meta, our teams work together to build safe, positive, and age appropriate experiences for teens and their families.
- Nicole Lopez
Person
But in order, and we've been talking about this today, to design products with the right mitigations to support users who are actually using them, it's critical. It's complicated. That has come up today as well, and it's really critical to bring the right voices into the room. And there are a lot of voices that matter. Teens, you have regulators, policymakers like yourselves, internal experts at Meta, as well as experts externally who are gonna have different focus areas and who are gonna come with a very, you know, blank slate, because they're not actually working at Meta.
- Nicole Lopez
Person
They have their own experiences to bring to bear. But importantly and relevant to the question that you posed at the beginning, we need to listen to parents. No kid is the same. No teen is the same. I say this from personal experience, having two very different boys who are 10 and 12, and parents know their teens best.
- Nicole Lopez
Person
In terms of the approach that we take to building, it is not a one and done static experience. As technology changes and assembly member Wicks talked about this, it's evolving really quickly. It is complex. We have moved into a different era than not just the AOL chats, but even four years ago. It's constantly shifting.
- Nicole Lopez
Person
And so we need to continue to listen, to build, and to improve. It's not static. And I'll said discuss, we have to get parents feedback. And it's not just about parent controls. I wanna make sure this is not a dichotomy.
- Nicole Lopez
Person
It's parent controls are important, very important. But sort of the baseline experiences that need to be protective of all teens who are using the apps. And in terms of how we get parents' feedback, we do it in a number of ways. One way that I've been deeply involved in includes listening to parents live in person. Met has hosted screen smart events in California.
- Nicole Lopez
Person
I hosted one in San Francisco. We've had them in LA. We've had them in San Diego where we provide hands on workshops for parents so that they actually understand how the tools and experiences work. We want parents to feel confident about raising their teens in an increasingly digital age, and we also wanna make sure that they have boundaries and protections that are gonna work for each family. Because, again, it's not just that every teen is different.
- Nicole Lopez
Person
Every family is different in what they want. So I wanna take a step back and share some of the work that we've done to address parents' concerns, some of which actually predates my joining meta. Before I joined, we started building out a number of parent supervision tools. And I'm not gonna spend a lot of time on every tool that we've built because there are a lot. I just wanna highlight some that I think give you an understanding of how things have shifted over time.
- Nicole Lopez
Person
We've given parents the ability to view how much time their teens spend on Instagram, set time limits, get notified when a teen reports an account or content, view what accounts their teens follow and the accounts that are following their teens, see who their teen has been speaking to in the last seven days. Again, hoping that parents feel empowered to have conversations with their teens. These conversations, as I said, are ongoing, and they're continuing to shape and improve how we design experiences for teens. And so more recently in the last two years, again, this is a trajectory that is continues to develop. Parents said they wanted to feel more confident around their teen's social media use without having to worry about the top three concerns that, again, things shift over time.
- Nicole Lopez
Person
It's what content their teen is seeing, who their teen is talking to, and how their teen is spending their time. And that's why we launched teen accounts, which was talked about earlier today in September 2024 for Instagram, Facebook, Messenger. And I think this is really important. All teens are defaulted into protective settings that address those three concerns. Who talks to their teens?
- Nicole Lopez
Person
We limit messaging. We limit the content that teens see, and we make sure that time is well spent by putting teens into sleep mode at night. And, again, any teen 16 cannot wiggle out of these defaults, these strict settings without a parent allowing them to do so. We also heard from parents more recently that, you know, they have different views on what's appropriate for their teens. Think about this as a parent.
- Nicole Lopez
Person
You know? Parents look at content on Instagram, and they looked at millions of pieces of content. And there were thousands of parents who looked at it. And they all had different views on what the feedback was and what was age appropriate. We took that feedback and we distilled it into how we draw lines across content that that teens can see.
- Nicole Lopez
Person
And that expanded, again, iterating, improving the teen accounts experience. We revamped our content policies inspired by 13 plus movie criteria and more specifically, parent feedback. That means now that teens under 18 are automatically placed into these 13 plus experiences, and they'll see content similar to what they'd see in an age appropriate movie. They also can't see 18 plus content anywhere, whether it's recommended, posted by a friend, if they're searching for it. We also listen to parents, and they told us they may not want their teens to see 13 plus experience, content because, again, not every team is the same.
- Nicole Lopez
Person
A 13 year old may not be as mature as another 13 year old. So we created even more restrictive setting that parents can put their teens in. So, again, every family is different. We took in that feedback. We implemented that feedback.
- Nicole Lopez
Person
We've also taken a similar approach to providing age appropriate interactions for teens who use our AIs. Teens can access information and educational opportunities through Meta's AI assistant, again, with default age appropriate protections in place. And we're continuing our work to give parents insights into those conversations. We're again using content guidelines that are inspired by movie ratings for 13, meaning that AI should not give responses that would feel out of place in an age appropriate movie. The other recent announcement that we made that has been highlighted today earlier is that Instagram will start notifying parents in supervision if their teen repeatedly tries to search for terms related to suicide or self harm within a short period of time.
- Nicole Lopez
Person
The vast majority of teens are not looking for this content. But when they do, we already have a policy in place to block those searches and to direct them to resources should that happen. These new alerts, though, are designed to make sure that parents are aware if their teen is repeatedly trying to search for this content and to give them the resources they need to support their teen. And, again, we worked with experts on this, but we heard directly from parents that they wanted to know, and we incorporated that feedback. I think what's been raised today, and I really wanna revisit this because it's been said so many times in in in a different, you know, variety of conversations, is that parents and myself included are feeling overwhelmed.
- Nicole Lopez
Person
Teens, and and I'm sure Australia will come up at some point during the conversation, are fleeing to apps that we've never heard of. Teens are an average of, according to a University of Michigan study, 40 apps per week, and parents have no idea what they're doing. And, again, you know, we supported assembly member Wix bill to require operating system providers and app stores to implement an age assurance signal. And that's important because in order to get teens into age appropriate experiences, you you absolutely need to know how old they are, and everybody here at the table will tell you it is complicated and it is hard to know how old somebody is. And so we applaud that bill for passing.
- Nicole Lopez
Person
We supported it. But I think what we're getting at today here is that parents want visibility into what their teens are doing online. They want to be able to decide whether their teen is ready for an app or not, and that's why we've supported OS app store legislation that requires app stores to get a parent's approval before their teen downloads an app. And under this approach, if a teen attempts to download an app, the parent would get a notification on their phone and it's a one stop shop, they approve it, or they don't. And, again, it addresses parents' concerns that they don't know what's going on, and it puts them in the seat.
- Nicole Lopez
Person
It still requires all of the apps to do the work to create age appropriate experiences. That work is not done. It's work that we're still gonna be doing. I wanna close that. I actually know the people at this table. I think industry wide, it's at Meta. We all care. We're all parents. We care about creating safe experiences. We wanna make sure that teens who we've been told by an expert today that teens wanna be online.
- Nicole Lopez
Person
My experience of AOL chatroom, I did get on when I was 16, is not the experience of my kids today. It is here to stay. We need to support them, and we need to do so in a way where we're part of the solution and we're empowering apps to continue doing the work that they're doing, but also making sure that parents are in the loop and that parents have visibility and can support their teens while continuing to require that we develop protective experiences for teens as a baseline. Thank you.
- Rebecca Bauer-Kahan
Legislator
Thank you. And I'll say as a kid that was in those AOL chat rooms, there was filth in there too. ...
- Nicole Lopez
Person
My first experience, which I shared with another person here, not safe.
- Rebecca Bauer-Kahan
Legislator
Yes. No. I would agree with that through lived experience. Okay. And now we will turn to Emily Cashman Kristen? Kirstein.
- Emily Kirstein
Person
So I'm Emily Cashman Kirstein. I lead child safety public policy at Google, and I'd also like to thank mister and missus Hinks for being here, for sharing your story, and for your advocacy.
- Emily Kirstein
Person
I come to this job, from industry today, but I've also worked on the NGO side. I led, safety, public policy
- Emily Kirstein
Person
I led public policy work at Thorn, the nonprofit to combat child sexual abuse material online, and on the government side working in the US Senate. I appreciate the opportunity to be with you all today to talk through parental tools, but also how Google, frames it in a larger context, how we're thinking about building for kids and families overall. I think you all have slides and we have them up here. As assembly member Wicks said, there have been a lot of, updates and, wanted to put those in front of you all today. So our, overarching mission at Google is to organize the world's information, make it universally accessible and useful.
- Emily Kirstein
Person
And when it comes to youth, we wanna be doing that in a way that offers them the benefits and the utility of the online world with the appropriate safeguards in place. And that last part bolded, underlined, underscored, all of that. And meaning, of course, we want to protect kids in, not from, the digital world. And how we're doing that is based on these three pillars here. The first is protect.
- Emily Kirstein
Person
This refers to everything from baseline protections for for all users, including our industry leading efforts to combat child sexual abuse material and exploitation online, Two, default settings that we have for 18 users that are backed by age assurance. Respect is, you know, the core of what we're talking about today, which is parental tools and knowing that each family has a different relationship with technology. Respect that? And third, the empower pillar is how we're building those enriching, not just okay activities. How are we building enriching educational experiences for youth online, and building the digital skills of the future, learning to use the latest technologies again in that safeguarded environment.
- Emily Kirstein
Person
And so starting with Protect, you know, we have default settings, for 18 users even before we get to parental tools, and I'm gonna go through a little bit of these here. So on search, for example, we have safe search on by default that helps filter explicit content. Location sharing is off by default. 18 plus apps are blocked on play. We'll get into YouTube and Gemini in a bit more depth, but I do wanna emphasize here that Google does not serve personalized ads to minors.
- Emily Kirstein
Person
And on YouTube, regardless of parental tools, again, for all 18 users, we've built in protections in our personalized recommendation systems to ensure that teens aren't overly exposed to specific kinds of content that while they're not violating our policy guidelines, they may be innocuous in a single view. But if they're, put in repetition, if they're recommended repeatedly, potentially, they could become problematic. And we worked with independent experts, YouTube's youth and family advisory council to develop these content categories, and we continue updating them. We also have take a break and bedtime reminders on by default. The take a break reminder is a full screen takeover, that is on the default setting is for an hour, but parents can also adjust that as needed.
- Emily Kirstein
Person
And to properly ensure that those 18 default settings are getting to the right users, we have rolled out age assurance on our own first party platforms, and we're also working toward compliance, of course, with ab 1043. The approach to responsibly share signals across the broader app ecosystem. And excuse me. So how do we do that? So first, of course, we're starting with declared age starting from somewhere, then we run an inference model.
- Emily Kirstein
Person
So without taking more information from the user, we're looking at things like have has this account been around for twenty years? Probably not a minor. If they're searching for mortgage rates and tax assistance, again, probably not a minor. That goes into how that inference model works. If the model is unsure that this is an adult and that user tries to access, you know, a music video on YouTube that has explicit lyrics, something like that that would otherwise be age gated, they will be prompted to confirm their age, whether that's through an ID.
- Emily Kirstein
Person
We not we know not everyone wants to offer an ID. We offer Selfie, email lookup, credit card verification, things like that. So getting into, the parental tools themselves, right, all the protections I was speaking about before are default under 18 before we even get to parental tools. And the premise being that no one family and no one child is the same. Of course, we've talked about this.
- Emily Kirstein
Person
We've heard about it, and we have to build with that reality. So one of the things, you know, we've had Family Link since 2017. That's our flagship parental tool for Google. But we have, of course, heard as we've heard today, parents are overwhelmed. They want quick and easy setup.
- Emily Kirstein
Person
They want options that fit their families best. So in addition to Family Link, this past year, we announced parental device controls right on the device. So at the point of a parent having the device, they can set up things like screen time, like web filters, approving and blocking apps that exists now, and that's all backed by a pin that a parent knows right there on the phone. If the parent would like a more robust experience with parental tools, that's where Family Link comes in. The ones before are just on the device.
- Emily Kirstein
Person
This is an app that the parents have. They can have on their phone that, is a, you know, a more robust experience remotely. Right? So they can, as it is, as it stands now, they can block apps, Approve Apps right through Family Link. They can block or, you know, approve websites, screen time settings, school time. This will make the phone not work during the day. At school, all of those exist right now through Family Link.
- Emily Kirstein
Person
Yep. And child accounts, I should say, remain in a supervised state after they turn 13 unless the parent approves removing that supervision. This helps make sure that those decisions are made as a family. And, again, in talking through all of the ways that, you know, parents were were incorporating parents' feedback, we heard that it took too long to set up, you know, YouTube accounts and things like that in in the YouTube app. So we rolled out within the past couple months an easier way for parents to set up YouTube accounts and how to more importantly or just as important to toggle back and forth between a a parent's account and a kid's account.
- Emily Kirstein
Person
It's incredibly important as we know for minors to be on their own account to be able to take advantage of the default settings we talked about of the parental tools that their parent has set up. And, you know, another piece to this is we just, rolled out a YouTube Shorts timer. So this is allowing the parent to decide how much time may be appropriate for their child to see YouTube Shorts. And an important piece to note is that timer can go down to zero, and parents can decide if they don't want shorts on at all for their child. And last pillars and power here, I'll wrap up.
- Emily Kirstein
Person
This is about using technology to help young people learn and create and explore one of the most important things, you know, talking about that is top of everyone's mind, of course, is related to generative AI. We want youth to have access to the benefits and the opportunities that come with it. But again, as we said before, with those appropriate safeguards in place. And so a bit on these safeguards themselves. Before rolling out youth experience on Gemini in 2023, we worked with our in house team of researchers, cognitive psychologists, child development experts, in addition to an independent youth advisory council, that we have at Google to develop policies and protections for youth.
- Emily Kirstein
Person
And recognizing that youth could be more vulnerable to developing an emotional connection with AI, we built persona protections for youth into Gemini from day one since 2023. So for younger users, Gemini is designed not to say I love you, not to say I need you, or any explicit, you know, any explicit claims of humanness or that it feels emotions. We've built protections additionally against sexually explicit content, dangerous activities, age restricted substances, violence and Gore, medical advice, unhealthy behaviors. Again, those are all baseline protections in Gemini. Many of these were all users, but especially for 18.
- Emily Kirstein
Person
And our suicide self harm protocols refer users to crisis service providers and encourage them to seek real world support and help from a trusted from someone they trust. And oh, excuse me. So we're committed again to empowering both parents and youth to explore Gemini responsibly. We've heard a lot about, you know, parents wanting more resources. And I should say, you know, for Gemini, parents are in control to decide if it's right for their child or not.
- Emily Kirstein
Person
But if they would like to get more information, we offer AI literacy guides. Some have been designed specifically for teens for their developmental, you know, state and family conversation guides to have this conversation as a family about how to use AI. Both of these help reinforce the importance of knowing the limitations of AI, how to think critically about responses, and to double check answers as needed. And we also offer things like podcasts for parents and a video series on how to use AI with your children. We're always looking for new ways to make Gemini, usable and and useful for youth.
- Emily Kirstein
Person
As an example, we recently announced a partnership with the Princeton Review to make free on demand SAT prep available within Gemini. So I know I've gone on a little bit.
- Emily Kirstein
Person
There's a lot to go through, you know, and this is really complex, but I hope that, you know, we're, we're able to show the many different layers we're thinking about this of which parental tools is just one layer and all under the umbrella of, you know, the premise of wanting youth to have the benefits of this technology with those appropriate safeguards in place. Thanks.
- Rebecca Bauer-Kahan
Legislator
Thank you. And I will say the only one of these products that my kid has is YouTube, and I didn't know about a lot of this, so I learned myself. So I think the education piece is really important. I didn't know I could turn shorts off. He will not be happy when I get home, and that's the next thing I do. And then is Family Link available even if you're on an Apple device or do you have to be?
- Rebecca Bauer-Kahan
Legislator
I was curious about that. Okay. We will obviously have more questions, but I just baseline. Now we will turn it over to Lauren Haber Jonas, head of youth well-being and families at OpenAI.
- Lauren Jonas
Person
Thank you so much. First, like my colleagues, I wanna thank Victoria and Paul for the time and and for the testimony here today. As a as a parent, I cannot imagine the experience that you've had. Good afternoon, chair Bauer, Kayehan, and members of the committee. Thank you for the opportunity to to be here and to testify and for your leadership on youth safety.
- Lauren Jonas
Person
My name is Lauren Haber Jonas. I lead youth well-being and families at OpenAI. In particular, I come at this as a builder. So I lead product and engineering. I do not lead only policy for OpenAI. My teams are the one building these things. We build parental controls. We build age assurance technologies. We build age verification. So we understand deeply the technical requirements and how difficult it is to do this well and what the opportunity is and and and any limitations might be.
- Lauren Jonas
Person
I have been doing this for ten years, so this is very much my life's work. I have been building both on the products and the engineering side in youth safety at large companies, at small companies, at my own companies as an entrepreneur for ten years. So, our goal here when I I got to OpenAI two years ago from nearly the moment that ChatGPT launched was to build this with use safety at the start from the moment that this was in the hands of teens. Again, this is my life's work and and core to the mission of the company. I'm also the mum the mother of three young children.
- Lauren Jonas
Person
I have 3, 7 and under. I don't sleep a lot. You see the bags under the eyes, as many have said. So I think about this both professionally and personally. We appreciate the committee's focus on parental controls, as AI becomes more integrated into how young people learn, create, and explore information.
- Lauren Jonas
Person
The companies that are developing these technologies have a responsibility to build the protections in from the start and also give families meaningful tools. At the same time, it's important to recognize that that generative AI systems like ChatGPT operate differently than social media platforms. ChatGPT does not have feeds. We do not have engagement algorithms or public posting. We have only been available since November 2022.
- Lauren Jonas
Person
But precisely because this technology is new and so powerful, we have focused on building these strong protections and learning from the lessons of platforms that have come before us. I'll talk a little bit today about the approach we're taking, the partnerships that guide our work, the multilayered approach, again, not just relying on parents as some of my peers have stated, parents and parental controls, to guide families on how best to make sure that their teens are using these tools responsibly. So, fundamentally at OpenAI, our belief is that young people should be able to benefit from these tools, whether that means learning, exploring ideas, or developing new skills. Learning is one of the most common use cases on ChatGPT today. One in three US students use it to study.
- Lauren Jonas
Person
Many use it as a learning support tool, create practice quizzes, study plans, review drafts drafts of assignments. It is a tool that helps them test their knowledge, clarify difficult concepts, and for many students, this kind of personalized support was previously only available through one on one tutoring. These benefits are immense, but they must be paired with intentional safeguards and responsible design, as we've said. One of the things that we said publicly from the start is that our approach to this is a priority of safety ahead of privacy and freedom for teens, full stop. This is a new technology.
- Lauren Jonas
Person
It is a powerful technology, and we believe miners need significant protection. We have said this. Our CEO has has said this a number of times before. This is a very serious responsibility that we take, both for our team users and to their parents to have a layered set of protections. I want to talk a little bit, about how we partner with experts.
- Lauren Jonas
Person
One of the things that we have learned from companies that have come before us, is that we cannot solve youth safety challenges on our own. We have built two organizations, external third party organizations that we partner with, the first being an expert council on well-being and AI. These are the folks on that council. These are researchers that study youth development, mental health, and the effects of technology. They come from Boston Children's Hospital, Georgia Tech, Northwestern, University of Oxford.
- Lauren Jonas
Person
We have also built a global physicians network. So this is a network of 250 Clinicians and physicians over 60 countries. So the goal here is a global lens, not purely a domestic lens that guides and help evaluate how our systems respond and help guide our policies and our principles and the content restrictions we have in place. Beyond that, we work closely with organizations that have long been leaders in the space. We work with Common Sense Media, the American Psychological Association, AFT, Connect Safely.
- Lauren Jonas
Person
Today, in fact, I am here and not there, but today, in fact, we're hosting a convening of a cross sector group of leaders, CEOs of the nation's leading mental health, firms, so the American Psychological Association and others to help guide our work in mental health for youth and for adults are today in our San Francisco headquarters in this particularly unique convening. There we go. Building on this input that we get from third parties, we introduced what we call, our teen safety blueprint. The blueprint is meant to serve as both an internal framework for every team building within OpenAI and as a starting point for broader policy conversations about responsible AI and young people. And it has a number of pillars.
- Lauren Jonas
Person
The first is, as we've talked about, is identifying users 18, and that is age estimation as as the approach the initial approach we've taken. The second is the a default safety layer of protections once those teens are identified. The third is a layer on top of that that offers parents the ability to have control as we've talked about and heard quite a bit today. The fourth is designing systems that support that are not just a safety floor, but support well-being. What does that mean?
- Lauren Jonas
Person
How do we support the well-being of teens, not just the baseline safety for teens? And then the last is transparency. The goal here is to be as as transparent as possible about our approach. The moral of the story here is that no single safeguard is sufficient on its own. We have taken a multilayered approach here, all working together, product design, behavioral policies, parental tools, consultation with experts, and most importantly, we work in the open.
- Lauren Jonas
Person
So we have published what we call our model spec, which are principles that guide how our AI systems behave. So this guides how how the model is built and how the model should be steered when interacting with teens. There is a specific section of the model spec that is dedicated to teens and to teen safety, which has been published and we're happy to share with the committee. Oh, backwards one. I wanna talk a little bit about the content restrictions that we have in place for teens.
- Lauren Jonas
Person
And, again, these are default on for teens when a teen is identified. Our system should not romanticize self harm or suicide. We should not engage in immersive role play with minors. They should avoid reinforcing harmful body ideals. They should encourage young people to seek support from trusted adults outside of the technology when facing difficult situations.
- Lauren Jonas
Person
Again, these are behavioral guardrails. These are content guardrails, that are a foundation. They are not the only mitigation, but they are the foundation on which, everything is built. So now I wanna turn a little bit to parental controls. We introduced a set of parental controls in the fall, and our overarching goal as a product and an engineering team was not to just build a new settings page, but it was to lead the industry and to pull the industry with us.
- Lauren Jonas
Person
And we'll talk a little bit about how we did that and how we feel we've done that. I wanna talk a little bit about our our parental controls and and the how we feel that this is empowering fa families and educators. The protections reduce exposure to the types of content described that research shows may be harmful for that for adolescents. So this is based on teen developmental psychology. Parents link their account to their teen's account, manage settings from a single dashboard.
- Lauren Jonas
Person
It allows parents to tailor the experience. But but in particular, the setup process is very straightforward and happens in numerous numerous directions. A teen can invite their parents to parental controls. Parent can invite their teen to parental controls. It it goes both ways.
- Lauren Jonas
Person
If a teen later unlinks their account, a parent is notified. If a teen asks to change a setting, a parent is notified. They cannot do that on their own. This is only available for parents. So the goal here was to design a system that encourages communication between parents and teens and is transparent on both sides.
- Lauren Jonas
Person
A teen can't do anything in terms of editing these controls their parents don't know about and vice versa. Once accounts are linked, there are a number of different controls that a parent has. So the goal here is to get as granular as possible. A parent should be able to turn on and off image generation, on and off voice mode, on and off, the sensitive content restrictions they have. Maybe for their family, they're comfortable with more adult content, their child seeing more adult content.
- Lauren Jonas
Person
To receive alerts if the system detects sort of possible signs of suicide and distress, and we'll talk about that in a little more detail, to opt out of model training. The goal is to give parents as granular and flexible options as possible, in as simple of a way as possible. All of these parental controls are default on. A parent does not have to opt in. And I wanna talk a little bit about safety notifications and how we built this.
- Lauren Jonas
Person
This launched last fall. We were the first in the industry to build this, and we're heartened to see some of our peers, to, you know, follow us in that regard. What this is is the following. It is a safety notification system. It's industry first, and it doesn't require an opt in.
- Lauren Jonas
Person
So if you are in parental controls, you do not have to raise your hand as a parent and say, I wanna receive safety notifications. It is on by default, and we will notify you in three ways, in ChatGPT, via text, and via email. We could do it via carrier pigeon if we if we had any ability to, we would. But the goal is to get to a parent and to share that a teen is prompting for distressing content. The content that a teen is prompting for is never shared with the parent.
- Lauren Jonas
Person
So we understand and value the privacy of teens. We are not sharing the specific prompt and generation text that a teen is prompting, but the goal is to encourage a parent to take action, for them to have enough information for a parent to take action. One thing that is important to note is that, when a teen is prompting for distressing content, before a parent notification is triggered, that content goes to human being full time employees for review. Train full time employees inside OpenAI for review to make sure that that content isn't we haven't had a false positive. We haven't done this in an incorrect way before we send a notification to parents.
- Lauren Jonas
Person
We love that this has become industry norm, and this is one of the ways that we hoped to to sort of pull the industry along in the parental controls space. Some of my colleagues have noted, our work is not done here. We are continuing to learn. We are continuing to improve. We partner with some of our our friends over at Common Sense Media.
- Lauren Jonas
Person
We believe these are first steps. This is not the end. Additionally, because we know that parents need additional support and guidance and team needs support and guidance on how to use our tools, We have family guides on how to use AI responsibly, a set of conversation starters for parents. These resources were developed with input from safety experts and organizations like Connects Connect Safely and Common Sense Media. I wanna, end with, recognizing that protecting young people online is an ongoing responsibility.
- Lauren Jonas
Person
No single company, no product feature or law will solve these challenges on its own. We believe that progress comes from thoughtful guardrails, transparency, collaboration with experts, and empowering families. In fact, today, we joined a group of kids' safety advocates, community groups, and other organizations as part of the Parents and Kids' Safe AI Coalition to pass what we hope will be the nation's strongest child safety AI law. We appreciate the committee's work in this area. We look forward to continuing to partner with you, and and thank you for the opportunity to testify.
- Rebecca Bauer-Kahan
Legislator
Thank you. I just want to clarify one question. You said that when parents get that notification, it doesn't say what the prompt was. It just gave them a category?
- Lauren Jonas
Person
It it'll say you're prompted... your teen is prompting for suicidal. Okay.
- Rebecca Bauer-Kahan
Legislator
Now we're gonna turn to Eliza Jacobs who is not sitting here, but her assistant and very talented government relations colleague is. So Eliza should be online. Eliza, do we have you?
- Eliza Jacobs
Person
Hi. Thank you so much for having us today, And thank you to all the previous speakers. I think it's just a testament to how much this needs to be a group effort for all of these different components to come together and talk about this important issue. And also thank you so much for letting me testify remotely. It lets me be home with my kiddo for dinner tonight.
- Eliza Jacobs
Person
So I really, really appreciate it. As, Chair Bauer Khan said, my name is Eliza Jacobs and I lead policy at Roblox. First of all, I don't know how many people know what Roblox is, but Roblox is an immersive gaming platform. People can connect with their friends and family and play and explore. Molly, you can go to the next slide.
- Eliza Jacobs
Person
We have over a 150,000,000 daily active users all across the world. About 66% of them are 13, but that means there's a significant portion of our users that are 13. And we have always been an all ages platform, which has really informed our approach to safety, for a twenty year history. Next slide. Do we miss do we miss the slide there?
- Eliza Jacobs
Person
No? Okay. Yeah. So Roblox has been around for a while. We've always been an all ages platform, and as a result, we've always built with safety at our core. We have a multi tier, multi level approach to safety. As many people have noted today, there is no one tool that is the silver bullet for safety. You have to have many layers and many tools, to keep your community safe. And that's what we do at Roblox. So we start with robust policies.
- Eliza Jacobs
Person
Can we go back, Molly? Yeah. We start with robust policies. Our policies are purposefully more restrictive than most of the Internet. Again, because we're an all ages platform, we don't allow profanity, for example, on the platform.
- Eliza Jacobs
Person
We don't allow any references to drugs or alcohol on the platform. We are optimizing for the safety of our youngest users in our policies. We also have robust automated moderation systems. At our scale, you need to have AI working in partnership with humans to moderate the content on the platform. We then have teams of human experts doing human moderation for more complex cases.
- Eliza Jacobs
Person
We have a team of deep subject matter experts on all manner of child safety issues, grooming, suicide and self harm, terrorist content, all of that. We have a team of internal investigators that work on those more complex issues. And we also have a wide variety of safety partnerships with NGOs, with Common Sense Media. You know, we work with all the people that, organizations, people have spoken about earlier today. And I also want to highlight that we have a teen council and a global parent council.
- Eliza Jacobs
Person
And those are groups of users and parents that engage with the platform where we're constantly talking to them about what they want to see, what would be helpful for them. We think it's really important to value the teen voice and value the parent voice in all of these conversations. Next. So there's, as I said, many layers of safety on the platform. And to start with on communication safety, we do not encrypt any of our communications.
- Eliza Jacobs
Person
So all of our communication can be monitored. We have AI models running in the background constantly to monitor for grooming and other critical harms behavior. We have internal experts that are looking at that communication and reaching out to law enforcement where necessary. We think it's really important when we're talking about kids that we're not encrypting communication. We also have a text filter that operates on communication on the platform.
- Eliza Jacobs
Person
So we're filtering inappropriate communication before it can be sent to other users. And specifically, it's designed to block the sharing of personal identifying information. So it be sharing phone numbers, addresses, Instagram handles, anything that would make it easier for people to meet up with them offline or online, but on another platform. Next slide. And we know that it's important to design, again, with kids and teens in mind and to have additional protections for our younger users.
- Eliza Jacobs
Person
There are real challenges here, as as many people have noted. As kids grow up, become teenagers, they have growing independence. They often have their own devices. Maybe they're alone in their bedrooms on those devices. They're moving between Apps.
- Eliza Jacobs
Person
You know, everyone that has spoken today, our users are on their platforms as well. We can only control what they do on our platform. Once they leave the platform, we just don't have visibility into that. So a few things we've built into the product as safety features. First of all, there's no image or video sharing in chat.
- Eliza Jacobs
Person
So you cannot share a photo from your camera roll in chat on Roblox. You can't forward a video. As I said, we don't encrypt communication, so we're constantly monitoring all communication between users for potential harms. We also, and I'll talk about this a little bit more later, we require age checks to access any communication features on the platform. That is a facial age estimation process that we rolled out starting in the fall and is globally required as of January.
- Eliza Jacobs
Person
And we've open sourced many of our safety models. You know, the companies that are testifying today are some of the bigger players, but there are lots of apps that just don't have the resources to build the kinds of systems that we're talking about today. And so we think it's really important to share this technology in an open source way with the whole industry to keep everybody safe. We want kids to be safe not just on roadblocks but everywhere. And we're constantly engaging with policymakers like yourself and child safety experts, child development experts to understand what is necessary, what we need to build in the next generation.
- Eliza Jacobs
Person
Next slide. So specifically talking about parental controls. And and just to reiterate, all of those things that I just talked about, those come as a factory setting out of the box. You don't need to engage with parental controls to have any of that be true on the platform. And we think it's really important that you're starting from a place of default safety and that parental controls are just another layer in the arsenal, another tool so that parents and families can personalize their Roblox experience.
- Eliza Jacobs
Person
But by all means, we don't we don't think that they're the end all be all and we we don't think that they should be necessary for kids to be safe on our platform. But that being said, our parental controls were the result of extensive partnership and consultation with experts. We work with a variety of rating boards. So in the gaming space, similar to movies, there are lots of different international ratings boards that rate content. Some of them are here.
- Eliza Jacobs
Person
We are working on integrating with IARC, which is the International Age Rating Coalition, so that sometime in the next year, our users will get localized ratings. Right now, and I'll talk about this a little later, you get our standard Roblox platform ratings. But in the future, kids in The US will get ESRB ratings. For those who have gamers in your life, you'll recognize those as things like E for Everyone and T for Teen. But in, for example, Germany, there are USK ratings.
- Eliza Jacobs
Person
In The UK, there are PEGI ratings. So those will be familiar to parents and will be displayed for their kids when they're accessing Roblox games. Next slide. So how do our parental controls work? Similar to what other people have spoken about, we have a sort of parent link approach where parents create their own Roblox account.
- Eliza Jacobs
Person
They link their account to their child's account, and then their phone becomes sort of a remote control for their kids' Roblox experience. As a parent myself, I know that often you're making these choices, like, late at night when you finally sit down after doing the dishes. And, so it's really important, we think, that you have the ability to have asynchronous control over these things. You will also get notifications if your kids request a settings change. So if they are at a friend's house and they want to play a game that you haven't allowed them in their settings to do, they will send a request and you will get that request on your phone and be able to approve or deny it from your phone.
- Eliza Jacobs
Person
But not necessarily you don't have to be on their device to make that choice. In order to link your account as a as a parent, you have to verify your age. You can do that either with a credit card or with an ID. And once you've done that, you'll have access to the full suite of parental controls. One thing I want to note here is that, another advantage of the parent link approach is that it encourages parents to get in the game themselves.
- Eliza Jacobs
Person
We really believe that the more you're opening a dialogue with your kids and talking to them about their Roblox experience or any online experience, the easier it will be to hear from them their honest experience. And if they believe that you care, about what they're doing on Roblox and that your your instinct isn't just to to ban it, the more likely they are to be open with you about what's happening. So, you know, create a fun avatar, play a game with your kids. We think that's a big component of parental controls and parental involvement. Next slide.
- Eliza Jacobs
Person
So as I said, parent accounts must be age verified with government issued ID or a credit card. We only use this information to verify your age and so it's not an identity marker in an ongoing way. Next slide. And then once you do that, you'll have access to this user friendly dashboard with the controls that we heard from teens and parents that parents most want. Something that we've heard a lot today is that parents are overwhelmed and I can totally understand that as a parent myself.
- Eliza Jacobs
Person
I think, what's most important is that we're giving parents the tools that they most want and not a million controls and a million radio buttons that are overwhelming and that sort of become like an eye chart for parents to have to review. So we really focus on the things that parents have told us they want the most. And in general, those fall into a couple of categories: content restrictions, what your kid can play communication, who your kid can talk to spending, what they can purchase and screen time as well as, a few other key controls. Next slide. So parents can see who their kids' friends are, and they can set daily screen time limits.
- Eliza Jacobs
Person
They can block individual connections, which means that your kids won't be able to talk to those users. And once that connection is blocked, kids can't go in and change that setting. Parents can also set daily screen time limit within the app. I think what's just one thing to note, like, this might change. You know?
- Eliza Jacobs
Person
Like, my daughter was home on Friday sick, and she got a lot more screen time that day than she would normally get. And so, again, we want this to be really easy for parents to do from their phones to be able to, like, quickly make adjustments if it's, you know, a sick day or a snow day as we've had many of here this year, and they they wanna let their kids have a little more screen time that day. Next slide. Parents can also set spending restrictions. It should be noted that parents are setting spending restrictions by sort of loading the money into their account in the first place, but they can also set additional restrictions and also notifications.
- Eliza Jacobs
Person
So if you want to get a notification every time your kid buys something on Roblox, you can do that. You can also get a notification just when it the spend hits a certain limit and set an overall limit as well. Next slide. Content maturity limit. So this is where the ratings come in.
- Eliza Jacobs
Person
We currently maintain a sort of universal Roblox standard of content maturity limits. Think of this like movie ratings. By default, users under nine only have access to minimal or mild content. Users over the age of nine will have access to moderate content. Restricted content requires that users be 18 But, again, just to reiterate, our content policies are just much more restrictive than the rest of the internet.
- Eliza Jacobs
Person
So, again, no profanity, no drugs and alcohol, no sexual content. All of those things are just flat out prohibited on the platform. And so these buckets are actually much more restrictive than traditional G, PG, PG-thirteen kinds of ratings. Next slide. In terms of content restrictions, parents can block individual experiences that they don't want their kids to play.
- Eliza Jacobs
Person
And something that we took directly from the research was first we show parents what their kids' 20 most played experiences are so that they know where their kids are actually spending time. And then they can choose, like, go into those, explore them, decide whether those, are appropriate on top of the the ratings level restrictions. And this really, we think, surfaces the information parents need to know so they can make choices about what they want their kids to be able to play. Next slide. Next slide.
- Eliza Jacobs
Person
So we, as I said, November started rolling out and in January, globally required that all users who wish to access communication features on the platform are required to complete a facial age estimation process. Once they do so, they will be able to access communication features. They'll only be able to chat with other kids in their peer group. We're very optimistic that this step, though not required by anybody, will become the gold standard for age verification on the internet and for child safety. For a long time, knowing how old kids was was just incredibly difficult.
- Eliza Jacobs
Person
Right? For adults, we have IDs. But for kids, it was very difficult to know. So we're very excited to launch this globally. And we also, have continual age estimation running in the background, Laura, I think Google talked about this as well.
- Eliza Jacobs
Person
But if we have any reason to believe that the age that you estimated on your account is not the age of the person using that account, so, for example, by the nature of the games you are playing or, the types of folks that you're friends with on the platform, if there seems to be a mismatch, we will introduce additional friction and ask you to verify again if we believe that the age is not accurate on your account. Next slide. That's it from us, but I look forward to hearing your questions. We're very passionate about safety at Roblox and appreciate California's leadership on this issue.
- Rebecca Bauer-Kahan
Legislator
Thank you so much. We are gonna open questions with Assemblymember McKinnor.
- Tina McKinnor
Legislator
Thank you guys so much, and I'm rushing. And I'm sorry that I have one and I have another meeting, but this has been such an important topic today. And I thank you, chairwoman, for bringing this forward. I wanna start with the settings. Is there any way we can make these settings to protect ourselves and kids better and more user friendly?
- Tina McKinnor
Legislator
I just started, like, trying to protect myself from being, you know, allowing people to know my location and, you know, just privacy things on my own iPhone. And it has been taken hours to go through there and try and figure out what to turn off, what to turn on, what to keep on. And because I'm nervous about being followed and stuff myself for privacy. And so is there any way that the setting you guys can make these settings more user friendly?
- Emily Kirstein
Person
Oh, I think from our perspective, we're always, from the Google perspective, always looking to improve. This is an ongoing process. For parents in particular, we have a variety of resources, whether that's, you know, families.google is where parents can go to get instructions and more information on the different settings in addition to, you know, the the setup in Family Link. But I think, you know, what we believe is this is a process that is going to evolve, right, as different technological tools evolve, so will protections, so will the the settings. And it's also something that why we take such why we prioritize working with our independent advisory groups.
- Emily Kirstein
Person
We have both on the Google and the YouTube side and also, you know, with with civil society, NGOs, with government having that back and forth, and this will be an ongoing discussion.
- Lauren Jonas
Person
I think the single biggest thing that that we're doing, you know, at OpenAI, and I think the the easiest thing to do would be to have all the default settings on so you don't have to figure out which ones are right, but the baseline safe private experience is on. And that's the approach that we've taken for our parental controls at the very least for parents and teens. We know that parents don't often know what what they are. You know, as my Google colleague has mentioned, we've there are we have literate literacy resources and and and in person consultation, and and we can get better at education, and I think we should as as we've noted. But by default, the the controls should be on, and a parent shouldn't have to turn them on and figure out what they are.
- Tina McKinnor
Legislator
So when we when we purchase the phone and first get it, it should just be on the default already, and then you go from there.
- Lauren Jonas
Person
Speaking to the OpenAI, you know, ChatGPT experience in particular, that's the approach we've taken.
- Rebecca Bauer-Kahan
Legislator
And remind me, because now I've probably conflated all of these different safety programs. I apologize. So OpenAI, it's on by default for 18. And then is that self attestation? Is that how are you determining a Chat GPT user's age?
- Lauren Jonas
Person
Similar to our Google colleagues three ways. Okay. Self declaration of age first. Okay. Age estimation that runs in the background that will determine whether a user is over or under the age of 18.
- Lauren Jonas
Person
And then, if we are not, certain of a user's age over or under an age the age of 18 using age estimation, we default that user down to the 18 experience. If we get it wrong, and we have defaulted you down to the 18 experience, you can use age verification either via Selfie or government issued ID to rectify.
- Rebecca Bauer-Kahan
Legislator
Got it. And then you will, I assume, I'll be complying with Assemblymember Wicks' bill when the time comes, which I know is not yet. Although I will say, before I
- Rebecca Bauer-Kahan
Legislator
turn it back over to Assemblymember McKinnor, I my device manufacturer has now turned on age signaling by their own choice. It's not legally required yet, and I downloaded an app that was choosing to limit it to 18. My device then warned me I was downloading an app that was 18, asked me if I wanted to change my age prior to sending the age signal to get the the app. So even the device manufacturers that are do and we it's not technically against the law. We didn't think of that.
- Rebecca Bauer-Kahan
Legislator
We didn't think the device manufacturers would be inviting people to change their age. So we'll be cleaning that up. But I just feel like every time we try to do these things, there's somewhere that there's an end run around. So but I we're gonna keep fighting the fight and closing the loopholes.
- Tina McKinnor
Legislator
Keep pushing. Given the subject matter of this hearing, I would like the panelists to comment on how we protect vulnerable youth who may not have active caregivers, but rather maybe neglected or experienced trauma at home? Given that research shows that children who have experienced abuse or maltreatment are at heightened risk for suicidal indention.
- Nicole Lopez
Person
I can I can start? We've been talking about defaults and and a lot of the conversations I've had with policymakers, this has come up. And, again, like, not every parent is gonna be involved. A lot of parents can't be involved. They're working multiple jobs.
- Nicole Lopez
Person
I used to do domestic violence cases. There may be home situations where teens don't want their parents involved. But again, as we said earlier today, it's an outlet for teens to connect, to get educated, to find their passions, to communicate with their friends, and that's why, again, we rolled out we were the first to roll out the teen defaults with teen accounts. We understood how important it was that even if a parent can't get involved, we need to have the strictest settings in place. And, again, we default all teens 18 into them.
- Nicole Lopez
Person
I will say separately, we do we did work. We have four or three or four different expert advisory councils we work with, and they drew a differentiating line between 16 year olds and 16 year olds. And so if you're 16, so between 13 and 16, you cannot get out of those protective defaults without a parent relaxing them. Older teens can drive. They are maybe studying, have jobs, executive functioning wise.
- Nicole Lopez
Person
Not again, every team is different, but there is a line between them, but we still default everybody into it. I think what's really important, it's not just the default experience itself. It's substantively. What are we protecting against? And we wanna make sure that, again, the content teens are seeing is age appropriate.
- Nicole Lopez
Person
That goes to your question about, you know, sometimes vulnerable teens are looking for content that maybe they shouldn't see. So it's really important not only to have policies on it, but to enforce on it and to make sure that we are keeping that content away from vulnerable teens, especially if their parents aren't involved and cannot have conversations with them. I think who you talk to is really important. You wanna make sure that teens are not getting randomly messaged by people and that they are in a protected experience when it comes to messaging restrictions, so we default them into that. And so I think the point here is everything should be automatic without the teen even having to hit anything and try to get out of it.
- Nicole Lopez
Person
And if they do wanna get out of it, that's when they go to a parent or guardian.
- Emily Kirstein
Person
Well, I think, you know, just when we're talking through how complex it can be and building for every type of child, every type of family, their unique experiences is why, you know, this can be hard and why we want to have the ability to have different settings. And, you know, from the YouTube perspective in particular, we're talking about access to a video library and the way that that can, you know, help teens or users in vulnerable situations finding authoritative content, finding content that is putting them in in validating some of, you know, what they may be feeling in a certain family situation or what have you. I think this is also about, not cutting off access, for some of those teens who, may need that information. You know, from a YouTube perspective, this is, you know, teens are using this to listen to music while they're doing homework. This is the largest, you know, for younger users, the largest video library of Sesame Street, for example.
- Emily Kirstein
Person
So this is a video sharing platform. And on top of that, there's digital well-being pieces built in for you know, if someone is searching for a suicide self harm, disordered eating, there's going to be protections defaulted in place that are, you know, having the screen takeovers, encouraging them to seek authoritative content and to take a beat and, you know, elevate content about self compassion, about, you know, grounding exercises, things like that. So there's a variety of different ways that they could be in in supportive as well.
- Rebecca Bauer-Kahan
Legislator
And are yours on by default? The which? I'm sorry. So these teen protections you're mentioning.
- Emily Kirstein
Person
So we do a we have age assurance. We rolled that out on our first party platforms, and it goes through we have that inference model that, you know, will say whether or not this user we we think is above or below the age of 18, taking into account things like, again, how long the account has been in place? Are they looking for different kinds of content?
- Rebecca Bauer-Kahan
Legislator
That's that's fascinating. I just will say, again, I don't wanna put you in the hot seat because my kids are on YouTube. And and part of the reason they're on YouTube is because my son has learned to play chess on YouTube. He became a magician on YouTube. I actually think YouTube has really
- Rebecca Bauer-Kahan
Legislator
Good content that my kids have grown from. And at the same time, I will say, my son does have his computer in the kitchen, so I see what he's seeing. He's also getting fed incredibly disturbing content every single day. And so, I just I'm, like, I'm surprised by some of these answers because it's all great, but it's not playing out in my household. So
- Tina McKinnor
Legislator
Very last question. And I it is good to see you guys coming up with great ideas. So that's that's good to see because that this is my second year in privacy. With no visible representation of people of color among your leadership here today, why should black communities trust that your platforms are safe for their youth for our youth? What measurable actions have you taken to eliminate systemic racism in your systems, and how are you being held accountable for those outcomes?
- Nicole Lopez
Person
I know. Mic dropped there. I mean, I can I can address I saw you doing that? I can address it and say, I think we need to do better. I mean, the fact that you pointed out that, you know, there aren't enough black leaders at companies across the board, not just our companies here today, I think it's something that we need to all work on. It's important. I will say I can only speak to my experience. I will say that when I used to lead Esafety Policy, which I did for two and a half years, we brought a lot
- Nicole Lopez
Person
of different perspectives into the people who were advising on how we built the products. Products. And it was across race. It was across gender. It was across socioeconomic status.
- Nicole Lopez
Person
It was across lots of different countries and also different kinds of families and different types of teens and parenting. And I do think and I believe wholeheartedly that the way that you best design these experiences is making sure that you're getting all sorts of viewpoints in the room and that you're accounting for them and that if you don't feel like you have enough diversity in the room, you have to try better. So that I can speak to in terms of how our team worked both with experts, parents, policymakers. It was a very, very diverse group of voices.
- Lauren Jonas
Person
One of the things that that I'll a couple things to to note here. I I agree. I think we can all do better here as a I am I am Latin. I'm of Mexican descent, and and I I don't think that there is enough representation just writ large in the technological industry, and so I'm in full support of of that. More broadly, as we work with sort of third parties, in in the mental health space in particular, one of the things that the CEOs of the major, you know, mental health organizations are people of color.
- Lauren Jonas
Person
We ensure that on our Well-being Advisory Council, there are people of color, that the Global Physicians Network is representative and globally representative so that we are not taking a very particular approach in the decisions that we're making. I I also wanna address sort of the prior question. It sort of dovetails together, which is to say, as we are building some of these, these systems, for example, this parental notification piece that that we've talked about, we we understand that even when a parent is involved, that that parent might not always have the best intentions. And this is something that has come through in some of the party organizations that we've worked with on mental health, which is to say, before we send a notification, we are we are assessing for risk at home, which is to say, what else is that team prompting for to be sure that they're not prompting for suicidal content because there is risk at home. Right?
- Lauren Jonas
Person
And so I think a lot of this, kind of dovetails together and and representative viewpoints, from from our, you know, well-being council on AI or global physicians network and this sort of broader, you know, represent representative dataset has really guided our our approach here.
- Tina McKinnor
Legislator
And and, thank you for that. And and to the companies that you guys work for, in leadership and decision making, we need to see a more diverse, group of people so that they could give their input because this is affecting all of our children, and it's great to see women sitting here. That is very good to see women sitting here, but we do need a more diverse perspective. So in the next coming years, that's what I'll be looking at. Like, where are you guys with with AI, with online tools, where you know, how are you guys making sure that all kids are gonna be safe and that this affects all children? Thank you.
- Chris Ward
Legislator
Yeah. Thank you for the presentations. Obviously, a key interest of the committee for some of the work that's coming forward before us, and, certainly, discussion out in the community, parents, schools, and anybody that cares about our kids, myself included in my head, 11 year old and seven year old. And I sympathize as well. The seven year old, you know, is loving YouTube, but maybe a little too much.
- Chris Ward
Legislator
And that raises a question because, you know, I'm still educating myself on how to set things up well. And and and and maybe we don't have enough education, right, when you're creating a new account. And I kinda would wanna ask two things for any companies that, you know, sort of are creating accounts or sort pointing you in that direction of where you're trying to be able to make sure the good controls are in place. You know, how do I even know to, access these controls or what options are available to me as we're learning things here just today that we we never even knew? And, you know, if you are creating accounts, there's evidence you said you're sort of screening the you know, the the the the technology sort of screen that, you know, this this viewer, like, might be a youth, might be a teen.
- Chris Ward
Legislator
Is there proactive, like, ways to be able to prompt the teen or any other viewers there? Hopefully, a parent is in the room, to know about the options that are there that they can start to, avail themselves of parental controls or other systems.
- Lauren Jonas
Person
Yeah. I'll speak to OpenAI in particular. At at every possible point, we are attempting to surface the concept of parental controls. It's available in our settings pages. It's available.
- Lauren Jonas
Person
We point users constantly to to our help center, to our notification systems. The goal is to drive as many parents to this as possible. I think industry wide, we can do better at education as we've, you know, said here today, but the goal is in the product to surface as many, sort of, notification moments both to parents and to teens as possible to our literacy resources, to our help centers, to the settings page to engage in these parental.
- Chris Ward
Legislator
Yeah. I think I think that could be certainly a takeaway that we need, you know, you know, more immediately. I kinda work on in this moment is that, we wanna make sure that there's a lot more opportunity for all of the software and product that you're you're you're you're developing to be able to help be a part of the solution here that that information is getting out there so it can be availed of. And maybe related to this is, you know, I use and say, in my case, you know, a seven year old, you know, we sort of get him on there and he wants to watch a little bit, and I literally am typing in the search chat, you know, educational videos for seven year olds. And there's a lot of great options out there.
- Chris Ward
Legislator
Right? And so he starts going on those, and he's kinda clicking around, and I'm out of the room for ten minutes. The next thing I come back in there, and he's, like, you know, he's hyper graphic, like, you know, like, war scenes and gunplay. And it's like, how how did I get from here to here? Right?
- Chris Ward
Legislator
And if I was typing in educational videos for seven year olds, well, one, hopefully, you're realizing that a seven year old is watching TV, and so it sort of would have self corrected, but it wasn't happening in this case. And and and two, why would algorithms even sort of, like, you know, link these two? Kind of an open question there. And and and it really raises because we're having that challenge right now. Next thing I know, I'm I literally, this week, got a call from the principal about gunplay at school gunplay at school.
- Chris Ward
Legislator
You know? And it's like, okay. Well, yeah, I guess he can't watch YouTube.
- Chris Ward
Legislator
And I don't want that prohibition on it because I recognize the positive benefit of it. But something is just not actively working in practice right now, or there's not enough check-in there that, unfortunately, there there wasn't a real problem. Right? Like, you know, it didn't really, like, you know, have a serious action. But, you know, left unchecked, you know, I I I can see more and more real problems sort of surfacing.
- Emily Kirstein
Person
Well, I you know, I'm happy to to take that. I think to kind of fuse the two questions if I'm understanding them correctly is, you know, when a user, for example, starts a Google account, if they're telling us they're under 13, they're automatically going, and it's saying you need a parent and, you know, getting that parent involved, and they can't access anything until they connect with the parent. And so they would go through that family link flow, which would have all of those settings that we had there. But if a user, says they're above 18, but then that's when our age assurance things come in and it isn't sure, then, you know, before they try to access any, you know, age restricted material, they have to confirm the age. But in that default setting, if we're seeing that it is indeed a seven year old, you know, in that setting, we would send them to Family Link.
- Emily Kirstein
Person
But say it's, you know, a teen, we're putting those default settings in place that would, you know, for the the parts of YouTube, I think one of the things that's really important and, you know, certainly can't speak to any specific, you know, incident, but I think what we're trying to do is elevate high quality content and limit low quality content. And so from the, you know, teen experience, we have principles that we've worked through with third party experts for both kids 13 and teens to figure out what does high quality look like, what does low quality look like, and how do we adjust those personalization recommendations accordingly. And I think the other thing, and not to say that this is the case, but I think this is one of the things we hear a lot is the importance of children being on their accounts. To toggle between their account. If a parent's on their account, they're not gonna necessarily have those default settings, that would have those personalization, those high quality principles, level more in the feed than they would for a for, you know, a child and making sure that they can take advantage to not only the the 18 default settings that we talked about, but also whatever parental tools are in place.
- Emily Kirstein
Person
And so we're making it easier for parents to go back and forth and, you know, wanting to show the importance of kids being on their own account.
- Chris Ward
Legislator
Thank you for that. I wanted to switch because I'm overdue for a 04:00 meeting, madam chair. But I did wanna make sure that we at least kind of, like, you know, we're able to work on another sort of community issue, and I'm the chair of our LGBT caucus. And that comes up often as we're thinking about, you know, how to manage social media, whether any kind of We do have concerns sometimes because we recognize what positive and benefit and and negative, the challenges right around social media use. You can imagine a number of scenarios where a youth might be identifying or questioning themselves, but they might not be in a supportive environment or they really just wanna go to more kind of constructive proactive things.
- Chris Ward
Legislator
Think Trevor Project, think your local teen, LGBT center, a support group, you know, just sort of positive information. And and, you know, with with parental controls, with the ability to sort of manage all that, you get things get a little dicey. Right? Because, you know, they're watching what they are accessing, and then that might be, you know, Kinda getting into their space of privacy a little bit too much when they're not ready to come out or they're, you know, may not be coming out in a very, you know, supportive environment or or worse. Right?
- Chris Ward
Legislator
Like, you know, a very, a very, hostile environment. And and so that that's something that comes up in this committee conversation as well as we're thinking about this this these regulations. And I guess, what what do you see I I know that this is the study the surgeon general is looking at, like, you know, both, like, you know, studies positive and negative benefits. What do you what do you see as sort of, like, you know, the kinda lens that you're thinking through when it comes to LGBTQ youth Mhmm. To make sure that they're protected overall, but that privacy considerations are are are embedded as well too, and and positive benefits are directed.
- Emily Kirstein
Person
I think that's incredibly important. And we do when we do talk about the how parental tools should you know, different levels of it. That's why, you know, this is a difficult conversation we need to be balancing the fact that teens do have an increased developmental capacity for autonomy, wanting to make sure they have, of course, all of those default settings, but there are really good reasons why they should be having, you know, a a more autonomous experience. And it's really important to think through those exact kind of examples as we're thinking through what public policy looks like and why it's important not to completely cut off access, but to allow access within a safeguarded environment.
- Lauren Jonas
Person
On the Chat GPT side and, again, we're not social media, so it's a little bit of a different, you know, game here. But, on the parental control side, one of the core tenants and principles in the way that we built this is a parent will never have access and will never see the exact prompt and generation text that a teen is putting into ChatGPT. And it's why as we built parental notifications and all of this concept, it the the the the general topic of the distressing content being suicide specific is shared, the exact prompt and generation text is not. Because why that teen is suicidal and everything that surrounds that is their privacy. But we want to give parents enough to have the ability to take an action.
- Lauren Jonas
Person
So, for example, the goal is to preserve the privacy of the teen and allow a parent to have enough information to do something about it, but we recognize and have thought extensively and worked with our third party experts and councils and the APA and what have you on this exact question, and so I appreciate it.
- Rebecca Bauer-Kahan
Legislator
And is there a difference in and I appreciate I think it was Instagram that you mentioned that there's a difference for your programs between 16 and 16 to 18, for example. Do you see any distinction between any age groups, or is everything 18 privacy protected? Like a seven year old.
- Nicole Lopez
Person
You're right in everything that you said. I think it depends on what the experience is. So just to elaborate, when we rolled out the we launched the new expanded version of teen accounts, we took a different approach when it came to content, and we identified that teen should not see whether you're 13-14, 16-17 you shouldn't see content that's 18. So depending on the type of experience, we actually sometimes delineate at 16 versus 16, and then there are other experiences that squarely fit in. This is an experience that teens should have and should not be accessing adult inappropriate content. So I just wanted to Okay.
- Rebecca Bauer-Kahan
Legislator
No. I appreciate that clarification. Didn't wanna misspeak for you. Do you so do you have any distinction between 18, or is all 18 as privacy protected as you just mentioned?
- Eliza Jacobs
Person
We at Roblox, just to jump in there Go ahead. We have sort of you know, kids grow up in a variety of ages and stages. Right? So as you age on the platform, you have access to a sort of expanded set of products, features, content, and all of that. And we we think of that as sort of a training wheels approach.
- Eliza Jacobs
Person
We want to teach kids good digital habits. And we know that at Roblox, for many kids, we're the first account they ever have on the Internet. And we take that really seriously. So we make distinctions. So Undernine, for example, has no access to direct messaging on the platform.
- Eliza Jacobs
Person
And as you age up, you have more and expanded access to communication features after that age check to different kinds of content. And then at 18 plus, you have access to restricted content on the platform. But I do think one thing that would be incredibly helpful, and this is from a couple of questions ago, but we're all talking about sort of safe by default and then layering on parental controls. But none of us use the same necessarily the identical language or the identical terms for settings and buttons and tools. And that makes it really hard for parents to be able to navigate across.
- Eliza Jacobs
Person
You know, I think the stat is most kids are on upwards of 40 different apps. And so to the extent that regulation, that legislation can standardize some of that language to make the cognitive load easier on parents, I think we would welcome that as an industry to say, like, this is what this word means. Everyone use this word when you're talking about this control. That would be incredibly helpful. We're all engaging with experts and teens and parents and NGOs and, you know, pediatricians and all of that, but we're all landing in slightly different places even though we're all trying to get to this outcome.
- Eliza Jacobs
Person
So the more we could standardize that language, I think the better and safer everyone is.
- Rebecca Bauer-Kahan
Legislator
Yeah. And I think that leads me to my next question, which is, you know, Sally Roberts who had to leave passed the age appropriate design code, which was really intended to get at how do we design these to be safe for children. And some of what I'm hearing today, I think, is unclear to me. Are you changing the algorithms or the recommendation engines? Are you just shielding content?
- Rebecca Bauer-Kahan
Legislator
I don't know. That's a little unclear if you wanna answer that. Age appropriate design was then sued on, is very minimally now lawful, but mostly not lawful according to the ninth circuit. So I guess I'm a little bit lost in okay. Great. We're here. You're talking about all these things. We had a assembly member who's led in the space for a long time. We tried to put that forward. It was then sued by industry.
- Rebecca Bauer-Kahan
Legislator
So is that the gold standard? Like, is this the gold standard? Should we be saying what's safe for kids online? Is that something an issue lever allowed to happen? I guess is the question. I don't know if I said that well. But
- Eliza Jacobs
Person
I would say Roblox supported the California age appropriate design code for precisely the reason that I I just discussed. And, you know, I can't speak to the legality necessarily and what those arguments were. But I do think industry standards that people can align on, would be incredibly valuable.
- Rebecca Bauer-Kahan
Legislator
Anyone else wanna weigh in on age appropriate design in concept?
- Emily Kirstein
Person
Well, I think, you know, speaking to, you know, the purpose of age appropriate design, you know, we are in favor and and have been you know, we had a legislative framework to protect children and teens. I think we released it back in 2023 with things like, you know, requiring companies to take the best interest of the child into account, to require companies to have offerings related to prioritizing mental health and well-being, things like that. And regard with regard to age appropriate design, I think there's a lot of things that, you know, we're we've you know, age assurance was part of that privacy by design as part of that. We, you know, have those in place. And I think just more broadly, you know, this is, as we've talked about and, you know, maybe unsatisfying in in in some ways, but I think this is I think my colleague from Meta said this isn't static.
- Emily Kirstein
Person
This is, you know, an ongoing conversation, an ongoing way that we want to be meeting the moment for both parents and for minors. And I was Yeah.
- Nicole Lopez
Person
I mean, I was just gonna jump in that I I agree with Eliza. I think standards are good. And I think we're you've heard. We all have different versions of default settings, different versions of parent controls, different versions of content ratings, I guess, if that's what you're gonna call it. So we're all solving for the same root issues, and we're trying to put in mitigations, and we're all working with experts and parents.
- Nicole Lopez
Person
I mean, we're all facing the same things. I think that what we heard earlier today, and I know you're gonna have, I think, another panel on this too, is, you know, as a parent and the fact that Eliza cited the same University of Michigan common sense media research study that showed that teens are on an average of 40 apps per week, it's a lot for parents to be jumping through those hurdles. And frankly, you know, I think parents have said time and time again and teens have said that the digital world is not going away, and there's a lot of good across everything that everybody has said today. It's not going away. But parents should be able to support their teens when they're online.
- Nicole Lopez
Person
And if a parent doesn't want their teen on 40 apps per week, they should be able to pick the apps and approve. If it's too I mean, you like YouTube. If you want your teen or kid to be on YouTube, that's your choice. It doesn't remove the obligations from all of us, all of our companies to build those age appropriate experiences. They still have to happen.
- Nicole Lopez
Person
But I think we need to make it easier on parents because every person here has described a different version of what Hoops parents are jumping through, whether it's streamlined or not to support their teens. And I think we've pushed for federal legislation and with state legislation to get parental consent at the OS app store. And I think if you can make it easy on parents and the apps continue to build these safeguards as technology changes, you're supporting not only teens but also their parents. So I think it's it's it's everything that we've been discussing and then some.
- Rebecca Bauer-Kahan
Legislator
No. And you say it's funny. You say I like YouTube. I actually have a love hate relationship with it. I think there are
- Rebecca Bauer-Kahan
Legislator
With as we do most technology, frankly, so not to pick on YouTube again, but, you know, I think it's so complicated. And look, this is my life's work, and I didn't know about the parental controls on YouTube. So if I don't know about it, then that really says something.
- Nicole Lopez
Person
But that's the point. If it can be easier for everybody at the OS app store level where it's like the same thing, the same standards, and then parents can decide, I'm okay with this app. Maybe my 12 year old, 13 year old is fine with YouTube, but maybe I have a kid with ADHD who's not okay with it. Right. You as a parent should be able to decide, and if you change your mind, you change your mind. But that's a parent's decision.
- Rebecca Bauer-Kahan
Legislator
I also think, look, I'm a big I love the training wheel analogy Yeah. Because I actually truly believe, and this is why the computer's in the kitchen, is that in my family, my kid will leave home, and he will have these devices, and he will have access to these things. And it's my job while he is in my home and living under my roof to help him learn to navigate these spaces. And so to not you know, we all again, we went or I went to college long before these spaces existed. But we knew the kids who were sheltered a little bit too much and got to college and with other things went a little bit, you know, wild because they hadn't been taught how to manage things that are exciting.
- Rebecca Bauer-Kahan
Legislator
And so I I struggle because I think kids should be in these spaces with their parents learning how to navigate them. How do we think critically about content on YouTube when you're being fed something that is maybe toxic or problematic or not factually based? You know, how do you ask questions and look up sources? And that is something people have to learn. And so I but at the same time, when I sit and watch, you know, my daughter be fed content, frankly, that is different than my son that is incredibly disturbing from a body image perspective.
- Rebecca Bauer-Kahan
Legislator
I'm like, should I be allowing this at all? And so I I think that, you know, if we can create spaces where they can learn and grow and start to get these critical thinking skills, we are better for it. And the problem is I think they're we're not there right now. So, as someone remember, Wick's wanted
- Rebecca Bauer-Kahan
Legislator
me to ask if wanted me to ask you questions. I think we've answered the first one. She said, for 18, are the default settings the strictest? I think the answer was yes for everybody. Mhmm. Correct me if I'm wrong.
- Rebecca Bauer-Kahan
Legislator
Yes. Okay. And then who can change them? Can kids override them? I think I heard you say only at 16 to 18, kids can override them in some context.
- Nicole Lopez
Person
Unless they're in parent supervision. Some some 16-17 year olds may wanna be in parent supervision. If they're not in parent supervision, they can undo some of the settings. Not all.
- Emily Kirstein
Person
And then for Google So we have so for Google, the you know, a a teen a supervised, user would remain on supervision, after the age of 13. With, YouTube, there are, you know, there's a voluntary teen, experience that I'm happy to kind of get more information for you on. But I think I also wanna go back to the the point that was made earlier and just clarify that parents right now on both on Android and through Family Link have the ability right now to approve or block apps, and I just wanna make sure that that's very clear.
- Rebecca Bauer-Kahan
Legislator
And that's true on Apple too, I think. Yeah. But I have Apple devices in our house. So yeah.
- Rebecca Bauer-Kahan
Legislator
Yes. Yes. You said nobody can overwrite. Right? Okay. Yeah. And then I think Roblox, I heard you say it depends on the age, but cannot
- Eliza Jacobs
Person
Depends on the age. It's sort of a, as I said, training wheels approach. We have parental visibility through I I think it's 18, but it might be 16. I will double check. And then we also have youth mental health tools, again, to the sort of digital literacy point that you were making.
- Eliza Jacobs
Person
We work with our team council to ask them what would be most valuable to them. And so at 13, they have a series of youth mental health tools that are available to them in their own, dashboard to make choices for themselves.
- Rebecca Bauer-Kahan
Legislator
Got it. Okay. And then her next question was, would our prior panel and this is kind of a tough one.
- Rebecca Bauer-Kahan
Legislator
I'm giving you her question. She asked tough questions. Would our prior panel believe that the strictest settings, so presuming they keep it on the strictest settings and don't make different choices as they can in some of these programs, are they good enough? And she gave an example. I'll read her example of what she meant. She said, for example, Google said you have a bedtime reminder. Mhmm. Can the kids just close that window and keep scrolling?
- Emily Kirstein
Person
Well, I think there's bedtime reminders, and I I wanna make sure I get it right. So let me make sure to follow-up after. But I think there's bedtime reminders, and then there's downtime. And I think those are all available through Family Link for parents to completely shut down the phone, whether it's, you know, the reminder itself, but also, you know, having the phone itself be be off.
- Rebecca Bauer-Kahan
Legislator
Okay. So would I mean, I think it's it's a challenging question. Would they think that these are sufficient? The answer I heard them say themselves was no. So I don't know if you wanna speak for them, if that feels.
- Rebecca Bauer-Kahan
Legislator
But I guess the last question, which is her question, but actually I share with her is you're all, you know, sitting here saying you're trying, yet kids are dying. Right? I mean, kids are being harmed. Kids are having eating disorder behavior because they're being fed too much content of that nature. I think that the lived experience of me and my peers, and it sounds like every single one of us is moms, I think. Right? All the same vintage mom. For sure.
- Rebecca Bauer-Kahan
Legislator
So you're probably getting the same questions and comments at the soccer games. I am. You know, it's not working. Right? We see our teens and our younger children, I mean, addicted, wanting that device so badly, not wanting to go out and play because the Ipad is sitting there even if it's turned off. And so I guess the question is, like, why if you're doing all of these things and you think they're best in class, are we continuing to see harms?
- Lauren Jonas
Person
I think the the I'll I'll sort of answer from an OpenAI Try GPT perspective. And I think we've all said this. This is this is a marathon, not a sprint. The way that teens engage with ChatGPT in particular changes over time as they grow, as they age. It's a learning source.
- Lauren Jonas
Person
It's a teach me quizzes source. It's a it is it is a it it evolves. The the product is so new and so early, at least for us. It's been around since November 2022 that the the mitigations and the controls and the content restrictions are constantly changing, and we're evolving them because of the way that teens are using the tool. In in in the ChatGPT case in particular, it is so new.
- Lauren Jonas
Person
It is such new technology. The technology changes over time. So for us, the approach of sort of iterative deployment is how we think about this, which is to say we restrict and then learn and and evaluate. To one of our prior panelists, We learn. We look at metrics.
- Lauren Jonas
Person
We have dashboards. We at the individual user level and at the aggregate level to understand how our mitigations are working. And so I think at least for ChatGPT, this is such a new technology that this will be a process. It's a marathon, not a sprint.
- Rebecca Bauer-Kahan
Legislator
And have you pulled back models because they were harmful?
- Rebecca Bauer-Kahan
Legislator
And that was, as I understood it, mostly a sycophancy problem? Is that or am I miss
- Lauren Jonas
Person
There were a number of reasons that that model was deprecated, but it's no longer, in production available to users.
- Eliza Jacobs
Person
I I I totally agree. Look, I think it is a marathon, not a sprint, and the technology is constantly changing. You know, we only launched facial age estimation a couple of months ago when we felt the models were accurate enough to give us accurate age signal. We did not have that tool before. And so as the technology improves and becomes available, we will use it.
- Eliza Jacobs
Person
And as our platforms grow and change, we will need to add more tools on top of them. I think the other thing about this is that all of these platforms are a little bit different. They have a little bit of a different offering, and all of our kids are a little bit different. And what they need is a little bit different. This it's not one size fits all at the platform level or at the user level.
- Eliza Jacobs
Person
We're not talking about, car safety. Right? Belt protects all of us. Same. An airbag protects all of us.
- Eliza Jacobs
Person
But when you're talking about different populations, as was talked about earlier, for some kids, parental controls are incredibly important. And for some, that same parental control might actually expose them to harm because their parent now knows something about their private internal life that might cause them to harm them. So it it it's just so complex and so multilayered that there isn't one solution, because every kid is different and every platform is different. And I just that's why it's a it's a never ending problem to
- Rebecca Bauer-Kahan
Legislator
solve. No. I appreciate that. And I think what's I struggle with, and I tell I've told this story before, when my kids were born, I had a vibrating chair. It was the only place my babies would sleep. It was my favorite thing in the world because it got me a nap and a shower most days. I believe it was five babies flipped over and suffocated in the chair. The chair was recalled because The United States Of America wouldn't accept Five deaths
- Rebecca Bauer-Kahan
Legislator
in children. And so I get that this is hard, but we have accepted far too many deaths of children through online harms. And so I just I hear you. I think it's hard. I actually I I understand that, but I just get to a point where I think we allow these tools to be in our children's hands when we know we haven't gotten it right.
- Rebecca Bauer-Kahan
Legislator
And we don't allow that for a drop side crib or a vibrating chair, but for some reason, we do for online spaces. And so I just I think we need to do better because I don't think we should be accepting the death of even one child. Nevertheless, the numbers we've seen across these products, and not just the products represented here, just to be clear. We would have invited everybody, but it would have taken us seventeen hours. So, I do, again, appreciate you all being here, and having this conversation because, we've struggled in this committee, and I think many of you know that to try to figure out a way to protect children and allow them to grow and experience online spaces.
- Rebecca Bauer-Kahan
Legislator
And I think that is everything, Alyssa. Anyone else want to answer that question that I asked? I don't know. I think I heard from anyone. Okay. Well, thank you. I do really appreciate it. I think your willingness to come here and engage with us and talk about what you're doing, and iterate and do better and devote your lives to protecting children, really matters. So I appreciate that. Thank you all.
- Rebecca Bauer-Kahan
Legislator
Thank you. And we're gonna move to our last panel, which is the solutions panel. We like to end with solutions in this space. We're gonna have Holly Grosshans, and I'm probably butchering that even though I've said your name so many times, from Common Sense Media, senior counsel for tech policy. Sunny Liu, director of research for Stanford Media Lab will be back, and Anneke Buffone, also doc founder and CEO of Clara will join us again too. So thank you all for coming back. Give us the answers.
- Holly Grosshans
Person
Mhmm. Alright. Thank you. Can you hear me? Is this on? Yes? Yeah. She is. Okay. Great. Thank you, chair Bauer Kahan, for having me here today to speak to you about this really important topic. As you have said, my name is Holly Grosshans, and
- Holly Grosshans
Person
I am senior tech policy, counsel for Common Sense Media. And more importantly, today for my testimony, I'm the mother of two small boys. Despite spending nearly my entire career working to protect kids online and holding those responsible accountable when kids are harmed, I still struggle to keep my own kids safe online. Because of those concerns, I, like you, don't allow my children to use most of the platforms you just heard from here today. Common Sense Media works to improve the lives of kids and families by providing research backed information, education, and independent voice for parents navigating the age of apps, algorithms, and AI.
- Holly Grosshans
Person
In California alone, more than 100,000 educators are registered to teach common senses, digital literacy, and well-being materials to their students in 13,553 schools across every legislative district in the state. As has already been identified this afternoon, raising children today involves navigating a digital environment significantly more complex than what previous generations have encountered. Thousands of apps, games, and online services compete for young people's attention, and this landscape is being fundamentally transformed by the unprecedented power and associated risks of AI products. Parental controls can help families manage this environment, but as we have already heard here today, it is important to understand both what these tools can and cannot do. There we go.
- Holly Grosshans
Person
As you have already heard, there are so many types of parental controls available today. We agree and adopt the comments from the opening panel as to the laundry list of limitations of these tools. Even when parental controls are set up correctly, they have clear limitations. There are some points about these limitations that I think are worth reiterating, particularly after hearing from the industry panel that just pointed out all the controls that are available. So I'm gonna go into a couple of the limitations.
- Holly Grosshans
Person
Content filters offer limited protection for younger children and can be ineffective with tech savvy teens, giving parents a false sense of security. Screen time limits are insufficient when platforms are designed for maximum engagement, creating a no win situation for parents. Activity monitoring can raise privacy and trust issues for older teens that need some independence, and research has shown that surveillance can lead to unhealthy self censorship. It is important to remember that no single solution is going to fit all families. Product design increasingly shifts the impossible burden of risk management onto parents.
- Holly Grosshans
Person
Furthermore, parents face a fragmented system where they are expected to master different controls across every device, app, and platform their child uses. This means sometimes more than 40 different devices, apps, and platforms, which is why some families have turned to third party monitoring services like BART, Bright Canary, or Aura, which Common Sense Media is also a partner with, to help manage this complex landscape. These third party monitoring tools can provide additional visibility and alerts to parents across devices and apps. But even these third party tools have limitations. They can prevent harms, but they can't fully prevent harms because they're reactive to what kids are doing or searching for online.
- Holly Grosshans
Person
And it is hard for these companies to keep pace with the rapid evolution of platform features and design choices. There are equity and ex accessibility challenges with parental controls. As we've already pointed out during this hearing, not all families have the same time, technical expertise, or financial resources to navigate parental controls or third party tools. Many parental control tools require complicated setup that is time consuming, I think we heard over a hundred and twenty hours a year, and requires ongoing monitoring. Additionally, third party tools require paid subscriptions that also require setup and maintenance in order to remain effective.
- Holly Grosshans
Person
This creates an equity gap. Families who can afford monitoring tools and have the time to manage often have stronger protections set up than families who cannot. Safety should not depend on how much time, money, or techno technical expertise a parent or other caregiver has. The strongest privacy and safety protections should be built into products by default, not hidden behind complex settings or paid services. This is a public a matter of public policy when companies will not do this voluntarily.
- Holly Grosshans
Person
An additional equity issue is how well these tools work for languages other than English. Although many parental control tools at the app, device, system, and third party levels are offered in languages other than English, safety alerts or AM moderation appear to work best for English. All of this leads to a simple conclusion. Parents need more support, not more to do. Parental controls alone are not sufficient to protect kids online.
- Holly Grosshans
Person
They are often complicated and ineffective even even when successfully set up. Research consistently shows advertised safety features don't work as promised or have disappeared and are no longer available. Internal company data confirms these tools are insufficient, yet companies promote them as safeguards. Many controls rely on accurate age verification to work, but current age verification methods on platforms are
- Holly Grosshans
Person
inconsistent or can easily be bypassed. Minors to create multiple accounts, ultimately avoiding parents' efforts to use these parental control tools without parents even knowing that their kids have unmanitored accounts. Therefore, platforms should be required to set the safest, most private settings by default, and must be required to determine the actual age of all of their users, an area where progress has definitely been made, but not all the platforms you just heard from are doing that. Platforms have a responsibility to help the children navigate the digital world. Parents have that responsibility as well, but parents should not have to shoulder that burden alone.
- Holly Grosshans
Person
Parenting, as you know personally, is already hard and exhausting. Families should not have to search through endless menus or toggle dug dozens of settings just to prevent predators from contacting their children in private chat rooms. What parents want and what they deserve is safety by design, with the strongest privacy and safety protections turned on by default. At Common Sense Media, helping families navigate the current digital environment is a core part of our mission. We provide resources to help parents and educators understand how platforms work, what parental controls are available at the device, operating system, and app levels, and the limitations of these tools.
- Holly Grosshans
Person
We heard from children now earlier in that first panel that kids want their parents to be able to talk to them about their use. Our parent ultimate guides, ratings and reviews, and safety assessments help families start conversations with their children about online safety. But even with these resources, keeping up with platforms can feel like playing whack a mole. New features, engagement tactics, and design changes appear constantly. These guides are valuable, but they are not a solution for the underlying problem.
- Holly Grosshans
Person
Parents cannot and should not have to solve for the structural safety issues created by intentional product design choices. Schools and educators also play a critical role in helping young people navigate the digital world. For over a decade, Common Sense Media's digital literacy and well-being curriculum has been a trusted resource in schools statewide and nationally, undergoing regular updates to keep pace with technological advancements. We strongly advocate for digital literacy education in all classrooms, and we are seeing significant success. For instance, in Ventura and Los Angeles Counties alone, more than 51,000 educators across 3,456 public and private schools have implemented our digital literacy and well-being curriculum.
- Holly Grosshans
Person
Our updated age appropriate and research backed curriculum includes more than 140 lessons for students in grades k grades k through eight, and we are on schedule to release another 50 to 60 lessons for nine through twelfth graders back to school this fall. These lessons are on topics such as privacy, safety, digital footprints, relationships and communication, cyberbullying, cybersecurity, and media literacy. We also have professional development resources for teachers and administrators to help build digital learning environments. Education is essential. But make no mistake, education alone is not enough in an environment where the products children use are unsafe.
- Holly Grosshans
Person
We can and must teach families how to navigate technology, but we, and this committee in particular, must also ensure the technology itself is designed safely. Policy guardrails matter, and this is why Common Sense Media have used online safety as a three pronged approach. First, we educate families and young people. Second, we try to push the industry to design products with children's safety in mind. And third, we advocate to to power powerful lawmakers like yourself that have passed laws that establish these required guardrails to make the Internet healthier and safer for all kids.
- Holly Grosshans
Person
California has been a leader in this, but there's more work to do. Laws by like s b 976 social media warning labels, and the age appropriate design code, these are all guardrails, but they are not substitutes for parent engagement in their kids' digital lives. They are needed to support parents and to ensure that parents are not expected to manage risks that are created by product design. In closing, parental controls can be valuable tools for parent families, but they are insufficient and sometimes misleading relative to the risks children are facing online. Safer product design protects kids.
- Holly Grosshans
Person
If we want young people to be safer in digital spaces, platforms must be designed with safety, privacy, and well-being in mind from the beginning. And companies must be held accountable if they evade these requirements. Protecting kids online requires a shared responsibility. Parents, educators, the companies, and policymakers all have a role to play. Parents should have tools to help guide their children.
- Holly Grosshans
Person
Schools should help students develop digital literacy and resilience, and ultimately, companies must be required to build safer products and be held accountable when they do not. They cannot be given a free pass on liability offering limited parental controls. As a committee, you have already begun to do this work in holding these companies accountable. And as a mom and an advocate, I hope that you continue to do so. Thank you.
- Rebecca Bauer-Kahan
Legislator
Thank you so much. And it's interesting. One of the questions forgot to ask Google was they mentioned all the parental controls on Gemini, but Gemini is turned on by default on Google Classroom and assume parents have no control over what is happening in Gemini in that environment. And I know that Google Classroom and classroom tools are a way that kids obviously get around parental controls writ large, but it does sort of lead to another gaping problem in this ecosystem. And so I really appreciate the Safe by Design articulation, for sure.
- Sunny Liu
Person
So in the last panel, I talked about challenges. Now I want to focus on solutions. So there's some data will give us some ideas back on research, but I also want to talk about personal ideas based on perspectives I learned from my research with parents and children. Again, the view present here on my own and should not be interpreted as to present the views of universities. I will quickly go over five ideas, I think, based on research from communication psychology, human computer interaction, and tech policies that can make partner controls work better.
- Sunny Liu
Person
Awareness, transparency, standards, streamline, and accountabilities. So those fall into two categories. The first three are trying to build understanding. The first two the last two focus on, driving action. So in the first panel, we'll know that parents and kids, they're aligned on the goals they want. Healthy, safe online space. It's really how can we build that? I think we should start by awareness. So parents mostly really struggle is even they don't know what's going on or really overly too strictive. Can we build a digital window?
- Sunny Liu
Person
Can we build a digital window that create awareness, give parents and caregivers awareness of children's online world without excessive intrusion or totally blindness? And so to have real, to really have awareness, we need real useful data. Can we have real transparencies and meaningful data? Specifically, we want data on children's experiences on both device and platform level. So parents, caregivers, and kids themselves can see the content, encounters, experiences, and harms they face.
- Sunny Liu
Person
And policy, platforms must share this with parents, researchers, and policymakers. When I talk about, transparency, I'm not talking about those platform agreement which have several page long and the most people don't read. I talk about easy standard and can be followed standard. In the real world, we know nutrition facts. We also have BMI.
- Sunny Liu
Person
My 11 year old son loves candy. So all kinds of candy. So usually, I think that I will not have to fight with him each candy he eat or quarrel about that if his BMI is okay. So can we build a digital BMI and nutrition fact so that we can understand what's the information diet my kids and he has or, like, all the kids have? Can we really create those kind of bench benchmarks and measures?
- Sunny Liu
Person
Can we require platforms to calculate and report those deep those index wise dashboards empower both parents to make decisions? And research and policy makers would access those equity data for guideline refinements as well. And those benchmarks, kids should be able to read as well, so they can make their own decisions and have, feel that they have agents to do that. So when we talk about parental control now, we have this 100 and twenty hours 40 apps. So currently, all those parental control is like we're doing detective work.
- Sunny Liu
Person
So we have to hunt around all those platforms and all those apps, all those bottoms. On the other hand, if we want to buy a thousand dollar phone, what we need to do is just one click. Protecting our kids is far, far more important. Why cannot we make it easier? Can we really have an easy to use one stop interface?
- Sunny Liu
Person
What we've set up parental controls was as effortless as online shopping. Just imagine one click experiences, simple, intuitive, and seamless. Can platforms really provide those single dashboard that really allow parents to enable their designs at age appropriate and family appropriate? And lastly, I think the most important thing to make all those work is we need accountabilities. Can we incentivize platforms to really impact those principles?
- Sunny Liu
Person
What we have, awareness, transparency, standard, and streamline as the platforms can really incentivize to follow those principles. And third parties will be important here, important here. So we should have third parties at audit so they can audit those systems to help support platforms to come to have compliance. So how can we improve parental control to make them work for the entire families is fundamentally build understanding and drive action, awareness, transparency, standard, streamline, and accountabilities. Those are something policymakers can consider when they discuss those complicated issues.
- Rebecca Bauer-Kahan
Legislator
Thank you. I really appreciate that, and I liked when you said, incentivize the companies because one of the things as we talk about, there's, of course, discussions here and abroad about, whether we should be banning children 16 from social media is can we use such a policy to actually incentivize safe spaces? That that's actually, I think, the real vision of those bans is that we say, fine. You can't be on these, but we've created incentives to create safe spaces online. I don't know if that would work, but it's a dream.
- Anneke Buffone
Person
Alright. I'll try to make it fast. Sorry, room. You're awake. I think the people and the people yeah? You guys got it? Okay. Cool. Alright. Okay.
- Anneke Buffone
Person
Yeah. I I I definitely I can't I can't I can't have people on their phones while I while I'm telling you all the cool things. So no. Okay. So I'm I'm gonna go over, like I think one thing that we haven't talked as much about is that key teens really don't like it when they get watered down experiences that don't work.
- Anneke Buffone
Person
So if it's like, oh, yeah. Here's this thing, but it's just not as cool and it's super safe, that loses the kiddos. So really making sure that whatever we're building is is still, you know, interesting to them. And and right now, the blind spots are that we we talked a lot about a lot, but the biggest blind spot I think that I think that we have is that parents just can't tell what their kid is seeing and doing all day. Like, to have a really strong overview, and dashboards really don't cut that.
- Anneke Buffone
Person
Dashboards are great for engineers. They're really not that great for parents. So, the thing, I I think I wanna just have everyone here leave with a lot of hope because I I think this is absolutely solvable. This is not a problem that we can't solve. There is 100% the capability within the tech companies today.
- Anneke Buffone
Person
We have really solid avenues now for age estimation. They're getting better and better. We have more and more ability to sort content and to suppress content that is not age appropriate. And I I do have a question, which is, like, if a company is able to, sort content into, you know, give kids different experiences by age group, and have defaults or a parent you know, where there are certain, like, things that parents can suppress, why can't there then be more choices? Like, so why can't I then say, okay.
- Anneke Buffone
Person
But my child is more mature or less mature, more sensitive, so I don't want them. They may be 16, but I don't want them in that experience because my child can't handle it. So if you've built the thing, why can't we have the choice built on top of that? So, I mean, that's one of my open questions.
- Anneke Buffone
Person
I wanted to, which actually is one of the reasons that, companies sometimes don't wanna build these things is because they, like, they call it the the open door or the one way door. So, anyway, we should have safety defaults. I am not gonna repeat that, but, you know, it's just, like, important for all the kids that don't have as much oversight. And I'm, you know, I'm a I'm a quantitative researcher. I've been doing that for fifteen years.
- Anneke Buffone
Person
I really believe in evidence based solutions. So what are the gaps? Open access. Some of the companies here today, if you go to any browser, you can get TikTok, you can get ChatGPT with really no use protections, and it's just important to call that out. So if as a company you have, an open access, you can get just get on the browser policy, then I I have questions about sort of what your actual youth policies are.
- Anneke Buffone
Person
End to end encryption, they really there's no business, I think, for children to be on anything with end to end encryption. It's just it makes it untraceable. If a child is with a predator in a chat and has end to end encryption, not even law enforcement can help your child. So we should not allow that. At AI chatbot guardrails, again, we talked about this.
- Anneke Buffone
Person
A lot of them don't have the right controls. That's that is really dangerous. That is really dangerous. These I mean, we all know this. Right? We're all telling these things our lives. You guys, you tell yeah. Right? You're like, everything. Right? Oh, I have problem with my girlfriend. Like, well, my mom's mad at me. You know? What I don't know. This isn't working for me.
- Anneke Buffone
Person
My my boss is mean. Like, all these things. Like, so these apps know so much about us. It's scary. And now we have children. And kids who have developing brains, they don't really have maturity yet. They they don't have impulse control in the same way. And they start trusting, room with a total stranger that doesn't have a background check and just say, tell them while you're alive and that yeah. Take their advice? No.
- Anneke Buffone
Person
We wouldn't do that. Right? So, at targeting data use, deep fake protections, there's a lot of things that we don't have yet really figured out. And so I think the number one thing that we need as we I mean, the age checks we talked about, the minimal standards to make sure that kids are protected if they the parents do nothing, Then standardized controls, the companies thankfully brought this up themselves, which, like, you know, I think it's exactly right. The same symbol, the same functionality for the same type of apps, the same number of clicks, no account required, and and then independent testing.
- Anneke Buffone
Person
This is a little too small, so I'm not gonna bore you the details, but you can now there is a new technology where the way you swipe the apps you download, all these different things, all these signals that are sort of unobtrusive can be collected. The nice thing, what I really like about this is that you can do it at the device level. And so that means you can do it every session, which helps with device sharing. So in in in, in in some families, that are lower income, the kids go on with shared Iphones, with shared devices. And we can't really keep them safe because they might be registered to the parent or to the grandparent or to somebody else.
- Anneke Buffone
Person
And so if you do it every session, so every time someone logs on, you have a much better chance to keep kids safer. And so I think we absolutely should should think about these solutions. Then I'm I I'm very excited about the device law that came came out that Assemblymember Wix has been pushing through. For that one, I think that we should consider to have the apps have the birth year and have the kids graduate in cohorts because the problem is that it's very easy for these systems to still end up leaking the age to the apps, and it's it it could be a privacy concern for some parents. But if the kids go in graduate cohorts, that problem, could be fixed.
- Anneke Buffone
Person
So, and then, minimal standards. So very important. No public posting by like, by default, there should be no public posting, no end to end encryption, no beauty filters. I don't think beauty filters should be there for anyone personally, but, like, there's no good thing that comes from a child being on an app with a beauty filter. It's just really terrible and toxic for women and for men, really, frankly.
- Anneke Buffone
Person
Like, we shouldn't have you know, we we just shouldn't have those for children. Parental control should be on by default, and standards should be on by default. Again, these things are possible. Platforms are doing it. They're using it for growth.
- Anneke Buffone
Person
They can use it for safety too. And then so my my my favorite is I I want, like, the you know, those we we year end review things that you get from Spotify and Instagram every year? I think parents should get that every year every week for their child just, like, to get a broad overview. Not some Creepy privacy invasive thing, like, you know, but, like, but just an overview, like, just so that you can answer the question of what are they doing all day. It's absolutely it's already being built.
- Anneke Buffone
Person
Right? They build it every year. They can build it for this. So give parents the insights into what the kid's actually doing. And you can do these things in a way that still preserve maturing children's privacy.
- Anneke Buffone
Person
There can be certain carve outs, like, if there are certain things that the child is is looking for. And, actually, I wrote up some some ideas for what legislation could look like to have these carve outs so that we're keeping kids safe when they're looking for certain health things or whatever where they might want and expect privacy, especially as they're growing older. And standardized parental controls, we talked about, like, the same pathing is important, the same icons. Like, this is absolutely possible, and and and companies do need help because it is very hard for companies to get into a conference room and decide on something they're all gonna do together. So the realization helps.
- Anneke Buffone
Person
And then independent testing and verification, I can't emphasize this enough. I think we need to ask companies for the right data so that we know what's actually going on. Like, one of the ones that I think is really important is, like, what kind of reports are actioned on versus not and why not. Independent verification. So really making sure that we have protected red teaming and safety testing, that we have research that's protected.
- Anneke Buffone
Person
Like, a lot of times, company can actually shut down our research when we try to safety test, sort of pretending to be children and, like, you know, sort of seeing if these systems work. As long as we're in that situation, we're not gonna be making meaningful progress, and that's really important to understand. And then intervention research as well. So, ideally, you know, we can test things out. Like, I had I we had the question today.
- Anneke Buffone
Person
Like, you know, maybe we can test, if it's better to have minimum age period or minimum age plus a parental consent out opt out where the parent really, like, signs something, agrees, and assumes the oversight that then they're taking on. And so, really, as long as we as long as we can get to a place where we have this is the baseline. This is what we're doing. Is it getting better? I think we can and I really like the idea of the the standards body here.
- Anneke Buffone
Person
I think standards have everyone. They they help us know that we're actually moving moving into the right direction and and making sure less kids are harmed, and making standards better. I think for industry, it's also better because right now, you know, it's it's very easy to have one country's law saying, that parents have to be able to read every message and another country's law, like in Europe, saying that kids need privacy. And, companies are global. So, so the more diverse laws there are that are conflicting with each other, the harder it can get.
- Anneke Buffone
Person
And so if we can get to these standards, that are minimal, that we we can sort of agree on, that are sound and reasonable, then industry might also appreciate that. But also, like, it might actually help us get faster to a better standard for everyone that really protects families and kids. So five actions, one pass, privacy preserving age assurance, minimal standards by default that are on for all minors, and very importantly, parental controls being on by default. It it is such a broken system if you have to decide that you want to put controls on for every single app that you use. Like, this is just, like, in a in a day for myself with four children between 17 and eight, there's no way I can be babysitting every single app someone is downloading and always remember, did I not turn their controls on or did they not turn on?
- Anneke Buffone
Person
It's just that an impossible situation has to be on by default. The highlights reel, giving parents insight on what's actually happening on these different apps, standardized parental controls. I think the you already have with this device level age assurance, you can actually use that law potentially to make the apps ask the API of the device what the controls should be. And then if there's more controls, then the app might be required to just set them on their app, so it reduces another action for parents to act. And funding independent testing or even even allowing and even making the path for it even if there's no funding.
- Anneke Buffone
Person
That will help a lot of us doing the research. It'll help Holly. It'll help Sunny. It'll help me, to do the research that we need to do to to keep supporting your work. And we really appreciate all of you and especially those of you that I've now woken up multiple times in the back of the room. Thanks for staying with us, through all these different talks.
- Rebecca Bauer-Kahan
Legislator
And when you say, safety standards on by default, how did you feel about the ability of kids and parents to be able to turn them off as described by the prior panel? No. They should just be on. No no turning them off by anybody. Is that the answer?
- Anneke Buffone
Person
I mean, the thing is I think it should I mean, I think for teens to unilaterally turn these things off, like, I think that's I mean, if it if the parent gets notified, you know, that's one thing. But I I mean, really I mean, what's the point of oversight if someone can just turn it off? Like, it doesn't really make that much sense to me. I mean, I think there is something to be said about why is it even designed that way. Because, like, I mean, like, what other parenting decision is fair where, like, kids could just opt out of it?
- Anneke Buffone
Person
I just I don't know. It's just the whole concept's a bit funny to me, honestly. I mean, I don't I mean, I don't it does I don't understand it. I don't know. I I don't know what you're
- Sunny Liu
Person
Yeah. I think that, usually, we will not turn off safe belts, in the car. And but if Yeah. Kids have some medical issues, need further protections, we can make adjustment and have other devices to further protect children. I think the goal is not really make it's fundamental. I think kids and the parents, they all aligned. They want a healthy Mhmm. And safe environment. So we've they have turned it off. So that really make kids safer or more even riskier.
- Sunny Liu
Person
I was thinking building more ways to protect them. So hold them. Let's go.
- Rebecca Bauer-Kahan
Legislator
And I will tell you, some of my older nieces, their parents said, okay. I'll turn them off now. You're old enough. I'm gonna give you a chance, and they said no. They wanted those parental safeguards because they they kinda knew they weren't ready to manage it on their own. And so to your point, I do think, and I do. I have controls on my own phone to stop me, and I just click through them all the time. So I'm as bad as
- Rebecca Bauer-Kahan
Legislator
Anybody else. I really appreciate that. I'm wondering that was really helpful. Anything, I mean, I sort of made the point in the last panel that, it all sounded great, but it isn't my lived experience. Anything you want to say regarding some of the parental controls we saw earlier and what you would do differently or how they aren't working or are working.
- Holly Grosshans
Person
Well, I think that your point was made that they're not work clearly, they're not working. Like, clearly, there's loopholes or there's gaps happening, which is why, you know, so many parents are concerned about this and so many kids are parents are not using these apps and these tools. I, you know, I've had that same experience. The one app that my kids are able to use is YouTube, and it we could deep we go down a deep deep rabbit hole every time they're on for more than thirty minutes. So, like, that's and and I I trust that Google is working on this.
- Holly Grosshans
Person
Right. I just don't think that they're there. Yeah. And I think that, you know, as we heard from each of these these groups or these these these platforms that they're they're working on this, I just don't think that it's there. I think that there are there are holes and and loopholes that the kids are still getting around.
- Rebecca Bauer-Kahan
Legislator
I really appreciated the point made also about independent auditing of the requirements. It's I've been a huge advocate of independent auditing for artificial intelligence, but we have not had the conversation around social media in part because there are no enforceable standards. And so what do you audit to? But I think the question is a really good one of how do we get to a place where we have minimal standards and we're checking them that we know they're working.
- Holly Grosshans
Person
And I think that the point that you made about the independent body, the the regulator for the state of California would be so helpful. We actually work with the survivor parent that tells the story of another survivor parent that reports every single day to TikTok, that the TikTok challenge still is online or maybe once a week. But she's constantly reporting that this video is still there, and it just keeps going back up. So, like, things like that really need to be regulated. There needs to be a body for people to go to and report those things because they're not being dinged down right now.
- Rebecca Bauer-Kahan
Legislator
Yeah. Awesome. Well, thank you so much. We are actually at time, so I should, I should adjourn us, but, I really appreciate all of the expertise at this table and here today. I wanna reiterate my gratitude for the Hinxes who had to leave, but who, like many survivor parents, and I have spent time with many of them, is what keeps me going every day because, last year, it was every single week, a different family walking into this room telling us tragic stories about their children, and it was like, how do we continue to let this happen?
- Rebecca Bauer-Kahan
Legislator
And so we really wanted to take today to do a deep dive into what's happening and how is it failing and how can we do better. And I hope that that's what we've accomplished today, and I know, I wish I could fund all the research of independent experts because I Me too. Yeah. Because I because I know we need more, expertise in this space. And it's interesting, this is I don't know that I this is a factually accurate statement, but, I think this is the most bipartisan committee in this legislature.
- Rebecca Bauer-Kahan
Legislator
Okay. The Republican consultant. You know, this is parents of all walks of life care deeply about this issue and keeping their kids safe. Republicans, Democrats. And so I I just it's a privilege to be able to lead these conversations and hopefully it can make the the bills that we move forward in the future and the policy we make better. So thank you all for being here.
- Rebecca Bauer-Kahan
Legislator
Yeah. And with that, we will turn it to public comment. I don't know if anyone here wants to make public comment, but we will open it up for a minute each. If anybody does, come on up. We outlasted everybody, so there's only one.
- Evy Christian
Person
Hey there. It's Evy Christian, California policy director with ENCODE.AI. Kinda sad that our industry fellows left. But thank you so much, first of all, for facilitating this hearing today, and so good to see so many of you. ENCODE is a nonprofit advocacy organization dedicated to advancing AI policy that benefits the public interest.
- Evy Christian
Person
We've been various very supportive of various approaches on the issue of online child safety. These approaches come from both Republicans and Democrats because to the chair's point, this is a bipartisan issue. They represent a combination of different transparency requirements, third party verification requirements, and specific deployment restrictions. And to this end, ENCODE is among many advocacy organizations that have had significant concern with recent industry backed proposals being proposed here in California on this issue, including the Parents and Kids Safe AI Act. ENCODE believes that any child legislation on AI and chatbots or companion bots should maintain all avenues for legal recourse when children experience harm from AI, accounts for real risks we know to be real in impacting children currently, be proactive in preventing harm before it happens, and ensuring that the public can be assured that developers have child safety in mind.
- Evy Christian
Person
The proposal mentioned, however, falls drastically short of those goals, but ENCODE is among many organizations that have, formally outlined a litany of concerns with that proposal and is also heartened that the legislature and namely the chair of this committee along with Assemblymember Wicks and Senator Padilla is treating this policy issue with the rigor that it deserves. As evidenced by the hearing today, we really are encouraged that the legislature is continuing to work to get this right. Our children in the state of California deserve nothing less. Thank you so much.
- Rebecca Bauer-Kahan
Legislator
Thank you. Seeing and hearing no further public comment, we will adjourn the hearing. Thank you all.
No Bills Identified