Hearings

Assembly Standing Committee on Privacy and Consumer Protection

March 3, 2026
  • Rebecca Bauer-Kahan

    Legislator

    I'll know exactly where you are. Good afternoon. This is a hearing of the Assembly Privacy and Consumer Protection Committee. The hearing is on California's privacy in the age of Mass surveillance. And I want to start by thanking our panelists who are here today to participate in the hearing.

  • Rebecca Bauer-Kahan

    Legislator

    We're fortunate to have several leading experts in the field, including from four of our UC campuses, which always makes us proud, including Berkeley, Irvine, Davis and UCSF. So I want to thank Committee staff for their work, the Rules Committee, Sergeant's Office, Speaker's Office, and everyone who supports these hearings.

  • Rebecca Bauer-Kahan

    Legislator

    This is an interesting time to be having this hearing. It was obviously scheduled before the news of last week around how generative AI and our large language model should be used as it relates to mass surveillance.

  • Rebecca Bauer-Kahan

    Legislator

    And so this has been a hot topic and a real question of how government should be using our technology that is here now, today. And so I'm glad we're here to have this conversation. Every day.

  • Rebecca Bauer-Kahan

    Legislator

    I, and I imagine everybody who is listening to this frankly give away a lot of data every time we engage on our phones in the backgrounder, I think did a really good job of this.

  • Rebecca Bauer-Kahan

    Legislator

    We engage with smart devices, our phones, we walk past cameras that are in our neighbors homes, and we give away information about ourselves that is now being aggregated in ways that are really, really powerful. There are currently over 4000 data brokers profiting from that personal information that they're aggregating.

  • Rebecca Bauer-Kahan

    Legislator

    And they have collected data on over 98% of the population. One of the largest data brokers claims to have over 10,000 data attributes on over 2.5 billion people in more than 60 countries. While people wanting information about us is not new, surveillance is not new.

  • Rebecca Bauer-Kahan

    Legislator

    The ability to gather the information and then to put it all together using AI really puts us in a moment where the amount of information that's being collected on us and on our constituents is incredibly powerful and frankly something we should be thinking about paying attention to and deciding how we want to proceed.

  • Rebecca Bauer-Kahan

    Legislator

    As California, as the backgrounder noted, California has some of the strongest privacy laws in the nation.

  • Rebecca Bauer-Kahan

    Legislator

    And yet we should still be looking at ourselves and our policies and asking if they go far enough in today's day and age to protect the privacy that frankly, hopefully some of my Republican colleagues will join us, has been a bipartisan value on this Committee.

  • Rebecca Bauer-Kahan

    Legislator

    So I want to note that today we're joined by Assemblymember Ortega, who just walked in. Perfect timing. It's like I made that happen. She is obviously a Member of the Privacy Consumer Protection Committee, but I also want to note that she has many hats she'll be wearing today. She is the chair of the Labor Committee.

  • Rebecca Bauer-Kahan

    Legislator

    And there is a small piece of this and as noted in the background here, where we will look upon the question of surveillance not just in the public sphere, but also in our workplaces. And also she is the vice chair of the Latino Caucus.

  • Rebecca Bauer-Kahan

    Legislator

    And I know that part of what many of us are concerned about as it relates to the question of surveillance is how it is affecting our undocumented communities in today's environment. And so I'm glad she's here to serve in all of those capacities. Thank you, Assemblymember.

  • Rebecca Bauer-Kahan

    Legislator

    And then I'll note that Assemblymember Schulz and Assemblymember Carillo were invited to join us and hopefully will Assemblymember Schulz, who is chair of our Public Safety Committee, because this obviously touches on constitutional protections unique to public safety. And then Assemblymember Crillo, who's the other vice chair of our Latino Caucus. Today we have three panels.

  • Rebecca Bauer-Kahan

    Legislator

    We will save questions till the end of each panel, but would like to invite Assemblymember Ortega to make any opening remarks if you'd like.

  • Liz Ortega

    Legislator

    Thank you. First of all, thank you, Chair, for hosting this very important hearing today. As you mentioned, I do have a lot of hats in this building.

  • Liz Ortega

    Legislator

    And this issue is particularly important during this time where we're seeing additional surveillance of our Latino communities, immigrant communities, and how that's impacting the laws that we're passing to protect everyone in the state.

  • Liz Ortega

    Legislator

    So I am looking forward to today's testimony and looking forward to what comes out of this Committee and how we can use it to further protect everyone in the state of California and the country, given the state we're in with this federal Administration. So thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you, Assemblymember. And I don't know if Assembly Member Pellerin or Lowenthal would like to make any remarks.

  • Gail Pellerin

    Legislator

    Just want to thank the folks who wrote the background paper. I read it last night. It was very haunting and kept me up at night. So thank you so much. But I know there's a lot of work that went into that, and I really appreciate it. Thank you. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Mr. Lowenthal.

  • Josh Lowenthal

    Legislator

    Just very briefly, I just want to say, you know, this is sadly becoming a kitchen table topic. It's ever present in the news right now. So I want to thank all the participants today for participating, for doing this on behalf of all Californians, because people want to learn more, and so we need to be talking about it.

  • Josh Lowenthal

    Legislator

    Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you both. And with that we will turn to our first witness, Nicole Ozer, who's the Executive Director of the center for Constitutional Democracy at UC Law, San Francisco, which maybe is not a part of ucsf. I said ucsf. Are you technically a part of ucsf? No. You're separate. Okay, that's what I thought.

  • Rebecca Bauer-Kahan

    Legislator

    The name change confuses things. She will be providing the historical context for our topic today. Ms. Ozer, when you're ready,

  • Nicole Ozer

    Person

    Bring this closer. All right. Thank you so much for the opportunity to talk today about the California constitutional right to privacy. I'm Nicole Ozer, the Executive Director for the Center for Constitutional Democracy at UC Law San Francisco. I'm also a Member, board Member of the California Privacy Protection Agency. But I'm speaking today entirely in my personal capacity.

  • Nicole Ozer

    Person

    I've been engaging in academic research on the foundations of California privacy law, specifically the history and intent of the California constitutional right to Privacy Enacted in 1972, what I call our Golden State sword.

  • Nicole Ozer

    Person

    I really embarked on this research to really inform how we can more powerfully utilize this fundamental right to privacy to protect against surveillance and advance privacy rights. Justice and democracy in the AI Age. The California constitutional right to privacy was enacted more than 50 years ago, but we face some very similar circumstances today.

  • Nicole Ozer

    Person

    Both then and now, surveillance has been running rampant. Both then and now, we are at the cusp of massive technological change. For them, it was the rise of computerization. For us, it's the next stage of artificial intelligence.

  • Nicole Ozer

    Person

    Both then and now, fights for the future of this country are happening across movement issues, from racial justice to reproductive justice, LGBTQ rights, and more. Both then and now, there are deep divides, and the fabric of American democracy and the rule of law is under assault both then and now.

  • Nicole Ozer

    Person

    Whether technology helps or hinders depends on decisions about why and whether to build technology and how it can be used that are being made every day in companies, our communities, in the courts, and by you as state leaders.

  • Nicole Ozer

    Person

    When we were last at this similar crossroads, in the early 1970s, Assembly Member Ken Cory spearheaded the passage of the California constitutional right to privacy. First, just to be clear, I'm going to make the slides work because they're behind you. We'll give it another try. Let's see. The tech is always the biggest problem in these privacy hearings.

  • Nicole Ozer

    Person

    There we go. That's the next slide. That's great. So first, just to be clear, we have multiple foundational privacy rights in our state constitution. We have the search and seizure clause of Article 1, Section 13, which is our state corollary to the Fourth Amendment.

  • Nicole Ozer

    Person

    But then in 1972, the California Constitution was amended by the voters to add an additional explicit right to privacy to the inalienable rights of Article 1, Section 1. The two words and privacy at the end were added and the word men was changed to people.

  • Nicole Ozer

    Person

    This change of three words was a small but very mighty change, because with it, the inalienable rights in California were guaranteed to equally extend to all people of all ages. And California passed the first explicit right to privacy in the nation that applied to both government and business, not in any penumbras, but a clear and inalienable right.

  • Nicole Ozer

    Person

    This passage was motivated by the context of the times. It was a time of critical actions and social movements and a time of extremely heavy government surveillance. And many Californians knew it.

  • Nicole Ozer

    Person

    In March 1971, just a few months before the constitutional amendment was introduced, a group of activists were able to expose the widespread secret surveillance of the FBI COINTEL program. And there was also rampant surveillance of activists at the local level.

  • Nicole Ozer

    Person

    SFPD alone had 100,000 intelligence files by the end of 1973, at a time when San Francisco's population was just over 700,000 people. And many of these movement leaders and activists personally understood the dangers of surveillance and how advances in technology could exacerbate threats to rights and safety.

  • Nicole Ozer

    Person

    It was for good reason that by 1972, the Black Panthers Party's 10.0 plan included an explicit provision right there in number 10, about people's community control of modern technology. In the same point as land, bread, housing, education, clothing, justice and peace activists knew what was at stake with modern technology.

  • Nicole Ozer

    Person

    California was also the epicenter for computerization and the exponential growth of electronic technology. Californians knew better than anyone what was coming with technology and what it could mean for information collection and use.

  • Nicole Ozer

    Person

    The path to passage of the California Constitutional right to privacy started as Assembly Constitutional Amendment 51, authored by Assembly Member Ken Cory, a Democrat representing Orange County.

  • Nicole Ozer

    Person

    He was personally motivated to address privacy issues, including because, according to his daughter, he was aware he himself was under heavy surveillance by local law enforcement who wanted him out of office. He apparently would often remark that he must be the cleanest politician around because they never found anything they could use against him.

  • Nicole Ozer

    Person

    And Assembly Member Corey included clear intent language in the legislative file. As you'll see here, this is actually his language from the legislative file talking about the intent of the constitutional right to privacy. And I will turn to that.

  • Nicole Ozer

    Person

    In the face of a cybernetics revolution and the increasingly pervasive amount of information being compiled, it would be highly desirable that our Constitution state in clear terms that each person has a fundamental right to privacy. This amendment will put the state and private firms on notice that the people have this fundamental right.

  • Nicole Ozer

    Person

    And it can only be abridged when the public concern is an overriding concern, such as in court ordered wiretapping. There was also the staff report of the Assembly Constitutional Committee which also made it clear that this was about the technological revolution.

  • Nicole Ozer

    Person

    This ACA 51 moved forward with the required 2/3 vote of the Legislature and put it on the ballot for the voters in November 191972. This is actually what voters saw when they went to the ballot that November to vote on Proposition 11.

  • Nicole Ozer

    Person

    The ballot argument itself is a document of legal beauty and elegance, explaining its objectives, purpose and scope with both urgency and precision.

  • Nicole Ozer

    Person

    Officially authored by Assemblymember Ken Cory and Senator George Moscone, it actually had behind the scenes drafting support by law professor Anthony Amsterdam, then at Stanford before a law, a long career at Nyu, who at the time was the foremost Fourth Amendment expert in the country.

  • Nicole Ozer

    Person

    It recognized that the combination of government, corporate and new technological power was going to stack the decks against people's privacy and other rights in the digital age. It specifically noted that the proliferation of government snooping and data collecting is threatening to destroy our traditional freedoms.

  • Nicole Ozer

    Person

    Government agencies seem to be competing to compile the most extensive sets of dossiers of American citizens. Computerization of records makes it possible to create cradle to grave profiles on every American. This was written in 1972. It's pretty amazing and at present there are no effective restraints on the information activities of government and business.

  • Nicole Ozer

    Person

    This amendment creates a legal and enforceable right of privacy for every Californian. And it also made clear what the right to privacy meant. Here in California it said the right to privacy is the right to be left alone. It is fundamental and compelling.

  • Nicole Ozer

    Person

    It protects our homes, our families, our thoughts, our emotions, our personalities, our freedom to associate. It prevents government and business from in collecting and stockpiling unnecessary information about us and from misusing information gathered for one purpose in order to serve other purposes.

  • Nicole Ozer

    Person

    And this is underlined in the original Prop 11 fundamental to our privacy is the ability to control circulation of personal information, which is essential to social relationships and all of our other rights. And it said that this right should be abridged only where there is a compelling public need.

  • Nicole Ozer

    Person

    It created an inalienable fundamental right to privacy for every person. Its scope safeguarding both autonomy, privacy of the body and informational privacy. Its reach was to protect against both government and private parties to protect personal information broadly.

  • Nicole Ozer

    Person

    And it gave power to the people, the burden on the privacy invader to show the justification, not for the individual to have to show a reasonable expectation. Again, this was written by the foremost fourth amendment expert in the country. He was very familiar with a reasonable expectation of privacy.

  • Nicole Ozer

    Person

    And this was written in a way to make sure that they did not include that jurisprudence and really gave power to the people for it to be the burden on the privacy invader and not on the individual to try to protect their privacy.

  • Nicole Ozer

    Person

    The California Supreme Court, when it first interpreted the constitutional right to privacy in 1975, was also clearly understood it. And it's found that its moving force was a focused privacy concern relating to the accelerating encroachment on personal freedom and security caused by increased surveillance and data collection activity in contemporary society.

  • Nicole Ozer

    Person

    And when the court heard this first case in 1975 on the reach of the right to privacy, it found a prima facie violation of the constitutional right. When the LAPD surveilled UCLA students and professors.

  • Nicole Ozer

    Person

    Ann White v. Davis also took care to highlight some principal mischiefs that the constitutional right was enacted to prevent government snooping and secret gathering of personal information over broad collection and retention of unnecessary personal information, a lack of reasonable checks on accuracy of existing records, and improper use of information properly obtained for one purpose, such as the disclosure of data to some third party.

  • Nicole Ozer

    Person

    There were 20 years of robust work in both the Legislature and the courts to make these California rights real in practice. Laws like the Information practices Act of 1977 and many other privacy laws and dozens of cases that moved through the California courts where privacy rights were enforced.

  • Nicole Ozer

    Person

    But social and political forces, including the war on drugs narrative, dealt profound blows to social justice generally and the trajectory of California privacy law. In the mid-1980s, the California Supreme Court went through political changes somewhat similar to what we're seeing in the United States Supreme Court now.

  • Nicole Ozer

    Person

    In 19863 California Supreme Court justices lost their 1986 retention elections after an aggressive campaign portraying them as soft on crime in the war and drugs era. These are the only three justices to have ever lost their retention elections in the history of the court.

  • Nicole Ozer

    Person

    And after these election losses, the new Governor, Duke Mason, appointed three new justices. And it was by this time that this case, Hill versus ncaa. This was the case of first impression to apply the constitutional right to privacy to a private defendant brought in 1987 about random suspicionless drug testing of NCAA athletes.

  • Nicole Ozer

    Person

    It's an ACLU news article because this was an ACLU case brought in 1987 and by the time it was went to the court and was decided by the California Supreme Court in 1994.

  • Nicole Ozer

    Person

    It was these far different set of Justices that deviated from 20 years of privacy precedent on the California constitutional right to privacy and kneecapped the ability of people to use this constitutional right to see redress in court. They put a much higher burden on the plaintiff.

  • Nicole Ozer

    Person

    They created a different standard for being able to prevail on an autonomy and informational privacy claim. And there was an absolutely blistering dissent from one of the few justices that had long been on the court.

  • Nicole Ozer

    Person

    Justice Stanley Mosque characterized the majority opinion as abrogating an express right that was made clear by the voters and that it was not mere revisionism, but actually creation out of nothing. And it was based on assumptions.

  • Nicole Ozer

    Person

    Things like company power could not be that overreaching or people would not be locked into use of a particular company premises that were already wrong then and are so terribly wrong in the AI age. But California constitutional right to privacy has been stuck in this Hill framework ever since 1994, right before the rise of surveillance capitalism.

  • Nicole Ozer

    Person

    And many privacy claims have not been able to move forward in the courts. There are some new cases that are now moving in the California court, and I hope that there is an opportunity for Hill to be properly reconsidered.

  • Nicole Ozer

    Person

    But regardless of whether someone can move their constitutional right to privacy claim forward in the courts, as lawmakers, you can actually ensure that the fundamental privacy rights that are guaranteed to all people, that law is all still good.

  • Nicole Ozer

    Person

    And to make sure that those rights that are guaranteed to all people in California are made real in practice through legislation that actually operationalizes the full extent of this constitutional right.

  • Nicole Ozer

    Person

    Laws that truly operationalize the constitutional right to privacy are consistent with what is promised and guaranteed to every Californian in that right to honor it as fundamental and inalienable. Laws that actually prevent privacy violations, that protect all personal information, that protect all people and and can be enforced by the people.

  • Nicole Ozer

    Person

    When you're considering legislation, there are some questions that you can think about about whether or not it truly operationalizes the right as intended. Does it allow a privacy violation to still happen with just some guardrails on the edges? Does it actually put the burden on the individual, rather on the privacy invader?

  • Nicole Ozer

    Person

    An individual that may have to take very difficult steps, steps to try to protect their privacy? Does it perhaps only protect a narrow category of personal information rather than all personal information? Does it include exemptions that limit which Californians get this privacy protection?

  • Nicole Ozer

    Person

    Does it allow people to be penalized for trying to use their privacy rights like charging them More or downgrading their service, or does it not enable people to actually enforce these rights on their own? The California constitutional right to privacy was both prescient and it's powerful.

  • Nicole Ozer

    Person

    And I hope that we can use it to its full extent as intended to protect people in these critical times. Thank you so much.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you so much. That was super helpful. And actually it reminds me of the book 1984, which was obviously incredibly prescient, as was. As were our predecessors. Anyone have any questions for Professor Ozer? Not, but I know you're going to hang out because more questions may come up.

  • Nicole Ozer

    Person

    Going through this history, it is just amazing that it was drafted and created in 1972. You know, I dug through the legislative history of both the passage of ACA 51 through the California Legislature and then the actual ballot measure as it moved, you know, onto the ballot.

  • Nicole Ozer

    Person

    And it really just is an amazing conception of what could be and to really try to make sure that people were protected against it. And, you know, many places in the country in the world don't actually have the foundation that we have, and we've been able to build such important law on top of it.

  • Nicole Ozer

    Person

    But I think particularly in this moment, there's so much more potential to use it more extensively and to really, you know, focus on the full extent to which we could have legislation really operationalize those rights for people.

  • Rebecca Bauer-Kahan

    Legislator

    And what was the composition of the Legislature at the time between the parties? They got two thirds votes. I imagine it was a much more divided Legislature than we see today.

  • Nicole Ozer

    Person

    It was. And I actually. So Kenneth speaker, it was actually very popular on both sides of the aisle. In his. When he passed away, there were these quotes from Republican Democrats. So as I say, in my Golden State, sort of, he was definitely a legislator with some juice, was able to get that through. And really. And.

  • Nicole Ozer

    Person

    And when this was moving through, both the Legislature was hard on the right and the left. There was support for. From sort of all segments. It passed with over 63% of the vote. And even in the legislative history, there's actually support from the city of Carmel for its passage. There's support from women's rights activists.

  • Nicole Ozer

    Person

    There's support from civil rights organizations. So there was a widespread, diverse coalition that supported this, that felt these issues were important regardless of political stance and socioeconomic background. And it really is just when I read this material just makes me proud to be in California. And the fact that California has.

  • Nicole Ozer

    Person

    This isn't a new thing of California really being focused on making sure we balance the potentials of technology with really protecting rights. This is the DNA of our entire state on these issues since the dawn of computerization.

  • Nicole Ozer

    Person

    I think it's something we can all be proud of, and it's also something that we can rely on more to be able to really understand what is guaranteed to all Californians and make sure that there's opportunities when we're drafting legislation to really operationalize that and support that and make those rights real for people.

  • Nicole Ozer

    Person

    Obviously, people can go to court, but it's much more difficult now. Hopefully, that may change in the next. You know, one can hope. But, you know, as lawmakers, you don't have to wait for that. You can really take action now to see the opportunities and to really make those rights real in practice.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. Yeah. And I think that, you know, part of what you're highlighting, which I appreciate, is this has been a long head of value of the state and of the people, and it, you know, transcends all differences.

  • Rebecca Bauer-Kahan

    Legislator

    You know, we sort of have this gut desire to maintain certain privacy in our lives, and that's something that we, as representatives of people need to be honest about and work hard to preserve. So I appreciate that. Okay, so we have no questions. We may come back to you, but we'll move to our second panel. And on that panel, we have three individuals.

  • Rebecca Bauer-Kahan

    Legislator

    We have Deirdre Mulligan, professor in the School of Information at UC Berkeley, faculty Director of the Berkeley center for Law and Technology, and author of the book Privacy on the Ground Driving Corporate Behavior in the United States and Europe Josh Black, who is here and is a worker and organizer with Amazon Teamsters, and Ari Ezra Waldman, professor of Law and Bacurus, professor of Sociology at the University of California, Irvine School of Law, and author of several books, including Privacy as Trust, Information Privacy for an Information Age, An Industry the Inside Story of Privacy, Data and Corporate Power.

  • Rebecca Bauer-Kahan

    Legislator

    And we will start with Ms. Mulligan. Thank you. Is this on? Yeah, yeah. You good? Great. Chair, Members of the Committee, thank you so much for the opportunity to testify today and for turning your attention to this.

  • Deirdre Mulligan

    Person

    Increasingly important issue in 2012, right? A lifetime ago now, Target's data science team built a model that predicted a teenage customer's pregnancy from purchasing patterns before the customer had told anyone, including her own family.

  • Deirdre Mulligan

    Person

    Her father received a targeted maternity coupon addressed to his teenage daughter, complained to the store, only later to learn that his daughter was in fact pregnant.

  • Rebecca Bauer-Kahan

    Legislator

    Can you move the mic just slightly closer so they're picking you up online? You want the whole world to hear what you say? Okay, yes, thank you.

  • Deirdre Mulligan

    Person

    That example, two decades old or a decade old involved a relatively simple computational model applied to a relatively narrow set of purchasing data.

  • Deirdre Mulligan

    Person

    Three key changes over the past two decades have produced a new economic order surveillance capitalism in which experience itself has become the raw material for extraction, production, prediction, manipulation and sale by and for private profit. First, the surveillance infrastructure has expanded.

  • Deirdre Mulligan

    Person

    In the 1990s, the birth of the commercial Internet gave rise to unique concerns about the consumer protection, privacy and security implications of the pervasive and persistent digital footprints created by every online interaction. Today, that pervasive data collection, once associated only with the Internet, permeates our physical environment. Our physical world is becoming instrumented.

  • Deirdre Mulligan

    Person

    Brick and mortar stores, no longer satisfied with registered data, deploy technology including eye tracking software, facial recognition technology and many others to monitor customers.

  • Deirdre Mulligan

    Person

    During COVID many stores moved away from accepting cash and the privacy protecting features it affords, encouraging in some instances requiring customers to pay with credit cards or touchless apps that reveal purchasers identity and store and transaction data.

  • Deirdre Mulligan

    Person

    Workplaces and even workers are increasingly instrumented with sensors from key cards to video surveillance to eye tracking software to wearables that generate digital data about their every move.

  • Deirdre Mulligan

    Person

    And the rise of AI enabled digital assistants and connected devices from watches to cars, provide companies with windows into homes across the country, offering them the opportunity for unprecedented access to information about both the mundane and the intimate. Domestic activities and concerns.

  • Deirdre Mulligan

    Person

    In addition, the near ubiquity of cell phones and smartphones combined with the rise of location based services has created a continuous stream of geospatial data, revealing people's movements through both public and private space. Surveillance is no longer episodic or confined to particular contexts. It is ambient and embedded in the architecture of our daily lives.

  • Deirdre Mulligan

    Person

    Second, the variety and amount of data collected on individuals has grown. Companies that once collected purchase data and in the digital environment, behavioral data now collect biometric data, fingerprints, facial geometry, retina stands, voice prints, gait and DNA. The rise of social media has exposed individuals networks of friends, co workers, acquaintances, fans and followers.

  • Deirdre Mulligan

    Person

    These social graphs, both explicitly and implicitly reveal information about the individual's interests, associations, beliefs, and facets of their identity. The rise of large language model chatbots has turbocharged the quantity and changed the quality quality of information individuals share with AI companies.

  • Deirdre Mulligan

    Person

    Short search queries have been replaced by long form questions and extended conversations, vastly increasing the amount of information shared with the private sector.

  • Deirdre Mulligan

    Person

    And as individuals turn to chatbots for emotional support, spiritual guidance, relationship counseling, legal advice, business advice, and a growing number of instances, intimacy, they are turning over reams of information about their follies, fantasies, neuroses, desires, hopes, and fears in ways that have no precedent in the history of commercial data collection.

  • Deirdre Mulligan

    Person

    Third, powerful advances in computation expand what private companies can glean from these troves of personal data. The machine learning and AI models available today can extract patterns and draw inferences about a wide range of topics from seemingly innocuous data. For example, health conditions or propensities can be inferred from non medical data generated far outside the medical context.

  • Deirdre Mulligan

    Person

    In the Target story I opened with, the company inferred pregnancy from the purchase of a few non pregnancy specific items.

  • Deirdre Mulligan

    Person

    But researchers have shown that more powerful computational techniques can make more startling and and category jumping inferences, including those that reveal attributes or conditions an individual has specifically withheld from others or perhaps are not yet aware of themselves. These advances and inference make it difficult for individuals to reason about the risks of disclosing information.

  • Deirdre Mulligan

    Person

    Taken together, these three challenges Sorry. These three changes challenge core assumptions of most US Privacy laws. Existing privacy laws built around notice and consent assume individuals are aware of and have a say over the collection of their personal data. The infrastructuring of physical spaces has pushed data collection behind the scenes and often outside of individuals control.

  • Deirdre Mulligan

    Person

    A person's presence in a physical space, a workplace, a public street, a commercial store routinely subjects them to surveillance, often without their knowledge and almost always without meaningful consent. In personal homes and on public streets, the Internet of other people's things extracts data from individuals based on other people's preferences and needs.

  • Deirdre Mulligan

    Person

    Surveillance has become the background condition of everyday life rather than an episodic event brought to an individual's attention and over which they might be afforded some control.

  • Deirdre Mulligan

    Person

    Existing privacy laws built around notice and consent at the moment of data collection assume individuals understand the risks posed by disclosing different kinds of personal information because they understand what it reveals about them. In effect, they understand the meaning. But individuals today don't understand what can be gleaned from their data.

  • Deirdre Mulligan

    Person

    The semantics of the data they disclose are not obvious to them and often not obvious until we put complicated computational processes on top of them.

  • Deirdre Mulligan

    Person

    Armed with sophisticated and powerful machine learning algorithms, companies and governments can draw powerful and compromising inferences from seemingly benign data, making it increasingly difficult for individuals to understand the meaning, let alone the risk of disclosing any piece of data.

  • Deirdre Mulligan

    Person

    In this asymmetric environment, individuals ability to control who knows what about them cannot be fully protected by notice and consent mechanisms focused on data collection. Existing privacy laws generally assume that a person's data has implications for them, but not others. Privacy operates often at the individual level, but surveillance operates at the collective level.

  • Deirdre Mulligan

    Person

    It leverages individuals data to classify, make inferences and predictions, and manipulate not only them, but others like them or others related to them. Addressing the risks of surveillance capitalism requires strategies that operate and protect against risks posed by tranches of personal data, not just our own.

  • Deirdre Mulligan

    Person

    The rise of surveillance capitalism also lays bare the limits of viewing the stakes of sweeping population surveillance as solely a problem for individual privacy. Privacy is both an individual right and a public good.

  • Deirdre Mulligan

    Person

    It affords individuals the space and freedom to learn and grow from intimate relationships, worship and form the independent opinions and perspectives essential to thriving democracies. It is a necessary bulwark against government and corporate power.

  • Deirdre Mulligan

    Person

    But as the speakers on these two panels will describe in detail, the harms flowing from surveillance capitalism go well beyond intrusions on individuals privacy. Surveillance capitalism is fueling unfair practices in the marketplace and the workplace.

  • Deirdre Mulligan

    Person

    Companies use the surveillance infrastructure to generate individualized prices for goods and service aimed at maximizing the revenue they can eke out of each individual consumer. They are weaponizing the public's personal information to make them poorer, extracting California consumers hard earned wealth.

  • Deirdre Mulligan

    Person

    Companies use the surveillance infrastructure as a living lab, seeking new methods to shape consumers desires and behaviors, stealing our attention, influencing our health and well being and that of our children. Gone are the days when you were paid to be a Nielsen family or part of a brand research.

  • Deirdre Mulligan

    Person

    Today the digital infrastructure allows companies to run continuous experiments in the wild without notice, consent or even your choice to participate.

  • Deirdre Mulligan

    Person

    Employers use the surveillance infrastructure to classify and manage workers, creating risk for workers jobs, physical and mental health, privacy and civil rights, and at times interfering with employees abilities to engage in collective bargaining and other protected activities.

  • Deirdre Mulligan

    Person

    And while this first panel is focused on surveillance capitalism, the private sector infrastructure and the pools of data and inferences it creates are of great interest to state actors both at home and abroad. And the walls between the public and private sector have become increasingly thin.

  • Deirdre Mulligan

    Person

    Governments, particularly those acting lawlessly, share the private sector's interest in keeping individuals and populations, behaviors, associations, transactions and identities articulated, measurable, mineable and programmable. At this moment, as the public's data is being weaponized against them, we cannot ignore the risk private sector surveillance capitalism poses for the people as polity, not just as consumers.

  • Deirdre Mulligan

    Person

    Surveillance capitalism imperils our shared commitment to constraining our government from overreaching surveillance of the US population, and it also poses threats to our national security. Other panelists will describe the risk to individuals, physical and mental health and security posed by surveillance capitalism. But the rampant collection, sale and use of personal data also poses collective risks.

  • Deirdre Mulligan

    Person

    The connection between the availability of the American public's personal data and national security is a growing concern.

  • Deirdre Mulligan

    Person

    Access to Americans personal data from data brokers and other sources increases the ability of countries and non state actors to engage up a wide range of malicious activities that threaten national security, from coercion and manipulation of specific employees to hacks into high value government systems to influence campaigns seeking to alter elections or sow domestic turmoil.

  • Deirdre Mulligan

    Person

    In addition, personal data linked to populations and locations associated with the Federal Government and even the state, including the military, can be used to reveal insights about those populations and locations of important assets that threaten national security. Surveillance capitalism, its infrastructure, the data it produces, the inferences it gleans, can provide strategic advantage to our enemies.

  • Deirdre Mulligan

    Person

    Put simply, surveillance capitalism can threaten not only individual privacy and security, but our collective safety and security. Protecting the collective public and social value of privacy requires policymakers to refocus on addressing the collective harms as well as individual ones that flow flow from the surveillance practices.

  • Deirdre Mulligan

    Person

    These harms extend beyond those construed as privacy, too often reduced to information control as they rest in the power of personal data in identifiable, aggregate and even de identified forms to shape markets, preferences and societies.

  • Deirdre Mulligan

    Person

    Public policy has to contend with the role data and algorithms play in actively mediating and normalizing the discourses and social conditions against which decisions about distributions of power, resources and opportunities take place. I offer a few brief recommendations to move us in this direction. First, focus on institutions and professionals, not just new rules.

  • Deirdre Mulligan

    Person

    Our existing regulatory and enforcement agencies are understaffed and under resourced to meet the moment.

  • Deirdre Mulligan

    Person

    Consumers are facing a raft of data extraction and surveillance practices and and our regulatory and oversight agencies need sophisticated influxes of technical, economic, social science experts to ensure that agencies have the cross disciplinary expertise to understand the threats and to really clearly understand best ways to address them.

  • Deirdre Mulligan

    Person

    Second, legislators must approach the regulation of personal data with the full range of human rights, consumer protection, competition, health, safety and security issues in view.

  • Deirdre Mulligan

    Person

    It has to extend beyond notice and consent at collection towards setting real limits on collection, setting real limits on flows across contexts and uses that undermine the health, well being and economic position of the public.

  • Deirdre Mulligan

    Person

    Third, proposals should have concrete and forward minded definitions and clear pathways for enforcement, especially given how the asymmetry between individuals and industry and government challenges individuals understanding and capacity to address the harms they face.

  • Deirdre Mulligan

    Person

    Finally, regulatory approaches that place the burden on consumers are impractical given the scale, intangibility and ubiquity of the surveillance practices in the current economy.

  • Deirdre Mulligan

    Person

    As Principal Deputy Chief Technology Officer in the White House Office of Science and Technology Policy during the Biden Harris Administration, I had the honor of helping to ensure that how we used and regulated technology centered people's rights and freedoms, advanced equity, and adhered to our fundamental obligations to ensure fair and impartial justice for all.

  • Deirdre Mulligan

    Person

    Today, the world is looking at California. They're looking for the Legislature, the Governor to lead the way. How the state uses, refuses and regulates technology, including the massive infrastructure of surveillance, is a key way the state will continue to manifest its values, including its commitment to privacy. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you so much, Professor Mulligan. And I want to note that we've had two more Committee Members join us. So Assemblymember McKinnor is here and I actually think the diversity of this Committee is one of its strengths.

  • Rebecca Bauer-Kahan

    Legislator

    Assembly Member McKinnor has been a leader on Criminal Justice Reform and I know this touches deeply on communities she cares about and how they will be surveilled and impacted. So thank you for being here to represent and also Assembler Ward, who is the Chair of the LGBTQ.

  • Rebecca Bauer-Kahan

    Legislator

    I always get wrong chair and Vice Chair Chair of the LGBTQ Caucus as well as a Member of the Committee. And I know that through the hearing we will be touching on the impact LGBTQ community of surveillance, especially at this moment in history. So I'm glad he's here to join us as well.

  • Rebecca Bauer-Kahan

    Legislator

    And with that, we will turn it to Mr. Black. Thank you.

  • Josh Black

    Person

    Thank you. Good afternoon. First, I'd like to say thank you to the Chair and the other Members of the Committee for inviting me to speak this afternoon. My name is Josh Black and I have worked for Amazon at a delivery station warehouse in San Francisco called DCK6 for the past three years.

  • Josh Black

    Person

    About two years ago, I joined with my DCK6 coworkers in coordination with the International Brotherhood of Teamsters and began organizing for better pay and benefits, safer conditions and respect in our workplace. In October of 2024, we went public and demanded union recognition from management.

  • Josh Black

    Person

    After a super majority of workers at our warehouse had signed cards saying they wanted representation as Teamsters Management has so far refused to negotiate with us.

  • Josh Black

    Person

    And a few months after we went public, we participated in a nationwide strike against Amazon just before Christmas of 2024, which was the largest strike in Amazon's history to date at our warehouse. We held a continuous picket line in front of the warehouse for 48 hours straight.

  • Josh Black

    Person

    But today I'm here to talk to you about how Amazon uses surveillance of its employees for profit, both in general terms and more specifically in our case, as a way to counter our organizing efforts so they can continue to pay starvation wages and increase productivity to dangerous levels and grind more revenue out of our labor.

  • Josh Black

    Person

    This is an image of a Zebra mobile computer. It's just a little bit larger than a modern smartphone that you have in your pocket right now, probably. Most associates are required to wear this device on their arm on any given day while they're working.

  • Josh Black

    Person

    It lets us scan packages, bags and carts as they move through the sorting and staging process.

  • Josh Black

    Person

    It also tracks our locations precisely throughout the warehouse so Amazon Management knows exactly when we decide to go use the restroom or when we go to get a drink of water or stray too far from our assigned location in the warehouse for any other reason.

  • Josh Black

    Person

    In addition to this, this device is continually sending what they call a TOT report to management, which stands for time off task, so they are alerted whenever an associate takes too long between their previous scan and their current scan.

  • Josh Black

    Person

    Despite the fact that there are multiple valid reasons we might need to take a short gap in our work, management verbally will tell us that we can take short breaks when we need to use the restroom to go get a drink of water to stretch.

  • Josh Black

    Person

    Because one of the most common injuries in the warehouse is these repetitive stress injuries when people work too long without stretching. In reality, we find that triggering the time off task report often results in receiving a visit from a manager, even if we were doing it for a valid reason.

  • Josh Black

    Person

    So we often delay or avoid entirely these important health maintenance activities like bathroom water, stretching out of a sense that we are being monitored and wanting to stay off the radar of management on these devices. We also regularly receive digital trainings that we're required to complete.

  • Josh Black

    Person

    They'll pop up in the middle of working, and we need to complete them before we can scan our next package. These often cover important policy updates, safety procedures, and other knowledge we need to properly do our jobs.

  • Josh Black

    Person

    But because we live with this fear that spending too long on a training is going to trigger this time off task report, associates routinely rush through these trainings as quickly as possible, just paging through the trainings without even reading the text.

  • Josh Black

    Person

    This, I believe, leads to a less safe working environment with associates lacking the basic training on safety, and it gives management cover to discipline associates for safety violations since they now have the digital proof that we've completed the pertinent trainings even if they were done in a hurry. These devices are also frequently old and in disrepair.

  • Josh Black

    Person

    One associate recently told me about an instance when she walked across the warehouse to get a replacement for a device that wasn't working and then got back to her aisle and found the second device wasn't working.

  • Josh Black

    Person

    Repeated this three times and after she finally found a working device and returned to her aisle, a manager found her and reprimanded her of course for TOT time off task. So instead of ensuring up-to-date and functional devices are available to employees, management has consistently put the blame on us for their shortcomings.

  • Josh Black

    Person

    In 2022, the state of California enacted Assembly Bill 701 which stated that an employee shall not be required to meet a quota that prevents compliance with meal or rest periods, use of bathroom facilities, including reasonable travel time to and from bathroom facilities or occupational health and safety laws in the Labor Code or division standards.

  • Josh Black

    Person

    And although Amazon claims to not rely on any quotas within California, they have used this time off task enforcement to clearly circumvent the spirit of this law.

  • Josh Black

    Person

    They harassed and disciplined employees for a reasonable time away from scanning and they also enforced this TOT policy selectively allowing management to target slower performing employees while allowing time off task for the top performers to slide, essentially creating a de facto quota system in everything but name.

  • Josh Black

    Person

    Through this surveillance and intimidation, which also I should add, tends to target people with disabilities and older workers more than others. So in the late 18th century, Jeremy Bentham invented the concept of the Panopticon prison.

  • Josh Black

    Person

    This layout later became standard around the world, with a circular watchtower in the center that prisoners could not see into and the cells arranged in a circle looking inward toward the tower.

  • Josh Black

    Person

    This design drastically changed the prisoners behavior because they were never sure when or if they were being watched and thus had to behave at all times as though they were. Amazon uses the principles of the Panopticon both inside and outside the warehouse.

  • Josh Black

    Person

    Although we are aware cameras are hidden throughout the warehouse, including in break rooms and non work areas, we don't know where they are or when somebody might be using them to watch us.

  • Josh Black

    Person

    When attempting to discuss union issues on the shop floor or even in break rooms, my coworkers have frequently expressed a fear not just that management may be watching us on hidden cameras, but they may be listening to us on hidden microphones, either in those zebra devices or in the walls of the warehouse itself.

  • Josh Black

    Person

    Most, if not all employees in our warehouse have also downloaded the Amazon A to Z employee app to their smartphones. Though not technically required, Amazon has made it the only reasonable way to stay up to date on scheduling changes, overtime and time off requests and opportunities, leave of absence and accommodation requests, et cetera.

  • Josh Black

    Person

    So many employees also believe the app listens to our conversations and or maybe tracking our locations via our phones 24 hours a day.

  • Josh Black

    Person

    And I want to be clear that I have no hard evidence to suggest Amazon is recording audio either on or off site, or monitoring our location outside of the warehouse, which would of course be illegal under California law.

  • Josh Black

    Person

    But the power of the Panopticon is that they allow people the reason to believe they are being monitored at all times, even when it may or may not be true.

  • Josh Black

    Person

    This, combined with Amazon pushing strong anti union messaging in the warehouse and even firing two of our union supporters, has contributed to a fear of meeting with union supporters and discussing organizing even outside of the warehouse. So in addition to all of this high tech surveillance, we're also subject to very low tech surveillance.

  • Josh Black

    Person

    It's the same way management has been surveilling workers for years. Since we went public, management has started finding excuses to hang out in our employee break room during breaks, clearly monitoring us for any union conversations and activity.

  • Josh Black

    Person

    When we set up a Teamsters information table on public property right outside the warehouse, management will, each time we do this, find an excuse to come outside the warehouse and monitor who is staffing the table and which employees are stopping to talk to us.

  • Josh Black

    Person

    For example, last time we were tabling out there, two managers came to take out trash to a nearby dumpster. To be clear, our warehouse employs a full janitorial staff of full time workers who work very hard to keep the warehouse extremely clean, including regularly emptying of trash cans.

  • Josh Black

    Person

    So it's in no reality would it ever be a manager's responsibility to empty the trash cans in the warehouse, much less for two of them to do a job that could easily be done by one person.

  • Josh Black

    Person

    So it seemed very obvious that they wanted us to see them there so anyone at the table would be intimidated and know that they were being watched.

  • Josh Black

    Person

    And then, as oppressive as the surveillance press workers is inside the warehouse, I need to touch on the extreme surveillance that the Amazon drivers are subjected to in their daily work, which I think is even worse, unlike UPS drivers who have a union contract which requires that their cameras only face outward, Amazon drivers have a camera pointed at their faces all day while driving.

  • Josh Black

    Person

    The AI enabled driver eye cameras, which are produced by Netradyne, allegedly encourage safe driving by monitoring drivers for distraction and fatigue. However, drivers frequently report the app appears to randomly report them as distracted, even when they are staring alertly straight ahead at the road.

  • Josh Black

    Person

    And particularly in urban environments like San Francisco, stop and go traffic and merging from a parking spot into heavy traffic are often reported as unsafe driving. One driver reported they were called into two separate meetings with two separate managers due to a single yawn caught on camera.

  • Josh Black

    Person

    Although yawning is traditionally thought of as a sign of sleepiness or fatigue, scientists have found yawning, of course, has many causes and actually increases our heart rate and level of alertness.

  • Josh Black

    Person

    Yawning is an involuntary reflex, just like a sneeze or cough, and up to 20 yawns per day for even a well rested person is considered normal by doctors.

  • Josh Black

    Person

    But unfortunately, Amazon has repeatedly shown through their use of AI and surveillance that they are looking for employees to be closer to robots than human beings and even our basic bodily functions are disciplined.

  • Josh Black

    Person

    I should also point out that despite this extreme level of surveillance and control exercised on drivers by Amazon, Amazon continues to claim that these drivers are not in fact Amazon employees at all, but employees of third party delivery service partners, all while they wear Amazon uniforms, drive Amazon branded trucks, and can be disciplined under Amazon's intense surveillance and monitoring policies.

  • Josh Black

    Person

    So, getting back to the theme of this panel, all the forms of surveillance I've discussed have one ultimate goal for Amazon. Increasing profits for their shareholders and top executives by increasing the productivity of US workers without increasing pay.

  • Josh Black

    Person

    While Amazon's profits soared to 77.7 billion in 2025 and their market cap is now over 2.25 trillion, Amazon's greed seems to prevent them from negotiating with us for fair wages, even while many of their employees are on public assistance or even homeless.

  • Josh Black

    Person

    Instead, they use intimidation and surveillance tactics to push for unsafe levels of productivity and suppress our organizing efforts. Despite all of their efforts, employees across the country are stepping up and demanding recognition and accountability. And we believe we're on the right path to force Amazon to come to the bargaining table soon.

  • Josh Black

    Person

    And thank you again for your time and for your interest in this important topic.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. Mr. Black, you got some snaps. And again, we will have questions at the end of our third panelist. Mr. Waldman, your turn.

  • Ari Waldman

    Person

    Great, thank you.

  • Ari Waldman

    Person

    Chairperson Bauer Kahan, Members of the Committee and with apologies to the people who are at my back, today's information hearing is a crucial step toward building privacy law in true California fashion, namely, in ways that recognize, account for, and address the fact that some people, particularly those most marginalized in our society, may have both a special need for privacy protection and are more likely to face disproportionate harm from surveillance.

  • Ari Waldman

    Person

    My name is Ari Waldman and I'm a professor of law at the University of California at Irvine. Much of my research is at the intersection of information technology, privacy, and queer civil rights. So thank you for inviting me here today to discuss these matters with you.

  • Ari Waldman

    Person

    Thank you also for including in this hearing people like Mr. Black, whose direct experience of over surveillance in service of someone else's profit in an oligarchic, informational capitalistic economy hammers home both the need for stronger privacy protections for our workers and the point that I want to make today, which is that the burdens of surveillance are not shared equally.

  • Ari Waldman

    Person

    Those of us who study privacy all know a few famous stories about how tech companies, voracious and endless appetite for engagement and advertising dollars have undermined the privacy of the most vulnerable among us. The target pregnancy example is just one, but a really great example. So I'm glad Professor Mulligan offered that earlier in this hearing.

  • Ari Waldman

    Person

    In addition to the Target example, there are numerable stories of Facebook outing its users as queer to their friends and families and anyone else in their networks. As much as Meta's executives protest in public that they fix this problem, the number of calls I get each year about this suggests that they haven't fixed much of anything.

  • Ari Waldman

    Person

    Facebook still likes to get users to engage by showing them how their friends have engaged. I wrote part of my doctoral dissertation on this point. Doing so manufactures a kind of social trust that triggers a cascade of engagement for others. If I know my friend did something or liked something, I'm more likely to do it as well.

  • Ari Waldman

    Person

    But anyway. Besides, Facebook routinely does this for affinity groups or products or businesses or many other things that indicate queerness, essentially outing many queer adolescents to the people who follow them their parents, their grandparents, their teachers, whomever else. And there are darker examples of data driven harms.

  • Ari Waldman

    Person

    Online harassment, particularly of women, is facilitated by a business model that prioritizes maximal data collection for for maximal engagement. Data driven algorithms feed users dangerous messages that push some people over the edge to severe depression and suicide. Of course, this can happen to anyone, and certainly it does.

  • Ari Waldman

    Person

    But queer kids, adolescent girls, marginalized boys, those living with disabilities, and many more are simply targeted.

  • Ari Waldman

    Person

    More Often then there are stories we hear less about in the media, and maybe that's because they happen so often that people don't think they're news or or maybe because they don't fit the narrative of a newspaper owned by Amazon's oligarchic bosses wants us to hear.

  • Ari Waldman

    Person

    These are stories about, say, Joan, which is a pseudonym, a 19 year old whose data was used by a dating website to target her with men who abused her.

  • Ari Waldman

    Person

    There's Matthew, who was told by a dating app that there was nothing that the company could do to stop someone from impersonating him and sending strangers to his home and place of business looking for sex.

  • Ari Waldman

    Person

    In several of those incidents, the impersonator assured the person that they were talking to that when they came over for sex that they shouldn't stop, even if they, meaning Matthew, objected or protested or said no because they wanted to explore fantasies about sexual assault and the company said that they could do nothing about this.

  • Ari Waldman

    Person

    And then there's Stephanie, who was murdered after being stalked by someone who got all the information he needed about her from a data broker. And Mary, whose geolocation history was sent by Google to a state Attorney General investigating her and her mother for going out of state to get an abortion. So let's drill home on this point.

  • Ari Waldman

    Person

    As Professor Mulligan said, the data collected by private companies for capitalistic surveillance does not simply remain on the servers of those private companies, they share it with other companies and they also share it with the government.

  • Ari Waldman

    Person

    Allowing data extracted behavior in the for profit world gives hostile governments access to the data they need to punish people they simply do not like. This is what's happening to transgender people and their families in Texas and in elsewhere.

  • Ari Waldman

    Person

    This is what's happening to some women seeking to exercise their rights they still have when they travel to pro choice states like California to terminate their unwanted or dangerous pregnancies. Just today, Joseph Cox at 404Media reported that ICE and CBP use location that it outsourced from online advertisers to track phone locations of people they wanted to arrest.

  • Ari Waldman

    Person

    The ease with which the government can access privately collected data and the eagerness with which so many oligarchs have shown to partner with this particular regime in Washington demonstrates that the data collection by an OpenAI or a Meta can sometimes be little different than data collected by the Trump Administration.

  • Ari Waldman

    Person

    The kinds of harms of this kind of data extraction may seem obvious to us, but allow me to highlight just a few. Nonetheless, data extraction, driven by a pathological need for engagement, strips us of our autonomy. It treats us as merely means to someone else's end.

  • Ari Waldman

    Person

    It reduces us to numbers and metrics that see humans as merely fitting into categories like romance novel-reading urbanites or Christian serial daters, or susceptible to splurge purchases. All of which, by the way, are real categories.

  • Ari Waldman

    Person

    Companies use these harms metastasize for the most marginalized among us who have already been told by society that they are less than or less deserving of protection. We see it every day, deaths by suicide in the queer community, the harassment, the micro targeting, the misinformation, and other real social harms. California needs to take a different approach.

  • Ari Waldman

    Person

    We need an aggressive government regulator that is going to do more than require companies to draft up an impact assessment and then file those away in some cabinet, digital or otherwise, or pay a meager fine and simply have to offer consumers even just an opt-out.

  • Ari Waldman

    Person

    CalMatters just reported today that the company PlayOn, which lied to students and parents and forced them to provide personal information before they could go buy sports ticket sports game tickets to their high school to their high school and college events.

  • Ari Waldman

    Person

    And the only thing that the regulator could do, for various reasons, in part because the law hamstrung them and in part because they have insufficient resources, the only thing that the regulator could do was fine them $1,000,000, a mere tiny fraction of the company's revenue per year and require an opt-out. We know opt-outs don't work.

  • Ari Waldman

    Person

    We know that there's very little that's going to change as a result. Instead we need things like ongoing monitoring of corporate adherence to the civil right of privacy, where data extractive products are limited in what they can collect, where personal data is not used to turn users into depressed, isolated and suicidal addicts.

  • Ari Waldman

    Person

    We need to make it harder for government to gain access to privately collected data. We need to hold companies accountable when their products have disparate impacts on minoritized populations.

  • Ari Waldman

    Person

    So as my time comes to an end here, I look forward to discussing even more specific steps in the Q and A that we can take to be a leader at protecting the privacy in California of its most marginalized citizens. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. I know. I think you've overwhelmed all of us, which is not just the reality of the world we live in. But I don't know if Assembly Member Pellerin, do you want to start or would you like me to start? Okay, Assembly Member McKinnor wants to start, but your mic.

  • Tina McKinnor

    Legislator

    Hello. Thank you guys so much for coming and speaking with us today and thank you to the chair for bringing this Forward. This is some really scary stuff. I want to ask the fellow that works for Amazon, are the managers surveilled? Do they do surveillance for the managers or just for the workers?

  • Josh Black

    Person

    Do they do surveillance of the managers? No, for the most part. The managers would never be carrying these same devices. So the managers are the ones doing the surveillance for the most part. I mean, there might be higher level stuff I'm not familiar with, but to my knowledge it's mostly coming from them.

  • Unidentified Speaker

    Person

    They probably yawn a lot.

  • Josh Black

    Person

    Yeah.

  • Tina McKinnor

    Legislator

    And do we know, like, how long do they keep this data that they're collecting?

  • Josh Black

    Person

    No, I'm not sure and I don't know if they're able, if they're willing to share that with us.

  • Tina McKinnor

    Legislator

    Last question about the employees. Are you guys seeing more accidents, like because they're working faster or they're just, they're scared to go to the bathroom?

  • Josh Black

    Person

    Yes, there's definitely lots of injuries. The other thing I didn't really have a chance to touch on is that I believe Amazon severely under reports the injuries they have.

  • Josh Black

    Person

    Often if you go to report an injury to a manager, they'll say, zero, well, why don't you go home and sleep it off and, you know, basically tell you you won't get in trouble for missing work if you just go home right now rather than reporting it through the proper channels.

  • Josh Black

    Person

    And for example, there's a required posting that they put up on the wall that's supposed to show all of the injuries that were reported to OSHA in the last year. It specifically said chemical burns zero.

  • Josh Black

    Person

    And I know a woman I work with who had chemical burns on her hands working in our hazardous materials Department, repacking, you know, leaking hazardous materials. So.

  • Tina McKinnor

    Legislator

    And then for the young ladies sitting next to you, is it. So you mentioned cash and cards, like all of a sudden we did run into companies that say we don't take cash. And so would requiring, do you think, requiring companies to give customers the option to use cash at all times, would that help?

  • Deirdre Mulligan

    Person

    I think there are incredibly important equity reasons as well as privacy reasons to make sure that businesses accept legal tender. And certainly cash is fungible. It does not tie you to a location, it does not tie you to a particular purchase. So it undermines the surveillance ecosystem.

  • Deirdre Mulligan

    Person

    Cameras, et cetera, might still put you there, but it keeps certain bits of information outside the data broker's hands, outside the surveillance infrastructure. But I also think requiring businesses to accept cash is incredibly important for people who are unbanked, which often includes people who are lower income, people who might be unhoused.

  • Deirdre Mulligan

    Person

    And so I think that there are many reasons why ensuring that people can go into a store and continue to use cash is an important component of making sure that people don't fall further through the safety net.

  • Tina McKinnor

    Legislator

    And is there a difference? When you write a check, are they still tracking you if you wrote a check? Because people don't even write checks anymore. But.

  • Deirdre Mulligan

    Person

    Certainly, certainly banks and the clearing infrastructure. Right. But the same sort of data collection and the network of data collection that happens when you use different kinds of digital payment apps. They may not be feeding into exactly the same infrastructure. There's greater privacy protections for that transactional data. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    I will say, just for the Members here. So Professor Mulligan has a hard stop at 3 o'clock, I believe. So if we want to ask her questions first. No, you're fine. But I just thought we could address questions to her first since she has to leave if you want to.

  • Gail Pellerin

    Legislator

    I have a question for you, Professor Mulligan, is that right? So can you explain the advise and consent approach and why that's inadequate for protecting privacy? Sure.

  • Deirdre Mulligan

    Person

    So notice and consent typically operates either at the point when somebody is signing up for a service or if we're lucky, at the time that somebody's making a decision to disclose a particular piece of information.

  • Deirdre Mulligan

    Person

    There are kind of three reasons why that doesn't address some of the ills that flow from the surveillance capitalism that people live within today. First, right. Individuals. The goal is to provide people with control over. The term is who knows what about them. The fact that data that is collected

  • Deirdre Mulligan

    Person

    from you might feel quite innocuous, but on the back end, a company may be appending lots of other information because it has information about transactions you've made in lots of of other places. It also has powerful computational techniques that it's.

  • Deirdre Mulligan

    Person

    Uses to mine that information. And so that information, for example, in that simple Target pregnancy context, my understanding of the meaning of the information I'm revealing and what it actually reveals about me may be starkly different. And so people's ability to assess the impact that that data is going to have on their lives is really limited.

  • Deirdre Mulligan

    Person

    Two, the notice and consent. Often people make decisions before they really understand the totality of data that might be collected about them. And so they often don't have information. And it's not that notice and consent isn't important, it's that it's insufficient.

  • Deirdre Mulligan

    Person

    And then finally, because much of our physical world... How many of you have gone to a friend's house and noted that they have Alexas and all other sorts of devices that are collecting information in the background? Right. And our physical environments are increasingly instrumented by other people. Right.

  • Deirdre Mulligan

    Person

    And so they are collecting information about us that we are not aware of, let alone have any ability to control. And so given that extraction that's happening without visibility or our control, really relying solely on that kind of individual notice and consent model just doesn't address the totality of the challenges that we face.

  • Gail Pellerin

    Legislator

    Oh, that makes perfect sense. And how would you respond to the argument that if they opt into sharing, it's the consumer's choice?

  • Deirdre Mulligan

    Person

    So certainly individuals do need to have control over decisions about to whom they disclose their information. In many contexts, that's appropriate. Right. But often individuals don't really understand the implications. Right. So there's an assumption that there's a whole lot of knowledge that people have about how that information is going to circulate and be used.

  • Deirdre Mulligan

    Person

    That might not really pan out in practice. I think Professor Waldman provided us with a whole host of examples or how individuals were choosing to share information with their friends on social media, but it comes back and haunts them in other ways that they probably would have limited sharing with certain audiences if they actually had that sort of functionality.

  • Gail Pellerin

    Legislator

    No, it makes sense. Yeah. I appreciate it. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    And I have one follow up to that, which is... So I do think that the data broker piece of this is really critical and that the ability to amass information from all these places and put it together to create this profile of us is actually somewhat at the heart of what is concerning. And then again to share.

  • Rebecca Bauer-Kahan

    Legislator

    That's, I think, where a lot of this sharing is probably happening for the government. But correct me if I'm wrong. And so, you know, how do you think our laws around the right to delete, the DROP program that we've rolled out, the data broker laws. How far do you think they go in helping to address that part of the ecosystem?

  • Deirdre Mulligan

    Person

    So I think they are like crucially important steps. And I also believe that, for example, the CFPB, may it come back to life, right. Was using the Fair Credit Reporting Act to try to bring the activities of data brokers under control in ways that were more consistent with our privacy and civil rights commitments.

  • Deirdre Mulligan

    Person

    So I think that there are many different things that we need to do. Professor Waldman mentioned increasing constraints on government access to certain kinds of data. That can be very important when we think about the repurposing of data, particularly in an environment where people's data is being weaponized against them in really unexpected and undemocratic ways.

  • Deirdre Mulligan

    Person

    We can also think about, you know, when I was in the administration, the national security risks. Right. And trying to set some limits on the kinds of data that is trafficked and sold about the population because of the security risks that that data poses. So this is like an all hands...

  • Rebecca Bauer-Kahan

    Legislator

    You mean to our foreign adversaries?

  • Deirdre Mulligan

    Person

    I do mean to foreign adversaries, but not only.

  • Rebecca Bauer-Kahan

    Legislator

    Okay, interesting. Any other questions for Professor Mulligan before she has to leave? Yeah. And then we can of course have more questions for the panelists.

  • Chris Ward

    Legislator

    Thank you. Obviously absorbing some pretty alarming information, but not unlike we've already discussed in this committee. And I've done a lot of our own research as well as we're developing our bill interest. I guess for Professor Mulligan, since you do have to run.

  • Chris Ward

    Legislator

    Are you aware of any decent sort of repository of information? Is there any either national or state or maybe through a nonprofit or an academic setting that is trying to consolidate a lot of this nefarious activity. And the reason I ask that is because, yes.

  • Chris Ward

    Legislator

    I mean we hear all the time horrible one off examples and we would believe that these could be reflective of a much greater use. And it would just be helpful to start to try to understand. But we don't know what we don't know. Right. They know. Right.

  • Chris Ward

    Legislator

    They know what they're using data for, but we just don't even know where to start. Consumers don't even know that they're having their data used against them. How can we start to sort of understand the magnitude, like maybe a quantitative magnitude of many of these types of examples, whether it's surveillance pricing or AI in the workplace or issues with regard to one's safety.

  • Deirdre Mulligan

    Person

    I think it's incredibly challenging because of the opacity of these activities. Right. That the asymmetry between managers who are free of surveillance and workers who are constantly being observed and managed creates real challenges.

  • Deirdre Mulligan

    Person

    Whether you're in a unionized workplace and you're trying to understand the ways in which data is coming back and shaping your lived experiences, shaping your contracts, et cetera. Right. That can be really, really complicated.

  • Deirdre Mulligan

    Person

    So I do think forcing mechanisms, and Chair Bauer-Kahan mentioned some right. Like the right to delete, the ability to find out what companies know about you can be incredibly important. But they don't help us make the connections between particular kinds of collections, tranches of data, and particular impacts on individuals out in the world.

  • Deirdre Mulligan

    Person

    That's why rights to explanation, how was this decision achieved. And so you can think about the general data protection regulation, you can think about some of the privacy laws that have been adopted here in California as creating some understanding among the public of what kind of information is being leveraged and how it is shaping their lived experiences.

  • Deirdre Mulligan

    Person

    And that's really important. But I also think we need civil society organizations to continue. I know we have 404 Media that's going to be speaking. Professor Waldman mentioned CalMatters, investigative journalism, Public Record Act requests, these continue about state activities.

  • Deirdre Mulligan

    Person

    At least these continue to be incredibly important ways in which we learn about not just what's happening, but the connection between surveillance capitalism and those impacts on the ground. And certainly there are also ways in which we can leverage changes in European Union law.

  • Deirdre Mulligan

    Person

    For example, some of the data infrastructure that have been created to allow us to better understand the ways in which social media data is shaping experiences. Using that can be very powerful.

  • Deirdre Mulligan

    Person

    And then of course the Federal Trade Commission and here in California. We get complaints, and those complaints are a really useful body of evidence to understand, particularly in the commercial context, the way in which data is affecting people's lives and opportunities.

  • Chris Ward

    Legislator

    That's exactly where I think we started working. Last year this committee passed out a bill to help to set some guardrails and regulation around surveillance pricing. Thank you for bringing up the topic as well. And that really, the genesis of that was some of the work by the FTC which of course was neutered by this administration.

  • Chris Ward

    Legislator

    And we don't have much hope that there's going to be effective work that they would need to do. What is your sort of kind of overarching thoughts? Because now we've got a series of states and we're among them that are tackling these issues at this level.

  • Chris Ward

    Legislator

    And of course then you've got federal administration that is trying first on AI regulation and possibly in other areas with regard to technology and activities of daily living to have a moratorium or a preemption. Right. That certainly is something that could come at any point. So what is your maybe candid advice about what the states can do to maintain their sovereignty for their constituencies on these issues?

  • Deirdre Mulligan

    Person

    I think states should continue to act to protect the public. And I think that the administration has made claims that perhaps will exceed their authorities. And I think that the states are in a good position to continue to take the bold actions that are required to protect the public, both in terms of the corporate sector and in terms of the government. Don't be shy.

  • Chris Ward

    Legislator

    Yeah, no, thank you. One of the difficulties that we've had on that topic with surveillance pricing is really establishing that definition and then making sure that you don't have arguments that are allegedly trying to prevent pro consumer, pro business activities. The issue of discounts. Right. And loyalty programs came up last year, and we agree. Right.

  • Chris Ward

    Legislator

    But you also don't want to craft something in a way that is going to, on a run around, sort of back end, you know, negate the intent, the effectiveness of the guardrail that you're trying to put in here. And so we welcome everybody that's got really good thoughts, partnership in this to try to make sure you get that language right.

  • Chris Ward

    Legislator

    Because one of the things that's most fascinating about this committee, as opposed to probably the other 31 standing policy committees. Right. Is that this is a new frontier. This isn't the water code or the, you know, the public safety code, you know. What's that?

  • Chris Ward

    Legislator

    Well, I love... Well, I love those issues. And our issues adapt to, you know, the kind of current issues. You know, this is, this is new. This is a new thing and you have to define it and you have to then like, decide what we're doing as a matter of public policy around that. And it is fascinating, scary, and also imperative that we get this right.

  • Deirdre Mulligan

    Person

    Yeah, I do need to run. The one thing I'm going to say is definitions are incredibly important. I know you know. And it is super important to have not just lawyers in the room, but also technical people who can really help inform those definitions.

  • Deirdre Mulligan

    Person

    And the other thing I want to say is if we look at the Federal Trade Commission, just an example, the flexible, broad standards that they have to address unfairness and deception in the commercial marketplace have given them great flexibility to go where the action is. And as technology changes, to continue to make sure that consumers are protected, that marketplaces are fair, et cetera.

  • Deirdre Mulligan

    Person

    And so often I think people get so hung up on the narrow definitions. And rather than thinking about the general principles that we think we need to both ensure we have robust competition in the marketplace, robust protections from consumers against unfair and deceptive, manipulative practices.

  • Deirdre Mulligan

    Person

    And so I do think that there's a reason to think about empowering regulators with more staff, more technical staff to use their existing authorities in ways that address the moment. And too often we get stuck on the idea that we need to address a very specific technology in some narrow way.

  • Deirdre Mulligan

    Person

    And the issues, the rights of Californians, they're what is enduring. And figuring out how we have a good regulatory and oversight regime that can continue to protect the public and increase our capacity to innovate in a way that is aligned with our values. You know, that's an evergreen activity.

  • Chris Ward

    Legislator

    Yeah, thank you for that. I know you got to run and feel free to run. I just wanted to make, you know, kind of quick.

  • Rebecca Bauer-Kahan

    Legislator

    No, thank you so much. And we are lucky to have Professor Mulligan at UC Berkeley. So I know she is available for all of us. And yes, as Michelle said back there, go Bears. Yes, Assembly Member Ward.

  • Chris Ward

    Legislator

    Closing thought on the matter, you know. Even trying to educate more and more people on it, because it's the only tool that we have right now is to be able to let people know that this is happening.

  • Chris Ward

    Legislator

    And in absence of, you know, guardrails, laws, to be able to have that, you know, sort of be the rules of the road. People just have to become educated. And they again, don't know what they don't know. So you are trying to make sure that people realize that this is going on. And if you're doing something of high value purchase especially.

  • Chris Ward

    Legislator

    Or, you know, you're buying things frequently that you might want to be able to sort of check, you know, through a friend, through somebody else that lives in a different zip code, whether or not you're seeing a differential in pricing because there are other ways that you could probably still make sure that you're securing the best price or try to get something price matched or something.

  • Chris Ward

    Legislator

    I don't know. It's a new frontier. And I think it motivates us because, you know, there's just incredibly broad support from the public. I mean, talking to my own constituents, you know, and we poll tested this question. You know, more than 80% popular, to be able to realize that like, you know, no, wait, that's not fair.

  • Chris Ward

    Legislator

    And this spanned across partisanship, too. And people want to make sure that they're not being taken advantage of specifically because of their own information. You know, with regard to our workers, it is, you know, speaking for me and I know we've got a lot of, you know, like minded colleagues as well.

  • Chris Ward

    Legislator

    Incredibly frustrating that... It's beyond frustrating that you have the experiences that you do for you and your peers and so many like you across other companies as well too. Because the profitability and the success of many of these businesses is always on the, on the productivity of our workers.

  • Chris Ward

    Legislator

    And you know, we always want to make sure that they are rewarded and successful in the productivity of something bigger that they are a part of as well. Not taken advantage of and, certainly, you know, you know, not under compensated. But more importantly, not, you know, unnecessarily not surveilled to the point where this is an intrusion on their way about going about their daily lives.

  • Chris Ward

    Legislator

    Workers work for themselves, for their family, for their betterment. Right. But also for a sense of pride. Right. It's great to have good work and to be proud of the work that you're doing. And to have big brother or big Company, you know, staring over your shoulder that, you know, really is belittling to that.

  • Chris Ward

    Legislator

    And so we have to think about ways that we are properly defining what is and was not proper right in this space. Because you can, and history has shown, be incredibly profitable, incredibly successful when you are, when you have a good product, when you have good business practices that does not demean the value of the work and the workers that are producing that productivity.

  • Chris Ward

    Legislator

    So I just wanted to thank you because you took the time to be here to represent many like yourself, but I think... You're welcome. But I think, I think, you know, it underscores, you know, why we take that so seriously is because again, new frontier, new information, new definitions. Right. And trying to balance the issues that are before us right now as a Legislature on behalf of all the other principles that people hold very dear for themselves.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you, Assembly Member Ward. Assembly Member Ortega.

  • Liz Ortega

    Legislator

    Yeah. I apologize for missing your testimony, but I can imagine. Met with many other workers from different sectors, talk about their privacy or lack thereof. So I had a question, you know, and I don't know if it's for you or for the, you know, we can answer it later.

  • Liz Ortega

    Legislator

    But just curious to know in terms of some of this technology and how it's moving towards, you know, gathering biometric data. For example, my temperature, how often do I sweat, when do I go to the restroom? Is it something that's part of like the employee employment package that you just part of it you're told you're going to be surveilled?

  • Liz Ortega

    Legislator

    Or you're going to use this handheld tool, but then not, you know, and it's part of your employment requirement. But you're not told about all the other data that's being collected. How are employers talking to you about this new technology and how it's being used? And are you being asked to sign basically your rights away?

  • Josh Black

    Person

    That's a great question. I, you know, it's funny, it's been three years since I started and I can't remember when I filled out the original, the original paperwork if I actually signed something saying that I was willing to use... I mean, you basically have no choice. If you want to work there, you have to use this device.

  • Josh Black

    Person

    So I don't think it's really like an opt in or opt out thing. In our workplaces right now, we are basically told if you want this job, this is what you sign up for. And whether or not there was a box that I checked saying that I'm okay with that, I don't remember. But I think it was a condition of accepting the job.

  • Ari Waldman

    Person

    If I may, there's a broader problem that you're speaking about, not just about employees and workers. That's certainly a big deal. But just because we consent to something at one time doesn't mean we consent to it forever. We are allowed to change our minds even if it is...

  • Ari Waldman

    Person

    Even if we assume it is definitely not true that people understand what they're consenting to at the moment that they're given a product, even if we assume that they, that they understand. Just because I consented to sex on Sunday doesn't mean you, that same person consents to sex the next month. Right.

  • Ari Waldman

    Person

    Like we're allowed to change our minds. We're allowed to say that something that was okay at one point is not okay at another point when we have maybe more knowledge about the effects or how it's actually going to pan out.

  • Ari Waldman

    Person

    So it isn't enough, say when the California privacy regulator enforces a law, enforces California state privacy law to say, well, you didn't offer someone an opt out at the beginning.

  • Ari Waldman

    Person

    Well, fine, but that's not really helpful when the people didn't have the necessary information in order to make the choice at the beginning, but also never were given the chance or all the years that they were using the product to be able to change their minds.

  • Rebecca Bauer-Kahan

    Legislator

    And I will say that I am fascinated to that point about these new wearable health devices that are becoming an ever growing market and are collecting health data on us as extensive as our blood pressure and our sleep patterns and all of information that I think a decade ago everybody in this room would have thought was information you would share with your physician under the US's strongest privacy laws.

  • Rebecca Bauer-Kahan

    Legislator

    And now we hand over to for profit companies with zero privacy protections. And there's just been this shift in understanding of how we give away information that I think many of us inherently know is some of the most private information. What is happening inside my body, my blood pressure, my heart rate. Right.

  • Rebecca Bauer-Kahan

    Legislator

    I mean, all of that, I think we realize my weight goes up and goes down, and yet we've shifted the way we share that data and then where it goes beyond that even further. So I think it is, it's interesting and it's, I will say that in my.

  • Rebecca Bauer-Kahan

    Legislator

    I have the privilege of working with the European Union, has an office in San Francisco focused on privacy and technology.

  • Rebecca Bauer-Kahan

    Legislator

    And when they, the team that is there got here within a few months, they said to me, we're fascinated by Americans and how little you care about your privacy and that you're willing to give it up for anything that's free. And it is sort of our ethos.

  • Rebecca Bauer-Kahan

    Legislator

    But I think that it's really important that our, our constituents both understand the implications of that and also have rights around it. So I don't know if.

  • Rebecca Bauer-Kahan

    Legislator

    And so we did, as I mentioned at the beginning, we'd invited Assembly Member Schultz, who is the chair of our Public Safety Committee, because this touches deeply on the public safety regime. So I will turn it over to him.

  • Nick Schultz

    Legislator

    Well, thank you, Madam Chair. I am so sorry that I missed presentation, but I promise I'm going to go back and watch the tape because I want to hear everything, but because the Public Safety Committee was invoked. I'm kidding. Assemblymember Ward.

  • Nick Schultz

    Legislator

    But I did want to actually say there is one thing that is incredibly interesting and potentially concerning for all of us, and that is when we talk about the Fourth Amendment, we're all very well aware if law enforcement wants access to our data, you generally speaking, have to get a warrant.

  • Nick Schultz

    Legislator

    You have to comply with the parameters of the Fourth Amendment. And we're not talking nearly enough about the role of data brokers and the ability of law enforcement to purchase information in which in many instances it's questionable whether the Fourth Amendment even applies.

  • Nick Schultz

    Legislator

    Carpenter vs United States said that those parameters can apply if you're trying to get information from a cell site provider. But does that apply to third party applications? Does that apply to marketing analytics companies? A lot of this is yet to be litigated. And so I just bring that up to say I find all this fascinating.

  • Nick Schultz

    Legislator

    I'm excited to hear from the next panel too, but I think that's something I would just encourage all of us to think about and especially our experts to educate Members of the Legislature I see in the next decade, if I was in law enforcement again, and maybe I will be after this career, there's a lot of information that you can get, very helpful to an investigation that you can purchase and not have to comply with the requirements, arguably not have to comply with the requirements of the Fourth Amendment.

  • Nick Schultz

    Legislator

    And as a citizen of this state and this country, that concerns the hell out of me. And I think we should all be very concerned about that. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you, Mr. Chair. And I will say that what's interesting about that now you're going to see two legal nerds have a very nerdy conversation, is that, as we learned, is based on the reasonable expectation of privacy. And I think that our current regime is eroding that expectation of privacy, which then erodes the constitutional protections.

  • Rebecca Bauer-Kahan

    Legislator

    And so there's sort of this chicken and egg problem that is fascinating and concerning that I think will probably come up in the next panel. So I appreciate that. Anyone else? So I have one more question for Mr. Waldman. So, I mean, you really started to touch on the scary and vast implications of the lack of privacy.

  • Rebecca Bauer-Kahan

    Legislator

    I think that one of the things that was striking in the backgrounder was the conversation around survivors of domestic violence and the implications this can have when anybody can go on and buy from these or receive from these websites, even free, frankly, everywhere you are, what you're doing. I mean, it really makes people less safe.

  • Rebecca Bauer-Kahan

    Legislator

    And so I just wanted to give you an opportunity to sort of dive a little bit deeper into that and sort of what do you think the things are that are most concerning for us from an implication perspective?

  • Ari Waldman

    Person

    Sure. Well, thanks for the opportunity. There are, of course, disparate impacts when it comes to intimate partner violence. Most often the victims are women and the perpetrators are most often men. These kinds of intimate partner violence situations also happen quite often in queer families and those stories often get missed. So that's one important part.

  • Ari Waldman

    Person

    It's also critical to see the vast array of tools that harassers have in order to keep their intimate partners in line.

  • Ari Waldman

    Person

    So California a couple of years ago, as a result, in part from work by my Colleague at UCI, Jane Stover and her husband Dave Min, passed a lot about electric vehicle safety and remote vehicle safety that allowed for or protections for victims of intimate partner violence when partners were using remote control over their cars to continue to harass and harm.

  • Ari Waldman

    Person

    So cars are not something we would normally think of as a tool of intimate partner violence, but all of these tools can be. You mentioned a few minutes ago about health data tracking apps.

  • Ari Waldman

    Person

    Not only do people like us who may wear them to track our steps or our heart rate for positive reasons, not only does that normalize the use of that kind of tracking surveillance for workers, and also for the use of that surveillance on over surveilled populations like communities of color, but also those that data is accessible oftentimes when families or couples have accounts together.

  • Ari Waldman

    Person

    Right. So whenever there are, Whenever I. Whenever a tool encourages you and in fact provides ease and lack of, takes away friction, says, zero, bring in your friends, bring in your family, onto all of these apps and all of these tools, those are easy tools, easy weapons for someone to use against you.

  • Ari Waldman

    Person

    So if it's a car or it's an Apple smartwatch, it's also, of course, you know, text message and social media and all of these things.

  • Ari Waldman

    Person

    So whenever you're writing a Bill that is meant to protect privacy, all of the technologies that fit under the definition that you're thinking about can and will be used by the worst among us to harm the most vulnerable among us.

  • Rebecca Bauer-Kahan

    Legislator

    I appreciate that. Oh, yeah.

  • Gail Pellerin

    Legislator

    So this just sort of sparks. I mean, it's definitely so important that individuals understand how their data is being used. And like, I'm just thinking we need to do like a webinar for our constituents so they really could understand all the different apps and the impacts that it has and how that data is getting used.

  • Gail Pellerin

    Legislator

    And my coworker who's here in the audience tells me about a secret military base was revealed a couple of years ago because soldiers were keeping their workouts on Strava and that got leaked and sold, and they were able to determine where the internal layout of this facility due to these jogging platforms.

  • Gail Pellerin

    Legislator

    So there's just so much we don't know and it's so incredibly scary. And protecting our constituents, protecting the people. California, protecting our most vulnerable, protecting women and LGBTQ is so critically important.

  • Gail Pellerin

    Legislator

    So I'd love to learn more about your ideas on how we can tighten up legislation and really make sure that we are doing everything possible in California. Sure.

  • Ari Waldman

    Person

    If I may, I think you're right to a point. I mean, the way you State the problem is certainly true and public education is definitely important. I mean we can start teaching kids about the effects of data collection in schools pretty young, especially since so many of our schools are highly surveillant.

  • Ari Waldman

    Person

    We have lots of schools that are hiring Google and other companies to and AI and facial recognition companies to evaluate how much they sweat and where they blink and where their eyes are going. Right. This is happening in school.

  • Ari Waldman

    Person

    So yeah, sure we should educate people at a young age and everyone else assemblies and public meetings are great, but that's insufficient because even if we did understand all of the implications and all of the effects of what happens with our data once it's collected, even if that were possible, which I don't think it is, we are not in the best position to effectively navigate our privacy because we have to do so on platforms that are designed for us by companies who don't want us to navigate our privacy.

  • Ari Waldman

    Person

    So it's a matter of holding the companies that are doing this to us responsible, even if we don't understand what's happening. Because it's often like when I come and teach torts to my first year law students, I go into that classroom expecting that they did the reading, but knowing that they didn't always do the reading. Right.

  • Ari Waldman

    Person

    So I think we need to pass laws too in the privacy space that expect that people may try to under educate themselves and give them the tools to educate themselves, but know that they're not actually going to educate themselves.

  • Rebecca Bauer-Kahan

    Legislator

    So my professor's knew. Is that what you're telling me? Well, thank you so much both of you for being here. And I just want to reiterate, Mr. Black, our gratitude for you speaking on behalf of the workers that you work with every day and telling your story.

  • Rebecca Bauer-Kahan

    Legislator

    We thought it was really important that we have someone with lived experience around surveillance here to talk about that and how it impacts people. So I appreciate you doing that. Thank you Madam Chair. And then we will move to the next panel.

  • Rebecca Bauer-Kahan

    Legislator

    We have Jason Keobler who is here virtually an investigative journalist and co founder of 404 Media, and Nila Bala, acting professor of law at UC Davis, who will speak directly to what the Chair of Public Safety was talking about in just a few minutes. So we will start with Mr. Keeler.

  • Jason Koebler

    Person

    Hey, can you hear me?

  • Rebecca Bauer-Kahan

    Legislator

    We can. Thank you.

  • Jason Koebler

    Person

    Amazing. Well, thank you so much for having me. I'm Jason Koebler, I'm a co founder of 404 Media which is a journalist owned investigator Executive tech publication. I'm also a resident of LA, so thank you so much for having me.

  • Jason Koebler

    Person

    I have a lot vested in this, I guess, and I'm joining you from a hotel room in San Francisco where I'm actually at a conference about AI surveillance right now. So I have a good reason, but I wish I could be there with you today.

  • Jason Koebler

    Person

    For the last 10 years or so, I've been focusing specifically on reporting about commercial surveillance and privacy, and specifically on how local, state and the Federal Government have begun to rely on commercially available products for policing and surveillance and the incentives and privacy invasions that this creates.

  • Jason Koebler

    Person

    I've noticed this pattern kind of over and over again because I've now reported on so many companies that do this work. But I basically have repeatedly watched surveillance startups leverage relationships with local police to pilot their hardware or software. Usually this starts with free or steep discount pilot programs in one town or one city.

  • Jason Koebler

    Person

    And then by doing this at first, they can usually sidestep public oversight via City Council meetings and prevent a robust conversation about the privacy impacts of the technology that they're building. After this, companies usually find a few police officers that they can then use to champion their technology.

  • Jason Koebler

    Person

    So then those police officers sort of talk about how great this technology is. Their sort of new surveillance toys are in police listservs, at conferences, and then through word of mouth. And then this basically turns local law enforcement into de facto salespeople of this powerful surveillance technology. And this is stuff that I've reported on over the years.

  • Jason Koebler

    Person

    I've looked at thousands and thousands of pages of public records, documents, emails, you know, presentations to police. And it's something I've seen over and over again. And the result is that local citizens become guinea pigs for things like autonomous drones, license plate reader technology, AI powered cameras, phone location data tracking, social media scraping, and so much more.

  • Jason Koebler

    Person

    As an example, the deployment of so many of these tools happened with little or no public debate. In my neighborhood in Los Angeles, I just learned that my council Member is using $450,000 of discretionary funds to buy 39 Flock license plate reader cameras. There was no public debate or vote on this purchase.

  • Jason Koebler

    Person

    And that's the story throughout a lot of the country and a lot of California, because so many of these technologies that use AI rely on scale and network effects of being, you know, having a lot of cameras or a lot of sensors in different neighborhoods, in different cities, all over the country.

  • Jason Koebler

    Person

    This city by city approach that happens really slow at first, you know, a couple of cities at first, and then suddenly very fast is really critical to the success of these technologies.

  • Jason Koebler

    Person

    As I've reported On this pattern, it's become clear to me that the police, politicians and the local population that are affected by these technologies, and you know, as other panelists have mentioned, it is disproportionately deployed against people of color, against immigrants, against women, against LGBTQ people.

  • Jason Koebler

    Person

    But very often the people who are the targets of this, and often even the police, don't understand the true capabilities of this new AI powered surveillance. And law enforcement often doesn't understand how surveillance tools are increasingly networked together to form these statewide and nationwide surveillance networks that in some cases can filter up to the Federal Government.

  • Jason Koebler

    Person

    I think they may be conceptually understand this, but as I'll describe, there are some cases where local police end up violating state law in California without even knowing it.

  • Jason Koebler

    Person

    And then because of the ways that these technologies work, you know, you're usually using a commercial company's dashboard or software under that company's contract or that company's terms of use. Police and politicians often don't know what's happening with their constituents data or under what circumstances other police departments might be accessing it.

  • Jason Koebler

    Person

    So to get specific about that, for the last two years I've been reporting on a company called Flock, which makes automated license plate reader cameras that track the movements of cars and by extension people, as well as something called Condor cameras, which are pan tilt zoom cameras, video cameras that are designed to track people as they walk in public.

  • Jason Koebler

    Person

    And the General power of Flock is that it's a networked, it's a huge network. And so in order for one police Department to run searches for license plates, they need to give access to their city's cameras to other law enforcement across the state or across the country.

  • Jason Koebler

    Person

    Flock's Network now has upwards of at least 80,000 cameras in at least 8,000 communities around the country.

  • Jason Koebler

    Person

    And then last May, my reporting, which again was based on public records and audit reports about how this technology was being used, showed that through informal police partnerships with Ayes, thousands of, well, local police did thousands of searches of Flock's national network for immigration related enforcement. Basically, they were doing these informal searches for ICE.

  • Jason Koebler

    Person

    All of this was done without a warrant and was done with very little oversight. And what was happening was police in Texas or Georgia or Nebraska were searching specific license plates on behalf of ICE. But they weren't just searching the Flock cameras in their own state.

  • Jason Koebler

    Person

    They were searching license plate cameras across the country, including in Illinois, where it's illegal. There was actually like a Secretary of State investigation after our reporting that there was quite a lot of slaps on the wrist. But Basically determined that it was illegal. And this also happened with California cameras as well.

  • Jason Koebler

    Person

    Both states have laws that say using automated license plate reader cameras for immigration enforcement is illegal. And yet that still happened. We also reported that a police Department in Texas searched tens of thousands of cameras nationwide, including cameras in California for a woman who had an at home abortion.

  • Jason Koebler

    Person

    And you know, when we acquired the police report, we learned that they considered pressing charges against this woman. The woman was also the victim of domestic abuse and the search was done at the behest of her abuser.

  • Jason Koebler

    Person

    This too, of course, is illegal in California, but it happened nonetheless because for the most part, police have been using Flock for whatever they want without a warrant. And I mentioned that police don't maybe don't understand that this is happening.

  • Jason Koebler

    Person

    So throughout this reporting, I spoke to police departments about the fact that their cameras were being used for these illegal searches. And to a T, they literally did not understand what they had opted their residents into. A lot of police actually fought me and said, you know, this couldn't possibly be happening.

  • Jason Koebler

    Person

    I showed them the documents, I explained how this entire system worked, and they had no idea that, you know, police in Texas were searching their cameras in California for these illegal purposes. And so, I mean, I took them at their word on that.

  • Jason Koebler

    Person

    I think that this is because when police use commercial products like this, they believe that there's not that many rules. Police, as a rule, do not get a warrant to search flock. I'm unaware of any police officer really ever getting a warrant to search Flock.

  • Jason Koebler

    Person

    And they often don't even really have to explain what they're using it for. After some of our reporting that revealed these illegal and embarrassing activities, we actually obtained an FBI guidance that was sent via fusion centers to police who are using Flocks. And they were warned.

  • Jason Koebler

    Person

    They basically warned police to be, quote, as vague as permissible about why they were doing any given search. And so these search audit reports, which are basically like huge spreadsheets of why police are using Flock, are full of, you have to put a reason in there. And the reason tab is now full of people just writing investigation, investigation.

  • Jason Koebler

    Person

    They don't really explain what they're using it for, but sort of in the course of this reporting, we've also learned, and local journalists around the country have learned, that there's been numerous cases of police using this technology to stalk people, to surveil protests, to do informal favors for federal agencies who aren't actually supposed to have access to this technology or don't have contracts for it, and so on.

  • Jason Koebler

    Person

    And so Forth other reporting that I've done has shown serious security problems with some of these cameras. I was able to track myself walking across the street in Bakersfield because Flock had misconfigured some of its cameras to stream in real time on the open Internet to anyone with no password required.

  • Jason Koebler

    Person

    And the very nature of Flock, you know, the fact that these cameras are linked together can be searched by any police Department who has access for any reason.

  • Jason Koebler

    Person

    And the fact that these are public records has resulted in the inadvertent leaking of the license plates of millions of surveillance targets around the country, sort of just via failed redactions on public records requests.

  • Jason Koebler

    Person

    Crucially, companies like Flock, as well as the doorbell camera company Ring, which is owned by Amazon Palantir, which powers a lot of ICE's surveillance, and other surveillance companies will say that they're simply building technology and that it's up to the police to use it lawfully in our Democratic society.

  • Jason Koebler

    Person

    These companies have drawn very few lines about what their tech can be used for and have basically put the onus on lawmakers to regul. But our privacy laws, by and large, are pretty outdated. And they don't reckon with the fact that in the age of AI, these are not disparate, disconnected systems.

  • Jason Koebler

    Person

    A license plate camera is not simply sitting on a street corner and taking photos of cars that pass by so that a police officer can later manually go and check the images. They're all networked together and they're actively monitored by AI.

  • Jason Koebler

    Person

    They are or can be or will be connected to facial recognition systems and other forms of surveillance. We've seen lots of companies propose this. Lots of privacy experts are very concerned about this. So the end result is you don't just have pictures of someone's license plate at one given place, at one given time.

  • Jason Koebler

    Person

    You have a detailed database of everywhere that a car goes when they went there. And it can be set up to automatically alert the police, police whenever a car passes by. This is not, by and large, targeted and thoughtful surveillance. It's mass surveillance of everyone who merely exists in society.

  • Jason Koebler

    Person

    And this is again, all essentially happened with very little public oversight. Since my reporting on Flock, dozens of cities around the country have cited it in canceling their contracts because lawmakers and cities, including Mountain View, Santa Cruz, a few others in California, they basically didn't understand what they were opting their residents into when they bought this technology.

  • Jason Koebler

    Person

    And I guess I'd also just mention that people really, truly care about this. I've been a journalist for a very long time, well, 15 years or so. And not since I covered the Edward Snowden revelations have we gotten so much attention for surveillance and privacy based reporting.

  • Jason Koebler

    Person

    A YouTube video that we made about Flock security problems has over a million views. And we saw huge backlash to a Ring camera feature called Search Party that proposes to network Ring cameras together to look for lost dogs. Ring's super bowl ad about this was a huge fiasco, essentially, and it was instantly widely criticized.

  • Jason Koebler

    Person

    Our reporting has since shown that Ring plans to extend similar technology to be used for police policing. I know a lot of this testimony has been about Flock because it's the company I've reported on most recently and most extensively and also has been used to violate California law. But this can.

  • Jason Koebler

    Person

    This same principle can be extended to ring doorbell cameras, drones as first responder programs, which are really popular in California. Facial recognition apps made for and used by Ayes, social media surveillance, location based tracking of cell phones, and so many other surveillance technologies that are by and large developed and sold by companies at this point.

  • Jason Koebler

    Person

    Many of the most powerful surveillance tools are now owned and operated by private companies who are competing to give cops the most powerful tools they possibly can. And they're increasingly interested in linking these technologies together with information that's purchased from data brokers or pulled from federal databases.

  • Jason Koebler

    Person

    And in the case of Ring, we have a company that's interested in using consumer devices sold to ordinary people to create network surveillance that is monitored by AI and can be seamlessly shared with police? There's certainly some likely positive use cases of these technologies, and they can in some cases be used to solve serious crimes.

  • Jason Koebler

    Person

    But again, they've all just been deployed with extremely little oversight, little public debate, and little understanding about the negative effects of this mass surveillance. Thank you so much for your time.

  • Rebecca Bauer-Kahan

    Legislator

    Can I ask one quick question before we move to our next panelist? So have you found any jurisdictions that are engaging with Flock in a way that ensures their cameras are not a part of the network? Or whenever jurisdiction engages with Flock, is it networked?

  • Jason Koebler

    Person

    It is possible to kind of limit the types of networks that you're on. A lot of these changes are quite recent. Sort of after our reporting Flock did. Flock essentially opted California cameras out of the national network because it was in violation of some of these laws, and it did something similar in Illinois.

  • Jason Koebler

    Person

    It is possible to control which. Which networks you're a part of, but there's sort of like a carrot and a stick here because if you opt your city's cameras out of that network, then you are. You're not allowed to search other people's cameras.

  • Jason Koebler

    Person

    And so, you know, often police want to be able to search one town over or all throughout a state. And so it's hard to say exactly what any individual police Department is doing. But we haven't seen that many police officers who say just, we're happy to have four cameras in our town and that's all we want.

  • Rebecca Bauer-Kahan

    Legislator

    Interesting, because I know San Francisco is extensively engaging in this type of networked policing, if you will, and I'm curious to know if those cameras are being accessed by folks outside of the jurisdiction or not. Okay, thank you. Now we will turn it over to Nila Bala.

  • Nila Bala

    Person

    Thank you, Chair Bauer Kahan, Members of the Committee and the Assembly, thank you for the opportunity to testify today on the urgent need to consider privacy and legislative protections in light of modern surveillance technology. My name is Nila Bala.

  • Nila Bala

    Person

    I'm a professor at the UC Davis School of Law, and I research criminal law evidence and emerging technologies and how they impact children and families. I'm here today with a fairly simple claim, which is the Fourth Amendment, as currently interpreted, is structurally incapable of protecting Californians in the age of mass digital surveillance.

  • Nila Bala

    Person

    And because of that, the responsibility has shifted to you. I will begin with how the Fourth Amendment was designed for a different world, one that didn't anticipate mass surveillance or the widespread use of third party data collection. From there, I'll turn to children's data briefly, because that's my area of focus.

  • Nila Bala

    Person

    And I'll close with some thoughts on how we might move forward. I'll start with a little bit of history, because where the Fourth Amendment comes from tells us a lot about where it falls short Today.

  • Nila Bala

    Person

    The Fourth Amendment, which protects our right to be secure in our persons, houses and property against unreasonable searches and seizures, was written in response to physical intrusions. British agents breaking down doors, rifling through desks, and seizing papers. Today, surveillance is continuous, mediated by private companies and digital rather than physical. Our constitutional framework has not kept pace.

  • Nila Bala

    Person

    Modern policing increasingly relies not on individualized suspicion, but on mass data collection. As Jason told us, automated license plate readers, facial recognition systems, social media monitoring, among other systems. These systems collect information about millions of people, most of whom are not suspected of any crime, and retain the data for months, years.

  • Nila Bala

    Person

    That data is often pooled and shared across agencies. The Fourth Amendment was built around particularity, this idea that you could have a warrant naming a specific person, place and items to be seized. But mass data retention inverts that model. Information is collected first and queried later.

  • Nila Bala

    Person

    Instead of a search being a moment in time, surveillance becomes infrastructure and in the digital age, law enforcement often does not need to search directly at all. It can obtain data from private companies that have already collected it. That move is legally possible because of what is known as the third party doctrine.

  • Nila Bala

    Person

    So under current Fourth Amendment doctrine, information you voluntarily share with a third party, your phone company, a bank, a tech platform, an app, generally loses constitutional protection. Law enforcement can obtain all of this without a warrant.

  • Nila Bala

    Person

    But this rule makes no sense in a world where our location is constantly tracked through smartphones, our browsing history reveals our medical and reproductive information, and nearly every digital action necessarily passes through an intermediary. So today, participation in society requires sharing data with third parties. Calling that voluntary is illegal fiction.

  • Nila Bala

    Person

    Now, as Assemblymember Schultz alluded to, the Supreme Court has recognized that some digital data, such as long term cell site location data, is highly sensitive. But that decision in the case Carpenter vs United States, which was a 2018 opinion, was narrow.

  • Nila Bala

    Person

    So one professor has actually analyzed 800 decisions that came after Carpenter and found that for the most part, that decision has just been very highly narrow and specific to just cell site location data. And most decisions were pro government after the Carpenter decision.

  • Nila Bala

    Person

    So the broader third party doctrine remains in place even after Carpenter, meaning most information shared with companies can be accessed by law enforcement with little constitutional protection.

  • Nila Bala

    Person

    I want to say one more thing about all this, which is my area of interest, the one that keeps me up at night and causes me the greatest alarm, which is how little the Fourth Amendment does to protect our children's data so well meaning parents might be installing monitoring software that logs a child's location communications.

  • Nila Bala

    Person

    Schools likewise require students to use devices loaded with edtech educational technology platforms that monitor and track activity both in and outside of schools. And most families and children cannot meaningfully opt out of any of this. Now, here's what we may not realize.

  • Nila Bala

    Person

    Once this data is shared with a third party, whether a platform, a school vendor, or through a data broker, that data can be obtained by law enforcement without any protection. Police can also rely on the fact that a parent agreed to AP terms of service or school technology policies to justify acquiring that same data from companies.

  • Nila Bala

    Person

    I know that we all want to solve crimes and seek justice for victims, and data is evidence. So balancing personal privacy with the need for evidence can be a challenge. But we have to recognize the very real harm that that comes from leaving these data streams unregulated.

  • Nila Bala

    Person

    When law enforcement has unchecked access to data, it builds a wall of distrust in our communities, particularly, as Professor Waldman explained, among marginalized groups, when People feel like they're constantly being data based and watched. It has chilling effects. It changes how they speak, where they go, how they express dissent.

  • Nila Bala

    Person

    And we have to remember that a digital footprint can be permanent. California is a national leader in privacy, but many of our laws include carve outs for law enforcement. And we've seen that procedural safeguards alone, like warrants, are not enough if the data ecosystem itself is unbounded.

  • Nila Bala

    Person

    So the Fourth Amendment and California's Electronic Communications Privacy act generally prohibit government entities from obtaining information without a legal order, AKA a warrant. But given the ease that law enforcement entities can use with commercial third parties like Flock to surveil individuals, the Fourth Amendment and California's own privacy laws are not enough.

  • Nila Bala

    Person

    If we're serious about reform, I think we do have to both regulate how police access data, but actually begin setting boundaries on whether certain categories of data should be used at all. So some categories, legislators might decide, warrant absolute protection or sorry, prohibition.

  • Nila Bala

    Person

    We might decide, for example, that health data generated by smart medical devices or reproductive information can never be used in a criminal investigation. We might have special protections for children's educational data. A second proposal I want to plant in your minds is pursuing a digital evidentiary privilege.

  • Nila Bala

    Person

    So just as the law recognizes spousal privilege, attorney client privilege, for social policy reasons that we deem fundamental, a digital privilege would bar certain categories of data from being introduced as evidence in court altogether. It would operate differently from a warrant requirement.

  • Nila Bala

    Person

    It wouldn't just regulate access, which is insufficient when police can just get data from Apps or data brokers. A privilege would remove some data from the evidentiary marketplace altogether. Legislators could define the scope, how much it covers, what the exceptions could be.

  • Nila Bala

    Person

    Maybe there's narrow exceptions for serious violent crime, but you can accommodate all of that within a privilege. And finally, I want to mention the children, since that's my research area. We should protect them as a legislative imperative, right? Data that we collect as well, meaning parents and schools should serve a specific limited purpose.

  • Nila Bala

    Person

    Police should not have a carte blanche to that same information. So without affirmative use restrictions, deletion mandates, information gathered in elementary school or through a parent installed safety app can persist indefinitely, searchable and repurposed, long after its original context is forgotten. So in conclusion, the Fourth Amendment was built for a world of physical spaces.

  • Nila Bala

    Person

    We live in a world of databases and data brokers. And California has long led the nation in privacy innovation. So the question before you is whether we'll continue to lead not just in protecting consumers from corporate misuse of data, but in setting principled limits on government use of that data. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. And I appreciate you focusing in part on California's children because this is something this Committee often focuses on. We think they're amongst our most vulnerable, especially when it comes to privacy. Thank you for that, for your very direct guidance on where you think we should go. That's super helpful. Sure.

  • Rebecca Bauer-Kahan

    Legislator

    Ms. Ozer, if you want to join, you can. Welcome to join any questions. I look to you, Mr. Chair, to start. If you want,

  • Nick Schultz

    Legislator

    I'll just say digital evidentiary privilege.

  • Unidentified Speaker

    Person

    That is a. I wrote that down.

  • Nick Schultz

    Legislator

    I think the four of us are going to be talking about that. No, but I think just one question to put a finer point on it. Yeah, yeah. I mean, just make sure I'm understanding correctly.

  • Nick Schultz

    Legislator

    We are in this different world now because, you know, 50 years ago, if you wanted to tail someone and you wanted to know where they were going, I mean, you could physically tail, like, you know, they get in their car, they're driving around the community, and you can make the argument that, hey, when you're driving out in public, you don't have a reasonable expectation of privacy.

  • Nick Schultz

    Legislator

    You're out in the public. The real challenge is we also have a digital presence online. I mean, so, okay, you know, in theory, if law enforcement wants to look at my Instagram search history and see all the cat videos I like, I like cat videos. Cats doing crazy fun things like that, for example, they could just get it.

  • Nick Schultz

    Legislator

    But the point I'm trying to make is there is a digital presence there, you know, where you go online or on our apps, you know, that isn't necessarily protected by the Fourth Amendment in the same way because we do not really recognize a right to privacy in that sense. I mean, it's arguable.

  • Nick Schultz

    Legislator

    But I guess the point I'm making is, you know, your comments about Carpenter are very true.

  • Nila Bala

    Person

    Yeah.

  • Nick Schultz

    Legislator

    A lot of folks pointed to that case as this counterbalance, this guardrail on the intrusion into the Fourth Amendment. And yet the subsequent case law, I believe, is really narrowing the scope of that court.

  • Nick Schultz

    Legislator

    And, you know, just, frankly, with the current composition of the court, I don't have a great degree of confidence that they're going to defend the Fourth Amendment. So, I don't know. Not so much a question, but a comment. And I just think it's a really fascinating area of law.

  • Nick Schultz

    Legislator

    And I just so appreciate you, Madam Chair, for bringing us all together to talk about it.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. No, and I thought the point. And then I'll turn it over to somebody. The point you made that I think was an exclamation point was found out by Mr. Chair around the chilling aspects is really, really important at this moment in time, because I do think there's a fear to say what you believe.

  • Rebecca Bauer-Kahan

    Legislator

    And we saw the news broke some administrative subpoenas just recently after people's social media content around immigrants, immigration enforcement that is frankly, just chilling. And so I think I appreciate the comments. Assembly Member Ortega

  • Liz Ortega

    Legislator

    yeah, I mean, this entire Committee is something that.

  • Liz Ortega

    Legislator

    You know. Right. We're in the nature of here both in terms of.

  • Liz Ortega

    Legislator

    Oh, sorry. And in the terms of privacy and not having any privacy and how it's being used and how the information is being gathered. But now we also have another layer of a Federal Government that's using this data and information against its own citizens.

  • Liz Ortega

    Legislator

    And so, how do we address this issue simultaneously? I know during the first Trump Administration, the state passed a couple of laws to protect some of this information, SB 34, SB 54.

  • Liz Ortega

    Legislator

    But clearly, there's holes in there that need to be addressed, particularly when it comes to our local law enforcement sharing and collaborating with Ayes, when we have laws that prevent them from doing so. And so my question is, what are your thoughts on that?

  • Liz Ortega

    Legislator

    What are your suggestions for us as legislators who are trying to do our best to protect all of our constituents in this new era of data collection and data sharing?

  • Angela Short

    Person

    Yeah, Nicole, feel free to chime in as well. Well, I just want to connect the dots a little bit between the first present presentation and this last one.

  • Angela Short

    Person

    So, you know, Even back in 1972, at the dawn of computerization, the California Legislature and the California voters recognized that the Fourth Amendment was not up to the task of dealing with a situation of technology and digital surveillance.

  • Angela Short

    Person

    And so that was the real intent of creating the right to privacy in Article 1, Section 1, to be broader than the Fourth Amendment. So our Fourth Amendment corollary in California is Article 1, Section 13. That's the Fourth Amendment of California.

  • Angela Short

    Person

    But Article 1, Section 1, the Privacy Amendment, was passed as a modern right to privacy to address the rise of computerization. It's why it is broader than the Fourth Amendment in that it applies to both government and private parties.

  • Angela Short

    Person

    They understood this connection, that it wasn't just that you had to protect against government intrusion or private intrusion, but also the connection between the two and how private companies, how the government could reach into private parties. The second is that it rejected the reasonable expectation of privacy.

  • Angela Short

    Person

    And under the California constitutional right to privacy, California does not recognize the third party doctrine. So under California jurisprudence, the third party doctrine does not exist here. We do not lose our right to privacy just because we share it with a third party.

  • Angela Short

    Person

    I think it's really important to understand that there are limitations to the Fourth Amendment, and those limitations were recognized 50 years ago. And that's why we have the privacy right in California. And we also have built statutory law on top of that. The California Electronic Communications Privacy act is one that was passed in 2015.

  • Angela Short

    Person

    Also bipartisan support that I worked on with Mark Leno and Joel Anderson. That requires heightened protections for government access to information. So I think it's really important to recognize that we actually have a greater constitutional foundation here, that we then also can pass statutory law on top of. We have much more that we need to do legislatively.

  • Angela Short

    Person

    But. Our sort of radar should not just be the Fourth Amendment. Our radar is the constitutional right to privacy. And that's what we need to hold the state to account, law enforcement to account, and private government to account.

  • Rebecca Bauer-Kahan

    Legislator

    And I don't know if Ms. Bala, we wanted to answer Ms. Ortega's question around.

  • Nila Bala

    Person

    Oh, just briefly, I just want to echo that. The federal Constitution is a floor right? It's not a ceiling. And we have more here in California through our Constitution. We can do more through statutes. I totally hear your concern. I wish I had, you know, the magic bullet answer for you.

  • Nila Bala

    Person

    It sounds like even when we are regulating this and we're telling local law enforcement not to share this information, sometimes it's getting shared. But I just want to say we should still try.

  • Nila Bala

    Person

    It's not a reason not to, you know, pass the regulations and pass the legislation and just sort of to weave together, hopefully, everything we've heard today. I think the. I guess it'll be officially the second panel, so the panel after yours, the. That really address surveillance infrastructure and surveillance capitalism.

  • Nila Bala

    Person

    There's so much we can do with just the data we're sharing with commercial entities. And then this panel, which I hope I conveyed, we can also regulate what law enforcement can do with that commercial data.

  • Nila Bala

    Person

    So there's so many things I think that could come out of today's conversation, and even if not all of them work or work perfectly all the time, it's not a reason not to try.

  • Nila Bala

    Person

    And what we have seen is when states like California or Illinois comes to mind, a few others that have passed stronger privacy regulations, it's improved things federally as well, because these companies have to make their policies follow those laws in order to operate.

  • Nila Bala

    Person

    And so we become, you know, sort of a new standard that we can set for everybody in a way. So I think we should keep trying even when there are sort of imperfect results. And I would say the other.

  • Jason Koebler

    Person

    I would just echo that is, yeah, this data sharing did happen illegally, but this situation now in California is better than it is in any other state because of these laws that you passed over the last few years.

  • Jason Koebler

    Person

    And so the people of California are protected more than, I think, any other state from this type of surveillance because of these laws.

  • Angela Short

    Person

    I would just say those are incredibly important Laws. I formally was at the ACLU for 20 years before being at UC Law San Francisco. So these are laws that I was engaged in knowing about its passage. We've also, the ACLU has tried to enforce those laws, and some of the enforcement mechanisms have not actually been strong enough.

  • Angela Short

    Person

    And so they have been wonderful laws on paper, but in order to actually enforce them effectively, you know, having private rights of action, having really robust enforcement is incredibly important because we saw through several years that, you know, the ALPR law was not actually being followed, and, and the attorney General has been active in that.

  • Angela Short

    Person

    But it's been difficult sometimes to enforce some of those laws. And there's also been efforts. There are actually litigation moving right now on license plate readers against the city of San Jose using an Article 1, Section 1 claim. There's a current case moving against Clearview AI as well, using an Article 1, Section 1 claim.

  • Angela Short

    Person

    So there's efforts in the courts, but having really robust enforcement, including private rates of action, and is incredibly important for these surveillance laws.

  • Rebecca Bauer-Kahan

    Legislator

    I will say that Professor Mulligan did mention a couple things. One was making sure we had the enforcement mechanisms we needed, including in our own agencies, which are often underfunded, as you know, in the realm you work in every day.

  • Rebecca Bauer-Kahan

    Legislator

    And then the other thing was actually journalism and supporting the journalists in our ecosystem that do the work that we heard is so critical to change. So I think I'll highlight those two as well, because I think that's. I always like to just ensure we're creating an ecosystem for good, honest journalism and with that Islam. Repeller.

  • Gail Pellerin

    Legislator

    Well, I just want to thank the chair for holding this informational hearing today. It's been very enlightening, and thank you to all the speakers.

  • Gail Pellerin

    Legislator

    And I'm clearly seeing that there's a lot more work for us to do in this space and to do the statutes that are necessary to protect our children, to protect our constituents, the people in California. And we're touching on the Constitution. Is it strong enough the way it's written?

  • Gail Pellerin

    Legislator

    Are there more changes we need to make to our Constitution, or are we just looking at statutorial changes?

  • Angela Short

    Person

    You know, the constitutional right to privacy is incredibly strong, as I mentioned. You know, there's been some impediments to adequately enforcing it in the courts that I hope might change. I think that it's incredibly strong. It's just really about drafting legislation that fully operationalizes that with strong definitions, strong enforcement, and strong scope.

  • Angela Short

    Person

    And I think there's so much. There's been. I mean, there's been so much important privacy work that has happened in California in the last 50 years. I mean, literally almost every privacy law in the country started here or exists here, doesn't exist other places.

  • Angela Short

    Person

    But there's so much more that can be done and I would just say strong definitions, as Professor Mulligan also said. And you know, the enforcement is absolutely necessary for people to actually be able to utilize that law effectively.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. Yeah, thank you, Assembly Member. Mr. Chair.

  • Nick Schultz

    Legislator

    Thank you. Last comment and question. For me, the one comment I would make is I totally agree with you both. I think that California should do more. Obviously we have concerns about access and use of this information in other contexts like immigration proceedings.

  • Nick Schultz

    Legislator

    And a lot of that is beyond our purview Code of federal regulations govern the admissibility of evidence in those proceedings for. But in terms of state court, what's allowed in, what isn't, I think there's certainly more we could do.

  • Nick Schultz

    Legislator

    So my last question, and it's a very nerdy one, and if you don't know the answer, if either, I would just love to continue the conversation offline. So if I'm remembering correctly, 1982 voters passed the Truth in Evidence Act, Prop 8. And I'm not, I'm oversimplifying it. Right.

  • Nick Schultz

    Legislator

    But the General proposition was even, you know, illegally obtained evidence, if it's relevant, should be permitted in. It was part of the crime victim Bill of Rights. Now, if I remember correctly, the court has said, you know, privileges are different and they should be nonetheless, you know, maintained and those should still apply.

  • Nick Schultz

    Legislator

    I guess that's the question is if the Legislature were to explore a new privilege, I mean, have we litigated many of the privileges that were in our statutes were on the books before the passage of Prop 8. How do we think the courts would treat that today if we were to add a new privilege?

  • Nila Bala

    Person

    You can add specific language when you're crafting the privilege to make it clear that Prop 8 doesn't, you know, bar this or if anything bars the evidence from coming in. And in fact, there have been subsequent court opinions on many issues around whether it's character evidence, hearsay to say Prop 8 doesn't affect this admissibility rule in California.

  • Nila Bala

    Person

    So it's just about crafting your intent very clearly that this sort of lives beyond Prop 8 and Prop 8 does not affect it.

  • Angela Short

    Person

    Yeah. So Cal ECBA has a suppression remedy. It requires a 2/3 majority of both houses of the Legislature in order to prevail over, you know, the truth and evidence. But you can. And that was another thing, you can have private rights of action. You can also have suppression remedies. It's a very high standard.

  • Angela Short

    Person

    But in Calecpa we were able to reach that standard, 2/3 majority. It requires suppression for any violation of that law.

  • Rebecca Bauer-Kahan

    Legislator

    We brought the right experts. They didn't even need to go home and do homework.

  • Nick Schultz

    Legislator

    I got a boatload of stuff to work with you on. This is great.

  • Rebecca Bauer-Kahan

    Legislator

    Well, thank you both for being here. And I want to thank all of our panelists today and again Committee staff for putting together a really insightful hearing.

  • Rebecca Bauer-Kahan

    Legislator

    I think that one of the reasons we were drawn to doing this is we've seen many bills, as you heard, on pricing on, you know, protecting Californians from surveillance, license plate readers, et cetera. It was mentioned the intimate partner violence law that was passed a few years ago.

  • Rebecca Bauer-Kahan

    Legislator

    And yet we'd never taken a deep dive on what does the full ecosystem look like. And we thought that was really important for us to do as we look to how we continue to evolve and protect Californians privacy. So I just want to thank everyone for engaging in this conversation.

  • Rebecca Bauer-Kahan

    Legislator

    And the work is only just beginning, so thank you all. With that, we will turn it over to public comment. Is that what you were going to say? Keeping me on task. So come on up if you'd like to join us for public comment. I think we'll do a minute each. Should work.

  • Yvonne Fernandez

    Person

    Perfect.

  • Rebecca Bauer-Kahan

    Legislator

    The mic's on. Go ahead.

  • Yvonne Fernandez

    Person

    Hi. Thank you Madam Chair and staff for putting on such an important hearing and for the inclusion of the impact of surveillance in the workplace. My colleague Josh perfectly encapsulated how surveillance exists in Amazon warehouses.

  • Yvonne Fernandez

    Person

    And, and I would like to echo all those points made and also highlight the fact that this exists, unfortunately, across all industries, impacting workers all across the board.

  • Yvonne Fernandez

    Person

    Because whether or not you are a white collar worker sitting in front of a computer, or if you're a farm worker working on a field, you are susceptible to all forms of surveillance technology. And we have gone far too long without the establishment of true worker centered privacy protections.

  • Yvonne Fernandez

    Person

    And because of this, employers now have a deep interconnected surveillance network and have taken complete control of the workplace with surveillance tools developed by the very tech executives who are aligning themselves with the Current Federal Administration.

  • Yvonne Fernandez

    Person

    Employers now have unimaginable troves of sensitive worker data at their disposal to sell for profit, to train LLMs, or to provide to unscrupulous actors such as eyes. Workers like Josh deserve protections. Surveillance for the sake of control and extraction does not protect workers. And I very much appreciate the opportunity to provide public comment today. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    And I don't think you. I know who you're here on behalf of, but if you want to put it into the record, my apologies.

  • Yvonne Fernandez

    Person

    Yvonne Fernandez with the California Labor Federation.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. Thank you.

  • Andrea Lynch

    Person

    Good afternoon, honorable Committee Members. My name is Andrea lynch and I'm a policy advocate here on behalf of California Chamber of Commerce. Technologies in the workplace are often used for safety purposes and risk management. Technology can determine when an incident has occurred. It can capture violent, unsafe or illegal behavior like theft, and more.

  • Andrea Lynch

    Person

    We respectfully ask that when the Legislature is considering potential regulations on technology, it is critical that these uses are considered and that we are not creating unintended consequences. Thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you.

  • Sarah Bridges

    Person

    Well, hi there. Hi. Sarah Bridges, on behalf of the California Manufacturers and Technology Association, in the vein of Benjamin Franklin, Everyone must be aware of the strain and tenuous relationship between liberty and privacy and the rather safety and security.

  • Sarah Bridges

    Person

    But as the representative of nearly 15% of California's GDP and over 8 million employees, it's the daily non delegable duty of my members to ensure the safety of everyone and their employees.

  • Sarah Bridges

    Person

    And that is the purpose of many of these AI tools to make these husbands, mothers, sons, sisters and citizens of California to go home safely every day that they can continue to be productive members of society. No one usually opposes targeted, tailored, solution oriented legislation and regulation that clearly identifies problems and prescribes specific remedies.

  • Sarah Bridges

    Person

    What is problematic is overly broad, overly inclusive and vague definitions that result in the inclusion of beneficial, helpful or neutral systems that not only are assets to employers, but are advantageous to employees.

  • Sarah Bridges

    Person

    CMTA looks forward to continuing to work with all parties in this space to ensure that the burdens and the equities of all parties are represented, but that this legislation is ultimately targeted and do not include unintended consequences. And the result is viable and practical laws that both employers and employees can reasonably implement and follow.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you so much.

  • Jp Hanna

    Person

    Good afternoon, Chair, Members. JP Hanna with the California Nurses Association, representing over 100,000 registered nurses in California. So our hospitals must be centers of care, not laboratories. Not laboratories for corporate surveillance. Employers are installing ambient systems that track movements, facial expressions and conversations, even requiring nurses to place monitoring apps on their personal phones.

  • Jp Hanna

    Person

    This bossware is not about patient care. It's about profits, speed ups, quotas and silencing workers. When patients feel watched or lose trust, they are less likely to speak honestly or seek care at all. Companies like Palantir are embedding themselves in our health system, partnering with hospital chains to collect vast amounts of patient and worker data.

  • Jp Hanna

    Person

    Palantir also contracts with agencies like Ayes that use their data infrastructure to power deadly immigration raids that stalk, target and detain people. Our health, our health information should never be weaponized. We need strong enforceable privacy protections. AI and surveillance tools must be independently tested, validated and publicly accountable before and after deployment.

  • Jp Hanna

    Person

    And lastly, surveillance corporations like Palantir have no place in our health care or our government. Thank you very much.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you. Hello.

  • David Bolog

    Person

    Hi. David Bollog, SFB, as in Victory Alliance. Just some low hanging fruits that I think you can address that was talked about in the presentations. School tablets. You can control the programs, the facial cameras on them, the microphones when they turn on, when they turn off.

  • David Bolog

    Person

    You can also go back to the age of books and pencils like it's said that Silicon Valley billionaires are doing. You can set a culture of privacy to students by having them have an education class on privacy a LA the Healthy Youth Act. You can prohibit mandatory facial recognition with employers.

  • David Bolog

    Person

    Right now, to sign on to my email, I have to use a two factor identification process where they send a code to my phone, but they have the ability to use facial recognition. It's not mandatory yet, but they are pushing that. You can also pass legislation that does not allow Warm Island Time. zero, you've got.

  • David Bolog

    Person

    Okay, don't worry. Okay. You can pass legislation that does not allow locations and municipality to do contracts that violate California's privacy contracts. You can also mandate warning signs for places that have flock cameras.

  • David Bolog

    Person

    And just in regards to flock cameras in Los Angeles, the first districts that had those were because of the richer areas that were having burglaries. They were the ones that pushed the legisl that pushed city council Members to purchase those.

  • David Bolog

    Person

    As I was looking here at the Chevy Hills area of Los Angeles, the 1 neighborhood, they spent $200,000 of their own money to have those installed. And the Professor Mulligan did talk about cash. You can pass legislation that requires businesses with California licenses to accept cash. Okay, thank you.

  • Rebecca Bauer-Kahan

    Legislator

    Thank you so much. I let everyone go over a little bit, but I did it for everybody, so I think that's fair. Thank you all for being here and thanks for the public comment. And with that, we will admit Facebook.

Currently Discussing

No Bills Identified