Tracking Real Algorithmic Harms in California as CCPA Obligations Near

October 31, 2023 | Ben Winters, Senior Counsel

As part of our project Assessing the Assessmentssupported by the Rose Foundation—EPIC will endeavor to track instances of specific AI harms relevant to enforcement of the California Consumer Privacy Act. These instances are listed below with the setting and application of the system, a real example of that harm happening, and what types of harm are done. 

This system draws on two taxonomies of AI harms to describe the types of harms done: 

  1. Danielle Citron’s and Daniel Solove’s Typology of Privacy Harms, comprising physical, economic, reputational, psychological, autonomy, discrimination, and relationship harms; and 
  2. Joy Buolamwini’s Taxonomy of Algorithmic Harms, comprising loss of opportunity, economic loss, and social stigmatization, including loss of liberty, increased surveillance, stereotype reinforcement, and other dignitary harms. 

These taxonomies do not necessarily cover all potential AI harms, and our use of these taxonomies is meant to help readers visualize and contextualize AI harms without limiting the types and variety of AI harms that readers consider.

While these examples are focused on California, examples of the effects of an AI system in one jurisdiction will often apply to other states and countries as well. 

ExampleExplanationCitationTypes of Harm
Management of Vaccine Distribution in 2021Algorithmic decision-making based on zip codes, rather than census-based health score index in California. This was to be used to target vaccine distribution to the areas most in need. Zip codes as a basis for determining health outcomes and vaccine access led to lower-income areas getting eclipsed by wealthier ones in the same zip code, leading to inequitable outcomes.California’s “Equity” Algorithm Could Leave 2 Million Struggling Californians Without Additional Vaccine SupplyIndividual: Financial, Discrimination; 
Societal: Democracy & Rule of Law
Autopilot in autonomous vehiclesTesla Model 3 on Autopilot crashed and killed a 15-year-old on CA freeway; autopilot made no attempt to slow down until just before the crash.Tesla Says Autopilot Makes Its Cars Safer. Crash Victims Say It Kills. – The New York Times ( Financial, Privacy & Autonomy; 
Societal: Equality
Autopilot in autonomous vehiclesTesla on Autopilot collided with a parked firetruck on CA freeway.Tesla was on Autopilot when it hit firetruck, NTSB finds – Los Angeles Times ( Financial, Privacy & Autonomy; 
Societal: Equality
Autopilot in autonomous vehiclesTesla on Autopilot drove into the wrong lane.Tesla vehicle in ‘Full Self-Driving’ beta mode ‘severely damaged’ after crash in California – The VergeIndividual: Financial, Privacy & Autonomy; 
Societal: Equality
Autopilot in autonomous vehiclesTwo Cruise autonomous vehicles collided with each other in CA.A Cruise-on-Cruise Crash Reveals the Hardest Thing About Self-Driving Tech | WIREDIndividual: Financial, Privacy & Autonomy; 
Societal: Equality
Autopilot in autonomous vehicle crashed into a divider and traffic sign in San Francisco.California halts’s driverless testing permit after accident | ReutersIndividual: Financial, Privacy & Autonomy; 
Societal: Equality
Autopilot in autonomous vehiclesTesla on autopilot hits a parked police car in CA.Tesla in Autopilot mode crashes into parked Laguna Beach police cruiser – Los Angeles Times ( Financial, Privacy & Autonomy; 
Societal: Equality
Autopilot in autonomous vehiclesTesla driver died after crashing into a barrier on a freeway. Autopilot is suspected of causing the driver’s death.Tesla defends Autopilot record amid investigation into crash that killed driver | The Independent | The IndependentIndividual: Financial, Privacy & Autonomy; 
Societal: Equality
Test-taking softwareCalifornia bar exam flagged over a third of applicants as cheating using automated proctoring software.California Bar Exam Flagged A THIRD Of Applicants As Cheating – Above the LawIndividual: Psychological, Privacy & Autonomy, Political & Community Participation; 
Societal: Democracy & Rule of Law
Generative AIChatGPT made up fake sexual harassment claims (citing fake sources) about CA law professor.What happens when ChatGPT lies about real people? – The Washington PostIndividual: Privacy & Autonomy; 
Societal: Environmental, Economic
NavigationGoogle Maps and Waze navigated CA drivers into wildfires.Waze, Google Maps Send California Residents Straight Into Wildfires ( Financial, Privacy & Autonomy; 
Societal: Equality
PolicingRobocop failed to send an emergency signal after onlooker pressed an emergency alert button.A RoboCop, a park and a fight: How expectations about robots are clashing with reality ( Financial, Privacy & Autonomy; 
Societal: Equality, Environmental
Identity Verification Facial recognition technology used for entry into stadiums was used to expel lawyers representing the opposing side in a legal dispute involving the stadium owners.Here Are the Stadiums That Are Keeping Track of Your FaceIndividual: Psychological, Privacy & Autonomy, Political & Community Participation, Safety; 
Societal: Democracy & Rule of Law, Environmental?
Speech recognitionStanford researchers found that speech recognition by tools from Apple, Amazon, Google, IBM, and Microsoft fare worse in recognizing speech by Black people.Personal voice assistants struggle with black voices, new study showsIndividual: Psychological, Safety; 
Societal: Equality, Democracy & Rule of Law
Stanford vaccine algorithmAlgorithm left out frontline doctors and instead prioritized administration and doctors seeing patients remotely.This is the Vaccine Algorithm that Left Out Frontline Stanford Doctors Individual: Financial, Psychological, Privacy & Autonomy; 
Societal: Equality, Democracy & Rule of Law
Identity VerificationMan arrested for sock theft by false facial match despite having an alibi.Study: Face Recognition Often Used As Sole Basis for ArrestIndividual: Financial, Discrimination, Privacy & Autonomy; 
Societal: Democracy & Rule of Law, Environmental

Support Our Work

EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.