Draft California Risk Assessment Regulations Are a Promising Start
September 7, 2023 |
By: Ben Winters, Senior Counsel
Earlier this year, EPIC, the Center for Digital Democracy, and the Consumer Federation of America submitted comments to the California Privacy Protection Agency (CPPA) to recommend strong regulations implementing key provisions of the California Consumer Privacy Act. The comments include proposals on cybersecurity audits, risk assessments, and automated decision-making systems and urge the agency to protect Californians by drawing on strong existing frameworks and ensuring that consumers’ rights to opt out and receive information are easy to exercise.
EPIC also provided extensive input on CCPA regulations in November 2021, May 2022, August 2022, and November 2022, arguing for consumer-friendly interpretations of the CCPA to guard against exploitative commercial data practices.
Last week, the CPPA published draft regulatory text implementing the cybersecurity audits and risk assessment sections of the CCPA. This blog highlights and evaluates the draft risk assessment provisions. It’s important to remember this is an early draft, and also that the agency is limited to creating regulations pursuant to the drafted text. (Occasional inline commentary by EPIC in parentheses.)
When do risk assessments have to be submitted? When processing of information presents significant risk to consumers’ privacy. The agency is proposing that be any one of the 7 following circumstances:
- “Selling or sharing personal information” (which means many businesses may have to complete these risk assessments);
- Processing sensitive personal info (information for normal employment purposes like payroll, health insurance, and wage reporting are excluded);
- Using Automated Decisionmaking Technology in furtherance of a sensitive decision (i.e, one “that results in the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment, healthcare, or access to to essential goods/services”);
- Processing personal information from people that the business knows are younger than 16;
- Processing in ways that constitute workplace or school surveillance (e.g., recording, speech/face detection, location trackers, keystroke loggers, and productivity monitors);
- Processing personal information of consumers in publicly accessible places using technology to monitor behavior, location, movements or actions (places that serve or are open to the public);
- Processing to train AI or ADT (particularly relevant to Generative AI)
What are risk assessments comprised of? Risk assessments are instruments of accountability—and the required process should reflect that. As EPIC wrote to the CPPA in March 2023, “When implemented properly, risk assessments force institutions to carefully evaluate the full spectrum of privacy and data-driven risks of a contemplated processing activity, to identify and implement measures to mitigate those risks, and to determine whether the processing activity can be justified considering any risks that cannot be fully mitigated. A risk assessment can also provide regulators and the public with vital information about processing activities that may pose a threat to privacy and civil rights. A risk assessment should not be a simple box-checking exercise or a static, one-off undertaking.” The CPPA’s draft reflects this perspective, requiring a whole-of-organization approach to completing risk assessments and a wide breadth of required considerations and disclosures.
The regulations call for a disclosure of the following to the agency:
- A short summary of the processing (how business will process the personal information, including how the business will collect, use, disclose, and retain personal information);
- The categories of personal information to be processed and whether they are sensitive
- The context of the processing activity;
- The consumers’ reasonable expectations concerning the purpose for processing their data, or the purpose’s compatibility with the context in which their personal information was collected (this reflects the bill’s data minimization provision);
- The way the business is planning to collect, use, disclose, or retain personal info and how they are getting it;
- How they are complying with the data minimization requirements elsewhere in the law;
- How long the business will retain personal data;
- The approximate total number of consumers whose information the business plans to process;
- What technology is being used to process personal information;
- The purpose/“why” of processing—which must be stated more clearly than generic phrases like “to improve our services” or “security purposes”;
- The benefits of the processing to business, consumer, other stakeholders, and public (with specificity as to how these benefits were determined);
- Negative impacts of processing to privacy, with explanations of why they’re disclosing negative privacy impacts (at a minimum considering constitutional harms, security harms, discrimination harms, privacy harms, economic harms, autonomy harms, physical harms, reputational harms, and psychological harms);
- Safeguards the business is going to implement to address the negative impacts; and
- The business’s assessment of whether the negative impacts identified outweigh the benefits when mitigated.
These requirements go beyond other currently required or proposed algorithmic risk assessments, both in the range of triggers requiring an assessment and the scope of content within the risk assessment. The largest outstanding questions are (1) to what extent these assessments and the information they contain may be accessible to the public, (2) whether the final rules will be as strong as the current draft, and (3) how these rules, once adopted, will be enforced to protect consumers.
EPIC will continue to advocate before the CPPA and elsewhere for strong regulations to address algorithmic and data-driven harms.The public has an opportunity to hear discussion on these regulations (and more) and to provide comment at the agency’s September 8 meeting.