Comments Of The Electronic Privacy Information Center, CALPIRG Education Fund, Center For Digital Democracy, Consumer Action, Consumer Federation Of America, Ranking Digital Rights, And U.S. Public Interest Research Group to the California Privacy Protection Agency
(Proceeding No. 01-21)
The Electronic Privacy Information Center (EPIC), the California Public Interest Research Group (CALPIRG) Education Fund, Center for Digital Democracy (CDD), Consumer Action, the Consumer Federation of America (CFA), Ranking Digital Rights, and the U.S. Public Interest Research Group (U.S. PIRG) submit these comments in response to the California Privacy Protection Agency (CPPA)’s invitation for public input concerning the agency’s development of regulations under the California Privacy Rights Act of 2020 (CPRA) and the California Consumer Protection Act of 2018 (CCPA). We commend the agency for its work to establish data privacy protections for Californians and urge the agency to include more use cases and more detail in the regulations to provide consumers and businesses clear guidance with respect to their rights and obligations.
Our Organizations
EPIC is a public interest research center based in Washington, D.C. that was established in 1994 to focus public attention on emerging privacy and related human rights issues and to protect privacy, the First Amendment, and constitutional values.[1] EPIC has a long history of promoting transparency and accountability for information technology.[2]
The California Public Interest Research Group (CALPIRG) Education Fund is an advocate for the public interest. CALPIRG Education Fund speaks out for the public and stand up to special interests on problems that affect the public’s health, safety and wellbeing in California.
The Center for Digital Democracy’s mission is to ensure that digital technologies serve and strengthen democratic values, institutions and processes. CDD strives to safeguard privacy and civil and human rights, as well as to advance equity, fairness, and community.
Consumer Action has been a champion of underrepresented consumers since 1971. A national, nonprofit 501(c)(3) organization, Consumer Action focuses on financial education that empowers low to moderate income and limited-English-speaking consumers to financially prosper. It also advocates for consumers in the media and before lawmakers and regulators to advance consumer rights and promote industry-wide change particularly in the fields of consumer protection, credit, banking, housing, privacy, insurance and utilities.
The Consumer Federation of America (CFA) is an association of non-profit consumer organizations that was established in 1968 to advance the consumer interest through research, advocacy, and education.
Ranking Digital Rights (RDR) is a non-profit research and advocacy program at New America that works to advance freedom of expression and privacy on the internet by establishing global standards and incentives for companies to respect and protect the human rights of internet users and their communities.
The U.S. Public Interest Research Group (U.S. PIRG) is a nationwide citizen advocacy group committed to serving the public interest. U.S. PIRG works for common sense solutions that make the future healthier, safer and more secure for everyone.
Below, please see our feedback on the proposed regulations. The Appendix contains specific line edits for certain provisions, particularly:
- § 7002 – Restrictions on the Collection and Use of Personal Information (A-1)
- § 7011 – Privacy Policy (A-2)
- § 7012 – Notice at Collection of Personal Information (A-3)
- § 7022 – Requests to Delete (A-3)
- § 7023 – Requests to Correct (A-4)
- § 7025 – Opt-Out Preference Signals (A-4)
- § 7026 – Requests to Opt-Out of Sale/Sharing (A-7)
- § 7027 – Prohibition Against the Use and Disclosure of Sensitive Personal
Information (A-8)
- § 7050 – Service Providers and Contractors (A-12)
- § 7052 – Third Parties (A-13)
I. GENERAL PROVISIONS (Article 1)
a. Request to Opt-In to Sale/Sharing – § 7001(y)
We recommend that the definition of “request to opt-in to sale/sharing” in § 7001(y) include an illustrative example of what type of action sufficiently demonstrates “that the consumer has consented to the business’s sale or sharing of personal information about the consumer by a parent or guardian of a consumer less than 13 years of age or by a consumer at least 13 years of age[.]” This action should require more than simply checking a box with little to no information.
b. Data Minimization – § 7002
The CPPA should not provide an exception in § 7002 to the consumer expectation standard that would degrade user privacy and experience. We urge the CPPA to amend the draft regulation implementing § 1798.100(c) of the CPRA to fully implement the law, which prohibits businesses from processing personal information in a way that is not compatible with the context in which that personal information was collected. Section 1798.100(c) reads in full:
(c) A business’ collection, use, retention, and sharing of a consumer’s personal information shall be reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed, or for another disclosed purpose that is compatible with the context in which the personal information was collected, and not further processed in a manner that is incompatible with those purposes.
The proposed CPPA regulations provide a useful mechanism to determine the scope of what is “reasonably necessary and proportionate” through the “reasonable consumer” standard. However, the proposed regulations include an exception that would allow businesses to collect data for reasons beyond what a reasonable consumer expects and beyond the context in which the data was collected. Specifically, § 7002 of the draft regulations provides that:
A business shall obtain the consumer’s explicit consent in accordance with section 7004 before collecting, using, retaining, and/or sharing the consumer’s personal information for any purpose that is unrelated or incompatible with the purpose(s) for which the personal information collected or processed.
We recommend the CPPA delete this exception. This exception would incentivize data uses that are inconsistent with the data minimization restriction in § 100(c) and would likely lead to a constant barrage of consent requests, which will increase consumer consent fatigue and have the unintended consequence of disempowering consumer rights created by the CCPA.[3] Please see page A-1 for our recommended line edits to section § 7002.
II. REQUIRED DISCLOSURES TO CONSUMERS (Article 2)
a. Disclosures to Consumers – § 7010 – 7012
We support the proposal to have clear and understandable notice requirements and encourage the agency to adopt language which provides consumers more than a notice-and-choice privacy regime. Specifically, the disclosures required by the regulations provide sufficient notice to consumers of their rights, including the collection notice, opt out notice, right to limit notice, and financial incentive notice requirements. We support the requirements that the privacy policies and notices must be clearly labeled, easily understandable, and conspicuous. Please see pages A-2 to A-3 for our recommendations for edits to section § 7011 and § 7012.
III. BUSINESS PRACTICES FOR HANDLING CONSUMER REQUESTS
(Article 3)
Please see pages A-3 to A-12 for our recommended line edits to §§ 7022, 7023, 7025, 7026, and 7027.
a. User Rights – §§ 7020 – 7024
The rules need to make clearer that both businesses and third parties have obligations to ensure that deletion and correction requests are delivered to and complied with by the third parties. The rules should also make clear whether written permission is something that must be given on paper or whether it may be electronic.
b. Opt-Out Preference Signals – § 7025
We urge the agency to revise § 7025(c)(7) of the proposed regulations to make it clear that a business which has received an opt-out preference signal may not prompt a consumer to confirm that preference or otherwise collect additional personal information in connection with such signal. An opt-out preference signal is by itself sufficient confirmation and authentication of the consumer’s intent to opt out, which the business must honor. Absent this clarification, businesses may attempt to undermine the efficacy of opt-out preference signals by barraging consumers with confirmatory pop-ups and fomenting consent fatigue.
c. Limiting Use and Disclosure of Sensitive Information – § 7027
We recommend that the agency amend the proposed regulations in § 7027, which implement Cal. Civ. Code § 1798.121, to prohibit companies from using or disclosing sensitive data for any purpose with limited exceptions. The proposed regulations wrongly place the responsibility on the consumer to enforce data minimization and limit the use and disclosure of sensitive personal information. Companies, not consumers, should have the affirmative duty to limit the collection and use of sensitive personal information. The regulations implementing the CPRA and CPPA should impose an affirmative duty on companies to refrain from the collection or use of sensitive data with limited exceptions.
Section 7027 expressly acknowledges the heightened risk of consumer harm from the unauthorized use or disclosure of sensitive personal information, and the proposed regulations should adequately address this risk. Overbroad data collection and retention poses a significant risk to consumer privacy.[4] In a recent white paper, EPIC and Consumer Reports explained that excessive data collection “necessarily subjects consumers to the risk of data breaches, employee misuses, unwanted secondary uses, inappropriate government access, and can have a chilling effect on consumers’ willingness to adopt new technologies, and to engage in free expression.”[5]
Excessive data collection and retention provides companies with massive amounts of personal information that they can use, share, and disclose with few limitations. This practice is particularly harmful when it implicates sensitive personal information. A recent survey conducted by the Future of Technology Commission reflects the severity of this problem: 68% of respondents agreed “it should be illegal for private companies to sell or share information about people no matter what” and only forty-six percent agreed that it would be okay for companies to “sell consumers’ data as long asthey are transparent about how the data is used and make it clear to consumers.”[6] Personal information collected online can reveal sensitive consumer information, including sexual orientation, gender identity, sexual activities, political affiliation, and health conditions.[7] Often this data is collected without the consumer’s knowledge and shared with data brokers or other third parties. Californians’ most urgent need is not for more notices about their rights; it is for substantive, meaningful limitations on the use and disclosure of their sensitive personal information.
Worse yet, the proposed regulations are a further extension of the failed “notice-and-choice” regime. In the current “notice and choice” regime, consumers are expected to read vague and expansive data privacy policies, understand those policies, and make decisions to protect their own privacy. This onerous system prevents consumers from meaningfully participating in the market while protecting their privacy. Overcollection of data also poses data security risks, as security incidents and breaches are common.[8] As written, the proposed regulations provide sensitive data the same treatment as non-sensitive data from the consumer’s perspective. The CCPA and proposed regulations recognize the heightened risk associated with the use and disclosure of sensitive personal information. Accordingly, the proposed regulation should provide heightened security for such data. The current proposal for § 7027 does not address this significant consumer harm.
Consumers should be protected from the harms associated with the collection, use, and disclosure of their sensitive personal information regardless of whether they have taken steps to prevent this harm. Instead, companies should be prohibited from engaging in this behavior. Placing the burden of action on to the consumer is not a workable solution to the problems that the CCPA and the proposed regulations seek to address. Even with constant and aggressive regulation of notice, defaults, and choice architecture, the proposed regulation for § 7027 places too much burden on consumers to vet and understand the nature of internet services and what data is being collected as they navigate their everyday lives. Our proposed additions and changes above reflect the goal of protecting consumers’ sensitive personal information.
IV. SERVICE PROVIDERS, CONTRACTORS, AND THIRD PARTIES (Article 4)
a. Service Providers – §§ 7050 – 7052
We believe the regulations should clearly reflect that some companies are both service providers and third parties depending on the purposes for which they collect information in § 7050. The regulations should include additional protections to ensure that companies, including service providers and contractors, cannot retain personal information for the purposes of improving their services. To that end, we recommend that the agency specify in § 7050(b)(4) that service providers and contractors may not “retain the personal information longer than necessary.”
Further, § 7051 contains the language “unless expressly permitted by the CCPA or these regulations[,]” which is too broad. Consumers’ rights under the CCPA apply even when a business contracts with service providers, secondary service providers, or tertiary service providers. The regulations therefore should enumerate the specific circumstances under which service providers and contractors may retain personal information.
We also recommend that § 7052 be updated to clarify that third parties must comply not only with deletion and opt out requests from consumers, but correction and access requests as well.
Please see pages A-12 to A-13 for our recommended line edits to section § 7050 and § 7052.
b. Contract Requirements for Third Parties – § 7053
We emphasize the importance of § 7053 and supports its adoption. This section is important to ensure that the rules and rights under the CPPA are adequately enforced and truly limit the flow of information to various entities beyond the business with which the user directly interacts. Consumers may understand the scope of their relationships with the businesses they directly interact with, but so much can happen with their personal information outside of those relationships through data transfers and sales. Section 7053 is crucial for reining in the unregulated data collection and use in the data ecosystem.
V. VERIFICATION REQUESTS (Article 5)
a. Verification Requests – § 7060
We request that the agency provide illustrative examples for § 7060(d) to demonstrate how and under what circumstances a business can request additional information to verify the identity of the requestor. With respect to § 7060(f), verification is important in certain contexts to ensure that a party who seeks to delete, request, or correct personal information is entitled and authorized to do so. We further agree with the rules in § 7060(b) that businesses may not require a consumers to verify their identity before processing opt-out requests, that businesses may only collect the limited information necessary to complete such requests, and that businesses must delete such information after it is no longer needed for that limited purpose. As noted above, we request that the agency clarify whether “signed permission” as mentioned in § 7063 must be written or electronic.
VI. NON-DISCRIMINATION (Article 7)
a. Discriminatory Practices and Calculating the Value of Consumer Data – §§ 7080 – 7081
We commend the CPPA for its inclusion of Article 7 protecting not only consumers’ rights to privacy, but also their ability to exercise those rights. The non-discrimination provisions explicitly protect consumers who exercise their right to privacy from facing discriminatory price or differential service, leaving consumers free to choose privacy. The CCPA’s guardrails to ensure that financial incentives practices may not be “unjust, unreasonable, coercive, or usurious in nature” are critical to ensuring that incentive programs do not provide a backdoor for businesses to coerce individuals into agreeing to waive their privacy rights. The examples in this section are particularly useful and clarify for both businesses and consumers which practices are allowed under law. Additionally, the examples make it clear that services such as loyalty programs, coupons, and discounts can still continue, even if consumers exercise their right to delete or to opt out of sale or sharing of their information. This clarification is useful because these are often popular programs that people may be concerned about losing, so explaining that these can coexist with privacy rights is important.
However, we do have some concerns about how the regulations instruct businesses to calculate the value of consumer data. We are particularly worried about the inclusion of a good-faith exception. Allowing businesses to create their own method of calculating the value of consumer data as long as it is done in good faith can result in undervaluing consumer data or valuing some consumers’ data more than others. We would recommend deleting clause (8) from § 7081(a).
VII. TRAINING AND RECORDKEEPING (Article 8)
a. Training and Recordkeeping – §§ 7100 – 7101
We commend the agency for mandating training and record-keeping in the regulations. These measures are essential to ensure that employees who handle consumers’ personal data are trained in how to keep data private and secure. Specifically, we support the regulations’ requirement that businesses not only train employees about the provisions of the CCPA but also about how to direct consumers to exercise their rights under the law. The record-keeping requirements are particularly strong, and the agency should adopt them. Requiring businesses to record consumer requests and their responses is a vital step toward ensuring businesses comply with the requirements of the CCPA. Importantly, the record-keeping provision also requires that businesses not use this data for any purpose other than CCPA compliance and that the data not be shared with third parties. Regarding the requirements for businesses collecting large amounts of personal data, we recommend revising one of the metrics the businesses are required to disclose. Instead of allowing businesses to report either the mean or the median number of days it took to substantively respond to consumer requests, the regulations should choose one. Requiring the businesses to report this information using the same metric will make it easier to compare across businesses, identify trends in the responses to consumer requests, and ensure compliance with the regulations.
VIII. INVESTIGATIONS AND ENFORCEMENT (Article 9)
a. Investigations and Enforcement – §§ 7300 – 7304
We support the investigation and enforcement regulations and urge the agency to adopt Article 9. We commend the inclusion of multiple methods for investigation, including sworn complaints, anonymous complaints, referrals, and agency-initiated investigations. To ensure these enforcement mechanisms operate as intended, however, we recommend adding a provision outlining who has standing to file a sworn complaint. Given California’s public interest standing doctrine, standing can be fairly broad. Specifying who has standing would eliminate confusion and ensure that public interest organizations and watchdog groups can file complaints in addition to individuals. A useful way to indicate who has standing to file complaints would be to provide a few examples in the regulations, consistent with the examples given in other articles.
Conclusion
EPIC, CALPIRG Education Fund, CDD, Consumer Action, CFA, Ranking Digital Rights, and U.S. PIRG applaud the agency’s open and robust rulemaking process to protect consumers in accordance with the California Consumer Protection Act. We will continue to be available for discussion about our recommendations and about how the Department can best protect Californians under the CCPA.
Respectfully submitted,
Electronic Privacy Information Center
CALPIRG Education Fund
Center for Digital Democracy
Consumer Action
Consumer Federation of America
Ranking Digital Rights
U.S. Public Interest Research Group
[1] EPIC, About EPIC (2022), https://epic.org/about/.
[2] See Comments of EPIC et al. to Cal. Priv. Protection Agency (June 8, 2022), https://epic.org/wp-content/uploads/2022/06/GlobalOptOut-Coalition-Letter.pdf; Comments of EPIC and Coalition to Cal. Priv. Protection Agency (Nov. 8, 2021) https://epic.org/documents/comments-of-epic-and-three-organizations-on-regulations-under-the-california-privacy-rights-act-of-2020/; Comments of EPIC to Cal. Office Att’y Gen. (Feb. 25, 2020), https://epic.org/wp-content/uploads/apa/comments/EPIC-CCPA-Feb2020.pdf; Comments of EPIC to Cal. Office of the Att’y Gen. (Dec. 6, 2019), https://epic.org/wp-content/uploads/apa/comments/EPIC-CCPA-Dec2019.pdf; see also Comments of EPIC (Mar. 25, 2022), https://epic.org/epic-recommends-cfpb-strengthen-buy-now-pay-later-bnpl-market-inquiry-on-customer-acquisition-and-data-practices/; Comments of EPIC to White House Office of Sci. and Tech. Policy, Implementation Plan for a National Artificial Intelligence Research Resource (Oct. 1, 2021), https://epic.org/documents/request-for-information-rfi-on-an-implementation-plan-for-a-national-artificial-intelligence-research-resource/; EPIC, AI & Human Rights (2022), https://www.epic.org/issues/ai/; EPIC, AI in the Criminal Justice System (2022), https://epic.org/issues/ai/ai-in-the-criminal-justice-system/.
[3] Cameron Kormylo & Idris Adjerid, Reconsidering Privacy Choices: The Impact of Defaults, Reversibility, and Repetition, Pamplin College of Business (2021), https://www.ftc.gov/system/files/documents/public_events/1582978/reconsidering_privacy_choices_the_impact_of_defaults_reversibility_and_repetition.pdf (“Repetition of choices can introduce new decision biases; for example, (Schaub et al. 2015) find that habituation in repeated choice contexts prevents the retrieval of new information. Past literature has shown that individuals exhibit what has been termed “privacy fatigue,” where they disclose more information over time when faced with increasing complexity and less usability in privacy controls (Keith et al. 2014). Choi et al. (2018) show how privacy fatigue leads to a perceived loss of control and a sense of futility with protecting one’s privacy that results in less informed privacy decision making. This theory has also been applied to privacy and security notices (Schaub et al. 2015).”).
[4] See, e.g., Letter from Access Now et al., to Chair Khan and Commissioners Chopra, Slaughter, Phillips, and Wilson (Aug. 4, 2021), https://www.lawyerscommittee.org/wp-content/uploads/2021/08/FTC-civilrights-and-privacy-letter-Final-1.pdf.
[5] EPIC and Consumer Reports, How the FTC Can Mandate Data Minimization Through a Section 5 Unfairness Rulemaking (Jan. 2022) at 6, https://epic.org/wp-content/uploads/2022/01/CR_Epic_FTCDataMinimization_012522_VF_.pdf citing Justin Brookman and G.S. Hans, Why Collection Matters: Surveillance as a De Facto Privacy Harm, https://cdt.org/wp-content/uploads/2018/08/September-2013-Brookman-Hans-Why-Collection-Matters.pdf
[6] Benson Strategy Group, Future of Tech Commission: Tech Attitudes Survey (July 2021), https://d2e111jq13me73.cloudfront.net/sites/default/files/uploads/pdfs/bsg_future_of_technology_topline_c1-1.pdf.
[7] EPIC & Consumer Reports, How the FTC Can Mandate Data Minimization Through a Section 5 Unfairness Rulemaking (Jan. 2022), https://epic.org/wp-content/uploads/2022/01/CR_Epic_FTCDataMinimization_012522_VF_.pdf.
[8] See Mahmood Sher-Jan, Is it an incident or a breach? How to tell and why it matters, IAPP (Feb. 28, 2017), https://iapp.org/news/a/is-it-an-incident-or-a-breach-how-to-tell-and-why-it-matters (“In today’s threat-filled world, sensitive customer information is constantly at risk for exposure. Cyberattacks, ransomware, spear phishing, malware, system & process failure, employee mistakes, lost or stolen devices — the list of dangers continues to expand. Indeed, it’s a near certainty that your organization’s customer data will be — or already has been — exposed.”).
Support Our Work
EPIC's work is funded by the support of individuals like you, who allow us to continue to protect privacy, open government, and democratic values in the information age.
Donate