On July 16, 2024, the California Privacy Protection Agency (CPPA) Board met to discuss advancing its over 200-page draft rulemaking package to formal proceedings.[1] The proposed regulations include 37 pages of significant new obligations spanning cybersecurity audits, automated decision-making technology (e.g., artificial intelligence, (AI)), privacy risk assessments, and 72 pages of other updates to existing regulations. Together, these regulations would create new compliance obligations for tens of thousands of California businesses and are preliminarily estimated to generate a staggering $4.2 billion in compliance costs for those businesses in their first year alone. Critically, these estimates do not include the many businesses that are based outside of California, yet subject to the California Consumer Privacy Act (CCPA) because they do business in California, meaning the real economic burden is likely to be far more significant.

In its July meeting, the CPPA Board signaled it might reconvene in September to initiate formal rulemaking after receiving requested updates to the proposed regulations from CPPA staff and additional information on the required Standardized Regulatory Impact Assessment (i.e., anticipated economic impact analysis). Once this happens, members of the public will have the opportunity to formally comment on the proposed regulations and urge the CPPA Board to make changes. Entities subject to the CCPA should familiarize themselves with the draft regulations now so that they are prepared to comment when the regulations enter formal rulemaking.

Below is an overview of the key provisions of the draft regulations and observations about the compliance efforts they would necessitate.

       I.          Cybersecurity Audit Regulations

The proposed regulations would require qualifying businesses to complete annual cybersecurity audits conducted by independent auditors and to certify completion of those audits with the CPPA each year. As drafted, over 25,000 businesses in California alone are estimated to be subject to these costly and resource-intensive audits. To meet these requirements, businesses would need to implement robust cybersecurity programs addressing at least the 17 program “components” outlined in the proposed regulations. Economists engaged by the CPPA estimate that the proposed cybersecurity audit regulations would cost California businesses a whopping $2.06 billion—or between $40,000 and $363,000 per business—in the first year of implementation alone.

Applicability: Businesses subject to the CCPA would be required to conduct an annual independent cybersecurity audit when their processing of personal information meets either of two proposed thresholds for presenting a “significant risk” to consumers’ security:

  • 50 percent of revenues from “sale” or “sharing”: The first threshold would be met by any business that derives 50 percent or more of its annual revenues from “selling” or “sharing” consumers’ personal information (as those terms are defined in the CCPA) in the preceding calendar year, regardless of the total amount of revenues.
  • Revenue plus amount of personal information processed: The second threshold would be met by any business that, in the preceding calendar year, met the revenue threshold to qualify as a “business” under the CCPA and either a) processed the personal information of 250,000 or more consumers or households; or b) processed the sensitive personal information of 50,000 or more consumers. Notably, the CCPA’s definition of sensitive personal information includes commonly processed types of information such as “account log-in” information and precise geolocation information. This means that businesses meeting the revenue threshold with 50,000 Californian users are likely to be required to conduct annual cybersecurity audits under the draft regulations.

Cybersecurity Program Components. The proposed text outlines 17 cybersecurity program components that auditors would need to specifically identify, assess, and document. This means that prior to their first audit, businesses would need to implement (if they have not already) a cybersecurity program with at least the following components:

  1. The establishment, implementation, and maintenance of a cybersecurity program appropriate to the business’s size, complexity, and the nature and scope of its processing activities
  2. Authentication, including multi-factor authentication and strong passwords for workforce members
  3. Encryption of personal information at rest and in transit
  4. Zero trust architecture
  5. Account management and access controls (including specific sub-controls)
  6. Inventory and management of personal information and the business’s information system, including personal data inventories or data flows, and hardware and software inventories and approval processes
  7. Secure configuration of hardware and software, including masking stored sensitive personal information, security patch management, and change management
  8. Vulnerability scans and penetration testing
  9. Audit-log management
  10. Network monitoring and defenses, including bot-detection, intrusion-detection / prevention, and data-loss prevention
  11. Antivirus and antimalware protections
  12. Segmentation of information systems
  13. Limitation and control of ports, services, and protocols
  14. Oversight of service providers, contractors, and third parties
  15. Retention schedules and disposal of personal information
  16. Incident response management
  17. Business continuity and disaster recovery plans

Annual Cybersecurity Audit Scope and Requirements. Each year, independent auditors would be required to conduct an assessment of the business’s cybersecurity program, including how it protects personal information from unauthorized access, destruction, use, modification, disclosure, or loss of availability from unauthorized acts. The audit would need to outline its scope and criteria, identify the specific evidence that the auditors examined, and explain why the evidence examined is appropriate for the business’s cybersecurity program in light of its size and complexity and justifies the auditor’s findings. Audit findings would need to be premised primarily upon specific evidence, not assertions from the business’s management. Under the draft regulations, the annual cybersecurity audits must also address the following:

  • Cybersecurity Program Establishment and Components: The audit would need to identify, assess, and document the business’s establishment, implementation, and maintenance of its cybersecurity program, including that it is appropriate to the business’s size and complexity and the nature and scope of its processing activities. Further, the audit would be required to assess, document, and summarize each component of the business’s cybersecurity program, including how each component is appropriate given the business’s size and complexity. If a component is not applicable to the business’s cybersecurity program, the audit must explain why the component is not necessary to protect personal information and how the business’s existing safeguards are sufficient to provide at least equivalent security.
  • Gaps and Weaknesses: The audit would be required to identify and describe the components’ gaps and weaknesses, as well as the business’s plans to address them. In subsequent assessments, auditors would be required to report on the status of previously identified gaps and weaknesses.
  • Breach Notifications: Audits would need to include samples or descriptions of breach notifications provided to certain agencies and consumers. If the business was required to notify any agency with jurisdiction over privacy laws or other data processing authority (in California or otherwise) of unauthorized access, destruction, use, modification, or disclosure of personal information or of unauthorized activity resulting in the loss of availability of personal information, the cybersecurity audit would include a sample or description of the notification and details of the activity giving rise to the notification. This includes related remediation measures taken by the business. Likewise, if the business provided notifications to affected consumers pursuant to relevant California law, the audit would be required to include a sample or description of those notifications.
  • Board Certification: A member of the business’s board or governing body would need to sign a statement included in the audit that certifies that the business did not influence or attempt to influence the auditor’s assessments and that the signer has reviewed and understands the audit findings. If the business does not have a board of directors or governing body, then the audit would be certified by the highest-ranking executive responsible for the cybersecurity program.
  • Audit Reporting to Board: Cybersecurity audits would need to be reported to the business’s board of directors or governing body, or if neither exists, the highest-ranking executive responsible for the cybersecurity program. The date that the cybersecurity program and evaluations of it were reported to the applicable governing body or executive would need to be included in the audit.

Auditor Requirements: The cybersecurity audits would need to be conducted by an internal or external auditor who is a qualified, objective, independent professional using procedures and standards generally accepted in the profession of auditing. The auditor would be barred from participating in business activities they may assess, including developing, implementing, maintaining, or providing recommendations regarding the business’s cybersecurity program. If a business uses an internal auditor, the auditor is specifically required to report issues regarding the cybersecurity audit directly to the business’s board of directors or governing body, as opposed to reporting issues to business management with direct responsibility for the business’s cybersecurity program. In the event that the business does not have a board or equivalent body, the internal auditor would report to the business’s highest-ranking executive that does not have direct responsibility for the business’s cybersecurity program. The determination of the internal auditor’s compensation, and the auditor’s performance evaluation, would need to be done by either the business’s board of directors, governing body, or highest-ranking executive that does not have direct responsibility for the business’s cybersecurity program.

Certification of Completion Requirement: Each year, businesses required to conduct cybersecurity audits would also need to certify completion of the audit to the CPPA through the agency’s website. The certification would need to be signed and dated by a member of the board or governing body, or if neither exists, the highest-ranking executive responsible for oversight of cybersecurity audit compliance. The certification must include a statement verifying that the signer has reviewed and understands the findings of the audit, along with the signer’s name and title.

Duplicative Audit Not Required: Businesses that have engaged in a cybersecurity audit, assessment, or evaluation that meets all of the final regulation’s requirements will not be required to complete a duplicative cybersecurity audit. They would, however, be required to explain how the previous audit meets all of the requirements set forth in the final regulations or supplement the previous audit to cover any additional requirements in the regulations.

Timeline for Cybersecurity Audit Implementation: Businesses covered by the cybersecurity audit requirement would have 24 months from the effective date of the regulations to complete the first cybersecurity audit. Cybersecurity audits would be required annually thereafter.

     II.          Automated Decision-Making Technology Regulations

The draft regulations also include rules regarding the use of automated decision-making technology (ADMT) that apply to “businesses” subject to the CCPA (i.e., determines the purposes of the processing of personal information, does business in California, and meet certain thresholds) that use ADMT for covered purposes (“covered businesses”). The draft ADMT regulations would require covered businesses to provide pre-use notices that inform consumers about the business’s use of ADMT as well as offer consumers the rights to opt out of the use of ADMT and access information about how the business used ADMT with respect to that consumer. The draft regulations use broad definitions and a broad array of covered uses, and in the July 16, 2024, CPPA Board meeting, some of the Board members criticized the breadth of the draft ADMT regulations as well as the imposition of onerous burdens on covered businesses. It remains to be seen whether and how the CPPA staff will address these concerns in the next draft. According to the CPPA’s preliminary economic assessment, the draft ADMT regulations would cover over 50,000 California businesses and cost them, in aggregate, a staggering $1.4 billion in the first year alone.

Two-Prong Test for Applicability: The draft regulations contemplate two threshold questions regarding ADMT for businesses covered under the CCPA to consider. First, is the business utilizing ADMT? Second, assuming so, is that ADMT utilization a “covered use” under the regulations? Businesses would need to analyze both, as detailed below, to determine whether they must comply with the substantive ADMT requirements, such as offering pre-use notice, opt-out rights, and information access rights.

ADMT Defined: The draft regulations define ADMT to cover “any technology that processes personal information and uses computation to execute a decision, replace human decisionmaking, or substantially facilitate human decisionmaking.” The draft regulations also state that ADMT includes “profiling,” which is defined as “any form of automated processing of personal information to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning that natural person’s intelligence, ability, aptitude, performance at work, economic situation; health, including mental health; personal preferences, interests, reliability, predispositions, behavior, location, or movements.”

  • The technology must be used as a “key” factor, not merely a factor, in human decision-making. The draft regulations define “substantially facilitate” as using the output of the technology as a key factor in human decision-making. This ADMT definition aims to highlight that the technology must play a significant role in human decision-making to be categorized as ADMT. However, this leaves open the question of what exactly constitutes a “key” factor. While the CPPA offers an example—where a technology creates a consumer score that a human reviewer considers as the “primary factor” in making a decision—it is unclear what non-primary uses would still count as key factors. Furthermore, it raises the question of how businesses would assess if a human reviewer used the technology as a key factor.
  • Examples of excluded technologies. The draft regulations list technologies that do not constitute ADMT unless they are used to replace or act as a key factor in human decision-making: spreadsheets, web hosting, caching, calculators, and databases, among others. The draft regulations also illustrate the uses of spreadsheets that do, and do not, constitute the use of ADMT. For example, running regression analyses using a spreadsheet on top-performing managers’ personal information to identify common characteristics, and subsequently utilizing that data to find similar traits in junior employees for promotion decisions, qualifies as ADMT usage. In contrast, using a spreadsheet to input performance evaluation scores collected from employees’ managers and colleagues and calculate the final scores of candidates for a promotion decision does not, as it is used “merely to organize human decisionmakers.” Despite these examples, identifying when the use of a spreadsheet or other listed technologies crosses the line into ADMT remains unclear, given that they have a wide range of use cases.

Covered Uses: Businesses that use ADMT for the following covered use cases must comply with the substantive ADMT requirements: 1) for a significant decision concerning a consumer, 2) for extensive profiling of a consumer, and 3) for training uses of ADMT.

  • For a significant decision concerning a consumer. Under the draft regulations, “a significant decision concerning a consumer” means “a decision that results in access to, or the provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services” (emphasis added). In contrast to other state privacy laws, the draft regulations cover decisions that result in “access” to the enumerated opportunities, not solely their “provision or denial.” In the CPPA Board meeting held on July 16, 2024, Board member Alistair MacTaggart criticized the ambiguity of “access to,” and requested clarification.
  • For “extensive profiling” of a consumer. Extensive profiling means three types of profiling: 1) “work or educational profiling,” (i.e., profiling a consumer through systematic observations when they are acting in their capacity as an applicant for work or education); 2) “public profiling,” (i.e., profiling a consumer through systematic observation of a publicly accessible place), or 3) profiling a consumer for behavioral advertising (i.e., any targeting of advertising to a consumer based on the consumer’s personal information obtained from the consumer’s activity, whether across different services or within the business’s own services).
  • For training uses of ADMT. Training uses of ADMT refer to processing consumers’ personal information to train ADMT “that is capable of being used” for 1) a significant decision concerning a consumer; 2) establishing individual identity; 3) physical or biological identification or profiling; or 4) generating a deepfake. Notably, the threshold question is not whether businesses intend for, or even permit, the ADMT being trained to be used for such purposes, but whether that ADMT is capable of being used that way. It remains uncertain how businesses will evaluate these capabilities and to what extent they should consider any unforeseen capabilities.

Pre-Use Notice Requirements: The draft regulations would require CCPA-covered businesses that use ADMT for a covered purpose to provide a “pre-use notice” that informs consumers about the business’s use of ADMT and the consumer’s right to opt out and to access further information. Importantly, the notice would have to be provided before the business processes the consumer’s personal information using the ADMT. The pre-use notice must:

  • explain the ADMT’s purpose in “plain language” and not use “generic terms” such as “to improve our services”;
  • provide “[a] simple and easy-to-use method (e.g., a layered notice or hyperlink) by which the consumer can obtain additional information about the business’s use of the [ADMT]”; and
  • include plain language explanations of the technology’s logic, including key parameters affecting the output of the ADMT, the intended output and use secured from the technology.

The draft regulations also added a number of exceptions for the notice. For example, a business relying on the “security, fraud prevention, and safety” exception to the opt-out requirement does not need to provide information that would compromise its use of ADMT for those purposes.

Opt-Out Rights: The draft regulations would give consumers the right to opt out of a business’s use of ADMT but also would include many exceptions to this right. Notably, the draft regulations provide no exception for using ADMT for profiling for behavioral advertising or training ADMT.

  • Security, fraud prevention, and safety exception.” The draft regulations provide an exception to the opt-out requirements if the business’s use of the ADMT is necessary to achieve, and is used solely for, the following security, fraud prevention, and safety purposes: 1) preventing, detecting, and investigating security incidents, 2) resisting malicious, deceptive fraudulent, or illegal actions directed at the business, or 3) to ensure the physical safety of natural persons.
  • Human appeal exception. If a business is using ADMT for any “significant decision concerning a consumer,” instead of providing consumers with an opt-out, the business may rely on “human appeal exception” if it designates a qualified human reviewer, describes to the consumer how to submit an appeal, and enables the consumer to provide information for the human reviewer to consider as part of the appeal. During the July 16, 2024, CPPA Board meeting, Board member Alistair MacTaggart heavily criticized this structure, arguing that it imposes a considerable burden on businesses because they must either offer an opt-out or set up a process for a human reviewer to reconsider every decision. He suggested that the CPPA model its regulations after the Colorado Privacy Act’s flipped structure, which as a default exempts businesses from offering an opt-out if a human being is meaningfully involved in a decision.
  • Education and work exception. The draft regulations would add exceptions for using ADMT for 1) admission, acceptance, or hiring decisions and 2) allocation / assignment of work and compensation decisions, as well as “work or educational profiling” if certain conditions are met. To rely on this exception, the ADMT would need to be necessary and used solely for those purposes. Additionally, the business would also need to have conducted an evaluation of the ADMT and implemented accuracy and nondiscrimination safeguards, or alternatively, if the business has obtained the ADMT from another person, it must have reviewed that person’s evaluation of the automated decision-making technology. The new exceptions reflect the Board members’ feedback in the December 2023 CPPA Board meeting, during which some Board members raised concerns about the negative impact that the opt-outs might have on business operations and performance monitoring and suggested striking opt-out rights for workers.
  • Opt-out methods. A business using ADMT for these covered purposes would have to provide at least two methods for submitting opt-out requests. Those methods would need to be provided in the manner, among other considerations, in which the business primarily interacts with the consumer and be easy for consumers to execute. Notably, a notification or tool regarding cookies, such as a cookie banner or cookie controls, would not (by itself) be an acceptable method for submitting opt-out requests. The rule would also require businesses to instruct service providers to honor the consumer’s opt-out. A business would be permitted to deny an opt-out request or require further identity verification when it suspects fraud.

Access Rights: Under the draft regulations, consumers would have the right to access information about a business’s use of ADMT for a covered purpose with respect to that consumer. In responding to the information access request and subject to certain exemptions, a business would have to provide the consumer with, among other things, a plain language explanation of: 1) the purpose for which the business used ADMT; 2) the output of the ADMT with respect to the consumer; 3) how the business used the output to make a decision with respect to the consumer; 4) how the automated decision-making technology worked with respect to the consumer; 5) and instructions for how the consumer can exercise their other CCPA rights. Despite the CCPA’s specific statutory directive that the CPPA establish exceptions for not disclosing trade secrets in response to consumer requests, the draft regulations once again fail to include any explanation of how these requirements are intended to interact with trade secret protections, which are likely to be highly relevant to this particular access right.

   III.          Risk Assessment Regulations

The draft regulations would bar businesses from processing personal information for activities that present a significant risk to consumers’ privacy that outweigh the benefits to the consumer, the business, other stakeholders, and the public. To make that determination, the draft regulations would require businesses to undergo detailed risk assessments before initiating high-risk processing activities and report the results of those assessments annually to the CPPA. The CPPA’s preliminary economic assessment predicts that over 50,000 businesses in California could be required to undertake these risk assessments, collectively costing them over $354 million in the first year alone.

Applicability: The draft regulations would require businesses subject to the CCPA to assess whether their processing of consumers’ personal information presents a “significant risk” to consumers’ privacy prior to such processing. Processing that presents a significant risk to consumer privacy would specifically include:

  • “selling” or “sharing” personal information (as defined under the CCPA);
  • processing sensitive personal information (as defined by the draft regulations);
  • using ADMT for a significant decision concerning a consumer or for “extensive profiling” (covered above); and
  • processing consumers’ personal information to train ADMT or AI “that is capable of being used” for 1) a significant decision concerning a consumer; 2) establishing individual identity; 3) physical or biological identification or profiling; 4) generating a deepfake; or 5) the operation of generative models, such as large language models. Note that this is broader than the similar ADMT covered training uses discussed above in two ways: 1) it adds training for AI, which is defined in the draft regulations to mean “a machine-based system that infers, from the input it receives, how to generate outputs that can influence physical or virtual environments,” and 2) it adds training for ADMT or AI that is capable of being used for operating generative models.

The draft regulations also provide a number of examples that illustrate when a business must conduct a risk assessment (i.e., processing that presents a significant risk to consumers’ privacy). One example describes a rideshare provider that seeks to use ADMT to allocate rides and determine fares and bonuses for drivers. Because these uses of ADMT would be significant decisions concerning a consumer, and accordingly present a significant risk to consumers’ privacy, a risk assessment would be required. Another example covers a mobile dating application seeking to disclose its users’ precise geolocation, ethnicity, and medical information to its analytics service provider. In this instance, a risk assessment would be required because the business seeks to process consumers’ sensitive personal information, which presents a significant risk to consumers’ privacy. Businesses should familiarize themselves with the draft regulations’ list of processing activities that present a significant risk to privacy to ensure that they know when a risk assessment is required.

Risk Assessment Requirements: Per the CPPA’s Initial Statement of Reasons, the purpose of a risk assessment is to identify, assess, and mitigate privacy risks stemming from certain processing activities and determine whether those risks outweigh the benefits to consumers, the business, other stakeholders, and the public. Businesses would need to ensure that relevant internal and external stakeholders are involved in completing the assessment and document who reviewed and approved the assessment.

In the assessment, businesses would be required to identify the specific purpose(s) for processing, the minimum personal information necessary to achieve said purpose, and the categories of information to be processed. For uses of ADMT or AI, businesses would be required to identify actions taken to maintain the quality of personal information to be processed.

Operational Information to be Provided:  Businesses would also need to identify specific operational information relevant to the processing activity, including:

  • the planned method of collecting, using, disclosing, retaining, or otherwise processing personal information;
  • source of personal information;
  • retention period for each category of personal information and criteria used to make that determination;
  • relationship between the business and consumer, including whether the consumer interacts with the business (e.g., via a website) and the nature of the interaction (if any) (e.g., buying a product from the business);
  • approximate number of consumers whose personal information the business seeks to process;
  • disclosures the business has made (or plans to make) to the consumer about the processing, how disclosures were made, and what actions the business has taken to make disclosures specific, explicit, prominent, and clear to the consumer;
  • names or categories of the service providers, contractors, or third parties to whom the business discloses personal information, the purpose for disclosure, and what actions the business has taken or plans to take to inform consumers that these third parties are involved in the processing; and
  • the technology used for processing.

Identifying Benefits: Businesses must identify the benefits to consumers, the business, other stakeholders, and the public. Notably, businesses must disclose whether they stand to profit monetarily from the processing activity and must share the amount of estimated profit if possible.

Identifying and Mitigating Risks: The draft regulations would require businesses to identify the negative impacts to consumers’ privacy associated with the processing activity, including the sources and causes of these negative impacts and any criteria used to make these determinations. Businesses would also need to identify the safeguards (e.g., security controls, encryption, and privacy enhancing technologies) that it plans to implement to address identified negative impacts. The following are some examples of negative impacts that businesses may consider:

  • unauthorized access, destruction, use, modification, or disclosure of personal information;
  • discrimination upon the basis of protected classes that would violate federal or state antidiscrimination law;
  • impairing consumers’ control over their personal information or coercing or compelling them into allowing the processing of their personal information; and
  • economic, physical, psychological, and reputational harms.

Finally, the risk assessment would need to clearly state which individuals reviewed the assessment, the date of review (and approval if applicable), and whether the business will proceed with processing the personal information (i.e., the risks were not outweighed by the benefits).

Special Considerations for Processing Personal Information with ADMT: The draft regulations include additional requirements for assessing processing activities that utilize ADMT or use personal information to train ADMT. In assessing ADMT related processing, businesses would be required to:

  • state whether they evaluated the ADMT to ensure it works as intended and does not discriminate based upon protected classes;
  • identify the policies, procedures, and training that the business has implemented or plans to implement to ensure the ADMT works as intended and does not discriminate;
  • identify the logic of the ADMT, including any assumptions or limitations; and
  • identify the output of the ADMT and how the business will use it.

Businesses that have obtained ADMT from another person would need to specify whether they reviewed that person’s evaluation of the ADMT and whether it included any requirements or limitations. These businesses would also be required to state whether any accuracy or nondiscrimination safeguards have been or will be implemented.

The draft regulations would require businesses that process personal information to train ADMT and provide that ADMT to another business to provide all facts necessary for the “recipient-business” to conduct its own risk assessment, as well as a plain language explanation of any requirements or limitations it identifies as relevant to the permitted use of the ADMT.

Timing and Retention Requirements: Required risk assessments would need to take place before any processing is initiated. Businesses would be required to review their risk assessments at least once every three years to ensure they remain accurate, and update assessments immediately whenever there is a material change relating to the covered processing activity. The draft regulations explain that a change is material if it diminishes the benefits of the processing activity, creates new negative impacts or increases the magnitude or likelihood of previously identified negative impacts, or diminishes the effectiveness of the safeguards. Businesses would need to retain their risk assessments for as long as the activity continues, or for five years after completion of the risk assessment, whichever is later.

Avoiding Duplicative Assessments: The draft regulations permit businesses to conduct a single risk assessment for a set of comparable processing activities that present similar risks to consumer privacy. Similarly, businesses that have conducted risk assessments for compliance with other laws need not duplicate their efforts. If the other law does not meet all of the final CCPA regulations’ requirements, however, the business would be required to supplement the existing risk assessment with the additional required information.

Submission to the CPPA: If the draft regulations become effective, businesses would have 24 months to conduct and submit initial risk assessment materials to the CPPA, including for processing activities that are ongoing when the regulation takes effect. After the first submission, subsequent risk assessment materials would need to be submitted to the CPPA every calendar year through its website.

  • Certification: The submission would be required to include a certification of conduct signed and dated by the business’ highest ranking executive responsible for compliance with these regulations stating that the risk assessments were conducted in accordance with the regulations. The certification would also need to identify the time period and number of assessments covered by the submission and include an attestation that the executive has reviewed, understood, and approved the included assessments and that no processing took place before the associated assessment was completed.
  • Abridged Risk Assessments: The submission would need to include the risk assessments conducted or updated during the submission period in an abridged form that addresses: the processing activity; categories of personal information and whether they are sensitive; and a plain language explanation of any safeguards in place. Notably, businesses would not be required to divulge information that would compromise their ability to handle security incidents, guard against fraud or illegal activity, prosecute their legal rights, or ensure the physical safety of natural persons. Additionally, businesses would be required to submit unabridged versions of their assessments to the CPPA or California Attorney General upon request within 10 days of such a request. The submission requirements in the draft regulations do not address how issues regarding trade secrets should be handled, despite the fact that the statutory authority upon which the CPPA is relying to issue the risk assessment regulations, Civ. Code § 1798.185(a)(15)(B), specifically states that nothing in that section shall require a business to divulge trade secrets.
  • Exemptions: Businesses would not be required to submit risk assessments for processing activities they do not implement; or when a review or update to an existing risk assessment does not result in any material changes to the abridged risk assessment already on file with the CPPA.

   IV.          Notable Updates to Existing Regulations

The draft regulations also propose notable updates to existing regulations, including:

Revenue Threshold to Qualify As a “Business”: The proposed regulations would increase the gross revenue threshold for when an entity qualifies as a “business” under the CCPA from $25 million to $27.975 million. Other monetary thresholds, including fines and penalties, would also increase to reflect increases in the Consumer Price Index, as is required under the CCPA.

“Sensitive Personal Information” Definition: The draft regulations would expand the definition of “sensitive personal information” to include the personal information of consumers the business has actual knowledge are less than 16 years of age.

Denying Consumer Requests: When businesses deny consumers’ requests to exercise their rights to delete, correct, know, opt out of sale or sharing, or limit the use or disclosure of sensitive personal information, they would need to notify consumers that they can file a complaint with the CPPA and California Attorney General. To comply, businesses would need to update their processes for responding to consumer rights requests.

Right to Delete: Businesses, service providers, and contractors would be expected to implement measures to ensure that personal information that consumers have requested to delete “remains deleted, deidentified or aggregated,” even if the personal information is provided by third parties (e.g., data brokers). This appears to exceed the text of the CCPA, which gives consumers the right to request deletion for personal information “collected from the consumer,” not collected from third parties.

Right to Correct: Businesses that deny a consumer’s request to correct would be required to inform the consumer that, upon the consumer’s request, it will note both internally and to any person to whom it discloses, shares, or sells the personal information that the accuracy is contested by the consumer (unless the request is fraudulent or abusive). Similarly, if a business denies a consumer’s request to correct personal information collected and analyzed concerning a consumer’s health, it would need to, upon the consumer’s request, make available a written statement from the consumer to any person to whom it discloses, shares, or sells the personal information. Where the business is not the source of information that the consumer contends is inaccurate, the businesses would be required to provide the name of the source or inform the source that the information provided was incorrect and must be corrected. Finally, in response to requests to correct or to know, businesses would need to provide a way for the consumer to confirm that certain sensitive personal information (e.g., Social Security numbers, driver’s license numbers, account passwords, etc.) the business maintains is the same as what the consumer has provided.

Right to Know: If a business maintains personal information for longer than 12 months, then it would need to provide a way for consumers to request personal information collected more than 12 months earlier when they make a request to know. This could include, for example, offering consumers a date range for their request or an option to request all information the business has collected about them.

Symmetry in Choice: Businesses would not be able to obtain consumer consent where the “yes” option is more visually prominent than the “no” button, because this would not be a “symmetrical choice.” Setting off consent buttons in a contrasting color or shading is a common practice and many businesses would likely need to update their user interfaces to comply with this change.

Opt-Outs on Connected Devices and AR / VR: Businesses would need to notify consumers of their right to opt out of sale or sharing and right to limit the use of sensitive personal information before a connected device (such as a smart television or smart watch) begins collecting personal information for these purposes. Similarly, businesses would need to provide these notices to consumers before they enter an augmented or virtual environment where their personal information would be collected for these uses.

Opt-Out Preference Signals: Businesses would be required to display whether they have processed a consumer’s opt-out signal as a valid request on their websites, for example via a phrase such as “Opt-Out Request Honored,” or as a toggle or radio button showing that the consumer has opted-out of the sale or sharing of their personal information. Many businesses would need to update their websites’ user interfaces to comply.

Security Exception to the Right to Limit Use of Sensitive Personal Information: Under the existing regulations, businesses are not required to offer consumers the ability to limit the use or disclosure of their sensitive personal information for certain purposes, including to “prevent, detect, and investigate security incidents.” The draft regulations explain that businesses can rely on this exception to scan employees’ outgoing emails to prevent them from leaking sensitive personal information, but that scanning employee emails for other purposes would not fall within the exception. This change seemingly narrows how businesses may rely on the exemption when implementing data loss prevention tools and is likely to lead to confusion and create security risks, particularly in environments where the use of such tools is advisable or required (e.g., for preventing leaks of sensitive commercial information).

    V.          Conclusion

As drafted, the CPPA regulations package contains an extensive set of detailed and complex requirements for cybersecurity programs and audits, using ADMT, conducting privacy risk assessments, and more. Though the draft regulations will likely undergo more revisions, they are shaping up to significantly impact businesses both inside and outside of California. Businesses that are subject to the CCPA should pay close attention to these draft regulations and consider submitting comments to the CPPA when the regulations advance to formal rulemaking. Wilson Sonsini Goodrich & Rosati routinely helps companies navigate complex privacy and data security issues. For more information or advice concerning your CCPA compliance efforts, or preparing a comment regarding these draft regulations, please contact Eddie Holman, Maneesha Mithal, Tracy Shapiro, Erin Delaney, Yeji Kim, Boniface Echols, or any member of the firm’s data, privacy, and cybersecurity practice. For more information or advice concerning your compliance efforts related to ADMT or AI, please contact Scott McKinneyEddie Holman, Maneesha Mithal, or any member of the firm’s artificial intelligence and machine learning working group.


[1] CPPA, Proposed Text of Regulations (July 2024), https://cppa.ca.gov/meetings/materials/20240716_item8_draft_text.pdf; CPPA, Draft Initial Statement of Reasons (July 2024) https://cppa.ca.gov/meetings/materials/20240716_item8_draft_omnibus_isor.pdf.