California Privacy Protection Agency Proposes Draft Rules for Automated Decision Making, Including Artificial Intelligence

By Eric Vicente Flores and Michael Stortz

Executive Summary: The California Privacy Protection Agency has proposed a new set of draft regulations that aim to regulate the use of artificial intelligence and automated decision making technology. These regulations will be discussed alongside other draft regulations the agency has previously proposed regarding risk assessments and cybersecurity assessments. The three sets of draft regulations will be discussed at the agency’s meeting on 8 December.

On 27 November 2023, the California Privacy Protection Agency (CPPA) published draft regulations aimed at regulating automated decision-making technology (ADMT), including artificial intelligence (AI) applications. The draft regulations join two other draft regulations proposed by the CPPA regarding data risk assessments and cybersecurity audits. The ADMT regulations continue California’s forward-thinking attitude regarding automated systems, including Governor Newsom’s executive order on the ethical and responsible use of generative AI. Businesses that utilize ADMTs must consider the use of these technologies within the risk assessments mandated under the California Consumer Privacy Act, which we previously discussed on Cyber Law Watch.

The scope of the draft ADMT regulations is quite broad—they define ADMTs as any system or process that processes personal information and uses computations to make decisions or facilitate human decision-making. ADMTs also include profiling, which is defined as a form of automated processing of personal information used to evaluate an individual’s personal aspects to make analysis or predictions about the individual.

The current draft of the regulations will require businesses utilizing ADMTs to give consumers notice of their use of the technology, and the right to opt-out of certain uses of the technology. Consumers will be allowed to opt-out of: decisions that significantly affect them; any profiling of a consumer acting as an employee, independent contractor, job applicant, or student; and profiling that occurs when a consumer is in a publicly accessible place. Certain uses of ADMTs will be exempt from consumer’s opt-out rights, including automated decisions used to prevent, detect, and investigate security incidents that compromise personal information, or provide services requested by the consumer.

The ADMT regulations are of especial importance to businesses that employ California residents, as they specifically include any decisions that significantly affect employment and contracting opportunities, compensation, and healthcare services. More broadly, consumers must be notified of decisions that significantly affect them, with that notification including an explanation of the decision and information on how a complaint can be filed with the CPPA and the Attorney General.

Special rules will apply to ADMT that apply to underage consumers. Consumers less than 13 years of age must opt-in to the use of technologies used to profile for behavioral advertising. Parental or guardian consent must be verified, and such requirement must be provided in additional to any verifiable consent required under the Children’s Online Privacy Protection Act. Parental consent is not required for consumers at least 13 years old but less than 16 years of age, though business will be required to establish an opt-in process for those consumers.

The risk assessment regulations and the cybersecurity audits regulations will be discussed along the ADMT regulations at the CPPA’s upcoming meeting on 8 December. The meeting will be used to facilitate public discussion and participation, but the regulations have yet to enter California’s formal rulemaking process.

Copyright © 2024, K&L Gates LLP. All Rights Reserved.