California Finalizes AI Hiring Rules: Key Takeaways For Employers
California is leading the way in regulating the use of AI and automation in employment decisions. New rules approved by the California Civil Rights Council aim to ensure these technologies align with existing anti-discrimination laws under the Fair Employment and Housing Act. If enacted, they could take effect as soon as July 1, 2025

California has taken a historic step in regulating artificial intelligence and automated decision-making in the workplace. The California Civil Rights Council has voted to approve the final version of its Employment Regulations Regarding Automated-Decision Systems. These rules underscore California’s role as a national leader in shaping how AI can be used in employment decisions. The regulations are now under review by the Office of Administrative Law and, if approved, may take effect by July 1, 2025.
While California employers are no strangers to evolving compliance mandates, these rules require a renewed focus on how artificial intelligence, machine learning, and other forms of automation are used across the employment lifecycle. From resume screening to criminal history adjudication, many digital tools that influence employment decisions may now fall within the reach of California’s Fair Employment and Housing Act (FEHA). The rules don’t necessarily create new prohibitions, but they frame existing anti-discrimination protections in the context of emerging technologies.
What Is an Automated-Decision System?
The regulation defines an Automated-Decision System (ADS) as a computational process that makes a decision or facilitates human decision-making regarding an employment benefit. This includes systems that use AI, machine learning, algorithms, statistics, or similar data processing techniques.
The regulation outlines a broad range of use cases that qualify as ADS, including resume screening, applicant ranking, directing targeted job ads, analyzing facial expressions or tone in interviews, and evaluating third-party data. Tools that assess personality, aptitude, reaction time, or cultural fit are also included. While general-use technologies like spreadsheets, spellcheckers, and cybersecurity software are excluded, these tools fall outside the exemption if used to support employment-related decisions.
The term "facilitates" is not defined, which may leave room for interpretation. For example, a tool that flags inconsistencies in an applicant’s stated education may influence a hiring decision, even if it doesn’t reject the candidate outright. Employers must assess whether such tools support decisions in ways that trigger compliance obligations.
Clarifying “Agent” and “Employment Agency”
The regulation clarifies that agents acting on behalf of an employer are subject to the same obligations under FEHA. An “agent” includes any person acting on behalf of an employer to perform a function traditionally exercised by the employer, including through the use of an ADS. This could include third-party vendors involved in recruitment, promotion, pay decisions, or other personnel functions.
Because most employers do not traditionally conduct background checks in-house, it’s possible that background screening vendors fall outside the definition of an agent. However, employers should evaluate whether such services influence employment decisions in ways that could be deemed facilitative. Even if not agents, background screening providers are unlikely to meet the definition of an “employment agency,” which is now defined as any entity that, for compensation, procures job applicants, employees, or work opportunities, including through the use of ADS.
Impact on Criminal History Screening
The final regulation reinforces California’s Fair Chance Act. Employers may not use an ADS to consider an applicant’s criminal record until after making a conditional offer of employment. Any decision to withdraw that offer must be based on an individualized assessment, regardless of whether it is made by a human or through an automated tool.
Earlier versions of the rule would have required employers to provide a copy of any ADS-generated report used in the assessment. While this requirement was removed from the final text, employers should still be prepared to explain how an ADS influenced a hiring decision and ensure compliance with procedural safeguards.
Recordkeeping and Anti-Bias Testing
The regulation significantly expands recordkeeping requirements. Employers must retain personnel records and ADS data, including inputs, outputs, and data used to develop or customize the system, for at least four years. This includes data about individual applicants or employees and information used to support employment decisions.
The regulation does not mandate anti-bias testing. However, it notes that evidence, or the lack of evidence, of testing or similar efforts will be relevant in determining liability. Employers who conduct and document regular testing may be better positioned to defend against claims of discrimination.
Vendor and Third-Party Considerations
While the regulation expands the definition of employer to include agents, it does not directly impose liability on third-party vendors or developers of ADS tools. Still, employers may be liable for discrimination resulting from the use of a vendor’s tool.
Employers should take steps to understand how ADS tools are built, trained, and used. This includes obtaining documentation on the purpose, design, and testing of the system and clearly defining the vendor’s responsibilities through contractual terms. If a tool makes or facilitates employment decisions, employers must be able to demonstrate that its use complies with California law.
What Employers Should Do Now
To prepare for the regulation’s likely effective date, employers should:
Inventory all tools used in hiring, promotion, compensation, and other employment decisions.
Assess whether those tools meet the definition of ADS.
Review vendor relationships to determine whether service providers qualify as agents and whether their tools create legal exposure.
Develop an internal governance program that includes documentation, considers anti-bias testing protocols, and a recordkeeping process that meets the four-year requirement.
Evaluate high-risk use cases, including tools that rely on personality assessments, facial recognition, or criminal history scoring.
Parting Thoughts
California’s ADS regulations establish a new framework for regulating AI and algorithmic tools in employment. The burden now falls on employers to evaluate how their systems work, ensure those systems don’t discriminate, and preserve evidence to support that conclusion.
These rules are part of a broader national trend to regulate artificial intelligence, but they offer the most specific guidance to date on what compliance looks like in practice. Employers that prepare now will be better positioned to meet California’s expectations and respond to future regulatory developments across the country.
Release Date: June 11, 2025

Alonzo Martinez
Alonzo Martinez is Associate General Counsel at HireRight, where he supports the company’s compliance, legal research, and thought leadership initiatives in the background screening industry. As a senior contributor at Forbes, Alonzo writes on employment legislation, criminal history reform, pay equity, AI discrimination laws, and the impact of legalized cannabis on employers. Recognized as an industry influencer, he shares insights through his weekly video updates, media appearances, podcasts, and HireRight's compliance webinar series. Alonzo's commitment to advancing industry knowledge ensures HireRight remains at the forefront of creating actionable compliance content.