Skip to content

You’re Hired! Navigating the Use of Artificial Intelligence in Recruiting and Hiring Practices

On Behalf of Berenzweig Leonard, LLP | July 6, 2023 | Employment & Labor Law

The EEOC recently released valuable guidance for employers on the use of Artificial Intelligence (AI) during the hiring process. The EEOC guidance document titled “Assessing Adverse Impact in Software, Algorithms, and Artificial Intelligence Used in Employment Selection Procedures Under Title VII of the Civil Rights Act of 1964”[1] outlines various steps an employer can take to avoid disparate impact discrimination when utilizing AI in employment decisions. Disparate impact refers to policies or processes that are not discriminatory on their face, but when put in practice have a discriminatory impact on protected employees. For example, in 2022, the EEOC enforced action against iTutorGroup, Inc. because the company unknowingly discriminated against candidates in violation of the Age Discrimination in Employment Act (ADEA) by rejecting candidates on its AI hiring platform whose birth date was over a certain year.[2]

However, not all bias and disparate impact caused by AI technologies are so blatant. In most cases, the risk is much less obvious and easy to miss if employers are not specifically looking for it. Amazon ultimately decided to ditch a hiring algorithm when it discovered the program inadvertently undervalued applications submitted by women, even when gender was not explicitly identified.[3] Even though it may be completely unintentional, AI could lead to discrimination and disparate impact claims against employers. Fortunately, the EEOC’s recommendations provide useful guidance for navigating these hurdles to ensure employers comply with federal laws under Title VII when incorporating AI into hiring practices.

A prominent rule-of-thumb referenced in the EEOC’s guidance document is the four-fifths rule. The four-fifths rule, or 80% rule, determines whether the selection rate for one group is “substantially” different than the selection rate for another group. In other words, the selection rate of a protected group of applicants should be at least 80% of the selection rate for a non-protected group of applicants. It is important to note, however, that compliance with the four-fifths rule does not guarantee that a particular employment procedure does not have an adverse impact for purposes of Title VII. Nevertheless, it is a “practical and easy-to-administer” test that may be used to draw a preliminary inference that the selection rates for two groups may be substantially different and prompts employers to inquire further about the hiring procedure in question.[4] 

Employers must recognize that, although AI cannot completely replace human-centered hiring and recruitment practices, AI usage could assist in hiring processes. Here are a few tips from the EEOC that may be useful in taking a hands-on approach to auditing AI usage in your business:

  • Ensure automated hiring processes are subject to consistent review. Check that the systems in place are not reflecting existing implicit biases of individuals building and maintaining the tools.
  • Invest in due diligence around AI tools and inquire about testing and audit practices with a focus on disparate impact.
  • Engage in implicit and explicit bias training to identify potentially discriminatory hiring practices.

The recent EEOC guidance is non-binding, however; it signifies the potential liability an employer may face for disparate impact claims involved with the use and implementation of AI software in the workplace. The EEOC’s overarching standards for liability, coupled with the rapid pace of AI development, call for employers to be vigilant when employing AI software in their hiring practices. Please contact us if you have any questions about how using AI in the hiring process could impact your business.

Isabel Wadsworth is a law clerk at Berenzweig Leonard LLP and will soon begin her third year at the University of Richmond Law School.


[1] https://www.eeoc.gov/select-issues-assessing-adverse-impact-software-algorithms-and-artificial-intelligence-used

[2] https://www.eeoc.gov/newsroom/eeoc-sues-itutorgroup-age-discrimination

[3] https://www.aclu.org/news/womens-rights/why-amazons-automated-hiring-tool-discriminated-against

[4] https://www.eeoc.gov/select-issues-assessing-adverse-impact-software-algorithms-and-artificial-intelligence-used