As more organizations embrace artificial intelligence (AI), the EEOC and U.S. Department of Justice (DOJ) have begun to be more cognizant of the discriminatory issues associated with these tools.
In particular, the EEOC published a technical assistance document, and the DOJ published guidance laying out the issues with AI. Employers need to be particularly mindful as a number of companies will try to sell recruiting solutions that are AI embedded. These solutions will likely be priced reasonably. As attractive as these tools may appear, the tools may lead to a path of discriminatory actions against applicants.
The agencies especially want employers to ensure any new AI tools do not discriminate against the disabled applicant. Why start with individuals with disabilities? First, there are tools being used with the AI software that can scrub out individuals with disabilities. For example, the inability of using software like JAWS to access the chat box’s questions and responses to video views of people which attempt to identify their thoughts and interest through the individuals facial expressions.
In particular, the EEOC identifies three ways that AI could discriminate against the applicant who is disabled:
- “The employer does not provide a ‘reasonable accommodation’ that is necessary for a job applicant or employee to be rated fairly and accurately by the algorithm.
- The employer relies on an algorithmic decision-making tool that intentionally or unintentionally ‘screens out’ an individual with a disability, even though that individual is able to do the job with a reasonable accommodation. ‘Screen out’ occurs when a disability prevents a job applicant or employee from meeting—or lowers their performance on—a selection criterion, and the applicant or employee loses a job opportunity as a result. A disability could have this effect by, for example, reducing the accuracy of the assessment, creating special circumstances that have not been taken into account, or preventing the individual from participating in the assessment altogether.
- The employer adopts an algorithmic decision-making tool for use with its job applicants or employees that violates the ADA’s restrictions on disabilityrelated inquiries and medical examinations.”
More specifically, the EEOC reminds employers that they must provide a reasonable accommodation for those who are applying when AI tools are being used. More importantly, the EEOC also requires employers to listen to the requests of the applicants, who do not have to use specific words to request an accommodation, but could just state something like a they have a medical condition that could cause the applicant issues using the employer’s AI tools. In this situation the employer can still ask for medical documentation to support the applicant’s statement, but if provided, they must be willing to provide a reasonable accommodation as long as there is no undue hardship.
When vetting AI tools, employers should review what kind of workarounds are available for those applicants with dexterity, low-vision, or other skills for which the AI tool may be assessing. The EEOC states that if an AI tool screens out an applicant, it is unlawful “if the individual who is screened out is able to perform the essential functions of the job with a reasonable accommodation.” One example the EEOC provides is:
“An example of screen out might involve a chatbot, which is software designed to engage in communications online and through texts and emails. A chatbot might be programmed with a simple algorithm that rejects all applicants who, during the course of their ‘conversation’ with the chatbot, indicate that they have significant gaps in their employment history. If a particular applicant had a gap in employment, and if the gap had been caused by a disability (for example, if the individual needed to stop working to undergo treatment), then the chatbot may function to screen out that person because of the disability.”
The issue of employment gaps as a hurdle and potential EEO issue has been growing since the pandemic. Another example is described as using
“video interviewing software that analyzes applicants’ speech patterns in order to reach conclusions about their ability to solve problems is not likely to score an applicant fairly if the applicant has a speech impediment that causes significant differences in speech patterns. If such an applicant is rejected because the applicant’s speech impediment resulted in a low or unacceptable rating, the applicant may effectively have been screened out because of the speech impediment.”
This issue would not just impact those who have that disability but also accents, which could lead to national origin discrimination. The DOJ guidance fairly matches the EEOC technical assistance.
This issue cannot be ignored. EEOC Commissioner Keith Sonderling was raising this issue of AI discrimination for the past two years. As much as the AI tools will seem cool to use and to have, and maybe cost effective for the organization, make sure to review and ask the vendor for accommodations and workarounds for individuals with disabilities.
Source: Law360 5/13/22, Phelps Dunbar LLP 5/12/22