The Hidden Legal Risks of AI in Hiring - American...

The Hidden Legal Risks of AI in Hiring

Companies strive to do things better, cheaper, and faster but sometimes it is better for business to take a step back and look at the bigger picture. Take the time to consider what unintended consequences could come up when putting policies and practices into place. Many employment decisions become legal decisions, yet how many employers look at their decisions that way when enacting or updating their policies and procedures?

Everywhere we look AI is there, and it can be a good tool to increase productivity. However, using AI for hiring practices is an area ripe with opportunities for those unintended consequences. Look at your recruiting process. Are you relying heavily on AI to do the work for you? Are you using AI to screen resumes?

What about your background checks? Do you have AI ranking or scoring applicants based on results of an employment background check? Using AI to automate these systems while neglecting human review of the information could mean that AI “may be functioning as consumer reporting agencies under federal law.”

Employers are most likely not looking at AI as a third-party resource and therefore are not considering that the Fair Credit Reporting Act (FCRA) requirement for disclosure and authorization forms, accuracy, and pre-adverse/adverse processes. There are no AI exemptions in the FCRA laws or California laws.

Just this past January a lawsuit was filed in California against the AI recruiting platform Eightfold AI Inc. for violating the FCRA. At issue is how Eightfold uses hidden AI technology to collect “sensitive and often inaccurate” information, then scores the applicants on a scale of 0 – 5 designed to inform employers of applicants “likelihood of success” at the position for which they applied.

The Eightfold technology collects data from places like social media profiles, internet use and searches, and tracking cookies. The information is then used to create a profile on each applicant’s characteristics that are not listed on applications. The software runs in the background, so applicants are not aware this is happening and they are not given the chance to get a copy of the profile and dispute any information in the report they believe is not accurate.

Problems with accuracy include information attributed to an applicant that belongs to someone else. Think of how many common names you see on resumes and applications. California, Michigan, and many other locations have enacted expungement laws. We know that once something is out on the internet it exists forever. AI can pick up information on records that an employer should not have access to or to consider when making employment decisions.

While using AI can help companies to be more efficient, relying on AI to review, score, and filter applicants without any human interaction in the process puts companies at risk for violating federal, state, and local laws.

 

 

Sources: news.bloomberglaw.com; reuters.com; aboutlaw.com

Please login or register to post comments.

Filter:

Filter by Authors

Position your organization to THRIVE.

Become a Member Today