New York City Enforcement of AI Bias Audit Law Starts July 5th - American Society of Employers - Anthony Kaylin

New York City Enforcement of AI Bias Audit Law Starts July 5th

AI lawNew York City is deferring enforcement of its first-in-the-nation regulation of the use of AI-driven hiring tools (Local Law 144 of 2021), which was initially slated to go into effect on January 1, 2023. The law is actually in effect currently, but enforcement has been deferred to July 5, 2023.  The delay in enforcement was announced by the city’s Department of Consumer and Worker Protection in an update as part of the publication of a final rule implementing Local Law 144.    

The law will require employers to audit and notify candidates about the use of automated employment decision tools.  Michigan employers take note, it could be a model law for Michigan if the legislature goes down this path.

NYC’s law defines an “automated employment decision tool,” or AEDT, as “any computational process, derived from machine learning, statistical modeling, data analytics or artificial intelligence, that issues simplified output, including a score, classification or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” 

The rule provides further clarity around the phrase “substantially assist or replace discretionary decision making” to mean tools that: “Rely solely on a simplified output, such as a score, tag, classification or ranking, with no other factors considered.”

  • Use a simplified output as one of a set of criteria where the simplified output is weighted more than any other criterion in the set.
  • Use a simplified output to overrule conclusions derived from other factors including human decision making.

A separate segment of the rule adds two criteria that define machine learning, statistical modeling, data analytics, and AI tools as “mathematical, computer-based techniques” that generate a prediction — i.e., an expected outcome for an observation — or that generate a classification — i.e., an assignment of an observation to a particular group.

For example, an AEDT could predict a job candidate’s fit or likelihood of success, or it could assign a classification to candidates or groups of candidates based on the candidates’ skill sets or aptitudes, per the rule. A computer must assist the tool by, at least in part, identifying or assigning weight to the inputs and other parameters that improve the accuracy of a given prediction or classification.

Employers are not prohibited from using AEDTs but must audit the use of the tools for bias against any protected category.  Essentially, like other testing products used, employers will likely have to run an adverse impact analysis to see if there is a statistically significant impact (2 or more standard deviations with shortfall) on any race or gender. 

This requirement also assumes that employers are naturally collecting applicant demographics. However, as more and more applicants generally are not reporting race, there could be a question of how that impacts the audit results.  Employers will have to report on unknowns as well as a separate group.

If there is a scoring aspect to the AEDT, the audit must calculate:

  • The median score for the full sample of applicants
  • The scoring rate for individuals in each category
  • The impact ratio for each category, and
  • The number of individuals not included in these calculations because they fall into an unknown category.

A category could be excluded from analysis if the category represents less than 2% of the data, but a justification must be provided as to why there was an exclusion.  It is unclear what that justification should look like. 

Data for bias audits is based on AEDT’s historical data. The data may be collected from one or more employers or employment agencies that use the same AEDT, but individual employers and agencies may rely on such an audit only if they provide historical data from their own use of the AEDT to the independent auditor conducting the audit, or if they have never used the AEDT.

If there is insufficient historical data to conduct an audit, an employer may rely on an audit that uses test data.

Employers and agencies must publicly disclose the data of the most recent bias audit as well as the summary of results from such audits.  These results should be posted on the employer’s career sites. 

Employers must also notify candidates and employees about the use of an AEDT at least 10 business days prior to use of the tool. The notice must include the job qualifications and characteristics that the AEDT will use to assess individuals.  Candidates must be permitted to request an alternative selection process or accommodation if such an alternative is available.

Employers also should consider obtaining consents from applicants and employees regarding the use of AI because use of the tech may give rise to disparate impact claims under federal and state laws, says Nicholas Pappas, partner at Dorsey & Whitney.

When using AEDT, employers should be prepared to have in place the ability to conduct the testing.  The vendor may provide the tools, but likely not.  Since the data has arguable privacy privilege, employers could use AAP consultants who routinely conduct these analyses or law firms who may conduct them.

Although employers who are not in NYC may choose to say this law does not impact them, it is likely if successful, to be the benchmark and model law that could be used throughout the country, including Michigan in today’s environment.

 

Source:  HR Dive 4/11/23, Littler 12/12/22, Morgan Lewis 1/9/23

Please login or register to post comments.

Filter:

Filter by Authors

Position your organization to THRIVE.

Become a Member Today