Beware the Pitfalls of Automated Technology in Employment

October 31, 2022

The increased use of automated software, algorithms, and artificial intelligence in hiring and employment has garnered the attention of government agencies and state legislatures seeking to guard against potential discrimination. 

The Equal Employment Opportunity Commission (“EEOC”) and the Department of Justice (“DOJ”) issued guidance that employers must consider before using automated tools.  The guidance focuses on disability discrimination under the Americans with Disabilities Act of 1990 (“ADA”), but employers should consider that other federal, state, and local anti-discrimination laws might apply to computer-assisted employment practices.  Several states and localities laws have enacted laws or have legislation pending that specifically addressed the use of automated tools in hiring and employment.   

LAWS AND REGULATIONS TO CONSIDER

EEOC Guidance

The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees (the “EEOC Guidance”).

In the fall of 2021, the EEOC launched an initiative to analyze employers’ use of artificial intelligence and algorithmic decision-making tools.  As a result of that analysis, the EEOC issued the EEOC Guidance, focusing on the use of these tools in the context of the ADA.  The EEOC Guidance explains how using these technologies might violate the ADA and provides several “Promising Practices” to avoid violations. 

Types of Automated Computer Processes Identified by the EEOC Guidance:

  • Software – Programs that provide instructions to a computer on how to perform a task (g., resume-screening software, chatbot software, video interviewing software, employee monitoring software)
  • Algorithms – Set of instructions computers follow to accomplish a task (g., formulas that evaluate, rate, and make decisions about applicants and employees)
  • Artificial Intelligence – Machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions (g., machine learning, computer vision, and language processing programs.)

EEOC’s Identified ways to Violate the ADA with Automation:

The EEOC identified several ways the use of automated tools may violate the ADA, including using technology that fails to offer an accommodation process, screens out individuals with disabilities, or makes impermissible medical inquiries.  The EEOC Guidance focuses on automated process that may disadvantage a disabled applicant or employee in some way.  To avoid such a result, the EEOC offered “Promising Practices” for employers.

EEOC’s “Promising Practices:”

  • All algorithms must start with the right data
  • Monitor the results to assess for discriminatory outcomes
  • Inform applicants / employees about the use of automated processes
  • Provide clear instructions for accommodation process
  • Remain open to all requests for reasonable accommodation
  • Limit assessment to job related criteria
  • Train staff to recognize requests for an accommodation

DOJ Guidance: 

Algorithms, Artificial Intelligence, and Disability Discrimination in Hiring (the “DOJ Guidance”)

The DOJ Guidance addresses the use of algorithms and artificial intelligence by state and local government employers.  Like the EEOC Guidance, the DOJ Guidance cautions about potential ADA violations, reminds that automated processes must evaluate job skills (not disabilities), and identifies promising practices to avoid discrimination.

State and Municipal Laws:

Illinois, Maryland, and New York City have enacted laws regulating the use of software, algorithms, and artificial intelligence in employment.  While each requires some form of advance notice to applicants of the tools in use, the laws take varying approaches to the issue. 

Illinois – Artificial Intelligence Video Interview Act (enacted 2019)

820 ILCS 42

  • Regulates the use of artificial intelligence to analyze videos interviews
  • Applicants must be notified in advance of use of the tool and what characteristics it will evaluate
  • Limits the distribution and use of the video
  • Requires that the video be deleted within 30 days of a request from the applicant
  • Employers relying only on AI video analysis to select applicants for in-person interviews must annually collect and report data about how the technology filters applicants

Maryland – Rules for use of facial recognition technology (enacted 2020)

Md. Code, Lab. & Empl. § 3-717

  • Prohibits the use of facial recognition technology during pre-employment job interviews unless applicants are informed of the use of the technology and sign written consent and waiver in advance

New York City – Rules for “Automated Employment Decision Tools” (goes into effect January 1, 2023)

Amendment to NYC Admin. Code § 1.5-20

Prohibits use of “automated employment decision tool” unless:

  • Tool undergoes annual bias audit by an independent auditor
  • Audit results are posted on the employer’s website
  • 10 or more days before use of the tool, all applicants must receive notice regarding:
  • Tool will be used;
  • Job qualifications to be assessed; and
  • Data to be collected

POSSIBLE FUTURE LAWS

Legislatures in more than 15 other states and localities have proposed or debated bills to regulate the use of automation in employment.  Of note, two bills proposed in New York and one in the District of Columbia would establish civil penalties and create possible private causes of action against employers for the discriminatory use of automated tools.  Proposed laws in California, New York and the District of Columbia also would require advanced notice, public disclosures to government agencies concerning the use of automated tools, and mandatory annual bias audits with results made publicly available. 

TAKE AWAYS

Employers should consider all applicable rules (federal, state, and local) before automating any employment process.  Best practices dictate the development of written policies governing the use of automation and the maintain of detailed records of outcomes.  Automated tools should be (1) designed to test for abilities necessary for the advertised position, (2) developed with individuals with disabilities in mind, and (3) audited regularly for biased outcomes.  Employers should consider using outside vendors for these assessments if the company lacks the know-how to perform an adequate audit.  Employers also should regularly monitor state and local laws across the country because this area of the law is developing quickly.

Click here to view a downloadable PDF of the legal update.

This Labor & Employment Alert is intended to keep readers current on developments in the law. It is not intended to be legal advice. If you have any questions, please contact Annemarie DiNardo Cleary (acleary@eckertseamans.com) at 804.778.7768 or, or a member of Eckert Seamans’ Labor & Employment team, or any other attorney at Eckert Seamans with whom you have been working.

Share This Post

Author