The EEOC recently released long-awaited guidance on how an employer’s use of software, algorithms, and artificial intelligence will be dealt with by the Commission under the Americans with Disabilities Act (ADA). In issuing these guidelines, the Commission has focused on employers administering software that uses algorithmic decision-making or artificial intelligence to make employment decisions before and during employment. The Commission has outlined three general areas where the use of such technology may violate the ADA: (1) an employer who does not provide reasonable accommodations necessary for an applicant or employee to be evaluated fairly and precise by technology; (2) an employer administering technology to “screen” job applicants; and (3) an employer using technology to conduct disability-related medical inquiries and examinations.
Reasonable accommodations. Generally speaking, under the ADA, an employer must provide a candidate or employee with reasonable accommodations that would enable the individual to perform the essential functions of the job. For reasonable accommodation, the EEOC flagged an employer’s use of algorithmic decision-making tools to assess job applicants or employees as a potential area of concern under the ADA. Indeed, if a candidate or an employee explains to the employer that a medical condition would make it difficult to use the software, the individual has requested a reasonable accommodation requiring the employer to engage in the interactive process and to find suitable accommodation for the individual. For example, where documentation shows that the disability could make a test more difficult or reduce the accuracy of the assessment, the employer may be required, as a reasonable accommodation, to provide an alternate test format or another type of skill assessment that is more accurate. judges the skills of the candidate or employee. Under the direction of the Commission, an employer may be held liable under the ADA for the actions of software vendors acting on the employer’s behalf.
“Eliminate” individuals. Another scenario in which ADA could apply to software using algorithmic decision-making or artificial intelligence is to “screen” individuals. “Screening” occurs when a disability prevents a candidate or employee from meeting a selection criterion or reduces their performance, and the candidate or employee loses a job opportunity as a result. For example, “screening” can occur when a chatbot that is programmed to reject all candidates with a gap in work history excludes a candidate who has a gap in employment due to a disability. In warning against such practices, the EEOC explicitly cautions against using “bias-free” software as a shield, because such software may not be programmed to assess discrimination under the ADA.
Disability and medical related inquiries. Finally, the EEOC focused on the circumstances in which a candidate did not receive a job offer. At this point in the employment lifecycle, an employer may violate the ADA if they use software that searches for information about an applicant’s physical or mental disabilities or health, a practice generally prohibited by the ADA. ADA.
All of this begs the question of what employers should be doing when using software to make employment decisions before and during employment. Fortunately, the EEOC has provided a non-exhaustive list of “promising practices” that include, but are not limited to, notifying all candidates assessed using software that reasonable accommodations are available, using tools algorithmic decision-making tools that only measure the abilities or qualifications actually needed for the job, and check with a vendor before using their software that the tool does not ask questions of candidates or employees who might get disability information or medical information. Yet even with the guidance of the EEOC, the use of software using algorithms and artificial intelligence is a new and developing area of law. Therefore, employers will want to consult an employment attorney before developing or hiring a vendor to administer software that uses algorithms or artificial intelligence to make employment decisions.