The Equal Opportunity Employment Commission (EEOC) is warning employers to proceed with caution when using Artificial Intelligence (AI) and algorithmic decision-making software to make hiring and personnel decisions. Employers who fail to take proper precautions risk inadvertently discriminating against employees with disabilities—and violating federal civil rights laws in the process. Here’s what you need to know:
The EEOC is concerned about AI and ADA compliance
Software programs that use AI and algorithmic decision-making have become increasingly popular among employers seeking to speed up their recruitment process, monitor employee performance, and determine eligibility for internal promotions.
From automated chatbots that evaluate candidate skills and personality traits to software that tracks and assesses employee keystrokes and job activity—AI and algorithm-based tools are touted as being able to help employers get a clearer view of their workforce and make more objective hiring decisions. Unfortunately, many of these programs can disadvantage applicants and employees with disabilities—and in some cases may violate the Americans with Disabilities Act (ADA).
In May 2022, the EEOC issued a technical assistance document providing updated policy and practical tips to help employers reduce their chances of discriminating against disabled people when using algorithmic decision-making software.
While the guidelines in the EEOC’S technical assistance document aren’t enforceable by law, they were created to help ensure ADA compliance and inform disabled employees of their rights when interacting with AI software. As AI programs continue to change the hiring and general employment landscape, employers should expect ramped-up regulation efforts and ongoing EEOC guidance in the coming months and years.
How do employers use AI?
Employers may use a range of AI tools to meet various needs. Common software examples noted by the EEOC include:
- Resume scanners that search for and prioritize applicants that use specific keywords.
- Employee monitoring software that rates employees based on their keystrokes or activity.
- Video interviewing software that utilizes AI to assess candidates based on facial recognition, language, intonation, or speech patterns.
- Job applicant “testing software” that scores applicants based on their personalities, skills, or “cultural fit” based on their performance on a game.
When might using algorithmic decision-making software result in an ADA violation?
Under the ADA, employers are barred from discriminating against applicants or employees who have a disability—defined by the ADA as a chronic condition that, when left untreated, substantially limits one or more life activities or primary bodily functions.
Unfortunately, but perhaps unsurprisingly, virtual technologies that rely on AI can make it harder for disabled people to get jobs that they’re otherwise qualified for. In some cases, these programs can even “screen out” qualified applicants entirely. For example:
- A job applicant with arthritis who cannot type quickly may be unable to score highly on a timed math test taken using their computer keyboard if not given reasonable accommodation.
- A job applicant who is blind may be unable to score well on a computer-based visual memory test, even if that person’s memory is proficient enough to perform the job duties.
- A job applicant who uses a wheelchair may be “screened out” of a job that they could easily complete with accommodations if they answer “no” when a chatbot asks if they’re able to stand for three hours straight.
According to the EEOC, employers can be held responsible for ADA violations made by algorithmic decision-making tools even if those tools were designed and administered by an outside vendor.
“Employers need to take note: if allegations of discrimination are made they are not going to be allowed to claim ‘it wasn’t me; the algorithm did it,” former EEOC commissioner Victoria Lipnic said in a statement to Bloomberg Law.
How can employers avoid violating the ADA when using AI and algorithmic decision-making tools?
Reasonable accommodations have always been key to setting employees with disabilities up for success—and staying in compliance with the ADA. Reasonable accommodations are defined by the ADA as “any change that helps an employee with a disability apply for a job, perform a job, or enjoy equal benefits or privileges of employment.”
If a job applicant or employee tells an employer that their medical condition may disadvantage them or make it difficult to take an assessment, they have effectively requested a reasonable accommodation—even if they never said the words “reasonable accommodation.”
After a reasonable accommodation request has been made, employers are responsible for promptly responding and requesting supporting medical documentation, if needed. If documentation shows that a person’s condition “may reduce the accuracy of an assessment or make it more difficult to take,” the employer must then provide a reasonable accommodation in the form of an alternative testing format or a different assessment of the applicant’s skills (unless doing so would create an “undue hardship.”)
Examples of reasonable accommodations for online assessments may include:
- providing an oral or written version of an assessment instead of a virtual version
- allotting extended time to complete an assessment
- providing assessments that are compatible with accessible technology
Employers can and should take further precautions to reduce the chances that algorithmic decision-making tools “screen out” applicants because of their disability. According to the EEOC, “screen out” happens when a person’s disability “lowers or prevents them from meeting selection criteria, resulting in them losing the job opportunity.” Screen out is illegal if the person who was screened out can perform the essential functions of the job with reasonable accommodation (if required by law).
To avoid “screening out” disabled applicants, the EEOC recommends that employers ask algorithmic decision-making software developers specific questions to determine if their programs were developed with individuals with disabilities in mind. Inquiries may include:
- Asking if the tool’s user interface is accessible to as many disabled people as possible.
- Asking if the materials presented to users are available in alternative formats.
- Asking if the vendor has attempted to determine whether their algorithm disadvantages people with disabilities.
Additional EEOC recommendations to avoid screening out qualified applicants include:
- Providing all applicants and employees with as much information as possible about the algorithmic decision-making tool, including information about testing methods and which characteristics or traits the tool is designed to measure.
- Clearly informing applicants and employees that reasonable accommodations, including alternative formats and tests, are available to people with disabilities.
- Providing clear instructions for requesting reasonable accommodations.
Finally, the EEOC urges employers who use algorithmic decision-making tools to narrow down the list of abilities and qualifications being evaluated as much as possible, so applicants are only being evaluated on skills that are truly necessary and required to perform the job’s core duties.
In doing so, employers can reduce the chance of a candidate being screened out for failing to meet criteria that are not actually relevant to their ability to perform the job.
“New technologies should not become new ways to discriminate,” EEOC Chair Charlotte A. Burrows said in a press release. “If employers are aware of the ways AI and other technologies can discriminate against persons with disabilities, they can take steps to prevent it.”
501 members and subscribers have unlimited access to HR Services. Contact us anytime regarding this subject or any other HR challenge you may be facing.
Need HR help for a low monthly fee? Contact us today.
The information contained in this article is not a substitute for legal advice or counsel and has been pulled from multiple sources.
(Image by 0fjd125gk87 from Pixabay)