Disability rights advocates worry about discrimination in AI recruitment tools

[ad_1]

Making recruitment technology accessible means ensuring that candidates can use the technology and the skills it measures will not unfairly exclude candidates with disabilities, said Alexandra Givens, CEO Democracy and Technology Center, An organization focused on civil rights in the digital age.

She said artificial intelligence-driven recruitment tools often fail to include people with disabilities when generating training data. These people have been excluded from the workforce for a long time, so an algorithm modeled on the company’s previous recruitment cannot reflect their potential.

Even if the model can explain the outliers, the manifestation of disability varies from person to person. For example, two people with autism may have very different strengths and challenges.

“As we automate these systems, and employers push for the fastest and most efficient way, they are losing the opportunity for people to actually demonstrate their qualifications and ability to do their jobs,” Givens said. “This is a huge loss.”

A non-interference method

Government regulators find it difficult to monitor AI recruitment tools. In December 2020, 11 senators wrote A letter To U.S. Equal Employment Opportunity Commission Concerned about the use of recruitment techniques after the covid-19 pandemic. The letter asked whether the agency has the power to investigate whether these tools are discriminatory, especially against persons with disabilities.

EEOC responded A letter Leaked to MIT Technology Review in January. In the letter, the committee stated that it cannot investigate artificial intelligence recruitment tools without a specific statement of discrimination. The letter also outlines concerns about the industry’s reluctance to share data, and stated that differences in software between different companies will prevent the EEOC from formulating any broad policies.

“When I saw the response, I was surprised and disappointed,” said Roland Baym, A lawyer and advocate for people with behavioral health problems. “The whole theme of this letter seems to make EEOC more of a passive bystander than a law enforcement agency.”

Once an individual makes a discrimination claim, the agency usually starts an investigation. However, using artificial intelligence recruitment techniques, most candidates do not know why they were rejected. “I think one of the reasons we haven’t seen more law enforcement actions or private litigation in this area is that candidates don’t know they are being scored or evaluated by a computer,” said Keith Ned, EEOC Specialist.

Sonderling stated that he believes that artificial intelligence will improve the hiring process, and he hopes that the agency will issue guidelines to employers on how to best implement artificial intelligence. He said he welcomes the supervision of Congress.

[ad_2]

Source link

Recommended For You

About the Author: News Center