Disability advocates worried about discrimination in AI

[ad_1]

Making recruiting technology accessible means both ensuring that a candidate can use the technology and that the skills it measures do not unfairly exclude candidates with disabilities, he says. Alexandra GivensCEO of the company Democracy and Technology Center, an organization focused on civil rights in the digital age.

AI-powered recruiting tools often fail to include people with disabilities when generating training data, he says. Such people have long been excluded from the workforce, so algorithms modeled after a company’s previous hires do not reflect their potential.

Even though models can explain outliers, the way a disability manifests itself varies greatly from person to person. For example, two people with autism may have very different strengths and challenges.

“As we automate these systems and employers turn to the fastest and most efficient, people lose their chance to truly demonstrate their qualifications and ability to get the job done,” Givens says. “And that’s a huge loss.”

The abandonment approach

Government regulators find it difficult to monitor AI recruiting tools. 11 senators wrote in December 2020 a letter for US Equal Employment Opportunity Commission by voicing concerns about the use of recruitment technologies after the covid-19 pandemic. The letter sought information on the agency’s mandate to investigate whether these vehicles specifically discriminated against people with disabilities.

EEOC responded a letter leaked to MIT Technology Review in January. In the letter, the commission stated that it cannot investigate AI recruitment tools without specific allegations of discrimination. The letter also outlined concerns about the industry’s hesitancy to share data and said that the differences between software from different companies would prevent the EEOC from establishing any broad policy.

“I was surprised and disappointed when I saw the answer,” she says. Roland Behm, an advocate and advocate for people with behavioral health issues. “The whole tenor of this letter seemed to make the EEOC seem like a passive audience rather than an executive office.”

The agency typically initiates an investigation when a person alleges discrimination. But with AI recruiting technology, most candidates don’t know why they were rejected for the job. “I believe one reason we’re not seeing more enforcement action or specific litigation in this area is because candidates don’t know they’re being rated or evaluated by a computer,” he says. Keith Sonderling, an EEOC commissioner.

Sonderling said he believes AI will improve the hiring process and hopes the agency will guide employers on how best to implement it. He says he welcomes Congressional oversight.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *