Recruitment AI risks discriminating against people with disabilities • The Register

The Biden administration and the Justice Department have warned employers using AI software for recruiting purposes to take extra steps to support applicants with disabilities or risk violating the Americans with Disabilities Act (ADA) .

Under the ADA, employers must provide adequate accommodations to all qualified job applicants with disabilities so that they can participate fairly in the application process. But the increasing deployment of machine learning algorithms by companies in their hiring processes opens up new possibilities that can put candidates with disabilities at a disadvantage.

The Equal Employment Opportunity Commission (EEOC) and the DoJ released a new document this week, providing technical guidance to ensure that companies do not violate the ADA when using the technology of the AI for recruitment purposes.

“New technologies should not become new means of discrimination. If employers are aware of how AI and other technologies can discriminate against people with disabilities, they can take steps to prevent it,” said EEOC President Charlotte Burrows.

“As a nation, we can come together to create workplaces where all employees are treated fairly. This new technical assistance document will help ensure that people with disabilities are included in employment opportunities of the future. .”

Companies that use automated natural language processing tools to filter resumes, for example, may reject candidates with gaps in their work history. People with disabilities may have had to take time off work for health reasons and therefore risk being automatically turned down at the start of the hiring process despite being well qualified.

AI can discriminate against people with disabilities in other ways. Computer vision software that analyzes a candidate’s gaze, facial expressions or tone is not appropriate for those who are speech impaired, blind or paralyzed. Employers should take extra care when using AI in their hiring decisions, the document says.

Companies should ask the software vendors providing the tools if they designed them with people with disabilities in mind. “Has the provider attempted to determine whether use of the algorithm disadvantages people with disabilities? For example, has the provider determined whether any of the traits or characteristics measured by the tool correlate certain disabilities? It said.

Employers should consider how best to support people with disabilities, such as informing them of how its algorithms assess candidates or giving them more time to take the tests.

If algorithms are used to rank applicants, they might consider adjusting scores for people with disabilities. “If the average scores of one demographic group are less favorable than those of another (for example, if the average scores of individuals of a particular race are less favorable than the average scores of individuals of a different race), the tool can be modified to reduce or eliminate the difference,” according to the document.

“Algorithmic tools shouldn’t be a barrier for people with disabilities seeking access to jobs,” concluded Kristen Clarke, assistant attorney general in the Justice Department’s Civil Rights Division. “These tips will help the public understand how an employer’s use of these tools may violate the Americans with Disabilities Act, so people with disabilities know their rights and employers can take steps to avoid discrimination.” ®

Comments are closed.