ICO launches investigation into discriminatory AI recruitment

An investigation is underway to monitor UK data on potential bias and use artificial intelligence (AI) during the recruitment process. The audit will focus on alleged racial discrimination from automated recruitment systems.

The Office of the Information Commissioner (ICO) investigation is in response to allegations that automated recruitment software is unfairly excluding candidates who are members of minority groups.

AI is often used in recruitment, to avoid managerial overload and to eliminate potential human bias in those employees. However, there has long been concern that it may be doing the opposite.

In 2018, the ecommerce giant Amazon end of its AI recruitment tool after it was discovered that he was discriminating against female candidates.

“We will be raising concerns about the use of algorithms to filter recruitment applications, which could negatively impact the employment opportunities of people from different backgrounds,” said a spokesperson from the ICO.

“We will also set out our expectations through renewed guidance for AI developers on ensuring that algorithms treat people and their information fairly.”

AI recruitment bias: ‘left by devices’ algorithms

Natalie Cramp, CEO of data science firm Profusion, said the investigation was “very welcome and overdue”.

Cramp said that “there have been a number of recent incidents where organizations have used algorithms for functions such as recruitment, resulting in racial or sexual discrimination”.

According to Cramp, the issue may be that programming is biased in these algorithms, or the datasets they use.

“These algorithms have essentially been left to their own devices, leaving thousands of people negatively impacted by their opportunities,” Cramp said.

Joe Aiston, a senior barrister at Taylor Wessing Law Firm, said: or do not have English as their main language. which is a heterogeneity.

“A decision by AI to reject such a candidate for a role on that basis alone, even though no one has made that decision, may result in a claim of discrimination against the employer.”

Peter van der Putten, director of AI Lab at US software company Pegasystems, said that errors can slip into machine learning models “even if its designers have the best of intentions”.

Putten added: “Therefore, organizations need to ensure that the data, models and logic being used to create their algorithms are absent as far as possible, decisions are constantly monitored. AI-powered for bias and relevant automation decisions come with complementary automation. explanatory aids. ”

ICO launches investigation into discriminatory AI recruitment

Source link ICO launches investigation into discriminatory AI recruitment

Back to top button