How To Use AI For Recruiting Without Getting Sued

May 8, 2023

cloud-computing-picture-id1318623693

The flaws in AI systems used in recruitment and hiring could transform the workforce in unpredictable ways, according to a Tilleky & Gibbins paper. It could lead to discriminatory hiring and inevitably to litigation. Users of AI recruitment tools should ideally be able to identify the algorithmic decision by deconstructing the AI decision-making process. However, as AI’s complexity increases it is becoming more and more difficult (or even impossible) to reverse-engineer algorithms based on machine learning. The most feasible approach to determining whether an algorithm is biased appears to be running samples of data sets in advance of using the system for recruitment. The city of New York recently passed a law requiring a bias audit of employment decision tools prior to their implementation. The paper also advises including human oversight in the recruitment process and being critical of the outcomes of AI recruitment tools. “If employers blindly follow AI outcomes without a deeper examination of how the algorithmic decision is reached, hiring outcomes may be not only ridiculous but also discriminatory.”

Critical intelligence for general counsel

Stay on top of the latest news, solutions and best practices by reading Daily Updates from Today's General Counsel.

Daily Updates

Sign up for our free daily newsletter for the latest news and business legal developments.

Scroll to Top