CALL: 516-248-5550
Se habla español
516- ABOGADO (226-4236)
Visite 516abogado.com

HWS AI in the Hiring Process

The presence of Artificial Intelligence (“AI”) has been expanding rapidly in our modern world, from the creepy robot vacuum that blocks the cereal aisle at the supermarket to self-driving Teslas.  Among concerns that AI is taking jobs away from human workers, there are those using AI to place human employees in jobs.  At some companies, recruiters use AI to screen resumes, questionnaires, interview transcripts, and other introductory paperwork to find qualified candidates whose resumes suggest they may perform well in a specific position.  While many advocates for this use of AI like to think that it removes human bias from the equation, this is not always the case.

AI only knows what humans program it to know, by training AI programs with certain datasets.  Often inherent in these datasets are human biases, many of which we may not realize are there.  For example, consider a resume scanning AI program that is fed data about successful employees who happen to be young.  This may result in the program concluding that in a particular job, young people are better candidates.  While this may be unintentional on the part of the employer, the AI screening tool may weed out those with experience dating back decades, thereby discriminating against applicants based on their age, in violation of the Age Discrimination in Employment Act (“ADEA”).

Using AI screening software can lead to unintentional, unlawful discrimination, even if the employer does not intend it to.  In May 2022, the Equal Employment Opportunities Commission (“EEOC”) and the Department of Justice both published guidance to help employers using AI screening programs avoid discriminating against those with disabilities, in violation of the Americans with Disabilities Act.  The material discusses how those with disabilities may be unintentionally discriminated against when AI fails to consider that their resume may look different due to their disability.  The EEOC gives the example of applicants being screened out by AI trained to ignore those with resume gaps.  While screening out those with gaps in their resume may be a seemingly innocent piece of criteria, when that resume gap is the result of a disability, this could constitute disability discrimination.  Many employers utilize AI to try and remove bias from the hiring process, but unlawful discrimination can occur even with the best of intentions.

In December 2021, the New York City Council enacted Local Law 144 (“the AI Law.”)  Pursuant to the AI Law, the New York City Department of Consumer and Worker Protection proposed rules that would require employers to give notice to applicants when using an “Automated Employment Decision Tool,” require an annual bias audit, and require employers to publish the results.  On April 15, 2023, after a delay due to the high volume of public comments on the proposed rules, the New York City AI Law will take effect, and hopefully the results of these annual bias audits will help programmers remove more bias from the hiring process.

Despite their flaws, AI programs can be an effective tool in resume screening.  If employers and programmers continue to identify problems with AI screening, and work to correct and avoid bias, these programs have great potential.

While the New York City AI Law is not yet enforceable, if you believe you may have been a victim of discrimination in the workplace, you should speak to an experienced New York employment law attorney.  To learn more or to schedule a consultation to discuss your situation, contact Borrelli & Associates, P.L.L.C. for a free consultation.

Logo

910 Franklin Avenue
Suite 205
Garden City, NY 11530
Tel: 516-248-5550
Fax: 516-248-6027

655 Third Avenue
Suite 1821
New York, NY 10017
Tel: 212-679-5000
Fax: 212-679-5005