Illinois employers that use Artificial Intelligence (AI) to analyze job interviews now face restrictions on the use of the technology starting January 1, 2020
A new Illinois law places requirements on employers who video record interviews with job applicants and then use an artificial intelligence system to analyze the responses, demeanor, and mannerisms of the prospective employees.
The law went into effect January 1, 2020. It’s the first time that a state has acted to regulate the use of AI in the hiring process.
The “Illinois Artificial Intelligence Video Interview Act” (also called the “Video Interview Act”) was signed into law by Democratic Gov. J.B. Pritzker on August 9, 2019.
Legal experts say the measure is aimed at providing job applicants information about and some control over the process when an AI system is used.
Employers use AI systems to scan resumes, schedule interviews, and to conduct the first set of job interviews. Relying on video recordings of prospective employees, AI systems employ algorithms to evaluate an applicant’s facial expression, word choice, body language, and tone of voice, among other factors, to determine whether they are qualified for the job and to rank the applicants in a job pool.
While the use of AI in hiring might sound like something out of a science fiction novel, it’s already in place among a number of U.S. and international companies. 88% of companies globally say they use AI in some way for HR and 83% of U.S. employers say they rely on the technology in some form, according to consulting firm Mercer’s Global Talent Trends 2019 report, SHRM has reported.
And many of the U.S. companies surveyed plan to increase their use of the technology. 59% said they plan to boost their use of workplace automation this year.
In response to the use of AI in hiring, Illinois legislators approved the workplace technology measure.
The Prairie State is developing a reputation as a leader in laws dealing with workplace and technology. Illinois lawmakers also approved the Biometric Information Privacy Act in 2008, widely regarded as “first of its kind” legislation regulating the collection and possession of biometric information and the Personal Information Protection Act in 2005, viewed as one of the more expansive data breach notification laws in the nation.
The new legal requirements
The law covers “positions based in Illinois.” The statute does not define what that means.
An employer that asks job applicants to record their interviews and then relies on an analysis by an AI system analysis when considering the prospective employees must take certain steps before the video job interview can proceed:
- Each applicant must be notified before the interview that artificial intelligence may be used to analyze the applicant’s video interview and to consider the applicant’s fitness for the position. The law does not require that the pre-interview disclosure regarding AI use be in writing.
- Each applicant must be provided with information before the interview explaining how the artificial intelligence works and what general types of characteristics it uses to evaluate applicants. The law does not specify that the pre-interview explanation be in writing.
- Employers must obtain consent from the applicant before the interview. An employer may not use artificial intelligence to evaluate applicants who have not consented to the use of artificial intelligence analysis. The law does not specify that the obtained consent be in writing; although, some legal experts have suggested that written consent be obtained as a best practice.
Sharing of videos is allowed, but only on a limited basis. An employer can only share job applicant videos with those whose expertise or technology is necessary to evaluate an applicant’s fitness for a position.
If an applicant requests the destruction of the video, it must be done within 30 days after the request is received. In addition, anyone who received copies of the applicant video interviews must also delete the videos, including all electronically generated backup copies.
New law creates several questions
The short statute creates several questions.
Although the bill in its original form specified that the pre-interview disclosure and consent be in writing and that a pre-interview information sheet is provided, the requirement that they be in writing was removed via a floor amendment while the bill was under consideration in the Illinois legislature.
The new law does not define what a job “based in Illinois” means. More than likely the new law applies to employers seeking to fill a job in Illinois and includes job seekers who are not located in Illinois.
It doesn’t have a definitions section so that the term “artificial intelligence,” for example, is not defined. Experts have noted that there are many types of AI.
The statute does not provide guidance on what information an employer must provide in meeting the requirement that the employer explain how artificial intelligence works.
The legislation does not provide an enforcement mechanism. It does not specify which state agency will handle alleged violations and it does not provide a private right of action or specify penalties for violations.
Bills in Congress
Two bills that would regulate the use of AI platforms are under consideration in Congress. The “Algorithmic Accountability Act of 2019” was introduced in April in both the U.S. House of Representatives and the U.S. Senate. The proposed law would require users of AI and similar platforms to audit them for their impact on accuracy, fairness, bias, discrimination, privacy, and security and correct any issues.
Neither bill has progressed much beyond its introduction by Democratic lawmakers.
AI concerns in the employment context
While the Illinois’ measure was supported by the Illinois Chamber of Commerce, there are concerns over the use of AI in the employment context.
The U.S. Equal Employment Opportunity Commission (EEOC) does not yet have an official policy on AI-based tools in the workplace, but it has emphasized that employers must assess the benefits of AI-based tools against increased exposure and risk of privacy and security issues, according to Joseph J. Lazzarotti and Maya Atrakchi, attorneys for employment law firm Jackson Lewis.
In 2017, the EEOC issued a strategic enforcement plan for fiscal years 2017 to 2021 identifying the “increasing use of data-driven selection devices” as one of several areas of concern in eliminating barriers to recruitment and hiring.
The Office of Federal Contract Compliance Programs (OFCCP), an agency within the U.S. Department of Labor that monitors federal contractors and subcontractors for discriminatory employment practices, has issued guidance explaining that the use of AI to screen employment candidates could trigger obligations under “Uniform Guidelines on Employee Selection Procedures,” Kwabena A. Appenteng, Philip L. Gordon, and Garry G. Mathiason, attorneys for Littler Mendelson have explained.
The Littler Mendelson attorneys say that, according to the OFCCP, if an employer’s use of an AI-based selection procedure results in an adverse impact on a racial or ethnic group or sex, the evaluation procedure may require further OFCCP scrutiny.
Writing for Forbes, employment law attorney Steven Pearlman, said “some fear that AI run amok” could lead to disparate impact claims – a complaint that a policy or practice that seems to be neutral disproportionately and negatively affects individuals who are legally protected such as members of minority groups. For example, Pearlman wrote, there is a fear that since AI may involve “machine learning,” it may develop assumptions regarding the suitability of certain groups of people for employment based on previous interviews with members of the same group.