Legal Developments Impacting Employers’ Use of Artificial Intelligence and Electronic Monitoring
As we reported in previous alerts here and here, New York City law imposes requirements on employers who use automated employment decision tools (“AEDTs”) to make employment decisions. Following suit, the NYS Senate introduced a bill that, if passed, would restrict employers’ use of not only AEDTs, but also electronic monitoring. While there is no analogous federal requirement relating to the use of AEDTs or electronic monitoring, the EEOC has also indicated that AI is an area of interest, recently releasing guidance on the application of Title VII of the Civil Rights Act to employers’ use of AI tools.
New York State Proposed Bill
Automated Employment Decision Tools
Similar to the NYC law, the NYS bill would prohibit employers from using an AEDT unless the AEDT has been the subject of a bias audit in the past year and the results of the bias audit are made public. Under both the NYC law and the NYS bill, an AEDT is defined as “any computational process, derived from machine learning, statistical modeling, data analytics, or artificial intelligence, that issues simplified output, including a score, classification, or recommendation, that is used to substantially assist or replace discretionary decision making for making employment decisions that impact natural persons.” As under the NYC law, the NYS bill would require that employers provide notice of their use of an AEDT to applicants and employees.
There are some key differences between the NYC law and the proposed NYS bill. Significantly, the proposed NYS legislation would apply to an employer’s use of an AEDT to evaluate independent contractors as well as applicants and employees. Second, the NYS bill contains a broader swath of prohibitions and requirements related to use of an AEDT than the NYC law, discussed in more detail below.
Electronic Monitoring
The proposed NYS legislation would impose restrictions on employers’ use of “electronic monitoring tools,” which the bill defines as “any system that facilitates the collection of data concerning worker activities or communications by any means other than direct observation, including the use of a computer, telephone, wire, radio, camera, electromagnetic, photoelectronic, or photo-optical system.” The bill would make it unlawful for employers to surveil employees residing in New York and independent contractors residing in New York using an electronic monitoring tool unless the tool is primarily intended to accomplish one of the allowable purposes specifically enumerated in the bill. The allowable purposes under the proposed bill include: (1) allowing workers to accomplish essential job functions; (2) monitoring production processes and quality; (3) assessing worker performance; (4) ensuring compliance with employment, labor, or other relevant laws; (5) protecting the health, safety, or security of workers; (6) administering wages and benefits; and (7) additional purposes to enable business operations as determined by the Department of Labor.
Furthermore, an employer would not be permitted to use an electronic monitoring tool to surveil employees unless: (1) the specific type of electronic monitoring tool is strictly necessary to accomplish the allowable purpose and is the least invasive means to the employee that could reasonably be used to accomplish the allowable purpose; and (2) the specific form of electronic monitoring is limited to the smallest number of workers and collects the least amount of data necessary to accomplish the allowable purpose.
Employers would also be required to provide individuals who reside in New York State that would be subject to electronic monitoring with specific and detailed notice about the tool’s use. Such notice would be required to include: (1) a description of the allowable purpose for which the electronic monitoring tool is used; (2) a description of the specific employee data to be collected, and the activities, locations, communications, and job roles that will be monitored by the tool; (3) the dates, times, and frequencies that monitoring will occur; (4) whether any data collected by electronic monitoring will be used as an input in an AEDT; (5) whether the data collected will be used (alone or in conjunction with an AEDT) to inform an employment decision; (6) whether any collected data will be used to assess employees’ productivity performance or set productivity standards, and if so, how; (7) a description of where the collected data will be stored and for how long it will be retained; and (8) an explanation as for how the electronic monitoring practice is the least invasive means available to accomplish the allowable monitoring purpose. A precise notification will be important, as the law would prohibit employers from using employee data collected via an electronic monitoring tool for purposes other than those specified in the notice.
If the bill is enacted, employers would also not be able to require employees to install applications on personal devices that collect or transmit employee data, or to wear, embed, or implant such devices, unless strictly necessary to accomplish essential job functions and only when narrowly limited to the activities and times necessary to accomplish essential job functions. Location-tracking applications and devices would need to be disabled outside the activities and times necessary to accomplish essential job functions. Employers would furthermore be prohibited from disciplining or terminating employees solely based on their refusal to comply with the collection of their data. If an employer has unionized employees, employers also could not refuse to bargain over the use of an electronic monitoring tool.
Restrictions on the Use of AEDT’s and Electronic Monitoring
The proposed bill would place significant restrictions on employers seeking to rely on the output of AEDTs and electronic monitoring tools when making hiring, promotion, termination, disciplinary, or compensation decisions. Specifically, employers could not rely solely on the results of an AEDT or the data collected by an electronic monitoring tool when making employment decisions, but rather would be required to conduct their own evaluation of the candidate or employee independent of the output from the AEDT or electronic monitoring data. The bill specifies that this would include “establishing meaningful human oversight by a designated internal reviewer” to corroborate the results by other means. Such methods of corroboration would include review of supervisory or managerial documentation, personnel files, or consulting with coworkers. If an employer could not independently corroborate the data from an AEDT or electronic monitoring, the employer could not rely on the data in making employment decisions.
When an employer can corroborate the results of an AEDT or electronic monitoring tool and wishes to make an employment decision based on the results, the proposed bill would require specific notice to affected employees prior to implementing the decision. If an employer relied on an AEDT, the notice would need to contain (1) the specific decision for which the AEDT was used; (2) any information or judgments used in addition to the AEDT’s results in making the decision; (3) the specific employee data that the AEDT used; (4) the individual, vendor, or entity who created the AEDT; (5) the individual or entity that executed and interpreted the results of the AEDT; and (6) a copy of any bias audits of the AEDT. Similarly, an employer relying on data from electronic monitoring to make a decision would first need to provide notice to the affected employees regarding the electronic monitoring and information and judgments used in the employer’s corroboration of the data.
Lastly, the bill would prohibit the use of AEDT’s and electronic monitoring tools in certain circumstances, including but not limited to: (1) where its use would result in a violation of labor or employment laws; (2) in such a manner as to unduly or extremely intensify the conditions of work or to harm the health and safety of employees; (3) to make predictions or obtain information about religious or political beliefs, health or disability status, or immigration status; (4) to predict or interfere with (including to coerce or restrain), or to identify, punish, or obtain information about employees engaging in activity protected under labor and employment laws; (5) to implement or effect a dynamic wage-setting system that pays employees different wages for the same work; (6) to subtract from an employee’s wages for time spent exercising their legal rights; and (7) to draw on facial recognition, fait, or emotion technologies. Employers would also be prohibited from using electronic monitoring tools while employees are off duty and not performing work-related tasks, or to conduct audio-visual monitoring of bathrooms or other similarly private areas (i.e., locker rooms, changing areas, breakrooms, smoking areas, employee cafeterias, lounges, lactation rooms, areas designated for prayer or religious activity), including data collection on the frequency of use of such private areas.
We will continue to monitor the status of the pending NYS legislation. If enacted, the bill would become effective 180 days after it is signed by the governor.
EEOC Guidance
The EEOC, as part of its Artificial Intelligence and Algorithmic Fairness Initiative, released a technical assistance document explaining the application of Title VII to an employer’s use of algorithmic decision-making tools and considerations for employers who utilize these tools in employment decisions. Algorithmic decision-making tools include resume scanners that prioritize applications using keywords, testing software that provide “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test, employee keystroke monitoring software, video interviewing software that evaluates candidates based on their facial expressions and speech patterns, and virtual assistants or chatbots that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements.
As employers are aware, under Title VII, employers are prohibited from using selection processes that have a disparate impact on the basis of a protected category, such as race, age, or gender, unless it is job related and consistent with business necessity. The EEOC’s guidance confirms that if an algorithmic decision-making tool used as a basis for an employment decision has a disparate impact on individuals of a particular race, color, religion, sex, or national origin, or a combination of such characteristics, then an employer’s use of the tool will violate Title VII, unless they can show it is job related and consistent with business necessity. While employers can typically meet this burden by showing the selection procedure is necessary for the safe and efficient performance of the particular job in question, the guidance does not address how employers should assess whether an algorithmic decision-making tool is a valid measure of job-related traits or characteristics, or how employers should evaluate whether the use of an algorithmic decision-making tool is consistent with business necessity.
The guidance also states that an employer will typically be responsible under Title VII for its use of such tools even if the tools are designed or administered by another entity (i.e., a vendor or recruitment agency), and may even be held responsible for the actions of its agents if the employer gave them authority to act on the employer’s behalf and the agent utilized an AI tool with a disparate impact on members of a protected category. For example, if a recruiting firm responsible for hiring an employer’s employees utilizes software that creates a disparate impact on applicants belonging to a protected category, the employer could be held responsible under Title VII. As such, the guidance advises employers considering whether to rely on a software vendor to develop or administer an algorithmic decision-making tool to, at a minimum, ask the vendor whether steps have been taken to evaluate whether use of the tool causes a substantially lower selection rate for individuals with a protected characteristic. If the tool is expected to result in a substantially lower selection rate for individuals of particular race, color, religion, sex, or national origin, the guidance states that the employer should consider whether use of the tool is job related and consistent with business necessity, and whether any alternatives exist that would meet the employer’s need with less of a potential for disparate impact.
If an employer is developing an algorithmic decision-making tool in house and discovers that the use of the tool would have an adverse impact on individuals belonging to a protected category, the guidance explains that employers can take steps to reduce the adverse impact or select a different tool in order to avoid engaging in a selection procedure that would violate Title VII. Furthermore, according to the guidance, an employer’s failure to adopt a less discriminatory algorithm that was considered during the development process may give rise to liability under Title VII.
Employers should continue to take affirmative steps to ensure that their use of automated employment decision tools and employee monitoring technologies comply with federal, state, and local laws. New York employers in particular should be vigilant of this developing area of law in light of the current NYC law and pending statewide legislation. Employers with questions about complying with these laws, including how to determine if a selection tool is covered by the law and how to comply with bias and notice obligations, can contact Kate Townley at ktownley@fglawllc.com or any other attorney at the firm.
DISCLAIMER: This alert is provided to clients and friends of the firm for informational purposes only and the distribution of this alert is not intended to, and does not, establish an attorney-client relationship. This alert also does not provide or offer legal advice or opinions on any specific factual situations or matters. This communication may be considered Attorney Advertising. Prior results do not guarantee a similar outcome.