Given the variety and complexity of tasks associated with operations management, automated systems, including those utilizing artificial intelligence (AI), are increasingly deployed by businesses to improve overall efficiencies. As part of this effort, the use of AI or related automated systems to track and monitor production, including employee activities, is becoming widespread. A 2022 New York Times survey revealed that eight out of the 10 largest private U.S. employers track individual workers, many in real time, to assess their productivity. Any process that utilizes electronic devices capable of being connected via network technology suddenly becomes a trove of data points that can be used to monitor or improve the process—or the employees engaged in the process.
Federal governmental agencies have taken notice of these innovations and voiced concerns over potential abuses of the new technologies. In October 2022, Jennifer Abruzzo, the General Counsel for the National Labor Relations Board (NLRB) issued a memorandum on electronic monitoring and algorithmic management of employees interfering with the exercise of rights under the National Labor Relations Act (NLRA). Specifically, Abruzzo advocates that the NLRB adopt a presumption of illegality for employer use of these technologies where the use has “a tendency to interfere with [NLRA] rights.”
Moreover, several federal agencies have decided to work together in identifying and monitoring potential abuses of the technology. The NLRB has signed memoranda of understanding with the Federal Trade Commission, the Department of Justice, and the Department of Labor, all of which aim to protect employees from these monitoring practices.
In March 2023, the NLRB also announced a partnership with the Consumer Financial Protection Bureau (CFPB) “to better protect American workers and address practices of employer surveillance, monitoring and data collection.” In that announcement, the CFPB asserted that the Fair Credit Reporting Act (FCRA) applies to automated worker surveillance tools, adding another wrinkle to the potential layers of oversight that companies must handle when implementing and utilizing AI tools in manufacturing operations.
The CFPB also prepared a joint statement with the Department of Justice, the Equal Employment Opportunity Commission and the Federal Trade Commission regarding enforcement efforts to protect the public from automated systems and AI. They broadly define “automated systems” to include “software and algorithmic processes—including AI—that are used to automate workflows and help people complete tasks or make decisions.” The joint statement provides that the use of automated systems has the “potential to perpetuate unlawful bias, automate unlawful discrimination, and produce other harmful outcomes.” The joint statement further concludes that “existing legal authorities apply to the use of automated systems and innovative new technologies just as they apply to other practices.”
In May of 2023, the White House Office of Science and Technology Policy (OSTP) announced that it would be releasing a public request for information to learn more about the automated tools used by employers to surveil, monitor, evaluate and manage workers. While the OSTP found that automated systems could be beneficial, they focused on a host of potential risks including pushing workers to move too fast on the job, jeopardizing safety, deterring workers from exercising their rights to organize and collectively bargain and potentially discriminatory treatment.
The governmental agencies have consistently stated that potential harm can arise from both the design of the technology and the actions taken based upon the information gained. An employer must safeguard itself when implementing new automated technologies by ensuring the new technologies don’t have built-in bias and the decision making based upon the data avoids outcome-based bias.
Businesses must be cognizant that the various federal government agencies will work in concert to track and enforce an employer’s use of automated systems, including AI-based systems. This includes traditional employment concerns arising from the NLRB or the various discriminatory protections available through federal law to the potentially new applications of existing laws like the FCRA. This doesn’t even take into consideration the state-by-state laws in place to protect workers. It is imperative that employers utilize a comprehensive approach to notifying employees of potential surveillance activities employed in its automated systems and remain cognizant of the potential pitfalls that using automated systems creates.
Another factor that adds uncertainty to employer concerns in this area is the pending change in Presidential administrations. Although employers can expect changes at all the federal agencies referenced above, whether new leadership appointed by President-Elect Donald Trump focuses on these technologies remains to be seen. Employers should stay tuned and remain nimble to adjust to any potential changes affecting employee monitoring technologies.
For Further Reading
“Workplace Productivity: Are You Being Tracked?” New York Times, August 14, 2022 (subscription may be required)
Joint Statement on Enforcement Efforts against Discrimination and Bias in Automated Systems