DOL Warns Employers of AI Risks Under Federal Employment Laws
Artificial intelligence (AI) and other automated systems are streamlining many workplace HR tasks. But the U.S. Department of Labor (DOL) warns employers that the new technology does not relieve them of their statutory compliance obligations under the Fair Labor Standards Act (FLSA), the Family Medical Leave Act (FMLA), and other federal employment laws. The DOL’s latest guidance instructs employers to constantly maintain human oversight over the machines taking over HR tasks.
The DOL guidance defines AI as “a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments. Artificial intelligence systems use machine and human-based inputs to (a) perceive real and virtual environments; (b) abstract such perceptions into models through analysis in an automated manner; and (c) use model inference to formulate options for information or action.”
The guidance comes in response to President Biden’s 2023 Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. The order called for a coordinated U.S. government approach to ensuring the responsible and safe development and use of AI. It specifically instructed the Secretary of Labor to issue guidance to protect the rights of workers in the utilization of AI in the workplace.
FLSA
Under the FLSA, non-exempt employees are entitled to overtime pay at one and a half times their regular rate of pay for all hours worked over 40. DOL regulations impose strict requirements on employers for proper timekeeping and accurate records. The guidance stresses that:
Reliance on automated timekeeping and monitoring systems without proper human oversight, however, can create potential compliance challenges with respect to determining hours worked for purposes of federal wage and hour laws. An AI program that incorrectly categorizes time as noncompensable work hours based on its analysis of worker activity, productivity, or performance could result in a failure to pay wages for all hours worked.
Specifically, the guidance refers to AI tools used by employers that can record time worked and that measure and analyze metrics of worker productivity or activity. These systems are used to analyze worker activity in real time, such as computer keystrokes and mouse clicks, website browsing, presence and activity in front of a web camera, or other data to determine when an employee is “active” or “idle.” An AI program that incorrectly categorizes time as noncompensable work hours based on its analysis of worker activity, productivity or performance could result in a failure to pay wages for all hours worked and expose employers to liability under the FLSA.
The guidance also warns that AI systems that automatically deduct meal breaks and other longer break periods from an employee’s compensable work hours, even if the employee is not completely relieved from duty, may result in FLSA violations. For example, an employee usually takes a 30-minute unpaid meal break but skips the break on a particular day due to their workload. Without appropriate human oversight, a system that automatically deducts the break from the employee’s work hours based on the employee’s past time entries could result in the employer failing to properly record and pay the employee for hours worked.
FMLA
The FMLA provides employees up to 12 weeks of job-protected leave for qualifying family and medical reasons and requires continuation of their group health benefits. Employees must be restored to the same or a virtually identical position when they return to work after FMLA leave. Employees are generally entitled to FMLA leave if they work for a covered employer for at least 12 months, have at least 1,250 hours of service with the employer during the 12 months before their FMLA leave starts, and properly submit leave requests and any required medical certification.
The guidance warns that employers should oversee the use of AI or automated technologies to avoid the risk of widespread violations of FMLA rights when eligibility, certification, and anti-retaliation and anti-interference requirements are not met. For example, an automated system that propagates rules for leave certification that results in an employee being asked to disclose more medical information than the FMLA allows would violate the FMLA. Similarly, a system that triggers penalties when an employee misses a certification deadline could violate the FMLA if the deadline should not have been imposed or the system failed to consider circumstances that permit extra time for submission.
PUMP Act
The guidance also addresses potential violations of the recently enacted Providing Urgent Maternal Protections for Nursing Mothers Act (PUMP Act). This act provides nursing employees the right to reasonable break time and space to express breast milk while at work. An employee and employer may agree to a certain schedule based on the nursing employee’s need to pump, but an employer cannot require an employee to adhere to a fixed schedule that does not meet the employee’s need for break time each time the employee needs to pump. Additionally, any agreed-upon schedule may need to be adjusted over time if the nursing employee’s pumping needs change.
The DOL notes that AI or automated scheduling or timekeeping systems that limit the length, frequency or timing of a nursing employee’s breaks to pump would violate the reasonable break time requirement. Additionally, automated productivity scoring and monitoring systems that penalize a worker for failing to meet productivity standards or quotas due to pump breaks would violate the law. Likewise, an automated scheduling system that requires an employee to work more hours to make up for time spent taking pump breaks or reduces the number of hours a worker is scheduled in the future because they took pump breaks would also be unlawful retaliation under the PUMP Act.
The DOL guidance acknowledges that AI has the potential to help improve compliance with the law. But without proper human supervision, these technologies can pose potential risks to workers, increase the serious risk of creating systemic violations across the workforce, and expose employers to liability.
Contact Mark Fijman or any member of Phelps’ Labor and Employment team if you have questions or need advice and guidance.