The U.S. Consumer Financial Protection Bureau is cracking down on workplace surveillance after employees from various industries reported companies deploying monitoring technology. The CFPB claimed organizations are using third-party technology to measure employee performance while sometimes collecting personal and biometric information without consent.

Recent guidance released by the CFPB indicated workers are being monitored by companies through artificial intelligence-driven technologies, including "black box" algorithmic devices. The devices can be used to score employees based on overall "effectiveness," by collecting employees' personal information. The CFPB justified jurisdiction over the matter, noting the Fair Credit Reporting Act applies to organizational use of this technology as it places protections to prevent employees from being unfairly profiled.

The CFPB held a joint field hearing with the Department of Labor on the same day the guidelines were released, offering an opportunity to unpack why the guide is necessary. CFPB Director Rohit Chopra said the guidance aims to advance protections for employees by enabling existing laws "as new technology shapes the workplace."

Regulatory compliance around workplace monitoring

The FCRA's requirements aim to "protect people from the abuse and misuse of background dossiers and scores," according to Chopra.

"The Fair Credit Reporting Act doesn't just cover reports assembled and scored for banking and lending. The same law protects us when it comes to all sorts of background reports, checks and scores including in employment," he added.

Organizations must comply with the FCRA by providing transparency regarding any monitoring technologies or techniques, obtaining employees' consent, and ensuring employees can dispute inaccurate information collected by the device. Inaccurate information must also be deleted or modified to ensure it does not impact the employee's performance metrics.

Contractors and third-party employees are also entitled to protections from potentially invasive monitoring technology.

Department of Labor Acting Secretary of Labor Julie Su said contractors and third-party employees who are "controlled" or monitored by workplace surveillance are technically considered employees and are therefore granted the same employment and privacy rights. She said employers assume "control" of contractors when tracking is assigned.

"And once they are your employee, they are entitled to a century's worth of labor law protections," Su added.

The FCRA requires organizations to provide consumers with a report upon request that can detail the information collected about individuals. This obligation also applies to employees who have had their data collected by workplace devices. National Employment Law Project Senior Staff Attorney Sally Dworak-Fisher said organizations that use tracking technology often do not share the specific metrics it is measuring or inform users of its data deletion and collection standards.

"Corporations are far from forthcoming about their use of these tools or how workers can correct errors that may cost them their jobs,” Dworak-Fisher said. "They effectively say, 'pay no attention to the man behind the curtain' as they force workers to fight for good jobs with one hand tied behind their backs."

Surveillance technology's impact on employment

Organizations are implementing AI as a tool to help employees with tasks. The risks of incorporating AI into facets of an employee's everyday work are still taking shape.

Michigan Nurses Association President Jamie Brown said AI surveillance tools could negatively impact a nurse's ability to provide quality patient care. Her hospital previously used tracking technology planted within nurses’ badges, sometimes impacting nurses ability to protect patients privacy.

With an increase in cyber incidents impacting health care organizations, Brown said health care privacy and trust is "critical to effective patient care." She also stressed that nurses would embrace health care organizations' use of monitoring technology that promotes worker-centric ideals, complements bedside skills, and improves the quality of care for patients.

However, Brown added she is "very concerned about certain technologies that are being implemented in hospital settings that do neither and instead can cause harm to our patients and our workforce."

Coworker.org Director of Policy and Research Wilneida Negrón indicated her organization has documented more than 500 cases of different workplace monitoring, and those cases are only rising now with AI being incorporated. Negrón said the’ terms of agreement for many employee monitoring devices include the right to collect sensitive information including stress levels, location tracking, health-related data, body temperature, respiratory rates, behavioral and productivity data.

Monitoring technologies can make employees feel like they are "under a microscope," Amazon Delivery Station Teamster Shannon Kowalski said. She claimed Amazon regularly implements disciplinary actions based on data collected by its mandatory electronic tracking tools for employees. Amazon's use of surveillance creates "fear and anxiety which creates a dangerous work environment," she added.

Despite the crackdown on risks, the DOL, the CFPB and other agencies tasked with tending to potentially harmful data collection practices remain committed to allowing for innovation. Su said enforcing worker's protections does not "live in contrast to innovation," but rights and protections must come first.

"What we are all talking about is responsibility," Su said. "What we’re talking about is recognizing the harms and the damages, and frankly the labor law violations and making sure that we continue to breathe life into the existing protections."

Lexie White is a staff writer for the IAPP.