Editor's note: The IAPP is policy neutral. We publish contributed opinion and analysis pieces to enable our members to hear a broad spectrum of views in our domains.

Over the past decade, wearable activity trackers have become deeply embedded in daily life, providing individuals with real-time insights into physical activity, sleep patterns, heart rate, stress levels and more.

While these devices offer potential benefits in supporting wellness and preventative health, they also pose significant risks to privacy and security. As personal health information and devices shift from clinical settings to commercial platforms, the limits of current legal and policy frameworks are becoming increasingly apparent.

Data collection

Wearable activity trackers collect sensitive and granular personal information through continuous sensor monitoring. This includes accelerometer and gyroscope sensors, heart rate, GPS and biometric inputs.

When combined with machine learning and big data analytics, these datasets can be used to infer deeply private information like mood, stress levels and behavioral patterns — well beyond what users knowingly disclose.

These inferences raise serious surveillance, discrimination and profiling concerns, especially when data flows across platforms and third-party applications without clear or transparent processes.

The illusion of anonymization

Despite claims of anonymization, sensor data often contains unique and persistent fingerprints that make true anonymization difficult, if not impossible. The promise of anonymization can therefore offer a false sense of privacy.

Studies have demonstrated that deidentified activity and location data can be reidentified with high accuracy; this discovery raises concerns about the risk of reidentification and highlights the issue of anonymized information falling outside the scope of regulation.

Consent and the design problem

A foundational challenge in this ecosystem is the reliance on informed consent. In practice, wearable technologies are not designed to support meaningful consent. These devices often lack screens, keyboards or other interfaces capable of conveying complex privacy terms.

As part of the broader Internet of Things, wearable activity trackers operate in the background, continuously collecting data with limited user interaction. This passive and persistent data collection further complicates their ability to meaningfully inform users.

Additionally, users typically underestimate the extent and nature of data collection and rarely understand how their information is stored, processed or shared. Consent, then, functions less as a mechanism for autonomy and more as a tool for symbolic compliance.

Unequal access to information and hidden data value

Information asymmetry adds to these challenges. Companies that operate or develop wearable devices possess far greater insight into the data lifecycle and logic of algorithms.

Users, by contrast, are left navigating vague and lengthy privacy policies with little understanding of how their information is handled.

In this context, personal information and personal health information increasingly hold economic value, not only supporting health-related services and personalized wellness initiatives, but also in generating data-driven insights.

These insights may, in some cases, inform insurance assessments or workplace wellness programs, raising further concerns about the scope and purpose of data use beyond the individual's awareness.

Security vulnerabilities

Privacy and security risks are closely intertwined. Many wearable devices have well-documented vulnerabilities, including weak encryption, insecure Bluetooth protocols and limited capacity for regular security updates.

These shortcomings increase the risk of unauthorized access, including hacking, which can expose sensitive personal data or enable the manipulation of device functions.

Legal gaps and fair information practice principles

From a legal standpoint, wearable activity trackers expose gaps in existing frameworks. Much of the data collected by these devices is not classified as "health information" under conventional legislation. Even where privacy laws apply, they often fail to address how data is inferred, aggregated or repurposed. Moreover, prevailing legal models tend to emphasize notice and choice, overlooking the realities of device design, data fusion and user misunderstanding.

The limitations of the Fair Information Practice Principles, particularly the focus on individual control, highlight the need for a broader regulatory shift. Rather than placing the burden on individuals to manage complex data ecosystems, future frameworks should focus on institutional accountability, data minimization and privacy-by-design. Regulatory approaches should move beyond procedural oversight to address the design, promotion, and commercialization of emerging technologies.

Building a collaborative governance model

Lawmakers must also consider setting baseline standards for security design, including authentication protocols and encryption, especially given the shared technical architecture across many devices.

In addition to formal legislation, modern regulatory models may benefit from a networked approach — one that involves collaboration among governments, developers, researchers and civil society. Such models can better balance rules and standards, accommodate technical complexity and promote adaptive governance.

It is also important to acknowledge that technology is not neutral. Devices can influence how people behave, how information is shared and what types of monitoring become normalized in everyday life.

A call for systemic change

Ultimately, protecting privacy in the age of wearable trackers requires more than updating old laws. It demands a shift in how privacy is conceptualized, not as an individual choice, but as a condition shaped by design and systemic practices. As these technologies continue to reshape the boundaries between health, data and everyday life, privacy professionals must respond with more than compliance tools.

A proactive approach grounded in privacy-by-design, strong technical safeguards, and clear regulatory standards is essential to ensure the use of wearable health technologies aligns with the expectations and rights of users.

Paula Pizzotti, CIPP/C, is manager, privacy reviews and consulting at Fraser Health Authority in British Columbia. She is currently pursuing an LLM in privacy and cybersecurity at York University.