New Zealand's Office of the Privacy Commissioner recently released an exposure draft of a biometrics privacy code, indicating biometrics are regulated under the country's Privacy Act 2020. But to what extent does the Privacy Act's fundamental purpose — that of protecting an identifiable individual — truly defend against today's new potential harms?
Privacy safeguards are attached to someone knowable. Can police identify the perpetrator from this footage? Should the salesclerk copy that driver's license? Generally, harm is assessed in terms of the potential "other" in receipt of personal information, be it a mild-mannered member of the public or a devious hacker.
However, many technological solutions can now target individuals without the need for traditional forms of identification. The fictional mall cop, doughnut in hand, feet up before a bank of screens, is not required.
So, no identity, no harm? The impact to an anonymous individual can still involve a real sense of intrusion. The essential assumption of privacy protection needs to be reviewed in line with the ways our identities exist in virtual spaces.
Responsive billboards
A 2023 article discussed advertisers using biometric billboards, called SmartScreens, in malls. The screens were equipped with cameras that could determine patrons' age, gender and mood while they shopped and advertisers could change the display based on the viewer.
For example, images and messaging for a Samsung campaign swapped out based on the on-looker. Women 44 and under viewed the message "With enough data for your insta-habit," women 45 and older were told they'd have enough data to "shop till you drop" and men of all ages received the message "stream every 2018/19 Premier League match live."
Of interest is the software provider's view that the technology did not involve "facial recognition," but rather, "facial detection." It involved advertisers reaching anonymous individuals based on an estimation of demographic characteristics. As such, Privacy Act protections applicable to the shoppers' data should not be triggered. But does this theory of anonymization detract from the original intent of privacy protection?
Historical context
While the Universal Declaration of Human Rights does not contain an explicit right to privacy, it is considered implicit in Article 12, which states, "No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks."
Identifying characteristics — particularly involving sensitive attributes such as religion, politics, sexual orientation and gender — are broadly considered a matter of self-identification.
In the billboard scenario, gender stereotypes are clearly at play. Is there potential harm in an advertisement inaccurately identifying a viewer's gender? Perhaps. There is a subtle coercive power involved. Not only does the viewer see a reflection of what they might want, they are presented — without asking — with what the advertising agency wants them to want. In other words, what they should want based on some perspective of societal norms.
As Warren and Brandeis famously stated in "The Right to Privacy," an 1890 Harvard Law Review article, privacy is the "right of the individual to be let alone." Although it is sometimes argued that certain online participation, for instance on social media platforms, reflects a choice to give up such a right, surrendering personal information has become so engrained in the digital economy, that it has become less of a choice, and more of an entry fee.
But you're anonymous, right?
Isn't that billboard, reflecting its view on a stranger to the stranger, the same as a passer-by or store clerk sharing an observation? Not quite.
The billboard — at least in this scenario — cannot be disabused of a misconception.
Similarly, targeted online ad messages generated via unreachable algorithms, cannot be easily reasoned with. Life, particularly post-pandemic, is regularly experienced via a screen. It can be difficult to avoid an experience tailored to "your" desires without simultaneously wiping data, like login details.
Don't let me get me
In a 2020 article, "The right to be let alone by oneself: narrative and identity in a data-driven environment," Bart van der Sloot argued that the right to privacy, as in keeping information from others, should be extended to keeping information from oneself.
"Being frequently confronted with information about one's past, present and future fundamentally challenges an individual's capacity to form and maintain an identity, which depends on her ability to select and prioritise information about herself. … the current privacy paradigm could be ameliorated by treating privacy not only as the right to be let alone by others, but in addition, as the right to be let alone by oneself."
The article goes on to give examples of wanting to avoid pop-up advertising for pregnancy products in situations where a pregnancy was not planned or avoiding personalized health-related predictions from fitness technology data. Essentially, even before delving into more sophisticated use cases, harm can and does occur when data is received without consent.
Privacy playing catch-up
The "Right to Privacy" article was triggered by brave new technology in 1890, namely photographs that could be taken surreptitiously without requiring a "sitting." Today, privacy protections continue to play catch-up with the intrusions associated with new technology and our relationship to it.
Reassessing when "identification," and thereby protection, begins will be part of the story in handing control back to individuals — anonymous or otherwise.