TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Asia Pacific Dashboard Digest | Notes from the Asia-Pacific region, 12 April 2024 Related reading: OCR director discusses HIPAA rule on reproductive health data

rss_feed

Kia ora koutou,

Back in August 2023, I wrote about the release of a consultation paper on the regulation of biometrics by New Zealand's Office of the Privacy Commissioner. In the paper, the OPC indicated its intention to issue a code of practice regulating the use of biometric technologies. In the IAPP Global Legislative Predictions 2024, we predicted "a new biometrics code of practice will create additional obligations for organizations looking to implement biometric solutions, such as facial recognition technologies." We're now a step closer to this reality, with the OPC's 10 April release of an exposure draft of the Biometric Processing Privacy Code.

For those unfamiliar with New Zealand's privacy regulatory regime, the Privacy Act gives the OPC the relatively unique power to make law, in the form of codes of practice that have the force of law. This allows the OPC to develop codes of practice in relation to a class of organizations, a class of personal information, a class of activity, or a specified industry. For example, the OPC has issued codes of practice related to health information, credit reporting, telecommunications and information sharing in civil emergencies. The OPC is permitted to use codes of practice to modify the application of the information privacy principles by prescribing more or less stringent standards, exempting any actions from a principle, or prescribing how one or more of the principles are to be applied in a given situation.

The OPC is proposing to use its law-making powers to regulate a class of personal information — biometric information — and a class of activity — biometric processing. This is an initial commentary on some of the interesting features of the exposure draft, based on a quick review. The exposure draft was released the same morning these notes were written, so no doubt there will be more fulsome commentary in the coming months.

Helpfully, the draft Biometric Processing Privacy Code provides examples of "privacy risks" that could be created by biometric processing, including over collection, over retention, inaccuracy, bias, security vulnerability, lack of transparency, chilling effect — such as where biometric processing results in adverse actions or deters individuals from exercising protected rights — and scope creep. These are familiar risks for privacy professionals and seem comprehensive.

The draft code provides for a balancing exercise in which the "benefit" of an organization achieving its lawful purpose will outweigh the privacy risk of biometric processing if the public benefit outweighs the privacy risk, a clear benefit to the individual outweighs the privacy risk, or the private benefit to the organization outweighs the privacy risk to a substantial degree. This is similar, in effect, to the legitimate interests balancing exercise provided for in the EU General Data Protection Regulation. It will be interesting to observe how this balancing exercise is applied by organizations. Like many existing Privacy Act provisions, it allows for a relatively subjective assessment on the part of the organization, particularly in relation to private benefit.

A set of recommended "privacy safeguards" that somewhat mirror the safeguards in similar laws overseas relating to responsible artificial intelligence are included in the draft code. Safeguards include informed consent (note, though, there is no requirement in the code that biometric information only be collected or processed with consent), transparency, testing of and assurance over the biometric system, security measures to protect biometric information, human oversight and monitoring, regular review and auditing, user training, and clear protocols and policies.  

The code would implement a set of 13 "biometric processing privacy rules." These rules reflect the information privacy principles but prescribe how they should be applied in relation to biometric processing. Importantly, however, several new or expanded obligations related to biometric processing are introduced, which are summarized below:

  • Rule 1 introduces a proportionality test, requiring that the organization must reasonably believe the biometric processing is not disproportionate in the circumstances. For the purposes of this proportionality test, the organization must consider the following factors (which very much reflect the OPC's expectations in relation to Foodstuffs North Island's recent facial recognition technology trial): the effectiveness of the biometric processing; the degree of privacy risk; whether the organization's purposes could be achieved by alternative means; whether the benefit of the biometric processing outweighs the privacy risk; and the cultural impacts on the biometric processing on Māori or any other New Zealand demographic group. It's worth noting here that the limitation to New Zealand demographic groups could leave an organization's overseas data subjects at some risk.
  • Rule 3 significantly expands on the Privacy Act's existing transparency obligations, requiring organizations to provide more information about the collection and processing of biometric information. It also specifies that privacy notices must be accessible, including being presented separately from an organization’s privacy statement, and conspicuous, that is readily noticed by an individual before their biometric information is collected.
  • Rule 4 prohibits certain high risk biometric processing. It prohibits an organization from using biometric classification to collect or infer health information, information about an individual's inner state or physical state (both defined terms), or information intended to categorize the individual according to their age, or any prohibited ground of discrimination under section 21(1) of the Human Rights Act 1993. There are some exceptions to this prohibition, including where the processing is necessary to assist with accessibility.

The OPC has asked for feedback on the exposure draft by 8 May. In particular, the OPC wants feedback on the proposed proportionality test required by Rule 1, the requirement for transparency about biometric processing in Rule 3, and the proposed prohibition on high risk uses of biometric information in Rule 4.

There is no doubt the Biometric Processing Privacy Code will be a topic of conversation during Privacy Week, taking place in New Zealand 13-17 May. Keep an eye out for information on IAPP events in New Zealand and Australia that week, including a keynote presentation from New Zealand Privacy Commissioner Michael Webster in Wellington on 14 May. Finally, there is still time to submit a proposal to speak at the upcoming IAPP ANZ Summit in Melbourne in November. The call for proposals closes 5 May.

Ngā mihi.


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.