Dear, privacy pros.
If you are going as stir-crazy as I am being grounded in your home country, recent news about the impending reopening of borders or the potential establishment of travel bubbles between countries may sound like music to your ears.
Personally, I am a little paranoid about certain countries pushing forward with these plans for economic reasons before the situation stabilizes. However, there is no doubt that digital vaccine passports or certificates proving a negative COVID-19 test result would be a key plank in any strategy toward the safe resumption of travel. Singapore is already using the open-source OpenAttestation framework to cryptographically endorse and verify pre-departure test results under its HealthCerts program, and this is likely to be extended to provide proof of vaccination in the future. Several other initiatives worldwide are also leveraging blockchain technology to ensure the trustworthiness of similar certifications.
You may wish to refer to this article from IAPP Staff Writer Jennifer Bryant for a more detailed analysis of the privacy implications arising from such initiatives.
Turning now to the recent articles on the Asia-Pacific Digest, one that caught my eye is the report from Reuters alleging China is developing technical standards for the use of facial recognition that place an unusually heavy emphasis on tracking ethnicity and other racial traits. Being unable to establish the veracity of these claims independently, I can only say I concur with the statement from the author of the IPVM report cited in the article that such a system, if established, would be “ripe for abuse.”
There is no question that, when used in the right way, mass surveillance technology has the potential to bring about great public good, as this example from India proves. As with any technology that relies on the collection and processing of personal data at scale, safeguards must be built into the system to ensure not only compliance with data privacy principles, such as data minimization, but also with data ethics principles, such as preventing algorithmic bias or discrimination. In this regard, it is heartening to note that some major companies are paying heed to such risks and proactively limiting the use of biometric data.
I am almost at my word count limit, so I will end with three small footnotes:
- You may be interested in reviewing the latest version of the IAPP "Privacy Vendor Report," which is growing in comprehensiveness and gaining significant traction since its inception in 2017.
- The EU and South Korea have recently concluded their adequacy discussions, with a final decision likely to be adopted in the coming months.
- The Singapore Personal Data Protection Commission updated its data breach management guide to highlight the mandatory data breach notification requirement under the Personal Data Protection Act (now known as the "Guide on Managing and Notifying Data Breaches". It has also updated its "Guide on Active Enforcement" with further details regarding voluntary undertakings and expedited breach decisions under the PDPA.
And with that, I wish you happy reading.
If you want to comment on this post, you need to login.