Dear, privacy pros.
As I am writing this introduction on the morning of 5 June, I would like to wish our Muslim colleagues "Selamat Hari Raya Aidilfitri" (even though this will be slightly belated by the time of publication).
In recent news, we have seen a number of big tech companies heed the clarion call and jump more fervently than ever onto the privacy bandwagon. Alongside various announcements made at Apple’s annual Worldwide Developers Conference, Apple revealed details on how the latest version of its iOS mobile operating system will restrict how third-party apps collect users’ data. Using the "Sign In With Apple" feature in iOS 13, for example, users will be able to log in to third-party apps without permitting the tracking that comes with logging in with other credentials, such as their Facebook or Google accounts.
Google also recently announced two major changes that will affect how developers of Chrome extensions are able to collect information from users of the popular browser. Extension developers will be allowed to request access to data only where required to implement the feature they are providing, and more extension developers will be required to post their privacy policies online. Further changes are also expected on how third-party developers can use the Google Drive API to provide their users access to files stored on Google Drive.
These initiatives contrast with so-called “dark patterns” — user interfaces that are designed to coax users of tech or social media platforms into taking actions that allow greater access to the users’ data, such as prompts to grant access to the users’ contacts in order to keep using the platform. Such clever design tactics (e.g., bright colors, big buttons, misleading deals) are targeted under a proposed legislation that will create a new regulator within the U.S. Federal Trade Commission to police best practices.
Some other news that caught my eye recently includes a number of articles that showcase the growing importance of safeguarding the security of our biodata, as DNA sequencing becomes increasingly commonplace. Given that an estimated 26 million people have taken an in-home DNA or genetic screening test (and this number is only set to grow), the importance of protecting genetic databases cannot be overstated.
The collection of and access to such data no doubt confers great value to the individual, as well as society in general — for example, in tracking diseases and improving medical treatment. There are even instances when such data was used to track down serial killers. However, the potential for abuse is similarly significant. A recent study of the common open-source DNA sequencing software used by many of these companies revealed that the DNA-testing process is extremely vulnerable to hacking. In fact, in a bizarre example of how the future might throw up unimaginable scenarios for cybersecurity experts, researchers from the University of Washington managed to encode a malicious exploit into strands of synthetic DNA in order to gain control of the system on which the sample was being analyzed.
For our friends in New Zealand, an upcoming event organized by the privacy commissioner might help to shed some light in this area.
Finally, in other news, the Singapore Personal Data Protection Commission and the Hong Kong Office of the Privacy Commissioner for Personal Data have signed a memorandum of understanding that will foment greater cooperation and collaboration between the two jurisdictions on the data protection front. I would expect to see more sharing of information and best practices between the two authorities, as well as the conduct of more joint research projects and investigations. Hopefully, I will be able to share more on these outcomes in a future note.
Until then, I wish you happy reading.
If you want to comment on this post, you need to login.