Dear privacy professionals,
I trust that you had a great Data Protection (or Data Privacy, in America) Day on 28 Jan. It was certainly an interesting day for us in Singapore.
Barely six months after the SingHealth cyberattack, the health care sector was again rocked by news that confidential data relating to 14,200 HIV-positive individuals and 2,400 of their contacts had been illegally disclosed online. While this appears to have been an isolated incident involving unauthorized disclosure by a doctor who had access in his capacity as the head of the Ministry of Health’s National Public Health Unit, questions may legitimately be raised about the scope of his access, as well as whether security measures should have been put in place to prevent the information from being downloaded.
The fact that the MOH knew the information has been compromised since at least 2016 has also raised some eyebrows. Given the sensitivity of this information, it seems perfectly reasonable for the MOH to avoid going public in order not to cause alarm to the affected individuals when it came to the conclusion (wrongly, as it turns out) that the leak had been contained. However, this is unlikely to be enough to convince skeptics that it was not a cover-up.
On a more positive note, Singapore released a framework on how artificial intelligence can be ethically and responsibly used at the World Economic Forum meeting in Davos. Intended to be a “living” document providing readily implementable guidance to private-sector organizations around the world using AI, the framework is expected to be further fine-tuned as more views are gathered from the industry.
The timing could not have been better. While AI has been on the WEF agenda for a number of years now, it seems that the tone of discussions has turned decidedly darker. Thought leaders have shone a spotlight on a number of potential issues arising from the use of AI, including how it may replace human workers, foment global power struggles and result in algorithmic discrimination.
These discussions are strangely reminiscent of some of the prevailing themes in a series of books I am currently reading. In "Sapiens," "Homo Deus" and "21 Lessons for the 21st Century," Israeli historian Yuval Noah Harari provides a number of thought-provoking perspectives on humanity’s past and future, as well as critical issues we are facing in the here and now. Given the enormous breadth and importance of the topics covered in these books, ranging from climate change to terrorism to "fake news," it is interesting Harari feels that ownership of data is the most pressing concern that humankind faces today.
In an interview that he gave in 2018, Harari said:
“Those who control the data control the future of life. Because today data is the most important asset in the world. In ancient times land was the most important asset, politics was a struggle to control land, and if too much land became concentrated in too few hands, society split into aristocrats and commoners. In the last two centuries machines and factories became more important than land, political struggles focused on controlling machinery. If ownership of the machines became concentrated in too few hands, society split into capitalists and proletariats. In the 21st century, however, data will eclipse both land and machinery as the most important asset, and politics will be a struggle to control the flow of data. If data becomes concentrated in too few hands, humankind might split not into classes, but into different species.”
This resonated deeply with me as I have always felt that this is one area where government regulations are severely needed and sorely lacking. Unlike land and machinery, data can flow freely and instantaneously across borders, and multiple copies can be made at no cost. As the number of sensors around us increases, each individual is generating more and more information every day. For example, I can tell you exactly what workout I did on any given day in the past few years and what my heart rate was at that point in time. I can even tell you whether I slept better that night, or whether I had to wake up to pee because I drank too much water during the workout.
Naturally, I realize this information would not be very interesting or valuable to you. As such, we think nothing of giving it away to the companies that provided us with a desired product or service. Yet, when concentrated in the hands of a few innovative companies, the data generates enormous economic value and translates to significant market power, often leading to dominance and further concentration.
Notionally, the information belongs to me, but I did not get a single cent from the coffers of these wealthy companies. Perhaps instead of trying to tax the “FAANGs” of the future in order to provide its populace with universal basic income in a post-work society, governments should lay the groundwork now to mandate equal and open access to data and ensure that the data owner is duly rewarded for information collected by these companies. That might mean one day in the not-too-distant future, we would be paid not for work, but to workout and to live our lives out on social media.
Now, wouldn’t that be grand?
If you want to comment on this post, you need to login.