With new technologies, new types of data and new methods of collection defining of our current reality, privacy cannot merely be an afterthought. Language models are fueled by our personal data, artificial intelligence art generators sexualize without consent and the metaverse embodies "data collection on steroids." In addition to these technological changes creating rifts in privacy, cracks have also appeared in the legal foundations protecting long-established privacy rights. New privacy risks, it seems, are everywhere.
With all this in mind, it is no surprise people have become increasingly worried about their privacy in recent years. Indeed, several recent studies — including ones by Pew Research Center, Cisco, McKinsey & Company, and KPMG — have shown most consumers do care about the privacy of their personal data. In line with this research, the IAPP’s first Privacy and Consumer Trust Report, which surveyed 4,750 individuals across 19 countries, found 68% of consumers globally are either somewhat or very concerned about their privacy online.
Studies such as these challenge the conventional wisdom that has long assumed people don’t care about their privacy. At some point, most of us have probably heard privacy be declared dead or dying. Even lawmakers involved in passing privacy legislation have been disenchanted by this belief. Sen. Dave Marsden, D-Va., the chief sponsor of Virginia’s Consumer Data Protection Act, told The Markup, "I think the public is largely indifferent to data privacy things. It’s just an annoyance that a lot of people are willing to put up with."
Yet, the argument that most people are agnostic, or worse, apathetic, about their privacy is on weak footing. George Washington University professor Daniel Solove convincingly argued the so-called privacy paradox — the idea that people say they care about privacy but then fail to protect it — is nothing more than "a myth created by faulty logic." Put simply, people can both value their privacy and fail to manage it effectively.
Indeed, the world does little to help us be good stewards of our own privacy. Privacy notices and policies have been the main vehicles to inform consumers about what data is collected and how it is used. Yet, most of these are "dense, lengthy texts," written at reading comprehensive levels (some even on par with Immanuel Kant’s Critique of Pure Reason) most literate people struggle to digest. Not to mention the impractical amount of time and effort required for a person to keep track of their data via such notices. According to one estimate from a well-known study by Carnegie Mellon University professors Aleecia M. McDonald and Lorrie Faith Cranor published about 15 years ago, a person would need to spend about 40 minutes per day reading privacy notices just to get through them. No doubt, the amount of data collected and processed — and, thereby, the number of privacy notices consented to — has grown considerably since then.
Despite the difficulty in comprehending all the ways their data is collected and used, consumers do have a desire to understand these processes. For example, in the IAPP Privacy and Consumer Trust Report consumers were asked what steps companies could take to increase trust in their data processing. The top response was for companies to provide clear privacy information to help consumers "really understand" how their personal data is being processed. Yet, about four in 10 reported they find it difficult to understand a company’s practices with respect to data privacy. A minority of consumers, about three in 10, said it’s easy to determine whether a company follows good privacy practices.
Limited consumer understanding of what happens to their personal data erodes their confidence in companies across every sector of the economy. And the consequences of this trust gap are anything but trivial for businesses. When consumers lack trust in a company’s data collection and processing activities, they become more likely to take actions that resemble what Georgetown Law professor Julie Cohen called privacy self-defense. These can include things like withholding, obfuscating or fabricating personal data, as well as avoiding commercial transactions that involve data or gravitating toward more privacy-friendly competitors. Over the past 12 months, 85% of consumers said they deleted a phone app, 82% opted out of sharing personal data, 78% avoided a particular website and 67% decided against making an online purchase due to privacy concerns.
Trust is a crucial feature in every relationship. Its etymological roots are "to be firm/solid," coming from the Old Norse "traust," meaning confidence, security and safe abode. Feelings of trust thus provide a sturdy basis for the laws and norms that guide our behavior. Indeed, if trust does not exist between companies and consumers, employees and employers, and governments and people, the foundation of society rests on shaky ground. To nurture this trust, protecting the privacy of personal data needs to be at the center of every relationship.
Privacy and trust are intertwined. Those who respect our privacy thereby earn our trust. And, with those we trust, we share what is private. We should remain hopeful that a deeper understanding of the connection between privacy and trust by companies and governments will enhance both for consumers around the world.