One of my favorite William Faulkner passages is the rumination in “Light in August” on the strength of belief, despite the frailty of human knowledge. It starts, “Memory believes before knowing remembers. Believes longer than recollects, longer than knowing even wonders.” The chapter is the beating heart of a novel which stands as a tragic reminder that the stories we tell ourselves, our deepest beliefs about fundamental things like who and what we are, can be completely divorced from true facts, from verifiable knowledge.
Privacy law has a strained relationship with knowledge and truth. Knowing a fact about a person is the same as holding a piece of their privacy. But does it matter whether the fact is verifiably true? Does it matter how the fact came to be known? Does it matter whether the person, themselves, believes the fact?
It turns out that sometimes these things matter and sometimes they don’t. To process someone’s personal data means, in some noncognitive technological way, to “know” a “fact” about them. Yet one of the most common personal rights over data that laws provide is the right to correction. From one perspective, correction is my right to increase the veracity of the knowledge someone holds about me. That sounds almost like the opposite of a privacy right. Yet, at the same time, correction is a right rooted in autonomy and control over information. It is a right to conform others’ knowledge about me to my own beliefs, or, taken a step further, even to limit their knowledge to only that version of myself I wish them to know.
Just like their personal data, knowledge about someone can be gleaned from all sorts of sources: directly from the person, from other individuals, from public materials, or even via inferences derived from watching their behavior. At first glance, inferences may be the least likely source from which to glean true information about a person. Human behavior is messy and context dependent. My browser history often includes more noise than signal about my varied and shifting interests.
But when taken over time, and compared against other individuals, inferences about me can serve as powerful pieces of knowledge — even if they don’t conform to the story, I believe about myself. I may not identify as a “shoe maven,” but if I fall into the top quartile of shoe purchasers, or linger over new shoes in shop windows, or talk a lot about shoes with my friends, does my belief about myself matter? The knowledge gained by someone observing these trends is just as powerful, whether or not it conforms to my belief.
So long as a piece of information is linked — or even, perhaps, linkable — to an individual, privacy protections apply. In this way, privacy may be divorced from truth, whether objective or subjective. Maybe this makes privacy more powerful even than knowledge.
Here's what else I’m thinking about:
- A preview of future privacy priorities at the National Institute of Standards and Technology. President Joe Biden signed the CHIPS and Science Act this week. Though the focus of the bill is on new incentives for microchip manufacturing, it also includes funds for research and testing at NIST, along with fresh authorizations related to standards-setting for data privacy operations and technology, similar to NIST’s prior work such as the Privacy Framework. In addition to general privacy standards, the new law directs NIST to undertake new projects related to biometrics, artificial intelligence, identity management and privacy-enhancing technologies.
- What would a Bureau of Privacy mean for the Federal Trade Commission? Marketing Brew explored this potentiality, which is included as a provision in the proposed American Data Privacy and Protection Act.
Under scrutiny
-
- Federal Courts’ cybersecurity practices are the subject of a letter from U.S. Sen. Ron Wyden, D-Ore., to U.S. Supreme Court Chief Justice John Roberts.
- The use of surveillance drones at the U.S. border, among other tools deployed by U.S. Customs and Border Protection, is detailed in a report from The Verge titled “The Most Surveilled Place in America.”
- TikTok’s “manipulative” user interface and “addictive” algorithm are profiled in an approachable and detailed article by Luiza Jarovsky. TikTok recently settled a U.S. District Court case based on allegations that it violated U.S. privacy laws.
- Social media safety for LGBTQ+ users is the subject of a report from GLAAD, which includes detailed recommendations for improvements on privacy practices for the top social media platforms.
Upcoming happenings
- Aug. 10 at 2 p.m. EDT, professor Daniel Solove hosts an all-star LinkedIn Live discussion on the American Data Privacy and Protection Act titled A Federal Comprehensive Privacy Law.
Please send feedback, updates and thoughts on the nature of knowing to cobun@iapp.org.