TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Consent and Forgetting: What Privacy Pros Can Learn from One Family’s Unexpected Experience Related reading: Comfortable In Your Genes?: 23andMe Offers DNA Analysis and a Privacy Promise

rss_feed
PrivacyTraining_ad300x250.Promo1-01

""

A recent article in VOX raised some fascinating questions about the ethics of choice and the value of forgetting.

Writing under the pseudonym “George Doe,” the author of the article recounts how he bought a personalized genetic test for himself and—consequentially—for his parents through the popular service 23andMe. As a PhD and stem cell and reproductive biologist, George Doe sought the test to demonstrate DNA collection to a class he was teaching on the human genome. But there was clearly a personal motive as well. George Doe describes his excitement at learning his personal genetic history and his fascination with the mysteries it could unlock. His enthusiasm was so great that he bought the same test as a gift for each of his parents.

As part of their service, 23andMe also offers a social media-type function that connects people based on their genomic tests. Through this feature, you can be connected with people who share similar genetic histories, suggesting that you have common ancestry. However, the 23andMe feature goes even further: You can be connected to people (also customers of 23andMe) who are directly related to you.

The dramatic twist to George Doe’s story is that this feature in the 23andMe service exposed the existence of his previously unknown half-brother. George Doe’s father had had another son and this revelation brought great emotional pain to the entire family, leading to the divorce of George Doe’s parents.

From a strict compliance perspective, George Doe’s story could be rather uninteresting. At the time, 23andMe required an opt-in to participate in the “close relatives finder program.” According to the article, the 23andMe service stated: “check this box if you want to see close family members in this search program.” George and his father both checked the box, indicating that they wanted to share their genetic results and see any resulting correlations.

The privacy concerns should end there, right? Perhaps not.

The consequences of their opt-in to this program were clearly more extreme than either man expected. George Doe is a trained biologist, teaching a class on the human genome, and he did not foresee that providing his consent to sharing this data (and, presumably, encouraging his father to do the same) would result in “really advanced paternity tests.”

This story raises a fascinating issue about the idea of consent in the privacy world. All consent is not equal. As privacy pros, we know this well; we manage compliance and respond to consumer expectations around consent and data practices every day. Too often, the public policy debate in the privacy world exists within the spectrum of “no choice at all” to “informed consent," with many gradients—opt out, implied consent, etc.—between them.

But do we rely too heavily on the power of consent as absolution for all privacy concerns?

This article suggests that some choices may be more important than checking a box or applying your (digital) signature to an agreement. George Doe, a PhD and reproductive biologist, did not foresee the significance of his choice. And yet, 23andMe could certainly make an argument that consent—even affirmative consent—had been obtained.

So should there be higher levels of consent, beyond affirmative consent, for decisions related to data that may have enormous consequences for individuals? Do organizations need to assess the degree of decisional understanding of their data subjects prior to even presenting a data choice to them? Ryan Calo, a leading privacy thinker at the University of Washington School of Law, has argued convincingly that we need to move toward “visceral notice” and consider experiential indicators that ensure data subject engagement and understanding when significant data decisions are presented.

George Doe seems to agree (emphasis added):

When you check that box it should have a bunch of stars and bells and whistles around it. Because there are plenty of people who click boxes. Nobody reads their iTunes agreement. That's how I feel about the family finder thing: You just check all the boxes, just keep doing it, and never put a whole lot of thought into the possibilities.

I would want a warning saying, "Check this box and FYI: people discover their parents aren't their parents, they have siblings they didn't know about. If you check this box, these are the things you'll find."

At a minimum, George Doe’s story is a cautionary tale for privacy pros. We should be cautious of relying on mere compliance as a complete answer to the privacy issues that emerge in today’s information economy—even when compliance demands the explicit consent of your customer. We may even need to frame this in ethical terms: Do organizations have an ethical duty to ensure that their data subjects clearly understand the foreseeable consequences when significant data decisions are being presented? I think George would answer “yes.”

There is another theme that emerges in George Doe’s story that is relevant to the broad debate over privacy: the value of forgetting.

The facts revealed by the genetic tests in the story are deeply personal and relevant to many of those involved. Thomas (the half-brother revealed by the tests) was adopted and had no knowledge of his parentage or family medical history. We can presume that such information was extremely important to him. George initially thought the revelation was fascinating—a discovery enabled by the power of genetics.

But what about George’s father and his ability to move on with his life after the birth and adoption of Thomas? Does he have a right to expunge that information from history and rebuild? Does he have a right to deny Thomas access to his personal medical and family history?

These questions were eloquently explored by Viktor Mayer-Schönberger in his book, Delete, where he argues that humanity has always needed a mechanism to forget the past. In forgetting, argues Mayer-Schönberger, we empower forgiveness, reduce emotional pain and allow for reinvention and redemption. In the George Doe story, however, we see a countervailing interest: Thomas’ interest in understanding his genetic heritage. It is cliché to say “your right to swing your fist ends at the tip of my nose,” but does George’s father’s right to forget interfere with Thomas’s right to know?

The value of forgetting in the article is clear, given that George Doe’s story ends in the divorce of his parents, the alienation of his father, and his own emotional pain and turmoil. In the end, George questions the idea that more information, more knowledge, is better than the alternative of not knowing:

One of my favorite phrases is sunlight is the best disinfectant. I still think that's true. But this has challenged that worldview. This is an example where having more information has had a negative emotional and psychological impact on me and family relationships.

The complexity of this story, and the millions of possible stories that genetic testing may create, are mind-boggling. And across our data-driven society, the need for sophisticated privacy practices that untangle such complexity is explicitly clear. Consent and forgetting are just two vectors that privacy pros need to balance as we seek better solutions to protect and manage privacy. And we must remain ever-vigilant to the reality that, often, robust compliance will not be enough to resolve all privacy concerns.

Note: The article states that 23andMe changed their consent standard from an opt-in to an opt-out for the family finder service subsequent to George Doe’s use of the service. The 23andMe privacy statement on their website appears to confirm this.

Comments

If you want to comment on this post, you need to login.