TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Levy details our intimate relationship with surveillance Related reading: Blockchain and the GDPR: Addressing the compliance challenge

rss_feed
GDPR-Ready_300x250-Ad

""

It's often true that privacy professionals talk about the datafication of our lives as being impersonal. In fact, however, data is incredibly personal and intimate, and the technologies we employ impact us in incredibly intimate ways. Given that, it's important for privacy professionals to work beyond compliance and closely examine the ways technologies are going to be used, in particular, in asymmetric power dynamics. 

That was the message from keynoter Karen Levy at the IAPP's Privacy. Security. Risk conference here in San Diego. Levy is a sociologist at Cornell University, and she's currently working on the ways in which surveillance is bound up with our intimate lives. She illustrated her point with three examples of the way we see this playing out in the Digital Age. 

First, there's this relationship unfolding between surveillance and care. Think products like baby monitors, security systems and other hidden cameras that allow us to care for ourselves or others by surveilling them. 

"Surveillance is becoming a normalized way of taking care of one another," Levy said. "But there's less emphasis on how we're all becoming watchers." 

Besides babies and the elderly, another group we're increasingly surveilling is truck drivers. Many may have heard the familiar trope that AI will take over the truck-driving industry, given the safety appeal of self-driving cars; they can work through the 24-hour long haul without getting tired the way a truck driver does. Despite the panic over the potential economic crises presented if 2 million drivers are forced out of a job by a technology they biologically can't compete with, Levy thinks those concerns are misplaced. Instead of AI replacing truck drivers wholesale, she sees much more gradual curve, in which truck drivers interact with AI in intimate ways. That's because being a truck driver is more than just driving; it's customer interaction and securing loads, for example. 

"The rise of AI in trucking right now isn't displacement, but physical intrusion into their work and bodies," she said. 

What's happening is technology companies are creating biometric monitors to measure when drivers are getting tired. One company, for example, created driver-facing body cameras to notice fatigue. If the driver's eyes close, the technology automatically sends a note to his boss that he's exhibiting signs of exhaustion, and the driver's seat vibrates to jolt him back to attentiveness. These kinds of interactions are "very intimate and very much about the trucker's body. It's this forced hybridation that really changes the work that remains," Levy said. 

Finally, when we think about surveillance technologies and the ways they interact intimately with their subjects, we can look to domestic partner violence. A lot of the privacy solutions we might ordinarily turn to really don't work if you're in an adversarial relationship. That's Levy's current focus of study at Cornell. Technology plays a really big role in domestic partner violence, according to research based at a intimate violence shelters in New York City. She's looking at ways victims, 10 percent of men and 25 percent of women in the U.S., can seek the help they might need, but on the sly. That's because victims seeking to disconnect from their partners struggle to do so because of how intertwined our devices are with our relationships. Often, partners have access to each others' devices via apps like "Find My Phone," for example. In addition, a variety of spyware apps are often employed to surveil victims without their knowledge.

And, often, completely disconnecting can make the situation worse, because if you change our phone number for example, then in-person stalking begins. 

So how can we develop tools that help victims, but quietly? One way to do this is to capitalize on already deployed technologies that were perhaps developed for less dire situations, but nonetheless help with operating discreetly. For example, one technology, built to respond to the number of Americans watching March Madness on the sly at work, contains a "boss" button, which they can push to change the computer screen once the boss comes by to display generic graphs that resemble some kind of productivity versus a basketball game. 

"So this idea actually has applicability for people in some dangerous situations, where someone else might similarly have visual access to what you're doing." We need a sort of "escape button" like the March Madness example to help those who really need it.

While it's true our norms are shifting, it's important we develop the kinds of tools that appropriately react to those norms, to make sure, even as surveillance increases, so does our care for each other. 

Comments

If you want to comment on this post, you need to login.