TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Privacy professionals need to be aware of tech abuse Related reading: ICO urges stronger data protection for victims of abuse

rss_feed

""

Features designed to improve privacy and protect children in online services, apps and networked devices also make it easier for abusers to maintain control in abusive relationships. 

"Ever since caller ID and GPS became part of our lives, we've known that digital technologies can be used by abusers to harm or track their victims, and that's only become more complicated and more prevalent as technology has," Clinic to End Tech Abuse Director of Operations Lana Ramjit told an audience of cybersecurity professionals and academics at the the USENIX Association's Enigma 2023 Conference in January.

Ramjit's clinic, one of three in the U.S. dedicated to helping people in abusive relationships where technology plays an important factor, recently published a toolkit for others seeking to set up their own tech abuse clinics.

Few technologists or privacy professionals have first-hand experience or training in tech abuse, Ramjit said. The result is that well-intentioned privacy and security features can backfire when put into use. To avoid such traps, privacy pros need to be familiar with special tactics used by abusers and work with design teams to build defenses into products and services.

For example, abusers frequently use family telephone plans to maintain control. Such plans typically allow the account "owner" to enable location tracking and monitor phone numbers called, with the hope of protecting children. These same features let an abuser maintain control in a relationship, for example by turning off a partner's phone service for a few days as "punishment" if they do something the abuser finds displeasing. 

The Safe Connections Act of 2022 gives people experiencing abuse the right to have their phones removed from family plans. But things can go wrong when a person attempts to exercise this right. For example, phone companies might contact the account owner when they receive the disconnect request, which might notify the abuser that their partner is making plans to leave. This might trigger physical retaliation or worse.

"Traditional privacy and data security practices are designed to keep out strangers attempting to defraud the victim," Electronic Privacy Information Center Fellow Chris Frascella said. He authored EPIC's comments on the U.S. Federal Communications Commission regulations to implement the act. Such approaches frequently fail in the context of tech abuse.

Most abusers aren't sophisticated, but some are

Much of today's information economy is predicated on the idea of sharing information. Systems like Apple's iCloud automatically synchronize data between multiple devices, while services like Google Maps can share information between accounts. Many of these services, enabled when two people are in a healthy relationship, become tools for abuse and control if the relationship turns toxic. 

Today's systems are so complex it can be difficult for someone trying to escape abuse to find all the ways information might leak. For example, an abuser who uses Apple equipment might track another person's location using location sharing with the "Find My" app, hide an Apple AirTag in their car, monitor their location with a laptop logged into the target's iCloud account, use Google Map's location sharing or install a covert "spyware" or "stalker ware" app to monitor their location. The abuser can then leverage this information for psychological dominance. 

"They will take little pieces of information — the fact that you can see what someone is watching on the Netflix account — and say 'how are you liking the new season? I can see what you are watching. I know everything you are doing,'" Ramjit explained.

When people experiencing abuse go to authorities, their claims can seem so outlandish the authorities think the person must be suffering from mental illness. "Being treated as if you are in a mental health crisis, as opposed to describing abuse, is something that survivors have described repeatedly," licensed clinical social worker and Technology-Enabled Coercive Control Clinic Co-Founder Natalie Dolci said. She has personally worked with more than a hundred clients on these issues.

That's what happened to Katherine. After reporting her husband had hacked her phones and Wi-Fi, she was told she would receive help but was instead escorted by police to a hospital and subjected to an involuntary mental health evaluation. (Katherine asked that her last name not be used.)

Some domestic violence laws that do not recognize tech abuse as coercive control also work against victims. Although Katherine could prove her husband's tech abuse in court, the judge refused to grant her a protective order since there was no physical violence involved.

 "A perpetrator-partner can literally commit felony wiretapping against his wife and not get (an order of protection from domestic violence or stalking) for it, as I experienced," she said in an interview. 

A widespread problem

Technology-facilitated abuse, or "tech abuse," is thought to be a widespread problem, although researchers do not have good data regarding its prevalence. 

Emma Pickering is the Senior Operations Manager for the Technology Facilitated Abuse and Economic Empowerment Team at Refuge, the U.K.'s largest provider of services for domestic violence victims. Pickering estimates that tech abuse is present in 95% of the domestic violence cases  her organization encounters.

Refuge U.K. has developed a Tech Abuse Traffic Light System to categorize the types of tech abuse present:

  • Green is for abuse, like constant calls or an insecure dating app. These cases are now covered in training that many of its front-line workers receive. 
  • Amber is for more technical problems, such as a compromised Wi-Fi routers or hacked email accounts. These cases are supported by key workers and "tech champions." 
  • Red abuse involves physical tracking devices and spyware, and are referred to the tech team. 

Pickering said her team has responded to "over 3000 survivors with complex concerns" to date.

Unlike the U.K., the U.S. lacks a comprehensive approach to combatting domestic violence and abuse. Instead, there is a patchwork of efforts at the state, county and local levels. This makes it especially difficult to handle the issue of tech abuse.

"I've worked with over 100 survivors," Ramjit said. "Most of the help that we give is not very sophisticated. It is often things that are considered relatively mundane, like reviewing account settings, changing account passwords, looking at login histories. It's a lot less of the spyware that people tend to imagine."

"When I tell people this," Ramjit said, "I often hear the same question: 'Is helping people rest their passwords really the best use of your time?'"

Ramjit insists this is because helping people gives her the hands-on experience needed to design better services.

For example, being able to delete information from a device irrevocably is frequently seen as a pro-privacy feature. But at the conference, Ramjit quoted an interview showing this might not always be the case: "I wrote down things that were happening, altercations, and I had evidence. He told me that he needed to update the devices. He took the phone, forced the password out of me, and he deleted my journal."

Without such evidence, a person being abused might have trouble getting a restraining order or being successful in a child custody case. 

Since 2020, CETA has received 550 direct referrals, Ramjit said in a follow-up interview. The organization now helps roughly 160 people in New York each year, with one full-time staff member, four part-time members and roughly a dozen volunteers. 

Turning to the providers

Practitioners say it would be most helpful if technology and service providers set up channels for vetted tech-abuse professionals to  speak directly with platform owners about specific cases. But, such opportunities for resolving individual cases are rare.

"Keeping users safe is our top priority and we have a dedicated policy prohibiting stalkerware in Google Play," a Google spokesperson contacted for this article said. "If we find an app that violates our policies, we take the appropriate action. Additionally, we believe it's important to collaborate across the industry to explore best practices in user protections and safety initiatives. We will continue to engage in these conversations and contribute to solutions."

An Apple spokesperson declined to answer questions for this article. Instead, Apple referred to the company's 112-page Personal Safety User Guide, which advises users to take measures such as deleting "unknown fingerprints from iPhone or iPad" and adding a "recovery contact" to prevent people from being locked out. The guide doesn't explicitly say an abuser might add their own fingerprint to an iPhone so they can access it without the owner's permission. There is no checklist for people who are experiencing abuse. Instead, they are advised to "partner with local law enforcement or courts." 

Apple's approach is consistent with other technology and platform providers, TECC's Dolci said.

"The only time I have had successful interactions with platforms is if I happen to know someone who works there who can escalate a request," she said. "Otherwise, in many cases, I am in the same position as survivors: knocking on doors that don't get answered. I've heard the same thing from people in criminal justice."


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

2 Comments

If you want to comment on this post, you need to login.

  • comment Oliver Kindzorra • Oct 10, 2023
    Thanks a lot for bringing this to my attention. As a Privacy Engineer I'm constantly working on implementing Privacy by Design and Default, but I wasn't aware that such a problem exists. I wish law makers had software and hardware manufacturers included into their respective privacy laws, but they usually shy away from it, because they don't want to get toe to toe with the heavyweights of the industry. Even European Regulators admit that the GDPR has a "design flaw" in this respect. And that needs to change, either through changes in regulations or through proper Privacy Engineering. But the first step is accepting the problem and make it public. And that you did, for what I'm very grateful.
  • comment CHUA Teck Leong • Oct 23, 2023
    My understanding is that in some juridisction, laws had been passed to protect the victim.  As to whether this is a matter for the privacy professional, much more should be discussed so that a clear scope could be mapped and introduced at the privacy by default stage.