The U.S. Federal Trade Commission’s fifth annual PrivacyCon, held virtually for the first time due to the pandemic, allowed privacy researchers from a variety of disciplines to come together and share their latest work on issues ranging from health apps, to Internet of Things, to bias in artificial intelligence algorithms.

The papers presented at a session on emerging issues around international privacy looked at the fundamental tension between consumers’ desire for control over their personal data and the increasing reliance of firms on consumer-generated data as the fuel that powers everything in the digital economy — from targeted advertising to machine-learning technologies. Others focused on privacy and security issues, including data deletion and opt-out choices on websites, persistent identification of users, security certification in the payment card industry, and the adoption and abandonment of security, privacy and identity theft protection practices.

The research provided insight into previously underexplored or unexplored territory, such as how much consumers around the globe are willing to pay for privacy, what effect the EU General Data Protection Regulation and the exercise of privacy rights is having on consumer welfare and competition, and how websites are enacting requirements to obtain consent, offer opt-outs and allow consumers to request to delete data.

Paying for privacy

Indiana University Professor Jeff Prince’s study, "How Much Is Privacy Worth Around the World and Across Platforms?," tried to uncover the monetary value that consumers place on their privacy. It did so in a clever way by asking consumers how much money they would demand to be paid to allow a platform to share certain types of data about them with third parties.

He found that individuals value the privacy of their financial and biometric information the most. The next two most valued types of data were text messages and contact information, followed by browsing history and voiceprint data. The two data types participants put the lowest value on were data about their network and location data.

This ranking was remarkably consistent from country-to-country, as respondents came from Argentina, Brazil, Colombia, Mexico, Germany and the U.S. Interestingly, though, Germans in the study put a higher premium on their privacy compared to participants in South America and the U.S. Females and older participants in the study also placed higher premiums on privacy than males and younger participants, respectively.

Notably, the study raises particularly interesting questions as businesses continue to grapple with the California Consumer Privacy Act’s “financial incentivesprovisions and their non-discrimination requirements.

GDPR’s effects on consumer welfare and competition

A study by economists at the Massachusetts Institute of Technology and Columbia University, "The Effect of Privacy Regulation on the Data Industry: Empirical Evidence from GDPR," sought to examine the effect that the implementation of the GDPR had on the tracking of consumers.

Specifically, it found that when the GDPR came into effect, there was a significant drop in the number of users that firms could track, due to users exercising their right to opt-out. This had surprising effects on revenue and data quality, though. On the one hand, companies lost revenue from not being able to collect data about as many users as before. On the other hand, the average value of the remaining consumers increased, as the data they retained about consumers who had not opted out was cleaner and it became easier to track those who continued to opt-in.

Thus, as the researchers explained, the opt-out consumer who takes advantage of the privacy protections afforded by the GDPR “makes the opt-in consumers who share their data more trackable and possibly more predictable to the firms with which they share data.”

This study clearly shows that privacy-conscious consumers, who are the “clear winners,” do indeed benefit from laws such as the GDPR and CCPA.

Yet, some panelists agreed that companies are not adequately incentivized to comply with GDPR, as “enforcement is lacking.” One panelist speculated that this may be due to lack of funding or personnel at DPAs to enforce the law, a point that was also made clear in the European Commission’s report issued following the second anniversary of its application. A study presented by Boston University Professor Garrett Johnson on the effect of the GDPR on market concentration found evidence that in countries where regulators were stricter, companies were more compliant with data minimization requirements.

This highlights the fact that GDPR enforcement is not a monolith, but is carried out at the national level, which allows for fragmentation and diverging approaches.

More specifically, the study, "Privacy & market concentration: Intended & unintended consequences of the GDPR," found that websites reduced their web technology vendor use by 15% right after May 25, 2018, in response to the GDPR’s data minimization mandate. Yet, at the same time, the GDPR had the effect of increasing the industry’s concentration or the extent to which a market is dominated by fewer firms.

In other words, more small vendors were dropped, while larger ones were retained. As the authors point out, “As policymakers wrestle with how to protect individual privacy, they should balance the risk of increasing concentration of personal data ownership and increasing market power.”

Privacy choices: Consent, opt-out and deletion requests

Christine Utz, a systems security researcher at Ruhr-Universität Bochum, presented on how website choices about where to display and how to present cookie consent notices affect how users engage with and understand them. The study, "(Un)informed Consent: Studying GDPR Consent Notices in the Field," showed that users were most likely to interact with cookie consent notices displayed in the lower-left hand side of the screen. The reason for this, the researchers explain, is that the most important, denser information tends to be displayed lower on the page, and on the left-hand side due to the use of the Latin alphabet, which reads from left to right.

The study also found that, given a binary option of accept/decline, users were more likely to consent to tracking than if they were presented an array of options to allow cookie use for various categories or companies individually. The practice of “nudging,” whereby the interface design of a website steers a user into making the choice preferred by the owner of the website — usually a “privacy-unfriendly option” — was shown to have a significant effect on users’ choices. Websites that highlight the box for “accept” while leaving the box for “decline” unhighlighted, or that hide advanced settings behind links that are difficult to see, are examples of nudges observed in the study.

A similar study presented by Hana Habib, a researcher at the Institute for Software Research at Carnegie Mellon University, "An Empirical Analysis of Data Deletion and Opt-Out Choices on 150 Websites," analyzed the choices regarding opt-out from email communication and targeted advertising and data deletion presented to users. This research also comes out of the Usable Privacy Policy Project, which is led by principal investigators at Carnegie Mellon, Stanford and the University of Washington and aims to make privacy policies easier to digest so users can make informed privacy decisions as they interact with websites.

Specifically, it looked at how difficult it was for users to exercise these choices, by counting the average number of actions or clicks that a user would need to make to opt-out or delete their data. Encouragingly, they found that around 90% of websites offer some respective opt-out mechanisms, while almost three-quarters offer a data deletion mechanism.

Yet, the study also found that most choices are offered through privacy policies, which make them burdensome and confusing. In addition, some opt-out pages provided multiple links, which made it hard for users to know which one to click. The way that websites were formatted, and the way text was written also created uncertainty for users about what to do. Requiring users to submit written requests to opt-out or delete their data was also hard because consumers had difficulty in stating what they wanted or in finding the right language to formulate the request. To eliminate confusion and reduce burdens on consumers, the authors suggest that a website provides multiple, clear paths to the same opt-out or data deletion link. Wordpress.com and phys.org were a couple of sites in the study that were easy to use, while reactions to nytimes.com were mixed. Another suggestion they made was for privacy policy section headings to be standardized, analogous to the model forms that are used for compliance with the Financial Privacy Rule 16 CFR Part 313.

The value of privacy research

The research presented at PrivacyCon2020 is invaluable for those seeking to understand the real-world application of privacy rights, how companies and websites are complying with requirements to respect users privacy rights, as well as the broader impact that laws, such as the GDPR and CCPA, are having on the economy, society and the lives and experiences of people around the world.

All of the papers, speaker bios and contact information, and recordings of the sessions are available on the FTC’s website for PrivacyCon2020.

Photo by Martin Adams on Unsplash