TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

Privacy Perspectives | Why US-based companies should care about the Norway DPA's interpretation of GDPR consent Related reading: Norwegian DPA reduces Grindr's GDPR fine



U.S.-based companies and regulators should fully understand the impact of a decision from Norway’s data protection authority, Datatilyset, regarding how consent is “done,” what constitutes special category data and what “manifestly made public” means. With the new U.S. privacy laws in California, Virginia and Colorado borrowing the definitions of “consent” and “sensitive data” verbatim from EU General Data Protection Regulation, as well as adopting a consumer intent-based standard for determining what constitutes “publicly available information,” the devil will be in the details of how these get interpreted. 

What did Norway's decision teach us?

In its decision, Norway’s DPA holds dating application provider Grindr responsible for the extensive data sharing carried out by the app in connection with targeted advertising tracking embedded in the app. The decision analyzes the four elements of consent under Article 7 and imposes a fairly high threshold for meeting with them.

Freely given: Separate, non-bundled consent for each purpose is required, and you can’t condition the service on giving consent. Per the decision, if you don’t provide clear information on the possibility of accessing a service if consent is refused, the choice becomes illusory. Accepting an entire privacy notice is not compliant. Also, sharing personal data with advertising partners requires separate consent as it is not necessary for providing the main services in the app, and the processing operations serve distinctly different purposes.

Specific: The consent needs to be granular for each purpose.

Informed: The information explaining the choice needs to be such that the individual really understands the consequences of their choice. It needs to be highlighted and not just be part of a general privacy notice. When asking for consent for a few things, each needs to be clearly distinguishable from the other matters.

Unambiguous: It must be obvious that consent was given. Clicking “I accept the Privacy Policy” is not enough.

Importantly, at its crux, the decision drives home the point that for any processing that is not necessary for the service (specifically, targeted advertising), true consent needs to be acquired and consent can never be true consent if the provision of the service is conditioned upon this consent, i.e., a “take it or leave it” scenario.

Will the US follow suit?

Other than showing us where the wind is blowing in the EU, regulators and practitioners in the U.S. should pay close attention to this decision because the new privacy laws — California Privacy Rights Act, Virginia's Consumer Data Protection Act and the Colorado Privacy Act — all use a definition for consent that has been lifted verbatim from Article 7 of the GDPR and requires the consent be a “freely given, specific, informed, and unambiguous indication of the consumer’s wishes.” 

There are indications U.S. regulators are thinking along the same lines as Datatilsynet at least in some respects. For example, the CPRA specifically prohibits “burying” consent inside terms of use of a lengthy privacy policy. It also states that hovering over, muting of closing consent is not sufficient for providing consent and prohibits consent acquired through the use of dark patterns.

However, what about the “take it or leave it” aspect?

Traditionally, interpretation of U.S. Federal Trade Commission requirements for consent, as well as consent as acquired for the purpose of U.S. wiretapping laws, did not prohibit conditioning the provision of a service on acquiring consent. Does the adoption of the new definition for consent signal that this aspect of GDPR consent will be adopted in the U.S. as well? We can find a hint in this direction in a CPRA provision that allows a consumer to limit the use of their sensitive information to that which is necessary for the provision of the services unless the individual provides consent.

An added complication is that unlike under the GDPR, consent in the new U.S. privacy laws is not a legal basis that could apply to all processing. Rather, consent is implicated only in specific scenarios, such as:

  • In CPRA: selling or sharing the information of a child; an opt in to sale or sharing after an opt out; entering into a financial incentive program.
  • In Virginia's CDPA: allowing a business to process personal data for purposes not reasonably necessary to or compatible with the disclosed purposes for which such personal data is processed or to process sensitive information.
  • Under CPA: opting in to processing information for targeted advertising (e.g., after an opt-out).

If adopted, the EU interpretation will insert into the U.S. area the age-old discourse of what is “necessary for the performance of a service” (see the recent discussion in the Irish Data Protection Commissioner Facebook case that has been referred for consultation under the one-stop-shop mechanism) and what is an equivalent alternative service, namely, is a free online newspaper with cookies the same as an online newspaper without cookies but with a monthly paid subscription?  

Sensitive information

In the Grindr decision, Datatilsynet also analyzed how the app processes data and found it processes information concerning a consumer’s sexual orientation. Per Datatilsynet, you don't have to require disclosure of the data subject’s particular sexual orientation for it to be deemed “concerning.” It may be enough to disclose generic keywords related to an app or a platform that itself targets data subjects pursuant to a special category data designation. The decision further states the sharing of personal data concerning a natural person’s “sexual orientation” to advertising partners is sufficient to trigger Article 9, irrespective of how the data is further processed by the data controllers to whom the data was disclosed.

The U.S. laws borrow their definitions of sensitive information, including in all three laws, “Personal information collected and analyzed concerning a consumer’s sex life or sexual orientation” from the GDPR. If adopted in the U.S., this interpretation will affect how companies process sensitive information and the scope of information the use of which will require consent. It may spark discussion, addressed in the Grindr decision, that these limitations put providers of sensitive information at a disadvantage as compared with their peers.

Public information

And what does the Grindr decision tell us about public information? Under the U.S. laws, publicly available information is carved out from the definition of sensitive information, and thus, from the opt-in consent requirement if it applies. Publicly available includes “information that a business has a reasonable basis to believe is lawfully made available to the general public by the consumer” or disclosing information when “the consumer has not restricted the information to a specific audience.”

The Grindr decision discusses whether the information on the app has been “manifestly made public” by the data subject. Here, too, Datatilsynet takes a rather narrow view of what this means. Per Datatilsynet, it must be obvious the data subject has meant to make the information in question available to the public. A public profile is not sufficient for this standard, especially in services, like dating apps or potentially certain social media platforms, where you need to create an account and/or the profile is usually only shown to a limited number of users. Datatilsynet also says, “It goes beyond the reasonable expectations of the data subject, even if their profile is set to ‘public,’ that an app would disclose information concerning their sexual orientation to advertising partners.”

The new U.S. laws specifically address targeted advertising, with the CPRA pulling cross-context behavioral advertising out of the realm of “business purpose,” calling out the opaque nature of advertising and stating as a goal, granting consumers a clear explanation of the uses of their personal information, including how it is used for advertising, as well as how to control, correct or delete it.

With this in mind, when coming to assess consumer intent for the use of their information, will the regulators lean toward this path set for them by the EU?

Photo by Bill Oxford on Unsplash

Credits: 1

Submit for CPEs


If you want to comment on this post, you need to login.