TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Privacy by default in online services Related reading: OCR issues rule for reproductive health care under HIPAA

rss_feed

""

Privacy as the default setting is one of the "seven foundational principles" of privacy by design, a concept developed back in the '90s. PbD has been widely recognized in the business environment and subsequently became part of various legal frameworks. For example, it is expressly included in Australian Privacy Principles.

Most recently, data protection by design and by default was introduced by EU lawmakers as a prominent part of the provisions of the GDPR. In this context (see Article 25 of the GDPR) technical and organizational measures are mentioned, which should aim to ensure that, by default, only personal data which are necessary for each specific purpose of the processing are processed. 

As the ePrivacy Regulation is being developed in the EU, which will also have a tremendous effect on the online environment, the fundamental question, "What are the very settings which are by default private?" remains.

This will be a very practical problem for many companies wishing to ripe the benefits of digital market, and some up-to-date guidance from regulators and industry standards would definitely help.  

In the meantime, let’s try to analyze it ourselves. 

It is quite obvious that the online environment is built around collecting all kind of information, including user-related data. Many big players of the Internet have their core activities and business models revolving around gathering and sharing user-related data. Equally obvious is that for a long time, these business models were not designed to be privacy-friendly, and various forms of user tracking remain a prevalent practice. So it generally looks more like a "tracking by default" than "privacy by default."

Through consumer awareness and regulatory interventions, some changes have been introduced, and quite often users can now adjust their settings to make them more private. Right of choice and openness are supported by various tools, such as interactive dashboards, videos and pop-ups, through which users get basic understanding of what happens with their data and what can they do if they are not happy about it. In other words, opt-out supported by different software tools is offered to users.

Although any improvements which strengthen consumer rights in an online environment are more than welcomed, and in fact are necessary since users’ trust is critical for digital economy, this is still by far not something we would call ‘Privacy as the default setting’.

Therefore, more adjustments seem to be necessary.

In particular, it seems online service providers should by default only use such data which are necessary for the core functionalities of the services requested by consumer, e.g., when she created her account or downloaded an application. Making services better, enhancing a user’s experience, or personalizing services, justifications all too often used by services providers in their notices, would not meet this condition.

In such cases, or for such additional functionalities, opt-in instead of opt-out would be a more appropriate method. This would be in fact more logical altogether.

Why not convince people to switch on personalized advertisements instead of pushing it on consumers? The free and informed consent would no doubt have a more lasting effect, and if someone makes conscious choices, it is more likely that she will be in fact interested in viewing some content, rather than fighting against pop-up windows. Perhaps this would call for modifying some business models so quality goes before quantity, also, in an online environment where it still could be measured in many ways.  

It should be clear also, that no data traditionally viewed as sensitive or harmful should be gathered "by default" from consumers using online services. Under the GDPR, that kind of data includes:

  • data revealing racial or ethnic origin;
  • political opinions;
  • religious or philosophical beliefs;
  • trade union membership;
  • genetic data;
  • biometric data for the purpose of uniquely identifying a natural person;
  • data concerning health, and
  • data concerning a natural person's sex life or sexual orientation. 

It also includes data on criminal convictions and offenses or related security measures, which are not called sensitive data per se, but are treated similarly.

Processing such data would require, in most cases, explicit consent. In some situations, e.g., when such data were manifestly made public by the data subject, such consent would not be needed. However, even then, this should not be the default setting to publicize such data or disseminate them.

This would in fact be, to some extent, relevant to all categories of personal data, as they should not be made accessible without the individual's intervention to an indefinite number of natural persons.

Almost equally important is not to use default settings which would collect and further process, without the user’s explicit will, data such as: financial information and in particular credit card numbers or other data that could be used to authorize financial transactions, government IDs, Social Security numbers and other identifiers having a similar role, or precise geo-location. The risks for the individuals and the sheer probability of misusing such data are too big to allow for gathering such data from the outset and without making sure that users are aware of existing risks before providing such data.   

Similarly, this would be relevant to all kinds of automatic or semi-automatic profiling practices consisting of predicting aspects concerning performance at work, economic situation, health, personal preferences or interests, reliability or behavior, location or movements. No kind of tracking of individuals should be enabled by default. This should clearly be an opt-in preceded by sufficient description of techniques used and potential consequences to the individuals concerned. Instead, privacy-friendly algorithms based on processing of aggregated and anonymous data could work well with defining types of information or advertisements for particular groups of users.

It should be noted that the risks of creating detailed user profiles have been widely discussed and described in WP29 opinions and in particular in Opinion 02/2013 on apps on smart devices and in Opinion 8/2014 on the on Recent Developments on the Internet of Things. The former describes certain categories of data, typically processed in an online environment, as having a significant impact on the private lives of users and other individuals. This covers in particular such data as:

  • location;
  • contacts;
  • unique device and customer identifiers;
  • identity of data subjects;
  • identity of phone number;
  • credit card and payment data;
  • phone call logs;
  • SMS or instant messaging;
  • browsing history;
  • email;
  • information society service authentication credentials;
  • pictures and videos, and
  • biometrics.

It would be more than welcomed, that when providing such data, the user should be warned with some kind of additional prompt window, and asked, as with online bank transactions, for a second confirmation of her intent to submit the data and allow for their processing. Such consent should be, where possible, only for the given period of time, or at least user should be reminded, from time to time, of her right to withdraw such consent by changing relevant settings. This could be less relevant and practical if the service itself is, for example, about providing email account or storage for photos, inasmuch as it relates to this content processed solely to deliver the core functionality to the user. However, it would be relevant in many other situations and it would be certainly better to err on the side of privacy when engineering apps or services.

While still not sure what will be the final version of ePrivacy Regulation and what would be the default settings, good enough to be GDPR compliant, building consumer awareness and industry good practices remains also crucial. Ultimately all these factors: laws, consumer expectations and industry standards, are strongly interrelated and will shape privacy standards in an online environment.

What would be recommended, as a good practice, is that only the data needed for the core functionalities of the app or service, which are apparent and transparent to the user, should be collected based on the default settings. For any other types of data, only anonymized and aggregated data should be gathered, unless the user herself will modify such settings so that personal data may be processed for specific purposes.    

3 Comments

If you want to comment on this post, you need to login.

  • comment Jason Cronk • Jun 20, 2017
    Piotr, while I absolutely agree with you that informed consent should drive data usage and companies should only use data necessary for the core service, I think those concepts more appropriately fall under consent and data minimization. Under the original PbD formulation, Privacy by Default consisted of situations where "No action is required on the part of the individual to protect their privacy." If the risk were of communication surveillance, for instance, Privacy by Default would imply that the user need not take their own effort encrypt their communications. Outside the security realm, consider the risk of aggregation of data (i.e. building dossiers). Privacy by Default might imply that identifiers be randomly assigned to reduce the risk of correlation across services. I, as the user, don't have to take the step of creating different usernames for every service I use to protect my privacy, the service provider has considered this risk and built in a control to help me.  
    
    I often use the analogy of passwords. We don't give users the ability to choose "password" as their password, even though many would. Why? Because they don't have the security expertise to realize the risks of choosing a simple password (dictionary attacks, rainbow tables, etc.). Similarly, with privacy, we (privacy professionals) need to protect users by default because often users can't, won't or don't understand the risks involved to protect their own privacy.
  • comment Piotr Foitzik • Jun 20, 2017
    Hi Jason, I agree very much that data minimization is relevant here, what's more I would call data minimization by default the very essence of settings which would be by default private. It seems that this is also how it needs to be interpreted under the GDPR as it says 'The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed.' (cf. Article 25 (2) of the GDPR). It may be, as you suggest, that concepts of the PbD and 'Data protection by design and by default' under GDPR although being close to each other, are not exactly identical.
  • comment Sayantan Dey • Feb 13, 2021
    Very well explained text