TOTAL: {[ getCartTotalCost() | currencyFilter ]} Update cart for total shopping_basket Checkout

The Privacy Advisor | Key takeaways from CNIL's draft recommendation on smart cameras Related reading: Local facial recognition bans begin to take hold

rss_feed

""

""

Smart cameras are not a simple extension of existing video surveillance techniques, but they raise new ethical and legal questions. Simply put, the main difference between smart cameras and a traditional video surveillance system is that smart cameras are coupled with software, allowing a real-time and continuous analysis of the captured images. Algorithms analyze those images and extract information such as pattern recognition, movement analysis and object detection, all without human analysis being necessary. This implies that images are no longer randomly viewed by a limited number of persons behind a screen, and individuals are subject to an automated analysis of who they are and what they do.

When these devices are used in a public space, they allow users to track individuals, detect suspicious events or profile individuals according to, for example, their age or religious belief. When coupled with facial recognition technology, smart cameras even provide the ability to identify individuals by processing of their biometric data.

France’s data protection authority, the Commission nationale de l'informatique et des libertés, observes that the deployment of smart cameras in the public space raises considerable challenges as they are increasingly used by a wide range of stakeholders. Examples include public authorities using them for safeguarding public orders (e.g., crowd control or public health restrictions) or retailers that use them for security purposes as well commercial and marketing purposes (such as displaying targeted advertisements on billboards according to the individuals’ mood, age and gender). Therefore, the CNIL identifies numerous risks to individuals’ privacy when it comes to smart cameras in their draft recommendation on the matter, published Jan. 14.

What are the privacy risks identified by the CNIL?

First, the CNIL states that a one-size-fits-all approach is not suitable to evaluate the privacy risks of smart cameras. The degree of risk will vary substantially according to:

  • The nature of the collected personal data.
  • The intrusive degree of the processing activities.
  • The level of control that individuals have on these processing activities.

Therefore, the CNIL considers that risks should be evaluated on a case-by-case basis. However, from a general perspective, the CNIL states the following:

  • Smart cameras considerably widen the quantity of the collected data.
  • Individuals may be subject to decisions that affect them without their knowledge and through an “invisible” process, and thereby lose anonymity in the public space.
  • Sensitive data could be systematically processed on a large scale (for example, if smart cameras are placed next to hospitals or places of worship).

Interestingly, the CNIL observes that these risks exist even if the captured images are anonymized or destroyed immediately, insofar as the EU General Data Protection Regulation is applicable to all processing of personal data, including anonymization processing operations.

How to be GDPR compliant when using smart cameras?

First, the CNIL recalls that every organization intending to use smart cameras must have an explicit and legitimate purpose. More importantly, this purpose cannot be narrowed to the operational objective pursued (e.g., the classification and analysis of individuals in the public space) but must be defined by the broader goal of the processing operation (e.g., road and public space management).

Second, the CNIL calls upon controllers to justify the need to use smart cameras by assessing whether there are less intrusive means of achieving the purposes envisaged (e.g., infrared sensors, security guards, police officers), and by demonstrating that smart cameras successfully meet the objectives pursued.

Furthermore, the CNIL considers that the information provided to the individual should not be general but must focus on the particular nature of smart cameras (i.e., on the algorithmic nature and their automated analysis capacity) and must be provided on a suitable media (i.e., information panel, videos, QR codes, sound announcements, etc.).

Finally, the CNIL considers:

  • A data protection impact analysis should be performed because of their innovative characters and systematic large-scale monitoring.
  • The appointment of a data protection officer may be mandatory, especially if the core activities of an organization is based on the use of smart cameras on a large scale.
  • Privacy-by-design considerations must be implemented. For example, organizations should consider reducing the quality of the images (by lowering the definition) or the number of images processed. They should also consider the integration of automated processing mechanisms allowing almost immediate deletion of source images and the production of anonymous information.

Two challenges for GDPR compliance: the right to opt-out and the use of the 'legitimate interest' lawful basis

First, while not being categorical, the CNIL’s message is quite clear — most of the time, the “legitimate interest” basis would not be applicable due to the lack of balance between the rights and freedoms of individuals and the interests of the controller, especially in the absence of reasonable expectations of individuals. More precisely, the CNIL excludes the use of “legitimate interests” for smart cameras that analyze and segment people to display/send targeted ads based on age, gender, emotions detected through gestures and expressions, sensitive data, or interactions with an object. In these contexts, the CNIL even goes one step further. It observes that it cannot identify another appropriate lawful basis and concludes the use of smart cameras for these purposes would not be GDPR-compliant.

Second, the CNIL observes that smart cameras can automatically capture the image of someone passing by without being able to avoid it and/or opt out. Stakeholders have put forward some solutions to alleviate this issue, such as requiring individuals to adopt a particular gesture to show that they opt out, or to wear distinctive sign or clothing, but the CNIL considers these methods impose an excessive constraint on individuals.

The only possibility to exclude the right to opt out: the use of smart cameras for statistical purposes

The CNIL observes that organizations may use smart cameras without having to grant a prior right of opting out to individuals if two conditions are fulfilled:   

  • The statistical results obtained from the data processing must be aggregated and anonymized.
  • The statistical results must not be used to take a decision on the individuals whose images have been captured, which implies that the controller must first process the statistical data, and then apply any measures to a group of people that is necessarily different than the individuals whose images have been captured.

If these conditions are met, the CNIL considers that the right to opt out may be excluded if it prevents organizations from obtaining reliable statistical results.

The analysis and conclusions made by the CNIL could be similar in other jurisdictions

In light of this difficulty and other challenges to be GDPR-compliant while using smart cameras, the CNIL calls upon organizations, particularly public authorities, to draw the line between what is “technically possible” and what is legally, socially and ethically acceptable.

It appears that these conclusions may be transposable in other jurisdictions. The principles of proportionality, data minimization, privacy risk mitigation and individual rights are core principles recognized under most privacy regulations. Therefore, the CNIL’s public consultation, happening through March 11, is a first step in what will likely be a global reflection.

Photo by Jürgen Jester on Unsplash


Approved
CDPO, CDPO/BR, CDPO/FR, CIPM, CIPP/A, CIPP/C, CIPP/E, CIPP/G, CIPP/US, CIPT, LGPD
Credits: 1

Submit for CPEs

Comments

If you want to comment on this post, you need to login.