09 Sep 2019

Response to EDPB consultation on video devices

DIGITALEUROPE is pleased to provide its comments on the European Data Protection Board’s (EDPB) draft Guidelines on the processing of personal data through video devices. The use of technology in this area has tremendous benefits for individuals and organisations, but also generates concerns in the public and uncertainty for businesses, particularly smaller ones. We therefore welcome the EDPB’s work to clarify how the General Data Protection Regulation (GDPR) applies to various types of processing by means of video devices. 

In our response, we highlight areas where we find application of the relevant GDPR provisions can be simplified in light of the letter of the text as well as existing case law. In particular: 

  • The household exemption should be more expansively interpreted in light of the clarifications brought about by the GDPR; 
  • Reliance on legitimate interest should be more clearly recognised, in particular when the purpose of processing is to protect property and physical integrity; 
  • Stricter requirements for special categories of data should be contained to areas that meet the relevant GDPR definitions, in particular that relating to biometric data applicable to facial recognition. 

While our comments encourage the EDPB to avoid Guidelines that are unduly restrictive, DIGITALEUROPE does not support the unmitigated deployment of video devices and associated technologies across all use cases. Video technologies will only gain consumer support if they are trusted; in many scenarios, privacy mitigations are necessary and ethical. Our goal, therefore, is to strike a balance that makes it possible to continue to innovate and benefit from these technologies while also protecting data subjects’ rights and interests. 

We hope our suggestions, read in light of the GDPR’s substantive obligations to preserve data subjects’ fundamental rights, can contribute to a positive deployment of present and future video technologies in Europe. 

 

Household exemption 

In illustrating a narrow interpretation of the household exemption under Art. 2(2)(c), the draft Guidelines refer to two pre-GDPR CJEU judgments that exclude such exemption in case of publication on the internet and where processing involves, even partially, public spaces. Accordingly, the three examples listed of exempted activities always refer to situations where individuals never publish their recordings online and/or are either in a remote area or in a purely private space.

However, the household exemption has changed in important ways under the GDPR compared to Directive 95/46/EC. In particular, Recital 18 now specifies that the exemption applies when there is ‘no connection to a professional or commercial activity’ and explicitly mentions ‘social networking and online activity’ in the context of an individual’s personal or household sphere as being exempted. 

These important changes should be reflected in the final Guidelines. The Article 29 Working Party (WP29) previously recognised that in an increasingly digitised society it would be illogical and impractical to subject individuals to full or even partial compliance with data protection law. As highlighted by the WP29, doing so could not only inhibit other fundamental rights, such as freedom of speech and association, but also jeopardise a long tradition of respect for individuals’ private lives, which should not be open to ‘official’ regulatory scrutiny.

The WP29 also recognised that other laws than data protection are available and might be more suitable to protect individuals against damaging material posted online under the household exemption. These include laws on libel, harassment, malicious communications, threatening behaviour, incitement, persecution or discrimination. The role of these laws in defending individual rights should also be reflected in the final Guidelines. 

 

Legitimate interest 

The draft Guidelines require a ‘real-life situation of distress’ in order for the legitimate interest legal basis to be met in case processing of video data to protect property.

This means, based on the draft Guidelines’ example, that a shop owner will not be able to install cameras unless she has suffered previous incidents of burglary, theft or vandalism –which the draft Guidelines require the owner to keep detailed proof of – or, at the very least, is able to present empirical and localised statistics showing high expectations of such incidents in the area. 

In fact, there is no basis in the GDPR for such rigorous limits to controllers’ reliance on the legitimate interest legal basis in situations where there are such clear interests at stake. 

Any owner has a direct interest in protecting her property, irrespective of whether offences have already occurred or bear a high probability of occurring. While legitimate interest does require a balancing test, we disagree that a case-by-case determination is necessary to establish the validity of legitimate interest for such uses.8 More certainty is needed lest unnecessary and disproportionate effort is imposed in particular on SMEs. 

On the other hand, we do agree that more complex determinations are required in relation to less straightforward cases. In practical terms, however, we believe these cases will be limited to those requiring a data protection impact assessment pursuant to Art. 35(3), particularly when monitoring on a large scale is concerned. 

This does not detract from the need for processing in each specific case to be adequate, relevant and limited to what is necessary, nor does it impact the need for the controller to implement technical and organisational measures minimising the impact on privacy and other fundamental rights (see next section of our response). 

 

Necessity of processing 

As stated in the Guidelines, there are various ways to protect property other than processing of video data. These include hiring security personnel and adopting physical protections and measures such as fencing, locks or lighting. This, however, does not imply that processing of video data is not a relevant and suitable measure. 

With reference to necessity and the data minimisation principle (Art. 5(1)(c)), the draft Guidelines provide that video surveillance measures can only be deployed when and where ‘strictly necessary.’ ‘Necessity,’ however, should be construed more expansively. The CJEU has held that processing that ‘contributes to the more effective application’ of legislation pursuant to Art. 6(1)(e) – and, by extension, to the more effective pursuit of the controller’s legitimate interest under Art. 6(1)(f) – could be considered as necessary.

Video devices can be valuable instruments, alone or in combination with other measures, to more effectively protect property and physical integrity. To the extent processing is in line and objectively linked with this purpose, we see no reason to ban the use of video devices, in itself, absent a detailed justification that no other means can be used in connection to the same purpose. 

This does not detract from the applicability of other GDPR provisions that require the controller to implement appropriate technical and organisational measures representing valid safeguards for the specific processing circumstances. Some of these measures are mentioned in the draft Guidelines, including: blocking out or pixelating non-relevant areas; limiting use to more sensitive hours; or restricting access to the data to when an incident has occurred. 

 

Data subjects’ reasonable expectations 

The draft Guidelines adopt a presumption that, based on an ‘objective third party’ test, monitoring within public areas is likely not to meet the data subject’s ‘reasonable expectations.’

It must be noted that people’s expectations as to reasonable uses of video have changed since the introduction of the movie camera and can change in the future. A bank customer in the 1960s may not have reasonably expected that the bank would be fitted with cameras, yet the draft Guidelines now list this example as acceptable.

This does not necessarily imply acquiescence to increased, intrusive processing of personal data, but may reflect a genuine shift in societal expectations and preferences. We suggest that a more circumstantial, case-by-case assessment is needed. For instance, video surveillance of parks, in particular where such surveillance is properly signalled to data subjects, can be considered as reasonable in that it can substantially improve the safety and security of public spaces and of those who want to use them lawfully, similar to the parking area example. Similarly, employees working in controlled environments such as warehouses or factories nowadays can reasonably expect that their workplace will be secured and equipped with video devices, in particular where clear information and notice are provided to them. 

Again, this does not impinge on the application of broader GDPR provisions requiring the controller to minimise the impact of the specific processing operations on privacy and other fundamental rights. 

 

Data subject rights 

We are particularly concerned with the draft Guidelines’ statement that when a data subject objects to processing, the controller should be obliged to switch off the camera unless she demonstrates compelling legitimate grounds for not doing so. 

In line with our comments above, we believe that the protection of property and physical integrity should always be considered compelling grounds, without the need for additional special circumstances or case-by-case assessments. Were this not the case, shoplifters or their accomplices could simply ask shopworkers to stop the processing and subsequently carry out their crimes. 

The need to protect property, staff and customers as well as to prevent crime must be considered a ‘compelling legitimate ground’ despite any objections. 

 

Special categories of data 

We welcome the draft Guidelines’ clarification that it is only when video is processed to deduce special categories of data that Art. 9 applies. 

This implies that controllers actively pursue such deductions from the footage; by contrast, the fact that special categories can hypothetically be inferred does not suffice to trigger application of Art. 9. While this is clear in the example at para. 61, the first example at para. 63 suggests that the mere fact of observing a crowd would trigger Art. 9 because it is in theory possible to infer sensitive data. This should be rectified in the final Guidelines. 

We are concerned with the draft Guidelines’ blanket statement that facial recognition functionality will in most cases require explicit consent. As explained at para. 75 of the draft Guidelines, the qualification of personal data as biometric data will require a case-by-case analysis to verify that the three cumulative criteria of Art. 4(14) are met. If one of the three criteria is missing, the processing of data will not be subject to Art. 9. 

In particular, it must be stressed that the definition of biometric data implies processing for the specific purpose of uniquely identifying a given natural person. If processing doesn’t result in such unique identification, the definition does not apply. 

Facial recognition technology can in fact process data without uniquely identifying a person. Templates do not necessarily enable unique identification of an individual. The individual is uniquely identified only when the template is correlated with a pre-existing template connected to identifying information held by the controller. In the absence of this other template and information, the individual cannot be uniquely identified from the newly acquired template. 

Consistent with paras 79 and 80 of the draft Guidelines, processing of a face template that is used only to detect matching faces should not fall under Art. 9. By contrast, Art. 9 does apply if biometric templates linked to uniquely identifying information are built and stored. 

In some cases, facial recognition may be deployed with the aim to simply determine that two face templates are the same and others are different, with no interest in identifying who is behind each template. Examples of such uses include: 

  • The counting of how many people enter a controller’s premises and to ensure the same person is not counted twice; 
  • Queue measurement; 
  • Calculation of how long it takes to move from the start to the end of a queue. 

In other cases, facial recognition may be deployed to uniquely identify a person for whom the controller owns a biometric face template connected to identifying information. In such cases, the system is neither able nor seeking to uniquely identify every other person who steps in front of the camera just by capturing her face through intermediate templates. 

In light of the above, we disagree with the draft Guidelines’ assertion that on-the-fly processing falls under Art. 9, thus requiring explicit consent from anyone captured by the camera. As a consequence, we urge the EDPB to reconsider the two examples contained in para. 84 of the draft Guidelines. 

 

Suggested measures 

Section 5.2 of the draft Guidelines, entitled ‘Suggested measures to minimise the risks when processing biometric data,’ appears to signify obligations on the part of data controllers, rather than suggestions. For instance, para. 87 states that in a controlled environment such as delimited hallways or checkpoints, templates ‘shall be’ stored in certain ways; likewise, para. 89 requires that controllers ‘must’ explore noise-additive measures. We urge that these and similar references should be softened. Requirements to follow unvarying prescriptive steps can actually undermine privacy by deterring usage of protections that may be more appropriate to a given set of circumstances. They also run counter to the GDPR’s risk-based approach, which allows both controllers and processors to adapt their security measures to the actual risks. 

 

Transparency and information 

We believe that the first-layer information described in the draft Guidelines contains unfeasible requirements. The sample warning sign at para. 114, in particular, is overly crowded and ignores the majority of signs that have already emerged in compliance with Directive 95/46. 

We believe that a feasible layered approach to informing data subjects must necessarily imply a very basic first layer of information that clearly signals the video processing, provided by the sign, and additional information available in the second layer, in digital and/or physical format. In this context, we urge the EDPB to reconsider its position that the second-layer information should always also be provided in a non-digital format, as in most cases this will only generate increased burden without improving data subjects’ access to the relevant information. 

 

Use of alternative solutions to access services 

The draft Guidelines require controllers to always provide an alternative to biometric solutions, in particular for authentication for the use of a service. While in principle agreeing with this approach,20 we believe consideration should be given to situations where biometric processing may be inherent to a given service. For example, the video technology used in a frictionless store will be quintessential to the convenience of the service, which simply cannot be provided without it. 

In addition, when facial recognition is used for security, authentication and identification purposes, having to provide alternative solutions may undermine the whole purpose of the processing. Requiring the controller to always offer an alternative way to access the building such as through badges or keys could lower the building’s level of security. This is particularly problematic in the financial sector, which uses video ID verification products for know-your-customer and anti-money laundering purposes. 

For more information please contact:
Alberto Di Felice
Policy and Legal Counsel
Back to Data privacy
View the complete Policy Paper
PDF
Our resources on Data privacy
09 Feb 2024 resource
The GDPR six years in: from harmonisation to alignment
15 Jan 2024 resource
DIGITALEUROPE’s response to the public consultation on a reporting scheme for data centres in the EU
16 Nov 2023 Position Paper
One Data Act to rule them all? Avoiding competing data sharing rules: DIGITALEUROPE’s views on the European statistics regulation revision
Hit enter to search or ESC to close
This website uses cookies
We use cookies and similar techonologies to adjust your preferences, analyze traffic and measure the effectiveness of campaigns. You consent to the use of our cookies by continuing to browse this website.
Decline
Accept