Response to EDPB draft Guidelines on connected vehicles and mobility-related applications
Connected vehicles and mobility-as-a service solutions are key to Europe’s digital transformation, where the increased safety, ease and efficiency of mobility can contribute towards a safer and greener society. A correct application of the General Data Protection Regulation (GDPR) and of the ePrivacy framework are necessary to this end.
DIGITALEUROPE is therefore pleased to submit its comments to the European Data Protection Board’s (EDPB) draft Guidelines 1/2020 on the processing of personal data in the context of connected vehicles and mobility-related applications.
The draft Guidelines contain appropriate reference to best practice on privacy-by-design and data protection impact assessments (DPIAs), appropriate exemptions and using interfaces and applications on devices to manage car drivers’ preferences.
However, we find that in many parts the draft Guidelines do not expand beyond the key GDPR concepts, which are not sufficiently contextualised if not in the final section on use cases.
In our response, we highlight elements of the draft Guidelines that we believe to be of particular importance. We hope our observations will be useful for an improved final version of the Guidelines.
We agree with the EDPB’s focus on personal data processing in relation to the non-professional use of connected vehicles by data subjects. This being said, we believe the final Guidelines could be improved by more explicitly confirming in paras 19 and 30 the professional use cases that are out of scope.
On the other hand, we find that some key use cases that apply to connected cars, particularly with regard to maintenance, safety and vehicle control, are not duly considered in the draft Guidelines. Such use cases will be widespread and issues such as the use of location data or the relationship between local and cloud processing will be central. Similarly, ‘ownership takeover management’ functions will require the processing of legal identification documents, vehicle registration licences, biometric (e.g. selfie to be compared with ID card) and other data for the purpose of preventing motor vehicle theft.
In setting out a range of practices to help mitigate risks to rights and freedoms of individuals, the draft Guidelines explicitly recognise the importance of both anonymisation and pseudonymisation. However, the draft Guidelines fail to properly reflect the role of these techniques.
The draft Guidelines state that data from connected vehicles will most certainly be personal data and assume it will always be possible or desirable to link such data to a driver or passenger. However, not all processing of vehicle data will be interested in, or even capable of, identifying individuals and this position fails to recognise a wide range of such scenarios.
The final Guidelines should more clearly elaborate on the fact that for data to be considered personal, thought should be given to ‘all means reasonably likely to be used’ to identify an individual and recognise scenarios whereby a service provider will not be able to identify an individual from the data received.
It is not sufficient for data to be theoretically traceable to an individual for it to be considered personal. Rather, this determination should centre around the reasonable likelihood that identification will be possible, which should take into account the cost and time required for identification by those who are reasonably likely to access and use the information at hand.
It is in theory not to be excluded, for instance, that braking patterns could be so specific to an individual that technical brake data could be traced back. However, identification will be extremely difficult and therefore unlikely, particularly in situations where precise location, device or online identifiers are not being processed.
In particular, technical data can fruitfully be anonymised or indeed be anonymous from the start. Such data can include, for instance, performance statistics relating to a car’s braking system, where any identifying pieces of information either are not collected in the first place or are stripped out immediately after collection, ensuring that only the non-personal bits are processed. This also applies to other examples provided in the draft Guidelines such as engine temperature or tyre pressure.
Similarly, in the context of location data and alerts emitted from devices installed in connected vehicles, anonymisation plays a key role in protecting users. Anonymisation can be achieved through various techniques that remove single vehicle identifiers and full journey details. These techniques provide effective safeguards to prevent identification and/or surveillance of individuals as well as potential misuse of data.
The draft Guidelines are the most direct articulation to date of the relationship between the GDPR and the ePrivacy Directive in Internet of Things (IoT) scenarios. While such relationship had already been explored in previous opinions from both the Article 29 Working Party (WP29) and the EDPB, the draft Guidelines are clearer as to the EDPB’s interpretation and its impact in real terms.
In essence, the draft Guidelines argue that it is ePrivacy that will provide the legal bases applicable to the processing of connected vehicle data given the applicability of Art. 5(3) of the ePrivacy Directive to terminal equipment, whose definition covers connected vehicles.
Any residual application of the legal bases under the GDPR will be limited to consent and contractual necessity, which merely confirm the legal bases already authorised by the ePrivacy Directive as the general consent rule and its two exemptions. Similarly, any further processing on the basis of Art. 6(4) GDPR will be forbidden.
To be even more specific, what this means is that for any practical purposes, rather than complementing, ePrivacy supplants the GDPR when it comes to the legal bases for processing terminal equipment data.
We would like to stress how such combined interpretation, if confirmed under the text currently being discussed for an ePrivacy Regulation, would de facto exclude reliance on the GDPR for use cases where the impact on individuals’ privacy would be minimal or non-existent such as the use of anonymous data that we have highlighted in the previous section.
We urge policymakers to carefully consider the implications of this approach for Europe’s competitiveness in the development and deployment of beneficial, non-invasive IoT and artificial intelligence (AI) applications in the mobility sector. An overreliance on consent, combined with a restrictive interpretation of contractual necessity, will make data processing in these scenarios exceedingly difficult or impossible – what is worse, with no benefits for data subjects.
The draft Guidelines themselves recognise there may be inherent difficulties in obtaining consent and providing notice to drivers of connected vehicles, without offering any real solutions or guidance on how to address these difficulties.
Recognising that in some scenarios provision of notice may be impossible or disproportionate as outlined in Art. 14(5)(b) GDPR, for example because a service provider has no direct relationship with the driver of a connected vehicle, suggestions of alternative approaches would be highly beneficial.
Similarly, the final Guidelines should provide more practical guidance on how to apply alternative legal bases under the GDPR (and potentially the future ePrivacy Regulation) such as contract and legitimate interest.
The draft Guidelines stress the importance of local processing, for example in relation to biometric templates. Correctly, the draft Guidelines find that such local processing falls outside the scope of the GDPR altogether.
However, contrary to the draft Guidelines, such local processing is not subject to the GDPR not so much because it falls under the household exemption but because no controller or processor is actually processing the data at hand.
From this perspective, we are puzzled by para. 73 of the draft Guidelines, which states that the GDPR in such cases anyway ‘does apply to controllers or processors, which provide the means for processing personal data for such personal or household activities (car manufacturers, service provider, etc.).’ The examples provided in Recital 18 GDPR, which this paragraph of the draft Guidelines is modelled after, all imply an actual processing activity performed by a ‘natural or legal person, public authority, agency or other body,’ which the local device itself clearly isn’t. This being said, the reference to Recital 78 GDPR, in the part relating to producers, is correct.
Similarly, we believe the last use case on rental car dashboards should consider that any data processing in this context happens locally in the car. To the extent that the rental company – or, for that matter, any other entity – does not process such data, it cannot be regarded as a data controller or processor. Again, this does not eliminate the need for them as well as producers to take into account data protection when developing and designing their products, services and applications.
Section 2.1.3 of the draft Guidelines classifies as personal data relating to criminal convictions and offences – for which the draft Guidelines coin the expression ‘offence-related data,’ which is not found in the GDPR – any data from which infractions can in theory be inferred. This, however, is not justified by Art. 10 GDPR, where this category of data refers explicitly to the criminal convictions and offences themselves or the related security measures. The mere processing of speed and location data does not amount to the processing of data about a conviction or offence, unless it is linked to the existence of such conviction or offense as registered by competent authorities.
As such, we find no basis for the draft Guidelines’ assertion that ‘external processing of data revealing criminal offences or other infractions [i.e. from which such offences could only in theory be inferred] is forbidden.’ If such interpretation were to be followed, processing of basic data, such as an IP address, that may in theory be linked to criminal activity would not be authorised.
The draft Guidelines describe in-vehicle Wi-Fi connectivity provided by ‘road professionals,’ such as taxi drivers or companies, as an internet access service. This is incorrect. The taxi driver or company merely provides access to a communications network provided by a telecoms operator and is to be considered as the provider of an information society service, not of the underlying internet access service.
The draft Guidelines also find that the household exemption is not applicable in cases where an individual allows other passengers to connect to the in-vehicle Wi-Fi network. In addition to being perplexed by this extremely limited interpretation of the household exemption, which makes ordinary people data controllers, we find the reference to the exemption misleading: a vehicle’s driver or owner will themselves as a rule not be processing any data relating to such Wi-Fi connection – let alone determining the purposes and means of such processing – and hence they cannot be considered a controller in the first place.
Contrary to the draft Guidelines, we submit that cars are not a totally private area where people can act ‘without encountering any external interferences.’ Cars are a means of transport that on the contrary must necessarily interact with the surrounding environment – other vehicles, pedestrians and infrastructure (respecting traffic lights, road signs, etc.) – and constantly react to it.
From this perspective, digital services and connectivity do not fundamentally change the nature of transportation, but can provide useful tools to improve the safety, convenience and environmental footprint of mobility.
While data processing from connected cars also generates a corresponding need to protect the related personal data, as well as the security of the related non-personal data, it is wrong to assume that such protection starts from a situation where cars do not face interference from the outside world.