The European Data Protection Board recently issued guidelines regarding requirements of Virtual Voice Assistants (VVA) to be GDPR compliant (the Guidelines).
What is a VVA?
VVA is a service that understands voice commands and either executes that request independently or works together with other systems and applications to achieve the request. It is commonly found in smartphones, perhaps the most notable would be Siri.
VVAs often play the role of intermediary between the user and other applications and online services. Hence, VVAs would need and have access to a large amount of intimate personal data to relay to users their desired service. If not properly managed, such access may harm the individuals’ rights to data protection and privacy.
Range of Intermediaries
The Guidelines identify five potential intermediaries and stakeholders throughout the execution chain that may bear different combinations of roles depending on the VVA, personal data processed, data subject request and the business model. These include the following:
VVA application developer
The Guidelines do not issue absolute determinations as to which stakeholder will be a data controller or a data processor, but they state that stakeholders of VVAs should decide which stakeholder will carry out which acts and comply with corresponding obligations under the GDPR. Stakeholders need to be aware of their potential obligations and the way they go about, not just dealing directly with data, but also indirectly in interacting with other stakeholders.
Given the recent cases of Google Spain and Fashion ID, the prospect of joint controllership arising can be quite easily identified by the courts. Joint controllership was declared to be something that could be inferred based on actions and technical configurations instead of something that was declared in a legal contract between parties.
Since VVAs process personal data (e.g. users’ voice, location or the content of the communication), they must comply, among others, specially with the transparency requirements of the GDPR. It is required that privacy policies of the data controllers are transparent and clear so that data subjects can control how their data is processed/used and know their data rights. Failure to provide necessary information is considered as a breach of GDPR obligations that may affect the legitimacy of the data processing.
Complying with the transparency requirements can be particularly difficult for the VVA service provider or any other entity acting as data controller. Considering complex nature of VVAs, data controllers have obstacles to comply with the GDPR’s transparency requirements such as:
Multiple users: data controllers should inform all users (registered, non-registered and accidental users), not only the user setting up the VVA.
Ecosystem complexity: the identities and roles of those processing personal data when using a VVA is far from clear.
Specificities of the vocal interface: digital systems are not yet fit for voice-only interactions. However, being able to inform the user/data subjects clearly and correctly through means other than in writing is a necessity.
The Guidelines also advise against bundling the VVA service with other services such as e-mails, due to the complex privacy policies that might accompany it.
GDPR states that data must be collected for specific and legitimate purposes and not go beyond these purposes. One interesting point raised by the Guidelines is how data processing for user profiling (for personalized content and advertising) interacts with the purpose limitation principle. The Guidelines refuse to make an absolute determination as to whether it does constitute an expected element of VAA and ‘necessary for the performance of a contract’ (and therefore no consent is required). Hence, data controllers should take extra precaution and make sure that users are notified of such data processing or given the option to opt out of that specific aspect of VAA.
The amount of data that is collected directly or indirectly and obtained by processing and analysis, e.g. not perform any analysis on the user’s voice or other audible information to derive information about their mental state, possible disease or circumstances of their life should be minimized.
Depending on the location, context and microphone sensitivity, VVA could collect third parties’ voice data as part of the background noise when collecting the users’ voice. Even if background noise does not include voice data, it can still include situational data that could be processed to derive information about the subject (e.g. location).
Therefore, the Guidelines recommend that VVA designers should consider technologies deleting the background noise to avoid recording and processing background voices and situational information.
The data minimization principle is closely related to the data storage limitation principle and not only do data controllers need to limit the data storage period, but also the type and quantity of data.
As previously mentioned, VVAs have access to information of an intimate nature of both data subjects and other third parties, which should be protected under the GDPR. It is imperative that businesses and users of VVA are aware of their roles and corresponding obligations under the GDPR (given the increasing loose definition of joint controllership in recent cases). It is crucial that the above-mentioned principles of GDPR are examined and implemented thoroughly to ensure full compliance with GDPR.
 Case C-131/12 Google Spain and Google  OJ C 212/4.
 Case C-40/17 (Fashion ID GmbH & Co. KG v Verbraucherzentrale NRW eV).
Back to News