Go back to menu

Not even a pandemic can stop the GDPR

Dutch DPA Continues to hand out fines

20 May 2020

It's May 2020, and lockdown continues to be a reality for many whilst the COVID-19 virus spreads through the globe. Readers are probably all getting a bit tired of reading and hearing endlessly about the pandemic, and many are longing for something different. In these distressing times, it may feel good to turn to things that can provide some sense of familiarity, a feeling that everything is going to be normal again someday.

Reading about the latest activities of the Dutch Data Protection Authority (Dutch DPA) could deliver this feeling, as it would seem that the Dutch DPA has not been slowed down due to current circumstances, but has rather picked up pace and continued to hand out fines as if it were business as usual. Therefore, readers can rest assured that this article will not be about COVID-19, but instead about that other favourite topic – GDPR.

In this article, we are going to explore two recent Dutch DPA decisions. These decisions may be of interest to other organisations as they shed some light on the Dutch DPA's approach to the interpretation of the GDPR and the Dutch Implementation Act.

The decisions at hand

The first case concerns an investigation lead against the Royal Dutch Lawn Tennis Association (Koninklijke Nederlandse Lawn Tennisbond, KNLTB), and gives some background on the Dutch DPA's views on commercial needs vis-à-vis the legitimate interest of an organisation.

The second case concerns the use of biometric data, more precisely fingerprint authentication, and provides some insight on the allowed use cases concerning such sensitive data and the boundaries of the Dutch Implementation Act.

The curious case of KNLTB

In March 2020, the Dutch DPA imposed a fine of EUR 525,000 on the KNLTB for sharing the personal data of KNLTB members with two of KNLTB's sponsors in consideration for payment.

Background

The investigation was started back in 2018, when the Dutch DPA received complaints against KNLTB's new data sharing policy. The investigation revealed that KNLTB was sharing certain personal data of its member, such as name, gender and contact details, with two of their sponsors. The sponsors then used this data to approach the affected individuals by telephone or by post with offers concerning tennis-related products and services. The number of affected individuals was approximately 300,000.

KNLTB claimed that sharing such personal data with the sponsors was justified as it was based on the legal basis of legitimate interest as set out in the GDPR, and thus KNLTB was not required to obtain consent from the individuals affected by the data sharing. The Dutch DPA decided otherwise, and established that KNLTB could not rely on its legitimate interest, and thus the sharing of personal data with the sponsors lacked appropriate legal grounds.

Legitimate interest is one of the six legal grounds laid down in the GDPR upon which organisations may process personal data. In order to rely on legitimate interest, organisations must carry out a so called legitimate interest test in order to assess whether: (i) the interest claimed is legitimate, i.e. it can be considered lawful, (ii) the envisaged processing is necessary to fulfil that interest, and (iii) such interest overrides the interests or fundamental rights and freedoms of individuals.

Conclusions of the Dutch DPA

In its decision, the Dutch DPA determined that KNLTB's interest in sharing data with its sponsors had a purely commercial nature, and they concluded that such purely commercial interest may not be considered legitimate or lawful. The Dutch DPA's reasoning for this conclusion is that KNLTB's data sharing for commercial purposes does not follow from a legal norm, hence it may not be considered lawful.

Having decided that KNLTB's interest was not legitimate, the Dutch DPA did not need to assess if the other requirements of a legitimate interest test (i.e. necessity and overriding nature) had been met.

The Takeaway

The most important conclusion of this decision is that the Dutch DPA stated outright that pure commercial interests may not be considered legitimate interests under the GDPR. This view of the Dutch DPA is also reiterated in its guidance on legitimate interest available (only in Dutch) on the Dutch DPA's website.

Wider implications: Impact on advertising

This interpretation of the Dutch DPA could have significant impact on the advertising industry and on organisations in general as it deviates significantly from existing case law and regulatory guidance.

In its applicable opinion, for example, the WP29 stated that legitimate interest could include a broad range of interests that are 'acceptable under the law', and that only in the second step, when it comes to balancing these interests against the interests and fundamental rights of individuals, should a more restricted approach and more substantive analysis be taken. In other words, the lawfulness of an interest must be interpreted in the broadest sense, and therefore the real question is whether such interest can be used as a basis for data processing, which would be determined during the second and third step of a legitimate interest test. The WP29 also provided a (non-exhaustive) list of possible legitimate interests, which included, among others, conventional direct marketing and other forms of marketing or advertisement.

A similar approach may be derived from Recital 47 GDPR, which states that "[t]he processing of personal data for direct marketing purposes may be regarded as carried out for a legitimate interest."

The decision has been appealed by KNLTB, therefore it is not yet final. However, based on the above, it will be interesting to follow the appeal of this decision, as, if it stands, it has the potential to significantly impact the commercial and marketing strategies of Dutch organisations, in particular how direct marketing is conducted in the Netherlands (or, possibly, in the EU) as we know it.  

Welcome to Gattaca

The second decision explored in this article concerns an unnamed company and its use of biometric data to log employee attendance and working time. For the sake of simplicity and also to keep readers engaged, we will refer to this company as Gattaca – inspired by the popular 1997 sci-fi film with the same title.

Background

On 30 April 2020 the Dutch DPA published its decision to fine Gattaca. In its decision, the Dutch DPA imposed a fine of EUR 725,000 on Gattaca for using fingerprint authentication in relation to its employee attendance and working time logging system. This is the highest fine imposed by the Dutch DPA under the GDPR to date.

Fingerprints are biometric data, which are considered a special category of data (otherwise referred to as sensitive data) under the GDPR, and are subject to more enhanced protection than 'regular' data. The GDPR's provisions concerning sensitive data may also be supplemented by national legislation, which is exactly the case in the Netherlands, where the Dutch Implementation Act (Uitvoeringswet Algemene Verordening Gegevensbescherming) has set out several additional rules and restrictions applicable to the processing of sensitive data. The use of several types of sensitive data, such as e.g. biometric data, is also featured on the Dutch DPA's DPIA blacklist.

The processing of biometric data may be allowed only if one of the exceptions under Article 9(2) of the GDPR is met. In the case of Gattaca, the possible exceptions are (i) explicit consent, and (ii) substantial public interest on the basis of Union or Member State law. The latter is supplemented by the Dutch Implementation Act, stating that the processing of biometric data may be possible if it is necessary for security or authentication purposes.

Conclusions of the Dutch DPA

In its decision, the Dutch DPA explored whether the use of biometric data by Gattaca can be based on one of the possible exceptions explained above. We are not going to summarise the decision concerning explicit consent, as consent usually cannot be viewed as an appropriate legal ground in terms of employment relationships, as employee consents are rarely 'freely given'. This was the case with Gattaca as well, since the Dutch DPA revealed that nothing in Gattaca's documentation pointed to the possibility that the use of the fingerprint scanner was optional, and the employees were also under the impression that this is a mandatory requirement.

Therefore, the only possible legal ground is the Dutch Implementation Act's 'necessary for security or authentication purposes'. Gattaca argued that requiring employees to scan their fingerprints when clocking in and out was lawful since this eliminated the possibility of abusing the logging system and contributed to a more reliable attendance register (thus necessary for authentication purposes), and it was also necessary for security reasons.

However, the Dutch DPA arrived to a different conclusion, and provided a very limited interpretation of the provisions of the Dutch Implementation Act. The Dutch DPA stated that when using biometric data for security or authentication purposes, an assessment must be made whether such identification through biometrics is necessary and proportional to the purposes at hand. The Dutch DPA established that ensuring proper attendance registers and combating the abuse of time registration are not proportional to the use of biometric data, as such purposes may be achieved by other - less intrusive - techniques. In terms of security reasons, the Dutch DPA reached a similar conclusion, stating that for organisations such as Gattaca, the need for security is not so high so as the use of biometrics authentication could be considered necessary and proportional (as the published decision is anonym, the type of business conducted by Gattaca is unfortunately unknown).

The Takeaway

The above decision provides organisations with further guidance in relation to the use of biometric data on the basis of the legal ground provided for in the Dutch Implementation Act, and confirms that such legal ground must be interpreted narrowly, and may be invoked only if the use of biometric data is necessary and proportionate, and where no other, less intrusive measures are available.