Go back to menu

PSD2-innovation and GDPR-protection: a fintech balancing act?

Part 2: automated decision-making (including 'profiling')

15 January 2020

This article is the second in a series of four, in which we address key issues in balancing the potentially competing requirements of Payment Services Directive (Directive (EU) 2015/2366) (PSD2) and the General Data Protection Regulation (Regulation (EU) 2016/679) (GDPR). This article focuses on automated decision-making (including profiling) under the GDPR in relation to the adoption thereof by payment services provides following the implementation of PSD2.

Who are BOLT?

Our series (read part one: consent) focusses on the fictional fintech company ''BOLT''. BOLT is licensed in the Netherlands as a provider of the two new payments services introduced by PSD2: (i) account information services (AIS); and (ii) payment initiation services (PIS). As a recap:

An 'account information service' is an online service which provides consolidated information on a users payment account(s) with payment service providers such as a bank.

A 'payment initiation service' is an online service which initiates payment orders at the request of a user with respect to a payment account held at another payment service provider, such as a bank.

BOLT provides these services through a smartphone app. This app collects data from banks, savings and credit accounts and provides its users with consolidated information on the status of their personal finances. In addition, BOLT categorises and analyses its users income and spending data, and provides them, at their request, with a personalised credit score. Users can also initiate payments from their designated payment accounts directly from the BOLT app.

What is automated decision-making and profiling?

The GDPR specifically addresses profiling and automated individual decision-making, including profiling.

Under the GDPR, 'profiling' is defined as any form of automated processing of personal data for evaluating personal aspects, in particular to analyse or make predictions about individuals. Profiling can be used in different ways: (i) general profiling (without decision-making); (ii) decision-making based on profiling; and (iii) automated individual decision-making based on profiling.

The difference between (ii) and (iii) is best demonstrated by the following two examples where an individual applies for a loan online:

  • a human decides whether to agree the loan based on a profile produced by purely automated means (ii); and
  • an algorithm decides whether the loan is agreed and the decision is automatically delivered to the individual, without any prior and meaningful assessment by a human (iii).

The practical difference is that a distinction is drawn between profiling that involves human review and decision-making that has no human input and is based solely on profiling.

Are profiling and automated decision-making the same?

No. The GDPR restrictions on profiling and automated decision-making slightly differ. BOLT and other organisations therefore carefully consider how data is being used to understand the commercial and regulatory implications of the proposed data use case. To expand further:

  • Profiling assumes an analysis has been carried out regarding various aspects of an individual's personality, behaviour, interests and habits to make predictions or decisions about them. The information is analysed to classify people into different groups or sectors. This analysis identifies correlations between different behaviours and characteristics to create profiles for individuals. These profiles will be new personal data about that individual. The use of the word ‘evaluating’ suggests that profiling involves some form of assessment or judgement about an individual (and not a simple qualification on personal characteristics). This means that profiling can be useful, for example for credit scoring or making an evaluation of individuals' disposable income in order to match this with a tailored offering of goods of services.
  • Automated decision-making means decision-making without human involvement, and can be made with or without profiling. Automated processing means that the use of the personal data is done by computers with little or no human interference. A decision from an automated process involving humans who have no significant influence on the outcome of the decision-making process, would generally still be considered as an automated decision within the meaning of the GDPR. An example of automated processing (which does not involve profiling) is a website on which an individual can apply for a loan. The website uses algorithms and automated credit searching to provide an immediate yes/no decision on the application.

Profiling can be a useful way to provide value-added services. The benefits of profiling and automated decisions include that they can lead to quicker and more consistent decisions, particularly in cases where a very large volume of data needs to be analysed and decisions have to be made very quickly. As with anything, the benefits obviously carry risk for individuals unless they are carefully controlled. From this perspective, the GDPR therefore requires individuals to obtain explicit consent for certain types of profiling / automated decision-making activities.

What are the practical implications?

Let us consider the example where users of BOLT can apply for a loan through the BOLT app. BOLT can decide to use algorithms and automated credit searching to provide an immediate yes/no decision on the application. In that event, the credit application could be automatically refused, without any human involvement.

Banks are increasingly using automation to provide such value-added services. With the implementation of PSD2, payment services providers that provide AISs and PISs (such as BOLT) can also benefit from profiling and/or automated decision-making as they can, under specific circumstances and with the relevant consents/permissions, gain access to, and derive insights from consumer payment data.

BOLT now wants to know: (i) under which circumstances it is permitted to carry out automated individual decision-making (including profiling); and (ii) what else it needs to consider when doing so. BOLT thereby needs to take into consideration that certain EU member states may impose restrictions that are incremental to the GDPR and apply specifically to automated decision-making and profiling. Hence, the legal team at BOLT would sensibly confirm the specific rules in each jurisdiction in which BOLT has an establishment and/or would like to provide its payment services, to: (i) understand potentially competing legal and compliance requirements; and (ii) rationalise the legal requirements in a manner that allows the services to be compliant in a manner that complies with the relevant requirements whilst also ensuring an attractive consumer proposition.

(i) Legal basis

According to the GDPR, BOLT can carry out profiling and automated decision-making as long as it meets all the principles and has a lawful basis for the processing as set out below.

Solely automated individual decision-making (including profiling) with legal or similarly significant effects on an individual is prohibited, although this prohibition can be lifted when appropriate safeguards are in place. In the financial services context, a solely automated decision on the creditworthiness of a consumer could very plausibly have a serious effect on an individual (e.g. their ability to afford housing, quality of life). An automated rejection of credit can also impact credit scoring and the ability of the consumer to obtain finance in the future.

For BOLT this means that it can only carry out solely automated decision-making with legal or similarly significant effect, if the decision is: (i) necessary for entering into or performance of a contract between BOLT and the individual; (ii) authorised by law; or (iii) based on the individual's explicit consent. It makes sense that BOLT will need to focus closely on transparency with consumers and obtaining the appropriate permissions for this data use case.

Furthermore, when BOLT is using sensitive personal data (such as data concerning health), stricter rules apply under the GDPR. In such case, it can only carry out solely automated decision-making if: (i) it has the individual's explicit consent; or (ii) the processing is necessary for reasons of substantial public interest. Demonstrating that an activity is "necessary" for a substantial public interest is not a low bar, particularly as the payment service is for the commercial benefit of BOLT (with consumer gains being a general benefit rather than, for example, a public good). Therefore, another legal basis would need to be relied on if sensitive personal information is to be processed.

The GDPR rules that apply to sensitive personal data and automated decision-making, including profiling, are highly relevant to and fully apply where such information is handled in the PSD2 context. As such, a careful evaluation of whether any sensitive personal information will be processed as part of the services is necessary (as, without another ground, explicit consent would be required).

(ii) What else does BOLT need to consider when it carries out automated individual decision-making (including profiling)?  

Because of the risks associated with automated decision-making and profiling (such as the lack of visibility to individuals), BOLT may need to: (i) carry out a Data Protection Impact Assessment (DPIA); (ii) designate a data protection officer (DPO); and (iii) put in place other appropriate safeguards, such as providing its users with specific information about the processing and rights available to them. These aspects are considered in turn below.

Data Protection Impact Assessment

The GDPR highlights the need for the controller (i.e. BOLT) to carry out a data privacy impact assessment in the case of: a systematic and extensive evaluation of personal aspects relating to individuals which is based on automated processing, including profiling, and on which decisions are based that produce legal effects concerning the individual or similarly significantly affect the individual. The European Data Protection Board (EDPB) has interpreted this to mean that this requirement will apply in a case of decision-making, including profiling with legal or similarly significant effects that is not wholly automated, as well as solely automated decision-making as defined above.

In some circumstances, local law requires that a DPIA is performed, so it is best practice to assess the risks in any event.

Data Protection by Design and Default

Payment services providers must adhere to data protection, both by design and default principles. These principles require service providers to think about the impact their services will have on data protection before delivering them. For example, if the service is to be offered to individuals under 18, BOLT should consider age appropriate design controls (e.g. simplified privacy information). Appropriate measures should be taken to achieve GDPR compliance and minimise the processing of data.

It is increasingly clear that what is "appropriate" will also factor in the reputation and size of BOLT, and the nature and volume of data handled. Moreover, payment related information includes financial information which, if compromised, may very plausibly cause distress to individuals. BOLT would sensibly work with its technical teams to understand what technical controls are appropriate to mitigate risk, and if existing systems are sufficient to comply with data protection measures.

Provision of Information

As a ''controller'' under the GDPR, BOLT is required to provide specific, easily accessible information about automated decision-making, based solely on automated processing, including profiling, that produces legal or similarly significant effects. If BOLT is making such automated decisions, it must: (i) tell the individual that it is engaging in this type of activity; (ii) provide meaningful information about the logic involved; and (iii) explain the significance and envisaged consequences of the processing.  

This may include information relating to:

  • which factors are weighted in the decisions and what their weighting is; 
  • the origin of the data (e.g. data provided by the data subject, the data subject's payment history or public data files);
  • that the rating methods are verified on a regular basis in order to ensure their fairness, efficiency and equality; and
  • contact details for requesting the reprocessing of a decision.

Before initiating a profiling or automated decision-making, it is essential that the profiling model is clearly documented and explained. A real focus for this compliance workstream will involve understanding how to provide this information "meaningfully". The notices should not be buried in long terms and conditions and a careful balance should be struck to explain the more technical aspects of any automated decision-making. It may, for example, be useful to provide a short video which explains how decisions are made or very high-level guidance points.

Right of Access by Individuals

Under GDPR, payment services providers (such as BOLT), must be able to justify every automated decision, if asked by a consumer. By exercising their data rights (including their right of access), individuals can become aware of a decision made concerning him or her, including one based on profiling. If the automated decision has a material impact on the individuals interest, BOLT will need to understand the logic of the decision and also ensure that it has a team in place that is trained in understanding how to respond to these requests of individuals.

An individual has a right to review the decision made. It should be transparent how and why a decision has been made and such decision must be possible to verify. BOLT shall have appropriate measures in place to safeguard the data subject's interests, including the data subject's right to demand human intervention, to express a point of view and to appeal the decision. A process should be in place for individuals to challenge or appeal a decision, and the grounds on which they can make an appeal. Individuals should also ensure that any review is carried out by someone who is suitably qualified and authorised to change the decision. The reviewer should take into consideration the original facts on which the decision was based as well as any additional evidence the individual can provide to support their challenge. In the Netherlands, the data controller (i.e. BOLT) is considered to have taken suitable measures if the right on human intervention, the right for the individual to make its decision known and the right of appeal have been secured.

Profiling of 'silent party data'?

When BOLT shares consumers' transaction data under PSD2, this data may also contain information from users of these payment services that have not explicitly given their consent to the third party. This is a so called ''silent party data''.

Let us consider the following example.

  • Two friends - Adam and Nina - transact on a regular basis with each other. At any point in time, Adam decides to start making use of certain services provided by BOLT and consents to BOLT's access to his payment account data.
  • However, Nina's name and account number is also included in Adam's payment account data. This means that while Adam consented to share his information (as required by PSD2 – see the first article of the BOLT series), Nina did not.

The question then is, whether BOLT is allowed to disclose this silent party data?

The Dutch Data Protection Authority (Autoriteit Persoonsgegevens) has indicated in its advice on the implementation of PSD2 that explicit consent for the access to personal data only relates to the personal data of the individual that has given consent. For the processing of the personal data of third parties (e.g. Nina's data), a separate legitimate basis is required.

It has not yet been made entirely clear by supervisory authorities how the legitimate basis of such third parties is to be obtained. We would expect that data controllers (banks and third party providers) will try to base the processing of third party data on the processing ground 'legitimate interest'. This is in line with the opinion of the European Data Protection Board, stating that a lawful basis for the processing of these silent party data by – payment services providers providing AIS and PIS, could be the legitimate interest of a controller or a third party to perform the contract with the service user. This means that the legitimate interest of the controller is limited and determined by the reasonable expectations of data subjects.

However, it should be noted that BOLT cannot simply apply the legitimate interest basis to process personal data. Under the GDPR, all entities making use of this legal basis are required to carry out a balancing test to assess whether their legitimate interest effectively overrides the individual's rights. Factors that play an important role to be able to demonstrate that you have an actual legitimate interest include the nature of data and source of the data, the impact of the data subject and potential risks.

If legitimate interests is relied upon, the rationale for the outcome should be documented. Typically, organisations document key decisions in this area in a "legitimate interest assessment", which should also include legal analysis of the interaction between PSD2 and GDPR.

Practical conclusions / next steps

In this article, we discussed the possible use of automated decision-making (including profiling) by new payment services providers introduced by PSD2 (i.e. BOLT). Automated decision-making and profiling are subjects that are regulated by the GDPR. Therefore, BOLT will have to take into account the various requirements that apply under the GDPR when it carries out automated decision-making in its payment services. In summary, practical steps include:

1.      Undertake a legitimate interests assessment: consider whether legitimate interest is an appropriate legal basis under GDPR – if so, document.

2.      Understand data types: is sensitive personal information involved?

3.      Update privacy notices to users/individuals: this should confirm any automated decision-making / profiling.

4.      Conduct DPIA, as appropriate: this helps understand risks and relevant mitigation steps, helping to lower the overall risk profile for individuals.

Sophie Wijdeveld, Advocaat-stagiaire, Amsterdam, wrote this article with the authors below

The third article in the BOLT-series will discuss the issues of data portability under PSD2 and GDPR.