Go back to menu

Credit scoring: AI-based scoring must be transparent and not biased

New code of conduct for scoring-related data processing in Italy

31 October 2019

On 12 September 2019 the Italian Data Protection Authority (DPA) approved the new "Code of conduct for credit reporting systems operated by private entities regarding consumer credit, creditworthiness and punctuality in payments" proposed by the trade associations (CR Code).

The CR Code stems from the need to review the former Code of Ethics, which was rendered obsolete by the changes introduced by the GDPR.

Main points of the CR Code
Instant messaging as a way to communicate with data subjects.

With a view to simplifying the communication streams towards data subjects, new forms of contact have been implemented, such as instant messaging systems used on smartphones. Subject to agreement with the data subjects and provided that such messaging systems ensure traceability of the delivery, (i) prior notices informing data subjects of their registration in a credit reporting system, as well as (ii) alert notices informing data subjects that they are to be included in the "bad debtors" list, may be delivered via instant messaging.

Transparency in decisions

In the event of a denial of credit based on automated analysis of financial risk, data subjects may now request to know the information which led to the denial as well as the logic underlying the functioning and operation of algorithms.

Monitoring body

The CR Code envisages the establishment of an independent body – autonomous from sector-related companies – to oversee the work of credit reporting systems. Such monitoring body is composed of three members: one designated by the National Council of Consumers and Users (Consiglio Nazionale Consumatori e Utenti), one by the members of the CR Code and the third one, assuming the role of chairman, designated by the first two.

Data processing

With the aim to facilitate proper functioning of the market, the records – i.e. only necessary, relevant data not exceeding the credit risk assessment purposes – may be processed without the data subjects' consent, on the basis of legitimate interest of the companies participating in the credit reporting systems (while guaranteeing the wider rights set out in the GDPR). Those legitimate interests include the correct measurement of credit risk and the correct evaluation of the reliability and punctuality of payments as well as preventing risk of evasion.

New credit categories

In order to keep pace with the digital economy, the scope of registered data is now extended to include not only loans and mortgages, but also various forms of leasing, long-term rental and loans between private entities (so-called "peer to peer" lending).

Data security

In order to ensure reliability of the systems and protect data from unlawful access, several security measures are envisaged: amongst others, the pseudonymisation of data, the ability to ensure on a permanent basis the confidentiality and integrity of processing systems and a procedure to regularly verify the effectiveness of technical security measures aimed at granting security of processing.

Data retention periods

Data retention periods vary depending on the type of data:

  • data relating to requests of loans may be stored for 180 days; 
  • data relating to late payments, which were subsequently remedied, can be stored for one year, if the delay does not exceed two months/instalments, and for two years, in case of longer delays; 
  • data relating to late payments which are not remedied can be stored for three years;  
  • historical data regarding timely payments can be stored for five years.
What next

The members of the CR Code have committed to comply with the rules and principles contained therein, even though the text of the CR Code will become fully effective and enforceable only upon completion of the accreditation procedure of the aforementioned monitoring body, which requires the favorable opinion the EU Data Protection Board.

Overall, the CR Code raises the bar for the processing of data such as income, banking details, litigations, and existing indebtedness for scoring purposes. Although this kind of data is not expressly contemplated within the special categories of data set out in article 9 of the GDPR, it is apparent that processing of this kind of data may negatively affect individuals applying for a loan, especially in case of non-transparent automated, AI-based scoring.

The establishment of an independent monitoring body raises the risk of regulatory scrutiny, especially considering that consumers' associations will be represented in the body. We also expect that the monitoring body will (either directly or indirectly) report to the Italian Data Protection Authority, possibly resulting in investigation risks for clearing systems and lender banks. Adequate AI-governance – e.g. in the form of policies outlining dataset cleanliness, transparency rules, ethics – should help mitigate this risk, by ensuring that the decision-making process is transparent and not biased.

Giulia Ricci, Trainee Lawyer, contributed to the writing of this article.