21st December 2023
“The European Court of Justice has ruled that credit scoring, which is commonly carried out by credit reference agencies, can in certain circumstances constitute automated decision-making under the GDPR. A second point coming out of the decision is that credit reference agencies should only retain personal data about individuals from insolvency proceedings for as long as these are published on the relevant public register.
Credit reference agencies and their customers who rely on their information (such as banks) in the UK should take note. You might need to revisit your business model and/or data retention periods – although the CJEU decision isn’t automatically binding on UK courts, it could well be persuasive.”
Jessica Padget, Associate, Regulatory and Compliance
On 7 December 2023, the European Court of Justice (the CJEU) gave its decision on related cases involving a German credit reference agency (CRA).
The CJEU ruled that in certain circumstances, credit scoring carried out by CRAs can constitute automated decision-making under Article 22(1) of the GDPR. The CJEU also ruled that where a CRA retains personal data about individuals from insolvency proceedings, this information should only be kept for as long as it appears on the relevant public register.
In this briefing, we give a summary of the cases and findings, and explain how this could affect businesses in the UK.
The CJEU made a ruling in related cases OQ v Land Hessen and SCHUFA Holding AG (as intervener) (Case C-634/21) EU:C:2023:957 and UF and AB v Land Hessen and SCHUFA Holding AG (as intervener) (Cases C-26/22 and C-64/22) EU:C:2023:958.
SCHUFA is a German CRA – it provides information on the creditworthiness of consumers to its customers, such as banks. SCHUFA establishes the probability of a future behaviour (such as the repayment of a loan) and allocates the consumer a ‘score’, based on their specific characteristics, using mathematical and statistical procedures. The claimant in the first case was denied a loan by a loan provider following the ‘score’ given to it by SCHUFA.
SCHUFA also records and stores information from public registers in its own databases. This includes information from the public insolvency register where an individual has been formally granted release from their debts. Such information is deleted from the public insolvency register after six months, but SCHUFA kept this information for three years. The claimants applied to SCHUFA for this information about them to be deleted.
The CJEU determined that automated decision-making will occur if the following is satisfied: (1) there must be a decision; (2) that decision must be based solely on automated processing; and (3) it must produce legal effects concerning the individual or similarly significantly affect them.
The CJEU found that SCHUFA satisfied this test. On point (1), the ‘decision’ was the automatic refusal of their online credit application without human intervention. Point (2) was satisfied as SCHUFA created an automated probability score relating to the individual’s ability to repay a loan in the future based on their personal data. On point (3), the lender’s action in relation to the credit application strongly relied on the score received from SCHUFA. For example, where a consumer applied for a bank loan, where the score from SCHUFA was inadequate, this almost always meant that the application was denied.
The CJEU found that where the official publication of information about an individual’s discharged debts has ended, the CRA shouldn’t continue to retain this data. At this point, an individual’s rights to erase their data outweighs the interest of the CRA in having access to that information. The CJEU stressed the importance of granting individuals a clean slate when applying for credit – an insolvency should not negatively affect their ability to take out credit indefinitely.
The ruling suggests that any company retaining and using data from public registers for internal purposes, when this information has been removed from the public register itself, contravenes the GDPR.
Following Brexit, CJEU decisions are not binding on any UK courts. However, there is significant common ground between the EU and the UK for a court to sensibly seek insight and interpretation from Europe.
The UK’s system relating to data protection still largely mirrors the EU system under the GDPR, and UK courts are expressly permitted to consider post-exit CJEU decisions when interpreting the UK GDPR (as set out in the European Union Withdrawal Act 2018).
The UK’s data protection regulator, the Information Commissioner’s Office, advises that loan providers will carry out automated decision-making under Article 22 of the UK GDPR if their website uses algorithms and automated credit searching to provide an immediate yes/no decision on the loan application. Based on the CJEU’s ruling, the scope of automated decision-making may now be applied to CRAs in the UK, even where the ultimate ‘decision’ to approve or deny the loan is taken by the lender. Indeed, the CJEU’s broad analysis of automated decision-making means that it may be applied more broadly, including in relation to companies using AI to make decisions (for example, across the insurance or healthcare industries).
If CRAs rely on information that is published on public registers, the CJEU decision also indicates that such information should be deleted once it’s removed from the register – this would certainly reduce the extent of information that CRAs are able to provide to customers.
Although the CJEU decision isn’t binding in the UK, if you are a CRA in the UK and you provide scoring or other profiling data on individual loan applicants to a customer, and if your customer uses this data as a determining factor in the granting of credit, you are likely carrying out automated decision-making. For other companies, you should use this ruling as a marker to re-assess your data processing activities to understand if you are carrying out automated decision-making.
Automated decision-making must be lawful under the UK GDPR – this is only permitted in certain circumstances, namely where that decision is necessary for entering into, or performance of, a contract between the data subject and a data controller, where it is authorised by UK law and which also lays down suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests, or where it is based on the data subject’s explicit consent.
With automated decision-making comes additional information requirements. You have to inform the data subject about the automated decision-making, and give meaningful information about the logic involved, the significance and the consequences of processing their data in that way. This also applies to any businesses relying on the automated decision-making (such as customers of CRAs).
Additionally, you should consider mapping out what data you have from public registers stored in parallel databases within your company. You will need to ensure that your internal retention periods do not exceed those of the public register. If they do, you will be expected to record clear justification for this.
We can provide tailored advice and assistance relating to your business model and data processing activities to ensure compliance in this regard – please contact us.