Articles
10 May 2019 

Fairness in consumer credit markets

Making sure you are treated fairly when applying for credit is becoming much more reliant on data that shows what you do, rather than who you are, writes Amanda Cooper

As a borrower, you are only as good as your credit score. But what makes up a credit score and what can influence it is often an opaque matter. With the proliferation of machine learning and big data, algorithms are playing a larger role in decisions by financial institutions to grant a mortgage or loan. In theory, this should increase transparency and remove biases associated with things such as gender, race, or even postal code. But does it?

This was the subject of a discussion among experts at a Think Forward Initiative event in Sweden on Thursday. Stefania Albanesi, Professor of Economics at the University of Pittsburgh, opened the discussion by noting that, on the surface, certain characteristics such as gender and religion shouldn't affect access to credit. However, simply excluding these factors is no guarantee that they won't ultimately affect access to credit because the data behind the models may implicitly include these. Indeed models need to be tested to ensure this is not happening.

This is especially important as evidence suggests several groups face difficulty accessing credit. The young, those on low incomes and minority groups seem particularly disadvantaged. The panel generally agreed that a more data-intensive approach to credit scoring can play a role in reducing the disadvantage that these groups face.

But Chuck Robida from consumer credit reporting company Experian said consumers won’t be protected fully from a biased system, as there can be an inherent bias in the data and models used in those algorithms. His organisation looks at what he terms “credit behaviour” to determine credit performance, which brings in the need for data beyond merely a name, which in itself could reveal a borrower’s likely ethnicity or location, for example.

How you spend and save

Jan Dodion, head of risk at ING, agrees. In any traditional credit report, it will be a consumer’s socio-demographic information that will be the most relevant, followed usually by information from a credit bureau, which more often than not, typically only reflects negative events, such as a default, rather than positive ones, such as regularity and reliability in paying household bills. “The moment we start incorporating ‘transactional data … the way in which you use your daily banking products, not credit, but your saving account, your current account and sometimes your credit cards, it starts ruling out the social demographic drivers,” he told the audience. A financial institution’s models can contribute to making decisions based on what you do with your money, rather than who you are.

But is it fair?

Brian Bucks, from the Consumer Financial Protection Bureau in the United States, says deciding what criteria to use when establishing fairness in lending is an important choice. One of the challenges that a regulator faces is that of availability of data, even so-called alternative, behavioural data. If a particular type of data is not universally available, but rather, available only to a certain sector of the population, then fairness becomes even harder to ensure. In lending terms, many consumers are “invisible”, meaning that they either have no credit record, which in turn makes them more likely to be turned down for a loan and excluded altogether. Bucks said some of the bureau’s research has shown that access to high-speed internet has a far tighter link to a borrower’s “credit visibility” than their physical proximity to a bank. So if a lender’s algorithm were incorporating digital, or behavioural, data into its traditional scoring models to determine a borrower’s creditworthiness, anyone without a mobile phone could potentially find themselves rejected.

Digital footprints

Bucks sees cases of lenders that are using data such as education or employment history as part of their models. By looking at an organisation’s credit applications and associated outcomes, Bucks can see how alternative data is working “in the real world”. Tarun Ramadorai, from Imperial College Business School and the Centre for Economic Policy Research, a think tank, has similar concerns with regards to how inclusive this newer data is. A consumer is more likely to want to hand over their “digital footprint” to their lender if that footprint is good, which in turn creates what Ramadorai calls a “massive censoring issue”.

No surprises

ING’s Dodion says application of new data sources, modelling methods and technologies are crucial to developing better risk models, which holds clear potential advantages for consumers. He explained that more focus on transactions-based models can result in less bias and broader access to credit. The greater the amount of automation, the more efficient a bank becomes, which then allows it to offer more affordable terms to a wider range of customers. The more you automate, the more you can ensure you have priced your products correctly and remove the scope for any nasty surprises, such as a default. “Surprises are never good,” he says, “not for a bank, not for a consumer and not for society.”

The discussion was part of the Think Forward Initiative event “Fairness In Consumer Credit Markets” held on 9 May in Lund, Sweden. This was organised by Stefania Albanesi from the University of Pittsburgh and Tarun Ramadorai from Imperial College Business School.

The Think Forward Initiative seeks to bring together global experts to find out how and why people make financial decisions. Stefan van Woelderen, research lead of TFI, gave the audience an update on its latest achievements and expectations for the next few months. The Think Forward Initiative's goal is to find out how people make certain choices as individuals in their social and macro context, and help them make better ones. Its main partners are ING, the Centre for Economic Policy Research (CEPR), Deloitte, Dell EMC, Dimension Data and Amazon Web Services. The wider TFI network consists of more than 200 different organisations (including universities, NGOs, companies, etc.) and more than 1,500 individuals, of which almost half are researchers.

Content Disclaimer
This publication has been prepared by ING solely for information purposes irrespective of a particular user's means, financial situation or investment objectives. The information does not constitute investment recommendation, and nor is it investment, legal or tax advice or an offer or solicitation to purchase or sell any financial instrument. Read more