The Prejudice of Algorithms

Statistical discrimination has replaced human prejudice in lending, Haas study finds

The Prejudice of Algorithms

Face-to-face meetings between mortgage officers and homebuyers have been rapidly replaced by online applications, but lending discrimination hasn’t disappeared. It’s simply morphed from human bias to algorithmic bias.

That’s the conclusion of a groundbreaking new Berkeley Haas study, which found that both online and traditional lenders charge higher interest rates to African-American and Latino borrowers, earning 11% to 17% higher profits on such loans. All told, minority homebuyers pay half a billion dollars more in interest every year than white borrowers with comparable credit scores do, researchers found.

The findings raise legal questions about the rise of statistical discrimination in the fintech era and point to potentially widespread violations of U.S. fair lending laws, the researchers say. While lending discrimination has historically been caused by human prejudice, pricing disparities are increasingly the result of algorithms that use machine learning to target applicants who might shop around less and hit them with higher-priced loans.

“Even if the people writing the algorithms intend to create a fair system, their programming is having a disparate impact on minority borrowers—in other words, discriminating under the law,” says study co-author Adair Morse, an associate professor of finance.

A key challenge in studying lending discrimination has been that the only large data source that includes race and ethnicity is the Home Mortgage Disclosure Act (HMDA), which covers 90% of residential mortgages but lacks information on loan structure and property type. Using machine learning techniques, researchers merged HMDA data with three other large datasets—ATTOM, McDash, and Equifax—connecting, for the first time ever, details on interest rates, loan terms and performance, property location, and borrower’s credit with race and ethnicity.

The researchers—including Haas professors Nancy Wallace and Richard Stanton and Prof. Robert Bartlett of Berkeley Law—focused on 30-year, fixed-rate, single-family residential loans issued from 2008 to 2015 and guaranteed by Fannie Mae and Freddie Mac.

This ensured that all the loans in the pool were backed by the U.S. government and followed the same rigorous pricing process, based only on a grid of loan-to-value and credit scores, put in place after the financial crisis. Because the private lenders are protected from default by the government guarantee, any additional variations in loan pricing would be due to the lenders’ competitive decisions. This methodology allowed researchers to isolate pricing differences that correlate with race and ethnicity apart from credit risk for the first time.

The analysis found significant discrimination by both face-to-face and algorithmic lenders:

  • Black and Latino borrowers pay 5.6 to 8.6 basis points higher interest on purchase loans than white and Asian borrowers do and 3 basis points more on refinance loans.
  • These disparities cost borrowers $250 million to $500 million annually.
  • For lenders, this amounts to 11%–17% higher profits on purchase loans to minorities (based on the industry average 50-basis-point profit on loan issuance).

Morse says the results are consistent with lenders using big data variables and machine learning to infer the extent of competition for customers and price loans accordingly. This pricing might be based on geography—such as targeting areas with fewer financial services—or on characteristics of applicants. If AI can figure out which applicants might do less comparison shopping and accept higher-priced offerings, the lender has created what Morse calls “algorithmic strategic pricing.”

“Even if the people writing the algorithms intend to create a fair system, their programming is having a disparate impact on minority borrowers—in other words, discriminating under the law.”

“There are a number of reasons that ethnic minority groups may shop around less—it could be because they live in financial deserts with less access to a range of products and more monopoly pricing, or it could be that the financial system creates an unfriendly atmosphere for some borrowers,” says Morse, noting that her research on the topic is ongoing. “The lenders may not be specifically targeting minorities in their pricing schemes, but by profiling non-shopping applicants they end up targeting them.”

This is the type of price discrimination that U.S. fair lending laws are designed to prohibit, Bartlett notes. Several U.S. courts have held that loan pricing differences that vary by race or ethnicity can only be legally justified if they are based on borrowers’ creditworthiness. “The novelty of our empirical design is that we can rule out the possibility that these pricing differences are due to differences in credit risk among borrowers,” he says.

The data did reveal some good news: Lending discrimination overall has been on a steady decline, suggesting that the rise of new fintech platforms and simpler online application processes for traditional lenders has boosted competition and made it easier for people to comparison shop—which bodes well for underserved homebuyers.

Research showed fintech lenders did not discriminate on accepting minority applicants. Traditional face-to-face lenders, however, were still 5% more likely to reject them.

Back