<img src="https://ws.zoominfo.com/pixel/pIUYSip8PKsGpxhxzC1V" width="1" height="1" style="display: none;">

Apple Card’s Fair Lending Fiasco

author
3 min read
Nov 13, 2019

While those tracking the mortgage industry have been digging through the new, expanded Home Mortgage Disclosure Act (HMDA) data, other types of Fair Lending data have been making news.

This time instead of race or geographic location, it’s an issue of gender.

The New York State Department of Financial Services is investigating whether the Apple Card discriminates against female applicants, offering them substantially less credit than male applicants.

On Saturday, tech entrepreneur David Heinemeier Hansson published a series of tweets (NSFW language) complaining about the amount of credit offered by Apple Card.

He wrote that even though he and his wife live in a community-property state and filed joint income tax returns, Apple Card offered his wife a card with just 1/20th the amount of credit that the company offered him—even though her credit score was higher than his.

Hansson also opined that there was no way to appeal the decision, and over conversations with six Apple employees, no explanation was given other than to blame the algorithm (the Apple Card is a joint effort between Apple and Goldman Sachs, which issues the card).

Other Twitter users joined with their own complaints. For example, one man reported his wife made more money than him, had a higher credit score and was offered just 1/3 the credit of he was.

Even Apple co-founder and current employee Steve Wozniak and his wife encountered the problem, writing: “The same thing happened to us. I got 10x the credit limit. We have no separate bank or credit card accounts or any separate assets. Hard to get to a human for a correction though. It's big tech in 2019.”

Intent Doesn’t Matter When It Comes to Discriminatory Lending

Did Goldman Sachs set out to discriminate against women when Apple Card was introduced with great fanfare this spring? The answer is almost certainly no.

And yet, it appears that’s what ended up happening, and regulators aren’t having it.  

“The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex,” an NYDFS spokesperson told Bloomberg News. “Any algorithm that intentionally or not results in discriminatory treatment of women or any other protected class of people violated New York law.”

Bias is bias, whether it’s coming from a real-live person or an algorithm. And in a prime example of third-party risk, Apple is taking the brunt of this public relations headache even though Goldman Sachs was responsible for making the lending decisions.

Oops, It Happened Again – Fair Lending Edition

This isn’t the first time an algorithm unintentionally made biased lending decisions. Earlier this month NYDFS began investigating a biased algorithm sold by a UnitedHealth Group subsidiary that resulted in giving white patients better healthcare than black patients. Criminal sentencing algorithms have been shown to give harsher sentences to black defendants than white defendants. 

In 2017, Amazon had to abandon an experimental hiring tool that was biased against women. Using artificial intelligence, the tool learned to look for common terms on previously successful applicants. The problem is that because the industry is dominated by men, so was the pool of preferred resume terms. As a Reuters report noted, “In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word ‘women’s,’ as in ‘women’s chess club captain.’ And it downgraded graduates of two all-women’s colleges.”

The list goes on.

As the financial services industry continues to integrate artificial intelligence and machine learning into credit decisions, it’s important to remember that even algorithms discriminate. Lending decisions still require some degree of human oversight, even if it’s just testing the results to ensure there are no patterns of discriminatory lending.

Speaking of Fair Lending Data…

We have released a HMDA Benchmark Report, built using the 2018 public HMDA LAR data from the Consumer Financial Protection Bureau (CFPB). Generated using TRUPOINT Analytics, it reflects national-level benchmarks.

Download your copy here.

2019 HMDA Benchmark Report

Check out our tips on building a Lending Compliance Management System.


Subscribe to the Nsight Blog