Finance

Regulator probing Goldman over Apple Card: Gender bias must be rooted out of process

Companies who deploy biased algorithms — even unknowingly — are still responsible for potential discriminatory outcomes, the Wall Street regulator who is probing Goldman Sachs‘ Apple Card told CNBC on Monday.

“Algorithms don’t get immunity from discrimination,” said Linda Lacewell, superintendent of New York’s Department of Financial Services, which is investigating claims that Goldman Sachs’ Apple Card discriminated against women when determining credit limits.

“Whether the intent is there or not, disparate impact is illegal,” Lacewell added on “Squawk Alley.”

The inquiry follows viral allegations from tech entrepreneur David Heinemeier Hansson, who said on Twitter that Apple Card gave him a credit limit 20 times higher than his long-time wife, even though she has a higher credit score than him and that the couple jointly files tax returns.

Hansson called Apple Card, which Goldman built in partnership with the iPhone maker, an “[expletive] sexist program.”

Apple co-founder Steve Wozniak later claimed that Apple Card gave him 10 times the credit limit that his wife received.

In a statement released Sunday, Goldman said it does not consider gender in credit decisions and evaluates all applications independently. Goldman also said it looking into ways for family members to share a single Apple Card account.

Lacewell said that her agency, which regulates banks in New York, has been in contact with representatives from Goldman and could sit down with them as soon as Tuesday.

When asked whether DFS was investigating both Goldman and Apple, Lacewell responded that it was looking into “the practice.”

“Goldman is the bank that stands behind the Apple Card,” she continued. “We actually license Goldman … We’ve asked the company to begin explaining what the algorithm is.”

In New York, discrimination in financial services based on gender or any other protected class is illegal, Lacewell noted.

“It is the person who uses the algorithm, the company that uses the algorithm, that is responsible to make sure it is not being used with discriminatory impact against protected classes,” Lacewell said, arguing that developers and sellers of algorithms need to do “appropriate testing” to ensure there is no bias.

“There is no such thing as, ‘the company didn’t do it, the algorithm did,'” she said.

Consumers across all industries deserve transparency into the “black box” of algorithms, Lacewell said. In this sense, algorithms are rules programmed into computers to produce desired outcomes or actions.

“Consumers are entitled to know how these decisions are being made that affect their daily lives,” she said. “Your credit rating agency sends you a letter and says why you’ve been denied credit. This should really be no different.”

Apple Card became available to all U.S. consumers in August, following a limited preview earlier in the month.

In a conference call in October, Goldman CEO David Solomon said the bank’s rollout of the Apple Card had been met with strong demand.

“From an operational and risk perspective, we’ve handled the inflows smoothly and without compromising our credit underwriting standards,” Solomon said, adding that Goldman believed it was “the most successful credit card launch ever.”

The question of bias in algorithms is not restricted to only this allegation against Goldman, Lacewell noted, pointing to a recent study that found a medical algorithm favored white patients over black patients who were sicker.

Because of that study, Lacewell said her agency also sent an inquiry to UnitedHealth Group, whose subsidiary, Optum, sold a tool with that algorithm.

“These types of issues and allegations, if these are true, it’s very corrosive,” she said.

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *