Apple Card never intended to offend
Apple Card never intended to offend. It just wanted to provide enhanced safety features and a minimalist appearance. In 2019, however, it caused a Twitter uproar and an investigation by New York Department of Financial Services. All because some people got unexplainably low credit limits. Jamie Heinemeier Hansson wanted the increased privacy and security of the Apple Card. The application process was fast, completed from an iPhone, and results were immediate. Jamie was granted a credit limit twenty times lower than her husband. The husband took to Twitter: “The @AppleCard is such a … sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit. (1) Steve Wozniak – the co-founder of Apple - shared that he similarly got a limit 10 times higher than his wife (2). Others shared similar experiences. The online community turned to Apple for clarification. Apple deferred inquiry to Goldman Sachs, who was managing the financial aspects of the card. Goldman Sachs blamed their algorithm (2). New York Department of Financial Services launched an official investigation into Goldman Sachs’ credit card practices (3).
Apple Card never intended to be sexist. Nor was it intentional about preventing sexism. The credit industry historically has been biased towards men (4), and the Apple Card amplified this bias through a powerful algorithm. The algorithm itself was a black box that neither Apple nor Goldman Sachs fully understood. Employees deferred to the algorithm as a God-like entity that could not be biased despite the abundant evidence. This reverent approach to algorithmic decision making is deeply misguided. Algorithms are tools; companies creating and using a tool are responsible for the consequences of the tool. However, both Apple and Goldman avoided legal repercussions. NYDFS found no evidence of bias (5) but due to trade secrets did not release specifics. So we are left to wonder: if the Apple Card wasn’t sexist, how do you explain the dozens of women who seemingly got discriminated against? And in an era of black box algorithms, what recourse will we have the next time an algorithm discriminates?
-
1. Hansson, David Heinemeier. Twitter . [Online] Nov 8, 2019. https://www.apple.com/apple-card/.
2. Hamilton, Isobel Asher. Apple cofounder Steve Wozniak says Apple Card offered his wife a lower credit limit. Business Insider. [Online] November 11, 2019. https://www.businessinsider.com/apple-card-sexism-steve-wozniak-2019-11.
3. Newburger, Emma. Wall Street regulator probes Goldman over allegations of sexist credit decisions at Apple Card. Cnbc. [Online] Nov 10 , 2019. https://www.cnbc.com/2019/11/10/wall-street-regulator-probes-goldman-over-allegations-of-sexist-credit-decisions-at-apple-card.html.
4. Eveleth, Rose. Forty Years Ago, Women Had a Hard Time Getting Credit Cards. Smithsonian Mag. [Online] January 8, 2014. https://www.smithsonianmag.com/smart-news/forty-years-ago-women-had-a-hard-time-getting-credit-cards-180949289/.
5. Ennis, Dan. Goldman cleared of bias claims in NYDFS's Apple Card probe. Banking Dive. [Online] March 24, 2021. https://www.bankingdive.com/news/goldman-sachs-gender-bias-claims-apple-card-women-new-york-dfs/597273/.
6. Geng Li. Gender-Related Differences in Credit Use and Credit Scores. Federal Reserve . [Online] June 22, 2018. https://www.federalreserve.gov/econres/notes/feds-notes/gender-related-differences-in-credit-use-and-credit-scores-20180622.htm.