Apple Card and Goldman Sachs Under Investigation Following Allegations of Gender Bias

Apple Card and Goldman Sachs are under investigation by the New York Department of Financial Services. The investigation comes shortly after allegations of the Apple Card’s gender bias against women.

The issue was first brought up by Ruby on Rails creator David Heinemeier Hansson on Twitter, who faced the exact same issue with the Apple Card. He and his wife have filed joined tax returns and despite her having a better credit score than him, Apple Card gave him 20x the credit limit than his wife.

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

[ad unit=’in_content_premium_block’]

Apple was quick to address the problem as soon it became a PR issue, giving Hansson’s wife a higher credit limit. But that didn’t stop the New York Department of Financial Services launching a probe into Goldman Sachs’s credit card practices as it’s the provider for the credit on the Apple Card.

“The department will be conducting an investigation to determine whether New York law was violated and ensure all consumers are treated equally regardless of sex. Any algorithm, that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law,” said a spokesman for Linda Lacewell, the superintendent of the New York Department of Financial Services.

According to Bloomberg, a Goldman Sachs spokesperson said that it’s possible for two Apple Card family members to receive significantly different credit limits, but “in all cases, we have not and will not make decisions based on factors like gender.”

That’s obviously not what has happened here, and Goldman Sachs is now facing investigation for a potential gender bias on its algorithms. And has Hansson said, “it’s not a gender-discrimination intent but it is a gender-discrimination outcome” which is the main problem in this case.

Tagged with

Share post

Please check our Community Guidelines before commenting

Conversation 24 comments

  • wright_is

    Premium Member
    11 November, 2019 - 4:24 am

    <p>There was a big discussion about this over on the TWIT Community over the weekend.</p>

  • jbinaz

    11 November, 2019 - 8:27 am

    <p>"<span style="color: rgb(0, 0, 0);">That’s obviously not what has happened here" is a big statement. Just because her credit score is higher doesn't mean her income is higher, and income will affect the credit limit, even in a community property state. I don't *think* I saw a mention of their relative incomes. If I did, I missed it. I'm too lazy to go back and look.</span></p><p><br></p><p><span style="color: rgb(0, 0, 0);">I can have a $30K income, and a great credit score, but I'm sure as heck not going to be approved for as much as someone making $100K.</span></p><p><br></p><p><span style="color: rgb(0, 0, 0);">Is it possible it's sexist? Yes. Do any of us know enough specifics to say that's why in this case (and similar cases mentioned in the tweets)? No.</span></p><p><br></p><p><span style="color: rgb(0, 0, 0);">And, while it's stupid that I have to say this, if it is being based on gender, all other things equal, it's wrong.</span></p><p><br></p><p><span style="color: rgb(0, 0, 0);">Sadly, we'll never get an answer. </span></p>

  • red.radar

    Premium Member
    11 November, 2019 - 9:02 am

    <p><br></p><p>It has nothing to do with gender discrimination. It’s the fact that the Apple Card doesn’t support multiple users like every other financial product. </p><p><br></p><p>so when the Partner signs up their credit worthiness is judged based on their income. People who stay home and support the family unit are more likely to earn less income. </p><p><br></p>

    • jimchamplin

      Premium Member
      11 November, 2019 - 12:14 pm

      <blockquote><em><a href="#487946">In reply to red.radar:</a></em></blockquote><p>Who said anything about her being someone who stays home? It’s antiquated ideas about women that create biased algorithms. </p>

      • red.radar

        Premium Member
        11 November, 2019 - 7:11 pm

        <blockquote><em><a href="#487998">In reply to jimchamplin:</a></em></blockquote><p>Never implied the victim was not employed. I wasn’t talking about the victim. I was referring to the limitations of the Apple Card and provided an example of a class of people it affects.</p>

  • Patrick3D

    11 November, 2019 - 10:26 am

    <p>Laws were changed in 2009 &amp; 2013 to require credit card companies to consider a person's ability (over the age of 21) to pay via any income they have access to rather than their individual income. Unless some people filed their application wrong it may be that Goldman Sachs were basing decisions on information provided by Apple which could have been individual specific, failing to take into account joint income. There is obviously something wrong in the pipeline that led to this but I don't expect it to have been intentional, simply something overlooked that needs to be corrected and a fine against Goldman Sachs for missing it.</p>

    • jimchamplin

      Premium Member
      11 November, 2019 - 12:17 pm

      <blockquote><em><a href="#487972">In reply to Patrick3D:</a></em></blockquote><p>It’s possible that GS “<em>accidentally</em> didn’t get the memo.”</p>

      • wright_is

        Premium Member
        11 November, 2019 - 1:49 pm

        <blockquote><em><a href="#487999">In reply to jimchamplin:</a></em></blockquote><p>Yes. Based on the discussion over on twit, I bought the Audible version of Weapons of math Destruction. </p>

      • jbinaz

        11 November, 2019 - 1:56 pm

        <blockquote><em><a href="#487999">In reply to jimchamplin:</a></em></blockquote><p>Why would a company whose main objective is to make money deny a higher credit limit to someone based on gender? A higher credit limit would allow for more transactions, which means more fees, which is more income, which can (but doesn't always) lead to more profit.</p>

    • jbinaz

      11 November, 2019 - 2:00 pm

      <blockquote><em><a href="#487972">In reply to Patrick3D:</a></em></blockquote><p>When you say "filed their application wrong", are you saying that the individual would have to (and in this case didn't) disclose the other income (i.e. their spouses) for it to be considered? And if they don't, then it shouldn't be considered? (Not arguing with you at all, just curious as to what the law says, and how it works.)</p>

  • Daekar

    11 November, 2019 - 10:57 am

    <p>As long as men are a protected class of people too, I don't have a problem with this. The original situation makes no sense… If they're going to split things up like this, gender certainly shouldn't play a part.</p>

    • jimchamplin

      Premium Member
      11 November, 2019 - 12:21 pm

      <blockquote><em><a href="#487982">In reply to Daekar:</a></em></blockquote><p>Men are a protected class of people?</p><p><br></p><p> Clarify please.</p>

      • Daekar

        11 November, 2019 - 2:01 pm

        <blockquote><em><a href="#488001">In reply to jimchamplin:</a></em></blockquote><p>The New York Department of Financial Services was quoted in the article saying '<span style="background-color: rgb(24, 26, 27); –darkreader-inline-bgcolor:#181a1b;" data-darkreader-inline-bgcolor="">Any algorithm, that intentionally or not results in discriminatory treatment of women or any other protected class of people violates New York law.'" I'll be happy as long as everybody gets equal protection. If not… well, they're a right bunch of hypocrites.</span></p>

  • Chris_Kez

    Premium Member
    11 November, 2019 - 11:48 am

    <p>I think the broader issue here is lack of transparency. Apple or Goldman Sachs should be able to clearly explain exactly why his wife got a lower credit limit. An analyst there should have been able to pull up this application as soon as it became a PR issue, reviewed the numbers and clearly explained the apparent discrepancy. That would have been the end of the PR crisis at least. Is it possible they can't explain why because the algorithm is impenetrable? This is going to be a big problem in all kinds of areas.</p>

    • jimchamplin

      Premium Member
      11 November, 2019 - 12:07 pm

      <blockquote><em><a href="#487991">In reply to Chris_Kez:</a></em></blockquote><p>But what does that analyst say if the algorithm is actually designed to be biased?</p>

    • wright_is

      Premium Member
      11 November, 2019 - 1:52 pm

      <blockquote><em><a href="#487991">In reply to Chris_Kez:</a></em></blockquote><p>The problem is the GS algorithm is proprietary and a trade secret, even the analysts don't get to see how they work. </p>

      • Chris_Kez

        Premium Member
        11 November, 2019 - 2:57 pm

        <blockquote><em><a href="#488016">In reply to wright_is:</a></em></blockquote><p>And that is going to increasingly be a problem that people want fixed. As we run into (or uncover) more instances of bias and discrimination in algorithms all over the place, it will be harder for companies to stick with their default responses— ??‍♂️ or ?. At least I hope that is the case. I'm counting on you guys in Europe to get this rolling. </p>

      • red.radar

        Premium Member
        11 November, 2019 - 7:29 pm

        <blockquote><em><a href="#488016">In reply to wright_is:</a></em></blockquote><p>It is an interesting paradox. The algorithm is secret so the general public can not alter behavior thus souring the effectiveness. Also ones algorithm is a competitive advantage. However it’s an obvious shadow were things can lurk </p>

  • jimchamplin

    Premium Member
    11 November, 2019 - 12:19 pm

    <p>I was honestly really surprised that Apple went with Goldman Sachs. They don’t have a great image with people who make less than a million a year. But I guess when you live in a world where everyone makes millions a year your worldview can’t be trusted by everyone else.</p>

  • rosyna

    11 November, 2019 - 1:01 pm

    <p>FWIW, the man that complained is worth $30 million, sold a house for $1.75 million (he doesn’t live in the US but has a green card) and tax returns have <strong>nothing</strong> to do with creditworthiness. (Banks aren’t legally allowed access to tax returns)</p>

  • Thom77

    11 November, 2019 - 1:47 pm

    <p>This is a non story that is being blown up because it involved Goldman Sachs which is heavily represented in the Trump administration. Don't worry though, New York, who has been trying to take Trump down for years now, is on the case … to protect the people, of course. And Apple is caught in the middle.</p><p><br></p><p>The credit limits are done on an individualized basis., which literally explains the outcome, but lets not let facts stand in the way of a good "sexist" story that can get click$ all over the internet.</p><p><br></p><p>I would also even go so far as to wonder if Apple's choice of Goldman Sachs was …. influenced … by the Trump administration.</p><p><br></p><p>This is political. Make no mistake. If we were under the Obama administration, and Apple joined up with Citibank who basically picked Obama's administration picks, then this same story would never be allowed to gain traction.</p><p><br></p><p>The real story here is how Apple increased the credit limit of a rich millionaire's wife, while millions of everyday people just have to live with their's because they aren't rich or powerful enough to get Apple attention. </p>

    • Greg Green

      16 November, 2019 - 12:20 pm

      <blockquote><em><a href="#488014">In reply to Thom77:</a></em></blockquote><p>While I don’t doubt there’s too much politics and not enough ethics in every NYC prosecutors office, the credit disparity is alarming. And indisputable.</p>

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC