Microsoft Will Not Provide Facial Recognition Software to California

Citing human rights concerns, Microsoft announced that it will not provide facial recognition software to the state of California.

The announcement was made by Microsoft president Brad Smith, who was speaking a Stanford University conference on human-centered artificial intelligence.” It was first reported by Reuters, which notes that Microsoft concluded that use of the software would lead to innocent women and minorities being disproportionately held for questioning because AI has been trained on mostly white and male pictures.

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

“Anytime [a police officer] pulled anyone over, they wanted to run a face scan” against a database of suspects, Smith said at the event. After thinking through the uneven impact, “we said this technology is not your answer.”

Smith also said that Microsoft rejected a deal in which it would have supplied facial recognition software for cameras installed all over an unnamed city (most likely in China, given that the nonprofit Freedom House had deemed this country “not free”). Smith said the use of this software would have “suppressed freedom of assembly” in that city.

Microsoft is, of course, interested in selling its technology. And Smith pointed to one example of its facial recognition software being used in an acceptable way: It is being deployed in an unnamed U.S.-based prison because “the environment would be limited and … it would improve safety inside the” prison.

For the past year, Microsoft has been pushing for ethics in AI, and it more recently revealed that shortcoming in its facial recognition capabilities required that potential customers be transparent about how they would use this technology. Microsoft would like this technology to be regulated.

Tagged with

Share post

Please check our Community Guidelines before commenting

Conversation 19 comments

  • Daekar

    17 April, 2019 - 9:45 am

    <p>I think the major tech companies are incapable of seeing the world through any lens except identity politics anymore, it's insane. Microsoft has made the right choice for the wrong reason. Widespread government facial recognition would be bad for ALL humans, gender or skin color be damned.</p><p><br></p><p>I'm glad to see that they appreciate how abusable the technology is and would like to see it regulated. </p>

  • Stooks

    17 April, 2019 - 9:52 am

    <p>How long will that actually last? If CA is taking this step then they should not use finger prints or DNA either.</p>

    • MachineGunJohn

      17 April, 2019 - 10:31 am

      <blockquote><em><a href="#421502">In reply to Stooks:</a></em></blockquote><p>That's just pure stupidity. It's not like they weren't going to physically take a long careful look at the person to double check the match. If anything this would have helped them avoid detaining people mistakenly. Brad and the other ultra libs need to be forced out for financial malfeasance. If the board doesn't get this crap in check there will be a shareholder lawsuit coming.</p>

      • skane2600

        17 April, 2019 - 1:09 pm

        <blockquote><em><a href="#421523">In reply to MachineGunJohn:</a></em></blockquote><p>You're dreaming. Shareholder's rights are a legit issue but the political leanings of the board wouldn't even make the top 10 list of their concerns. If they didn't sue because of the Windows 8 fiasco or the ill-fated acquisition of Nokia, they wouldn't sue over such a relatively minor business opportunity.</p>

      • lvthunder

        Premium Member
        17 April, 2019 - 2:08 pm

        <blockquote><em><a href="#421523">In reply to MachineGunJohn:</a></em></blockquote><p>I believe they did this because they knew the software would cause California to sue them for selling them software that discriminates women and minorities. That lawsuit wouldn't be pretty.</p>

        • MachineGunJohn

          18 April, 2019 - 7:21 am

          <blockquote><em><a href="#421572">In reply to lvthunder:</a></em></blockquote><p>this isn't just about California,that's just the tip of the iceberg. And notice nowhere was there any transparency about what the alleged discrepancy in accuracy is. This issue was raised years ago. It doesn't take that long to add another couple hundred thousand diverse photos to your database and retrain your ai. I doubt it's off by any statistically significant amount at this point. If I was an MS employee working on this I be timing. Again California wasn't going to detain anyone based on an ai match here and this was clearly more likely to prevent a mistaken identity than cause one. This was an entirely political decision not a technical one, and it has clear negative impact on MS shareholders. Others now will avoid MS to avoid any false public misconstruing of their policies. Letting your political bias outweigh facts is intellectually dishonest. Doing it to the point where it negatively affects revenue is reason to boot him out.</p>

  • lwetzel

    Premium Member
    17 April, 2019 - 10:02 am

    <p><span style="color: rgb(0, 0, 0);">"Microsoft would like this technology to be regulated."</span></p><p><br></p><p><span style="color: rgb(0, 0, 0);">And they are correct and it needed to be done 10 years ago. However it won't stop governments from creating their own software. Things are getting 1984ish!</span></p><p><br></p>

    • chrisrut

      Premium Member
      18 April, 2019 - 9:53 pm

      <blockquote><em><a href="#421506">In reply to lwetzel:</a></em></blockquote><p>I brought the positive interpretation of the term "cutting edge" into the vernacular with a marketing campaign for Epson back in the 1980s, but I've never forgotten the sting of its original meaning. Unintended consequences baby, unintended consequences…</p><p><br></p><p>S<span style="color: rgb(0, 0, 0); background-color: rgb(255, 255, 255);">ocial policy ALWAYS lags technology. Twas so in the age of Morse when electricity first emerged from Leyden jars, and it's true now as we seek ways to extend intelligence beyond the limits of our skulls. To set policy the unintended (and largely unforeseeable) consequences of technology must first be felt. </span></p>

  • glenn8878

    17 April, 2019 - 11:17 am

    <p>Microsoft doesn't own the market for facial recognition software. Governments will just go elsewhere to buy it. It's awfully odd to admit that their software is faulty so it will cause arrests because they are not white and male. Usually, pictures are matched against a database so if a picture doesn't exist, there's no match. Governments are already on the hook for disproportionate arrests of minorities. Until the bridge is crossed, no one will know the impact and no improvements can be made. Technology will still march ahead without Microsoft.</p>

    • codymesh

      17 April, 2019 - 4:50 pm

      <blockquote><em><a href="#421529">In reply to glenn8878:</a></em></blockquote><p><em>"Governments are already on the hook for disproportionate arrests of minorities."</em></p><p><br></p><p><span class="ql-cursor"></span>you have got to be shitting me.</p>

  • igor engelen

    17 April, 2019 - 12:36 pm

    <p>What bothers me most is that they seem to be very aware of the fact that they are training their AI with incomplete data. Why on earth would you do that?</p>

    • lvthunder

      Premium Member
      17 April, 2019 - 2:05 pm

      <blockquote><em><a href="#421548">In reply to Igor Engelen:</a></em></blockquote><p>Why would you do what? Tell the truth. Imagine if they would of went through with this deal and California determined it was discriminating women and minorities. I would bet the liberals politicians in California would sue.</p>

    • wright_is

      Premium Member
      18 April, 2019 - 4:03 am

      <blockquote><em><a href="#421548">In reply to Igor Engelen:</a></em></blockquote><p>Because you need live people to test this on and you need to get them to sign a waiver to have their faces used for recognition – you can slam in public domain photos for the database, but you still need real people to test on. So you start with your development team, and people on campus and work out.</p><p>If your team is largely white males, then the bias starts straight away. And then they pull in their mates for further tests… </p><p>They should be setting out the test plan before they start and see to it that they get a wide spread of "minorities", in the sample database, there shouldn't be any "minorities", there should be equal numbers of people from all ethnic roots.</p><p>But, generally, people just take what is at hand and if you are working and living in a white dominated area, you will have a dominance of white candidates to test against. It is just easier to grab people in the corridor or off the street than it is to search for a balanced test group.</p><p>If they were doing the research in China or Japan, you'd probably find that it has a good rate of success with oriental faces, but poor with African, Hispanic, Caucasian etc. faces.</p><p>I live in a small town in rural Germany. I'd guess 70% German, 20% East European, 9% Turkish and 1% "other" and me, a white Englishman fall in the "other" category. Doing the testing here would be very difficult to get a good racial spread.</p><p>Obviously, I have no idea about the actual make-up of the population around the research facility where MS developed this, so it is only a hypothesis.</p>

  • Pierre Masse

    17 April, 2019 - 12:52 pm

    <p>It means that they will provide it as soon as the software has improved.</p>

  • Mark from CO

    17 April, 2019 - 2:03 pm

    <p>Paul:</p><p>I agree with the other comments below.&nbsp;I would also add this is an admission that this AI product is unfinished and doesn’t work well.&nbsp;This leads to a couple of other obvious questions:</p><p>1.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Why is Microsoft selling the software to others – it contains the same significant flaw/bug?</p><p>2.&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;Are all Microsoft’s AI products equally flawed?</p>

  • AnOldAmigaUser

    Premium Member
    17 April, 2019 - 4:12 pm

    <p><span style="color: rgb(0, 0, 0); background-color: rgb(255, 255, 255);">Anyone complaining about this being a PC move is likely to be someone who stands next to no chance of being affected by a false positive and put through the ringer. If you think there is no bias in law enforcement, drive I95 down the east coast from NY to Miami, and pay attention to the cars that are pulled over.</span></p><p>The real problem here is that another company will sell the technology, and it will be equally flawed. It is not called "Artificial" intelligence for no reason. People tend to think that any information coming from a computer is accurate, despite the fact that much of it is entered or collected by people, and manipulated by algorithms created by people; all of them at least as stupid as us. </p>

  • chrisrut

    Premium Member
    18 April, 2019 - 10:01 pm

    <p>I think MS's point is that this they are selling a programmable/configurable/teachable system, and therefore no better that the entity doing the programming, configuration, and teaching. </p><p><br></p><p>The mantra bears repeating here: you will know when AI has "arrived" when computers start doing things that are really stupid. </p>

  • jim_may

    19 April, 2019 - 12:29 am

    <p><span style="color: rgb(0, 0, 0);">Microsoft won't give it to California, but they did give it to Commie China – </span></p><p><span style="color: rgb(0, 0, 0);">Look it up!</span></p>

    • waethorn

      22 April, 2019 - 11:53 am

      <blockquote><em><a href="#421951">In reply to Jim_MAY:</a></em></blockquote><p><br></p><p>Either one is bad.</p><p><br></p><p>Just FYI: the UK is as much a surveillance state as China now.</p>

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC