This New Windows App Lets Anyone Make Deepfake Nude Images of Women

Update: the app has now been pulled and shut down by the developers after admitting that the number of people misusing the app will be too high. Original story follows.

Deepfakes have been a major talking point over the last year or so. For those unfamiliar, deepfakes are fake nude images or videos of women created using artificial intelligence. These advanced neural networks allow anyone to create fake nude images of anyone within minutes. And yes, they are being dangerously abused.

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

But that didn’t stop men from taking advantage of the algorithms. An anonymous developer has created a new DeepNude app for Windows and Linux that lets anyone create a fake nude image of women within seconds. The app has a free version that allows you to create deepfakes within seconds. And if you pay $99 for the premium version, you can get these fake nudes without any watermarks, reports Motherboard.

The app only works with pictures of women for now, with the developer apparently planning to create a version that works with men, too. The app even stamps a “fake” sign to let people know that a nude is fake, but it’s very easy to crop out. Deepfakes simply introduced the world to fake nude images and videos but required a lot of expertise, while this new app makes things as easy as using a simple photo editor app. It’s scary.

What’s infuriating about this is the fact that any creep out there can use your images to make a fake nude image of you, and then use that to blackmail you. It’s fairly easy to spot that these deepfakes are not actual, real nude images, but it could still fool a lot of people. And the fact that the algorithm is only going to get better is worryingly crazy. What’s worse that there isn’t really much legislation around these deepfakes, and creepy men online are abusing these algorithms and apps to the best of their abilities.

Tagged with

Share post

Please check our Community Guidelines before commenting

Conversation 46 comments

  • MadsM

    27 June, 2019 - 7:26 am

    <p>I wonder if similar pearl-clutching has been written about Photoshop. </p>

    • jbinaz

      27 June, 2019 - 7:42 am

      <blockquote><em><a href="#438022">In reply to MadsM:</a></em></blockquote><p>Pearl-clutching? Over degrading, humiliating, and embarrassing women? Seems like a pretty legit concern to me. </p><p><br></p><p>And there's a huge barrier: photoshop takes time and effort, which many people won't expend. This seems pretty easy, give it a photo and it's done.</p>

      • MadsM

        27 June, 2019 - 9:13 am

        <blockquote><em><a href="#438029">In reply to jbinaz:</a></em></blockquote><p>It could just as easily affect men. I don't understand the need to paint women as eternal victims.</p>

        • garumphul

          Premium Member
          27 June, 2019 - 9:21 am

          <blockquote><em><a href="#438044">In reply to MadsM:</a></em></blockquote><p>I agree in principle, but let's be absolutely honest – the <em>vast</em> majority of the time it's men victimizing women with fake nudes. </p>

        • spullum

          Premium Member
          27 June, 2019 - 9:55 am

          <blockquote><em><a href="#438044">In reply to MadsM:</a></em></blockquote><p>Then you are extremely tone deaf at best. How many men are going to be realistically impacted by this? This is a script-kiddie-like tool for anyone who gets mad at a woman or girl in middle school-adulthood. An ex, rival, co-worker, someone who is pissed off, whatever can now find a normal photo and made a nude of it and convince others it’s real to shame someone else.</p>

          • lvthunder

            Premium Member
            27 June, 2019 - 10:40 am

            <blockquote><em><a href="#438052">In reply to spullum:</a></em></blockquote><p>Society will just evolve to where it's not shameful to be naked.</p>

            • AnOldAmigaUser

              Premium Member
              27 June, 2019 - 11:05 am

              <blockquote><em><a href="#438070">In reply to lvthunder:</a></em></blockquote><p>The way things are going here, it is more likely to devolve into a Handmaid's Tale scenario. There is a big difference between faking nude photos of someone, and people feeling unashamed of nudity.</p><p>The only reason for this sort of application is to create images meant to shame an individual or to pander to the stunted imaginations of those unlikely to have a meaningful relation with another human.</p>

              • lvthunder

                Premium Member
                27 June, 2019 - 11:18 am

                <blockquote><em><a href="#438080">In reply to AnOldAmigaUser:</a></em></blockquote><p>I've never seen a Handmaid's Tale so I don't get the reference. But you are right the app is meant to shame a person. I think it's going to be abused that after a while people will get used to it and the shame will go away.</p>

                • AnOldAmigaUser

                  Premium Member
                  27 June, 2019 - 9:30 pm

                  <blockquote><em><a href="#438086">In reply to lvthunder:</a></em></blockquote><p>If you are interested, here is the <a href="https://en.wikipedia.org/wiki/The_Handmaid%27s_Tale&quot; target="_blank">Wikipedia </a>entry for it.</p><p>tl;dr, It is a dystopian novel about a theonomy that takes over the US government. Women have no rights under this government.</p>

            • phuor

              27 June, 2019 - 11:32 am

              <blockquote><em><a href="#438070">In reply to lvthunder:</a></em></blockquote><p>There will still be a problem if the fake nudes look better than the real body.</p>

            • mmcewan

              29 June, 2019 - 12:20 pm

              <blockquote><em><a href="#438070">In reply to lvthunder:</a></em></blockquote><blockquote><em>Yes, I'm surprised that our so-called modern society is still so prudish about nudity. I think most people are already aware that nudes can be faked. Perhaps it's the result of all this 'You are so special' crap that is espoused everywhere. In reality, nobody's body is all that special. If you're a homosapien, well, that comes in two models. </em></blockquote><blockquote><br></blockquote><blockquote><em>The sole power of a 'nude photo scandal' is in the reaction of the offended party. Exhibit no sense of shame and make yourself bulletproof against this sort of petty attack. You can photoshop anyone's face onto any sort of body, plant, statue, robot, so what?</em></blockquote><blockquote><em>Perhaps we will all have to copyright our own nudes and sue for loss of perceived value when fakes are distributed.</em></blockquote><p><br></p>

              • skane2600

                30 June, 2019 - 1:22 am

                <blockquote><em><a href="#438511">In reply to mmcewan:</a></em></blockquote><p>It's amazing how some people can twist a discussion on any topic into a complaint about the idea of valuing every individual or any other humanistic belief.</p>

    • jimchamplin

      Premium Member
      27 June, 2019 - 8:29 am

      <blockquote><em><a href="#438022">In reply to MadsM:</a></em></blockquote><p> Wow. Just wow!</p>

  • Thom77

    27 June, 2019 - 8:09 am

    <p>Im Just here for the Streisand Effect.</p>

  • jimchamplin

    Premium Member
    27 June, 2019 - 8:31 am

    <p>The scope of the “Deepfake” problem goes deeper. Think literally “putting words into someone’s mouth” level chicanery.</p>

    • wright_is

      Premium Member
      27 June, 2019 - 8:47 am

      <blockquote><em><a href="#438037">In reply to jimchamplin:</a></em></blockquote><p>Yes, deepfake is more than just naked women and has gone on to audio and video in recent months.</p>

    • christian.hvid

      27 June, 2019 - 9:35 am

      <blockquote><em><a href="#438037">In reply to jimchamplin:</a></em></blockquote><p>The flip side of deepfakes is that they provide plausible deniability for those who need it – if someone publishes a video of you doing or saying something inappropriate or illegal, you can always claim it's a deepfake. I can imagine a certain president howling "DEEPFAKE NEWS!" whenever confronted with actual footage of his behavior.</p>

      • lvthunder

        Premium Member
        27 June, 2019 - 10:36 am

        <blockquote><em><a href="#438048">In reply to christian.hvid:</a></em></blockquote><p>I can also see the people who don't like that president (or any politician) creating a ton of deep fakes of him saying all kinds of vile stuff.</p>

        • christian.hvid

          27 June, 2019 - 10:50 am

          <blockquote><em><a href="#438066">In reply to lvthunder:</a></em></blockquote><p>Agreed. Deepfakes will be a major political weapon going forward, and it's going to be used by all sides. My point is that they work both ways: turning lies into truth, but also turning truth into lies. We're increasingly getting trapped in a twilight zone where nothing is true and nothing is false.</p>

    • garumphul

      Premium Member
      27 June, 2019 - 11:24 am

      <blockquote><em><a href="#438037">In reply to jimchamplin:</a></em></blockquote><p>Inevitably we will end up where a fake video is claimed to be real, but detailed analysis will show that it's a fake. </p><p>The bigger problem (perhaps) is when a real video is claimed to be fake, and the lack of any evidence that it's a fake will be brushed off as the video being a <em>really good</em> fake. </p><p>The speed at which the technology is progressing is both amazing and terrifying, but I suspect the damage will outweigh any benefit by a very, very large factor.</p>

  • Pbike908

    27 June, 2019 - 8:40 am

    <p>Pretty wild…Not sure if there is anything that can really be done about this. </p><p><br></p><p>Imagine the implications when it becomes technologically feasible to create deep fake videos of innocent folks committing crimes. Especially given all the internet connected cameras out there….</p>

    • lvthunder

      Premium Member
      27 June, 2019 - 10:38 am

      <blockquote><em><a href="#438038">In reply to Pbike908:</a></em></blockquote><p>They can already do it with video.</p>

    • irfaanwahid

      28 June, 2019 - 1:26 am

      <blockquote><em><a href="#438038">In reply to Pbike908:</a></em></blockquote><p>Like Lincoln from Prison Break!</p>

  • skramer49

    Premium Member
    27 June, 2019 - 8:40 am

    <p>Just curious. Why are you giving it free publicity? Seems a creepy thing to do.</p>

    • spullum

      Premium Member
      27 June, 2019 - 9:52 am

      <blockquote><em><a href="#438039">In reply to skramer49:</a></em></blockquote><p>Agreed. Why is this summary of another article on thurrott.com exactly? The tech is interesting I guess but it doesn’t really fit. There’s not a lot of positive use for this software. It’s problematic in many ways. At first I thought this was posted because it somehow got into the Windows Store which would be a bad idea.</p>

      • lvthunder

        Premium Member
        27 June, 2019 - 10:33 am

        <blockquote><em><a href="#438049">In reply to spullum:</a></em></blockquote><p>People should know just how easy it is to create fake images.</p>

    • mattemt294

      27 June, 2019 - 12:45 pm

      <blockquote><em>I agree why is this trash even on here ? <a href="#438039">In reply to skramer49:</a></em></blockquote><p><br></p>

    • jimchamplin

      Premium Member
      27 June, 2019 - 1:23 pm

      <blockquote><em><a href="#438039">In reply to skramer49:</a></em></blockquote><p>Because this is a major problem. It’s a frankly chilling and terrifying ability to generate incredibly believable falsehoods, and the lack of effort required means that it will become a trivial thing. </p><p><br></p><p>Not could. <em>Will</em>.</p><p><br></p><p>The “lulz culture” of internet bros will use it to troll and annoy and degrade people. </p>

  • gerardt

    Premium Member
    27 June, 2019 - 8:57 am

    <p>Prime example of how technology advances are abused and used in a malicious, vile manner.</p>

  • VMax

    Premium Member
    27 June, 2019 - 9:04 am

    <p>"For those unfamiliar" – that seems to include you, Mehedi. This article isn't really accurate about what a "deepfake" is (as pointed out by others), and it's long been possible to put someone's face on someone else's body so it's not a new risk. Just because an app exists to make it easier doesn't excuse pointlessly advertising that app as is being done here. IMO this article would be better not having been published.</p><p><br></p><p>Edit – You know what, the downvotes are right, and my comment was uncalled for. I apologise for that, but I'll leave it up because it's fair to judge me on it. If I were to try again, I'd say that this article perhaps could do with an adjustment to the definition of deepfake, and maybe it would've been better not to name the app, as naming it doesn't help anyone except bad actors.</p>

    • DaveHelps

      Premium Member
      27 June, 2019 - 2:11 pm

      <blockquote><em><a href="#438043">In reply to VMax:</a></em></blockquote><p><br></p><p>Hey, that’s not how the internet works, You can’t just say something, respond to feedback, change your mind and apologise. I’m afraid you’re legally bound by IETF RFC 1149 to maintain your previous view. What kind of world would it be if everyone suddenly started acting like a rational, mature adult? ?</p>

      • jboman32768

        Premium Member
        27 June, 2019 - 9:31 pm

        <blockquote><a href="#438123"><em>In reply to DaveHelps:</em></a><em> </em>&nbsp;I’m afraid you’re legally bound by IETF RFC 1149</blockquote><p><br></p><p>However one might argue that due to the IETF RFC 2795 an infinite number of monkeys might also be responsible.</p>

    • RonH

      Premium Member
      27 June, 2019 - 2:25 pm

      <blockquote><em><a href="#438043">In reply to VMax:</a></em></blockquote><p>Good on you.</p>

  • dontbe evil

    27 June, 2019 - 9:57 am

    <p>Is it available on the store? Oh wait you wanted win32 apps thqt can be installed from everywhere</p>

  • Thom77

    27 June, 2019 - 10:11 am

    <p>The solution here is to force this to become exclusively a PWA so nobody knows it exists.</p>

    • lvthunder

      Premium Member
      27 June, 2019 - 10:38 am

      <blockquote><em><a href="#438060">In reply to Thom77:</a></em></blockquote><p>No. The solution is to make sure everyone knows about it so people can be skeptical of what they see. Those that want to use this type of technology will find and use it regardless.</p>

      • Thom77

        27 June, 2019 - 10:39 am

        <blockquote><em><a href="#438067">In reply to lvthunder:</a></em></blockquote><p>I was being sarcastic while taking a shot at the PWA is the future people.</p>

  • RobCannon

    27 June, 2019 - 10:23 am

    <p>I think the blackmail angle goes away. Since it is so easy to create a fake, it is easy to claim any photo is fake now.</p>

  • Daekar

    27 June, 2019 - 12:25 pm

    <p>Wow, I don't think I can remember any more sensationalistic reporting on this site before. Not only does it make it seem like women are the only ones that could be shamed with this sort of technology, it ignores the fact that women can and do shame each other. This isn't the Victorian era where women were still the paragons of innocence and virtue.</p><p><br></p><p>Amazing that anyone is surprised by this. The technology is only going to get better, so get used to it now. Very soon, whether it's reported on or not, deepfake nudes are going to be some of the more inoccuous uses of this tech…if somebody posts an auto-photoshopped nude of someone else, that doesn't lead to war or political disasters. </p>

    • skane2600

      27 June, 2019 - 1:10 pm

      <blockquote><em><a href="#438101">In reply to Daekar:</a></em></blockquote><p>Your politics are showing. Obviously the claim that women aren't the only ones shamed isn't supported by the claim that women shame each other. You seem more interested in commenting on women's character than on the subject at hand.</p>

      • Daekar

        28 June, 2019 - 4:56 am

        <blockquote><em><a href="#438109">In reply to skane2600:</a></em></blockquote><p>I think you should go back to school and study logic. Those two statements are not dependent on each other for validity. </p>

        • skane2600

          28 June, 2019 - 10:44 am

          <blockquote><em><a href="#438249">In reply to Daekar:</a></em></blockquote><p>Perhaps you should review the use of a comma. It isn't used to separate independent clauses. </p>

  • hrlngrv

    Premium Member
    27 June, 2019 - 4:45 pm

    <p>Useful to post about this on this site, was it? Would it be equally useful to post about apps for designing bombs?</p>

  • Patrick3D

    27 June, 2019 - 5:03 pm

    <p>Might as well ban pens, pencils, paints, scissors, where does the abandonment of freedom of speech end?</p>

  • fbman

    28 June, 2019 - 1:53 am

    <p>But the windows store is dead.. so no one will download it….</p><p><br></p><p>Sorry I could not resist.. P.S.. I do use the windows app store to purchase games (Forza)</p><p><br></p><p>The idea is now out there, so this app might be pulled, but another 100 copy cats will replace it.. Pandora's box is now open.</p><p><br></p><p><br></p>

  • Singingwolf

    28 June, 2019 - 5:25 am

    <p>Just me or is this the first click bait article on Thurrot.com</p><p><br></p><p><strong style="color: rgb(178, 178, 178); background-color: transparent;">Mehedi</strong>, don't lower yourself to the drudge of other sites. Just write articles that are on topic for this site.</p><p><br></p>

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC