This New Windows App Lets Anyone Make Deepfake Nude Images of Women

Posted on June 27, 2019 by Mehedi Hassan in Uncategorized with 46 Comments

Update: the app has now been pulled and shut down by the developers after admitting that the number of people misusing the app will be too high. Original story follows.

Deepfakes have been a major talking point over the last year or so. For those unfamiliar, deepfakes are fake nude images or videos of women created using artificial intelligence. These advanced neural networks allow anyone to create fake nude images of anyone within minutes. And yes, they are being dangerously abused.

But that didn’t stop men from taking advantage of the algorithms. An anonymous developer has created a new DeepNude app for Windows and Linux that lets anyone create a fake nude image of women within seconds. The app has a free version that allows you to create deepfakes within seconds. And if you pay $99 for the premium version, you can get these fake nudes without any watermarks, reports Motherboard.

The app only works with pictures of women for now, with the developer apparently planning to create a version that works with men, too. The app even stamps a “fake” sign to let people know that a nude is fake, but it’s very easy to crop out. Deepfakes simply introduced the world to fake nude images and videos but required a lot of expertise, while this new app makes things as easy as using a simple photo editor app. It’s scary.

What’s infuriating about this is the fact that any creep out there can use your images to make a fake nude image of you, and then use that to blackmail you. It’s fairly easy to spot that these deepfakes are not actual, real nude images, but it could still fool a lot of people. And the fact that the algorithm is only going to get better is worryingly crazy. What’s worse that there isn’t really much legislation around these deepfakes, and creepy men online are abusing these algorithms and apps to the best of their abilities.

Tagged with ,

Join the discussion!

BECOME A THURROTT MEMBER:

Don't have a login but want to join the conversation? Become a Thurrott Premium or Basic User to participate

Register
Comments (46)

46 responses to “This New Windows App Lets Anyone Make Deepfake Nude Images of Women”

  1. MadsM

    I wonder if similar pearl-clutching has been written about Photoshop.

    • jbinaz

      In reply to MadsM:

      Pearl-clutching? Over degrading, humiliating, and embarrassing women? Seems like a pretty legit concern to me.


      And there's a huge barrier: photoshop takes time and effort, which many people won't expend. This seems pretty easy, give it a photo and it's done.

      • MadsM

        In reply to jbinaz:

        It could just as easily affect men. I don't understand the need to paint women as eternal victims.

        • garumphul

          In reply to MadsM:

          I agree in principle, but let's be absolutely honest - the vast majority of the time it's men victimizing women with fake nudes.

        • spullum

          In reply to MadsM:

          Then you are extremely tone deaf at best. How many men are going to be realistically impacted by this? This is a script-kiddie-like tool for anyone who gets mad at a woman or girl in middle school-adulthood. An ex, rival, co-worker, someone who is pissed off, whatever can now find a normal photo and made a nude of it and convince others it’s real to shame someone else.

          • lvthunder

            In reply to spullum:

            Society will just evolve to where it's not shameful to be naked.

            • AnOldAmigaUser

              In reply to lvthunder:

              The way things are going here, it is more likely to devolve into a Handmaid's Tale scenario. There is a big difference between faking nude photos of someone, and people feeling unashamed of nudity.

              The only reason for this sort of application is to create images meant to shame an individual or to pander to the stunted imaginations of those unlikely to have a meaningful relation with another human.

            • phuor

              In reply to lvthunder:

              There will still be a problem if the fake nudes look better than the real body.

            • mmcewan

              In reply to lvthunder:
              Yes, I'm surprised that our so-called modern society is still so prudish about nudity. I think most people are already aware that nudes can be faked. Perhaps it's the result of all this 'You are so special' crap that is espoused everywhere. In reality, nobody's body is all that special. If you're a homosapien, well, that comes in two models.

              The sole power of a 'nude photo scandal' is in the reaction of the offended party. Exhibit no sense of shame and make yourself bulletproof against this sort of petty attack. You can photoshop anyone's face onto any sort of body, plant, statue, robot, so what?
              Perhaps we will all have to copyright our own nudes and sue for loss of perceived value when fakes are distributed.


  2. Thom77

    The solution here is to force this to become exclusively a PWA so nobody knows it exists.

  3. gerardt

    Prime example of how technology advances are abused and used in a malicious, vile manner.

  4. VMax

    "For those unfamiliar" - that seems to include you, Mehedi. This article isn't really accurate about what a "deepfake" is (as pointed out by others), and it's long been possible to put someone's face on someone else's body so it's not a new risk. Just because an app exists to make it easier doesn't excuse pointlessly advertising that app as is being done here. IMO this article would be better not having been published.


    Edit - You know what, the downvotes are right, and my comment was uncalled for. I apologise for that, but I'll leave it up because it's fair to judge me on it. If I were to try again, I'd say that this article perhaps could do with an adjustment to the definition of deepfake, and maybe it would've been better not to name the app, as naming it doesn't help anyone except bad actors.

  5. dontbe evil

    Is it available on the store? Oh wait you wanted win32 apps thqt can be installed from everywhere

  6. RobCannon

    I think the blackmail angle goes away. Since it is so easy to create a fake, it is easy to claim any photo is fake now.

  7. Daekar

    Wow, I don't think I can remember any more sensationalistic reporting on this site before. Not only does it make it seem like women are the only ones that could be shamed with this sort of technology, it ignores the fact that women can and do shame each other. This isn't the Victorian era where women were still the paragons of innocence and virtue.


    Amazing that anyone is surprised by this. The technology is only going to get better, so get used to it now. Very soon, whether it's reported on or not, deepfake nudes are going to be some of the more inoccuous uses of this tech...if somebody posts an auto-photoshopped nude of someone else, that doesn't lead to war or political disasters.

  8. hrlngrv

    Useful to post about this on this site, was it? Would it be equally useful to post about apps for designing bombs?

  9. Patrick3D

    Might as well ban pens, pencils, paints, scissors, where does the abandonment of freedom of speech end?

  10. fbman

    But the windows store is dead.. so no one will download it....


    Sorry I could not resist.. P.S.. I do use the windows app store to purchase games (Forza)


    The idea is now out there, so this app might be pulled, but another 100 copy cats will replace it.. Pandora's box is now open.



  11. skramer49

    Just curious. Why are you giving it free publicity? Seems a creepy thing to do.

  12. Thom77

    Im Just here for the Streisand Effect.

  13. jimchamplin

    The scope of the “Deepfake” problem goes deeper. Think literally “putting words into someone’s mouth” level chicanery.

    • wright_is

      In reply to jimchamplin:

      Yes, deepfake is more than just naked women and has gone on to audio and video in recent months.

    • christian.hvid

      In reply to jimchamplin:

      The flip side of deepfakes is that they provide plausible deniability for those who need it - if someone publishes a video of you doing or saying something inappropriate or illegal, you can always claim it's a deepfake. I can imagine a certain president howling "DEEPFAKE NEWS!" whenever confronted with actual footage of his behavior.

    • garumphul

      In reply to jimchamplin:

      Inevitably we will end up where a fake video is claimed to be real, but detailed analysis will show that it's a fake.

      The bigger problem (perhaps) is when a real video is claimed to be fake, and the lack of any evidence that it's a fake will be brushed off as the video being a really good fake.

      The speed at which the technology is progressing is both amazing and terrifying, but I suspect the damage will outweigh any benefit by a very, very large factor.

  14. Pbike908

    Pretty wild...Not sure if there is anything that can really be done about this.


    Imagine the implications when it becomes technologically feasible to create deep fake videos of innocent folks committing crimes. Especially given all the internet connected cameras out there....

  15. Singingwolf

    Just me or is this the first click bait article on Thurrot.com


    Mehedi, don't lower yourself to the drudge of other sites. Just write articles that are on topic for this site.


Leave a Reply