Microsoft Designer Adds New Safeguards to Block AI-Generated Porn Deepfakes

Microsoft Designer

Microsoft has added new safeguards to its Designer tool after reports surfaced that the text-to-image generation tool was being used to create porn deepfakes of celebrities such as Taylor Swift. 404 Media, which first reported last week that fake nude images of the singer were being shared on X, discovered that people on 4chan and a Telegram channel were discussing how to use Microsoft Designer to create these Taylor Swift deepfakes.

Microsoft Designer uses the same technology as OpenAI’s popular DALL·E app, which allows users to create images from a simple text prompt. Just like Microsoft Copilot, Microsoft Designer is currently free to use, though the app still has a “preview” tag.

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday — and get free copies of Paul Thurrott's Windows 11 and Windows 10 Field Guides (normally $9.99) as a special welcome gift!

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

In an interview with NBC News on Friday, Microsoft CEO Satya Nadella said that the company has to “move fast” to prevent users of its AI tools from creating non-consensual intimate content. “I go back to what I think’s our responsibility, which is all of the guardrails that we need to place around the technology so that there’s more safe content that’s being produced,” Nadella said. “And there’s a lot to be done and a lot being done there.”

Microsoft also said in a statement last week that it’s been investigating reports about Taylor Swift deepfakes, but the company has been unable to reproduce such images using its AI tools. “Our content safety filters for explicit content were running and we have found no evidence that they were bypassed so far. Out of an abundance of caution, we have taken steps to strengthen our text filtering prompts and address the misuse of our services,” the company said.

Today, 404 Media independently confirmed that the Microsoft Designer loopholes that could be used to create porn deepfakes of celebrities were no longer working. Last week, people could reportedly try to misspell the name of celebrities and use prompts with no explicitly sexual language to get around the safeguards.

Overall, this is definitely a bad look for Microsoft’s AI technology, but this should also be a serious warning for any company working on generative AI technology while promising to do that in a “responsible” manner. While Microsoft acted pretty fast to fix its Designer tool, X, the platform formerly known as Twitter, also temporarily blocked Taylor Swift searches to prevent the porn deepfakes from spreading.

Tagged with

Share post

Please check our Community Guidelines before commenting

Windows Intelligence In Your Inbox

Sign up for our new free newsletter to get three time-saving tips each Friday

"*" indicates required fields

This field is for validation purposes and should be left unchanged.

Thurrott © 2024 Thurrott LLC