A guy made a deepfake app to turn photos of women into nudes. It didn’t go well.

The quick rise and quicker fall of an app that would’ve made any woman a potential victim.

An anonymous programmer created a new app called DeepNude that uses AI to create nonconsensual porn. If you feed it a picture of a clothed woman, it removes her clothes so that she appears naked. The result is pretty realistic — and blatantly unethical.

This app is the latest evolution of AI-powered deepfake technology, which makes it disturbingly easy to doctor images to make it look like someone said or did something they never actually said or did. Even if you’ve never posed naked for a photo in your life, anyone who downloads DeepNude can make it look as if you did.

So it was really great news when, just days after he released the app, the programmer behind it decided to shut it down on Thursday.

After Motherboard first reported on the “horrifying” new app, other news outlets followed suit with critical coverage. In an excellent example of how public scrutiny can bring unethical AI-powered tech to a halt, the developer then said he realized that “the probability that people will misuse it is too high.”

pic.twitter.com/8uJKBQTZ0o

— deepnudeapp (@deepnudeapp) June 27, 2019

Damn straight. The short-lived app was free, easy to use, and fast — the digital disrobing only took 30 seconds.

In other words, it had all the ingredients necessary to turn an unsuspecting woman’s existence into a living hell. You don’t need a whole lot of imagination to realize how it could be used to produce revenge porn that will be all the more devastating to the target’s life because of how realistic some of the nude photos look. (For ethical reasons, I’m not going to include examples of the photos in this article.)

Although the programmer’s decision to pull his product is welcome, he capped off the announcement with a conclusion that seems naive at best and disingenuous at worst: “The world is not yet ready for DeepNude.”

How could the world ever be ready for an app whose sole and explicit purpose is to transform regular photos of women into nudes within seconds? Or, put another way, of course people were going to “misuse” the app. There was no way to use it other than for that very sort of “misuse.”

Note that the app only worked on women (if you tried to use it on a photo of a man, it just added female genitalia to him) — which right away should make you dubious about the programmer’s original intentions.

Deepfakes are most often discussed as a threat in the political realm, because of their potential to sow misinformation and fake news. You may have heard them discussed in relation to the doctored video of House Speaker Nancy Pelosi that went viral in May, or the fake video of Facebook CEO Mark Zuckerberg that made the rounds this month. (In both cases, Facebook refused to remove the videos from its platform.) Congress is holding hearings about the technology, and Rep. Yvette Clarke (D-NY) has put forward a bill known as the Deepfakes Accountability Act.

Less discussed is the unique danger that deepfake technology poses to women. It’s time we realized that when our AI systems are not aligned with our ethical values, women are sometimes particularly apt to suffer.

In the deepfake era, every woman is a potential victim

Since the public started playing around with deepfake technology a couple of years ago, it’s been used to harm women.

In 2017, fake celebrity porn videos started appearing in the subreddit r/deepfakes. People had altered videos of actual porn actors, swapping out their faces for those of celebrities. In February 2018, Reddit banned the community for distributing “involuntary pornography.”

Motherboard’s Samantha Cole, who first reported on the DeepNude app, noted that it “dispenses with the idea that deepfakes were about anything besides claiming ownership over women’s bodies.” She wrote:

In an email, the anonymous creator of DeepNude, who requested to go by the name Alberto, told Motherboard that the software is based on pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017. Pix2pix uses generative adversarial networks (GANs), which work by training an algorithm on a huge dataset of images — in the case of DeepNude, more than 10,000 nude photos of women, the programmer said — and then trying to improve against itself.

The programmer, Alberto, said it was “fun and enthusiasm” for new technologies that drove him to create the app. “I’m not a voyeur, I’m a technology enthusiast,” he said. “Recently, also due to previous failures (other startups) and economic problems, I asked myself if I could have an economic return from this algorithm. That’s why I created DeepNude.”

Translation: He realized he could profit off women’s bodies and the invasion of their sexual privacy. Although he offered a basic version of the app for free, the nudes it churned out came partly occluded with a large watermark that said “FAKE.” For $50, though, you could get a version of the app that only put “FAKE” in the upper-left corner, making it very easy to crop out.

What’s more, as late as 1 pm on Thursday, the DeepNude website read: “Are you interested in investing or buying our startup? We are evaluating offers.”

Alberto told Motherboard that he’d asked himself whether his app should have been made in the first place. “Is this right? Can it hurt someone?” he mused. But he dismissed his misgivings.

“I think that what you can do with DeepNude, you can do it very well with Photoshop (after a few hours of tutorial),” he said. “I also said to myself: the technology is ready (within everyone’s reach). So if someone has bad intentions, having DeepNude doesn’t change much. … If I don’t do it, someone else will do it in a year.”

This is a rationale I’ve often heard trotted out in relation to new technologies. Some people working on cutting-edge AI systems tell me there will always be bad actors; if they don’t create these systems, the bad actors will. Sometimes, they argue it’s better that well-intentioned people release the new AIs while making their process fully transparent and warning about the ethical dangers inherent in it.

But releasing new AIs that are misaligned with ethical values inevitably contributes to changing the norms in the AI community. If one person releases an app like DeepNude, they open the gate for others, who will arguably find it easier to justify charging into the space.

As the threat from deepfakes becomes more obvious, politicians, lawmakers, AI experts, and even CEOs like Zuckerberg are beginning to realize we need a harsher approach to them. It’ll probably take some time for new legislation to be put in place, though. In the meantime, the threat from deepfake technology is something every member of the public should be aware of. With free apps like DeepNude entering the market, we’re all at risk of being the next victim.

The good news is that, as the DeepNude episode showed, journalistic scrutiny and public outcry can sometimes stem the tide.


Sign up for the Future Perfect newsletter. Twice a week, you’ll get a roundup of ideas and solutions for tackling our biggest challenges: improving public health, decreasing human and animal suffering, easing catastrophic risks, and — to put it simply — getting better at doing good.

Powered by WPeMatico