On Wednesday, a Vice article alerted the world to the creation of DeepNude, a computer program that uses neural networks to transform an image of a clothed woman into a realistic rendering of what she might look like naked.
The software attracted widespread condemnation. This is an “invasion of sexual privacy,” legal scholar Danielle Citron told Vice.
The software’s anonymous creator explained how it worked to Vice’s Samantha Cole:
The software is based on pix2pix, an open-source algorithm developed by University of California, Berkeley researchers in 2017. Pix2pix uses generative adversarial networks (GANs), which work by training an algorithm on a huge dataset of images—in the case of DeepNude, more than 10,000 nude photos of women, the programmer said—and then trying to improve against itself.
The software focused on women because it was easier to find pictures of nude women, the programmer said. If you run the program on a clothed picture of a man, it will depict him naked with female private parts.
The DeepNude website was quickly swamped with downloads and was soon taken down. On Thursday morning, the team behind DeepNude promised to upgrade the website, fix some bugs in the software, and bring it online in a few days.
But the public backlash against the software proved too powerful. By Thursday afternoon, the software’s author announced that it was being taken down permanently.
In a statement announcing the decision, the anonymous creator noted that the program included watermarks in generated photographs as a “safety measure” to prevent them from being misused—presumably referring to would-be users presenting a fake image as if were real. However, he said, “if 500,000 people use it, the probability that people will misuse it is too high. The world is not yet ready for DeepNude.”
Critics questioned whether there was any way to use the software that wasn’t a “misuse.” In any event, the software is no longer available, at least from official sources. Pirated copies continue to circulate online.