An app called DeepNude, which lets users create fake nude pictures of women, has been taken down by developers, who finally realised the potential for misuse. The anonymous team of programmers created DeepNude, which would use artificial intelligence to remove clothes from the pictures of a woman and create a nude image in its place.
The developers announced on Twitter that they were taking down the app, which was made available on Windows and Linux. The app was first highlighted by MotherBoard, who also noted that the app was only producing nudes of women. An image of a man would result in the app adding a vulva to the picture, which clearly shows the thinking that went into creating the app.
According to the report, DeepNude had a free version where the nude image created had a watermark saying it was a fake. A premium version would put the ‘fake’ watermark lower down and make this smaller. As the website points out, watermarks can be easily removed.
I’m glad DeepNude is dead. As a person and as a father, I thought this was one of the most disgusting applications of AI. To the AI Community: You have superpowers, and what you build matters. Please use your powers on worthy projects that move the world forward.
— Andrew Ng (@AndrewYNg) June 28, 2019
The potential for misuse and harassment that DeepNude would cause was clearly lost on the developer team. The tweet notes that they only created this app for users’ entertainment. The idea that turning photos of dressed women into nudes without their consent qualifies as entertainment is appalling in itself.
Further, the app’s Twitter account has this tagline, ‘The superpower you always wanted.’ Clearly creating non-consensual nudes of women is considered a superpower.
In their takedown notice, the developers further defend themselves saying that they were only planning to do sales in a controlled manner. They’ve also said app is not that great and can only work accurately in some situations. Should women be relieved that only a few of their photos can be converted into nudes? We’re not sure.
Further, the letter goes on to acknowledge that if too many people were to use it (around 500,000 is mentioned which is not small at all), there is a probability that it will be misused. It raises the question on why the developers think it is a problem only if too many people use the app. Be it 100 people using the app or half a million, the idea proposed by DeepNudes is deeply problematic.
Coming to the Deep Fake technology, which was used to achieve this, it relies on more sophisticated tools afforded by artificial intelligence and machine learning. These can fake someone’s voice, face, even body language and add them to a video or an image. Deep fakes are not the crude photoshop fake images or badly edited videos that are easy to spot. Deep fakes can accurately copy the mannerisms, sound of the person they are supposed to be imitating.
The developers have also said they don’t want to make money this way, adding that they will not be releasing the license to anyone else to use the app and will not be activating the premium versions either. However, they acknowledge that other developers could sell copies of DeepNude on the web, but insisted they would not be selling the app.
Check out their tweet below
— deepnudeapp (@deepnudeapp) June 27, 2019
The letter ends saying ‘People who have not yet upgraded will received a refund. The world is not yet ready for DeepNude.’ The world should ideally never be ready for an app like DeepNude considering that it represents is a violation of privacy and objectifies women.
Deep fakes pose a serious problem to how content on the internet will be perceived and could only aggravate the spread of misinformation. Recently a deep fake video of Mark Zuckerberg went viral to showcase exactly why this is a serious problem. One of the biggest problems with deep fake AI is of course the potential for misuse and harassing women.
A recent article on HuffPost US revealed that ordinary women are already being harassed thanks to deep fake technology, which is deployed to create fake pornography videos of them. Some of the women have even tried to get these videos unsuccessfully removed from pornographic websites where they are hosted, but that has not happened. And with an app like DeepNude on the scene, the harassment would have been even easier to execute.
📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines