Fake news? No, fake nudes!
8th
July 2019, 21:51
Deepfake technology has been up and coming for the past few years. From facial synthesis to cloned voices to video manipulation, the possibilities for abuse are terrifying. For instance, there may come a day when audio and video evidence can no longer be submitted in court because fake media is indistinguishable from the real thing.
Terrifying, undoubtedly. But also fascinating.
From a technical point of view, this application of technology is interesting. It's art and science in one heady package. The myriad of possibilities for actual practical use ranging from the mundane (test profile photos for Social Media simulations) to the awesome (fully computer-generated action movies where actors aren't at risk of getting injured).
But sometimes people go too far. And a couple weeks ago, the development team behind the now-defunct app DeepNude found out the hard way.
DeepNude is an app that creates simulated nudity. It purports to digitally remove the clothing from any picture of a woman, and display her naked. When this app was released, the outrage went viral, and while I'm not a big fan of online outrage in general, it's hard to say this was wholly undeserved. Women, in particular, were horrified at the implications regarding enabling perverted fantasies and revenge porn.
What this app actually does is refer to its database of body parts and replace the clothed body in the photo with a computer-generated nude one. Kind of like what people have been doing to celebrity photos (painstakingly using Photoshop or something), but now automated via software.
If you ask me, this seems too little, too late.
This app is deeply sexist not because of misogyny. Not because it shows women nude. But because it doesn't do the same for men. If you feed the app a man's image, it's going to show a naked woman's body with a man's head.
So if this app only allows users to "strip" women (and deny the ladies a similar pleasure), what's this for other than cheap male thrills? The creators didn't think it would be abused? They don't want to make money this way? That's cute.
The world is not yet ready for DeepNude? Bitch, please. As mentioned earlier, people have been doing this for years. The app simply made it easier. The world's been more than ready for something like DeepNude for a while now. Guess it was really a matter of time.
You see, that's the thing about us developers. We don't think. Not much about the non-technical stuff, anyway.
Case in point - when I first heard about the app, you know what my first thought was? Not how horrifying and misogynistic it is, but how something like that is supposed to work.
Even after processing the entire spectacle, moral outrage is the last thing on my mind. (Not being a woman, any moral outrage I expressed would come off as disingenuous, at best) My predominant position is that Deepfake technology has tremendous potential, and the creators of DeepNude using it as wank material strikes me as a huge waste.
Generally, when a developer is presented with a problem, our first instinct is to solve it, not to moralize it. We spend more time thinking about whether or not it can be done, rather than whether or not it should be done. That's why we need business people to tell us what's actually useful and commercially viable (and in good taste). Because, on this evidence, left to our own devices, us geeks would just waste our time churning out technically brilliant but utterly useless junk.
We automate stuff. And in this case, the choice of stuff to automate was unfortunate.
Tags
See also
Terrifying, undoubtedly. But also fascinating.
From a technical point of view, this application of technology is interesting. It's art and science in one heady package. The myriad of possibilities for actual practical use ranging from the mundane (test profile photos for Social Media simulations) to the awesome (fully computer-generated action movies where actors aren't at risk of getting injured).
But sometimes people go too far. And a couple weeks ago, the development team behind the now-defunct app DeepNude found out the hard way.
DeepNude is an app that creates simulated nudity. It purports to digitally remove the clothing from any picture of a woman, and display her naked. When this app was released, the outrage went viral, and while I'm not a big fan of online outrage in general, it's hard to say this was wholly undeserved. Women, in particular, were horrified at the implications regarding enabling perverted fantasies and revenge porn.
The tech, in layman's terms
Of course, DeepNude doesn't exactly digitally remove clothes to reveal the actual person's naked body beneath. Unless they had actual footage of that person's naked body (certain celebrities in the adult entertainment industry, for example), that would be impossible. That person might have tattoos, birthmarks, or sporting a third nipple. There's no way to tell.What this app actually does is refer to its database of body parts and replace the clothed body in the photo with a computer-generated nude one. Kind of like what people have been doing to celebrity photos (painstakingly using Photoshop or something), but now automated via software.
Apology and shutdown
The creators of DeepNude issued an apology, after shutting down their operation when it crashed due to overwhelming traffic.
Here is the brief history, and the end of DeepNude. We created this project for users' entertainment a few months ago. We thought we were selling a few sales a month in a very controlled manner. Honestly, the app is not that great, it only works with particular photos. We never thought it would become viral and we would not be able to control the traffic. We greatly underestimated the request.
Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make mney this way. Surely some copies of DeepNude will be shared on the web, but we dn't want to be the ones who sell it. Downloading the software from other sources or sharing it by any other means would be aganst the terms of our website. From now on, DeepNude will not release other versions and does not grant anyone its use. Not even the licenses to activate the Premium version.
People who have not yet upgraded will receive a refund.
The world is not yet ready for DeepNude.
Despite the safety measures adopted (watermarks) if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make mney this way. Surely some copies of DeepNude will be shared on the web, but we dn't want to be the ones who sell it. Downloading the software from other sources or sharing it by any other means would be aganst the terms of our website. From now on, DeepNude will not release other versions and does not grant anyone its use. Not even the licenses to activate the Premium version.
People who have not yet upgraded will receive a refund.
The world is not yet ready for DeepNude.
If you ask me, this seems too little, too late.
This app is deeply sexist not because of misogyny. Not because it shows women nude. But because it doesn't do the same for men. If you feed the app a man's image, it's going to show a naked woman's body with a man's head.
This is only a simulated
example using my sick
Photoshop skillz!
So if this app only allows users to "strip" women (and deny the ladies a similar pleasure), what's this for other than cheap male thrills? The creators didn't think it would be abused? They don't want to make money this way? That's cute.
The world is not yet ready for DeepNude? Bitch, please. As mentioned earlier, people have been doing this for years. The app simply made it easier. The world's been more than ready for something like DeepNude for a while now. Guess it was really a matter of time.
An interesting question
I've seen people question why the "perverted geeks" like the creators would think this was a good idea in the first place. It seemed like a genuine question, so I thought I'd answer that.You see, that's the thing about us developers. We don't think. Not much about the non-technical stuff, anyway.
Case in point - when I first heard about the app, you know what my first thought was? Not how horrifying and misogynistic it is, but how something like that is supposed to work.
Even after processing the entire spectacle, moral outrage is the last thing on my mind. (Not being a woman, any moral outrage I expressed would come off as disingenuous, at best) My predominant position is that Deepfake technology has tremendous potential, and the creators of DeepNude using it as wank material strikes me as a huge waste.
Generally, when a developer is presented with a problem, our first instinct is to solve it, not to moralize it. We spend more time thinking about whether or not it can be done, rather than whether or not it should be done. That's why we need business people to tell us what's actually useful and commercially viable (and in good taste). Because, on this evidence, left to our own devices, us geeks would just waste our time churning out technically brilliant but utterly useless junk.
We automate stuff. And in this case, the choice of stuff to automate was unfortunate.
And that's the naked truth!