AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |
Back to Blog
Sandra orlow nude picture3/9/2024 Just this month, it passed an amendment expanding its ban on nonconsensual pornography to include deepfakes. Should we respond legislatively? Legally, creating a DeepNude of someone who didn’t provide consent could be treated as a felony similar to blackmail (independent of the fake image’s use). Should our response be social? Is it even possible for us teach every person on the planet (including curious adolescents whose brains are still maturing and may be tempted to use DeepNude indiscriminately) that consent must be asked for and given freely? What role does Corporate Responsibility play? Should GitHub, or Microsoft (its parent company), be held accountable for taking down the DeepNude source code and implementing controls to prevent it from reappearing until victimization can be prevented? And instead of overlaying 1 person’s face onto 1 (other) person’s body, because it’s a machine learning algorithm trained on a dataset of over 10,000 images of nude women, reverse-engineering the output images to its component parts would be nearly impossible.Īll this begs the question- how should we respond? Can we prevent victimization by algorithms like these? If so, how? What’s significant is that it does so very quickly via automation. In the below example, Photoshop is used to overlay Katy Perry’s face onto Megan Fox’s (clothed) body:ĭeepNude effectively follows the same process. Thanks to applications such as Photoshop and the media’s coverage of deepfakes, if we don’t already question the authenticity of digitally-produced images, we’re well on our way to doing so. If technology’s ability to create fake images-including nudes- well enough to fool the human eye isn’t new, why is this significant? The downside of DeepNude becoming open source is that the algorithm can be trained on a larger dataset of nude images to increase (“improve”) the resulting nude image’s accuracy level. The upside for potential victims is that the algorithm is failing to meet expectations: Quite the opposite- it’s back as an open source project on GitHub- making it more dangerous than it was as a standalone app. Read more here.Īfter 4 days on the market, the creator(s) of DeepNude, the AI that “undressed” women, retired the app following a glut of backlash from individuals including leaders of the AI community:Īlthough DeepNude’s algorithm, which constructed a deepfake nude image of a woman (not a person a woman) based on a semi-clothed picture of her wasn’t sophisticated enough to pass forensic analysis, its output was passable to the human eye once the company’s watermark over the construced nude (for the free app) or “FAKE” stamp in the image’s corner ($50 version of the app) was removed. Update July 9, 7:55 p.m EST: GitHub removed the DeepNude source code from its website.
0 Comments
Read More
Leave a Reply. |