#Github Is Banning Copies Of #deepfakes Porn App #DeepNude

This feels like a Napster déjà vu. And this is exactly what will happen to an open source community once it's taken over by Big Tech.

New York (The Verge) — Github is banning copies of “deepfakes” porn app DeepNude theverge.com/2019/7/9/20687….

Microsoft-owned software development platform GitHub has banned copies of the officially discontinued DeepNude app, which produced nonconsensual nude pictures of woman using AI.

DeepNude was originally a paid app that created nonconsensual nude pictures of women using technology similar to AI “deepfakes.” The development team shut it down after Motherboard’s report, saying that “the probability that people will misuse it is too high.” However, as we noted last week, copies of the app were still accessible online — including on GitHub.

Late that week, the DeepNude team followed suit by uploading the core algorithm (but not the actual app interface) to the platform. “The reverse engineering of the app was already on GitHub. It no longer makes sense to hide the source code,” wrote the team on a now-deleted page. “DeepNude uses an interesting method to solve a typical AI problem, so it could be useful for researchers and developers working in other fields such as fashion, cinema, and visual effects.”

GitHub’s guidelines say that “non-pornographic sexual content may be a part of your project, or may be presented for educational or artistic purposes.” But the platform bans “pornographic” or “obscene” content.

DeepNude didn’t invent the concept of fake nude photos — they’ve been possible through Photoshop, among other methods, for decades. And its results were inconsistent, working best with photos where the subject was already wearing something like a bikini. But Motherboard called them “passably realistic” under these circumstances, and unlike Photoshop, they could be produced by anyone with no technical or artistic skill. Adi Robertson/@verge

Source: verge, full story

 

Leave a Reply