Ivan Bravo, the creator of the spinoff website that claims to have more than 3,000 paying customers, says “it is not correct” morally that he makes money from a service that harms people. All three sites still accept various cryptocurrencies for payment. Like Coinbase, many of these providers cut ties after previous media reports. At various points in their existence, they have accepted bank transfers, PayPal, Patreon, and multiple cryptocurrencies. They offer a limited number of free images, billed as trials of the technology, but visitors are pushed toward payment. All three charge people for processing the images, ranging from $10 for 100 photos to $260 for 2,000. The websites are raking in money for their creators. Society, technology companies, and law enforcement need to have a “zero tolerance” approach to these deepfakes, she adds. “This harm is going to become part of the sex industry and is going to become profitable it's going to become normalized,” Maddocks says. The inclusion of partners and payment services across the website and its two partners indicates that this kind of technology is at a tipping point, says Sophie Maddocks, a researcher at the University of Pennsylvania’s Annenberg School for Communication who specializes in studying online gender-based violence. “The people behind it have done something which hasn't really been done since the original DeepNude tool … that's trying to build a strong community around it.” “The quality is much higher,” says Henry Ajder, an adviser on deepfakes and head of policy and partnerships at synthetic media company Metaphysic. The expansion of this recent site and its partnerships commoditizes those intrusions even further. With the increased ease of use, targets of harassment have moved from high-profile celebrities and influencers to members of the public. Recent horrifying developments have also included easy-to-use video production. Since then this kind of technology has become as easy to use as selecting a photo and clicking upload. The technology was turned into its first app, dubbed DeepNude, in 2019 although its creator took the app down, its code still circulates. Since the first AI-generated fake porn was created by a Redditor at the end of 2017, these systems have become more sophisticated. The website’s startup-like growth tactics signal a maturity in abusive “nudifying” deepfake technologies, which overwhelmingly target and harm women. The original website has been previously reported on, but the extent of its partner programs has not. The website has made its algorithms available to “partners” through access to its APIs and two spin-off websites have been created by other people. In recent months the website has expanded its services, earning its creator potentially thousands of dollars. Previously, similar technologies have only worked with partially clothed photographs. Researchers say its output is “hyper-realistic,” and unlike similar abusive platforms, it can generate pornographic images even when the person in the original photo is fully clothed. It digitally “removes” clothing from non-nude photos to create nonconsensual pornographic deepfakes. The website, which WIRED is not naming to limit its amplification, has existed since last year. The expansion efforts have allowed the service to proliferate despite bans placed on its payment infrastructure. A deepfake website that generates “nude” images of women using artificial intelligence is spreading its murky tentacles across the web-spawning look-alike services through partner agreements and recruiting new users through a referral system.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |