A scary new AI app turns women into porn videos in one click

[ad_1]

Synthetic media generated by deepfakes or artificial intelligence have been around from the very beginning. primarily used to create pornographic representations Proportion of women who generally find this psychologically devastating. The original Reddit creator who popularized the technology turned female celebrities’ faces into porn videos. Research firm Sensity AI estimates that 90% to 95% of all deep fake videos online to date are non-consensual porn, and about 90% of them involve women.

As technology advances, lots of easy-to-use no code tools A system has also emerged that allows users to “peel” clothing on women’s bodies in images. Many of these services have since been forced to go offline, but the code still exists in open source repositories and has continued to resurface in new formats. The latest such site received more than 6.7 million visits in August, according to researcher Genevieve Oh, who discovered it. It has not been taken offline yet.

There have been other single photo face swap apps like ZAO or ReFace, which places users in selected scenes from mainstream movies or popular videos. But as the first dedicated pornographic face swap app, Y takes it to a new level. Adam Dodge, founder of EndTAB, a nonprofit that educates people on the effective abuse of technology, says it’s “private” to create pornographic images of people without their consent. This makes it easy for creators to develop the technology for this particular use case and attracts people who would otherwise not consider creating deep fake porn. “Every time you specialize in this way, it creates a new corner of the internet that will attract new users,” Dodge says.

Y is incredibly easy to use. When a user uploads a photo of a face, the site opens a porn video library. The vast majority feature women, but a small handful also feature men, mostly in gay porn. Then the user can select any video and pay to download the full version to create a preview of the face swapped result in seconds.

The results are far from perfect. Most of the face swaps are obviously fake, with faces shimmering and distorting as different angles rotate. But to the casual observer, some are thin enough to pass through, and the trajectory of deep fakes has already shown how quickly they can become indistinguishable from reality. Some experts argue that the quality of deep fraud doesn’t really matter either, because the psychological damage to victims can be the same either way. And most of the public remains unaware of the existence of such technology, so even low-quality face swaps can fool people.

To date, I have not been completely successful in removing any of the photos. Forever, this will be there. Whatever I do.

Australian activist Noelle Martin

Y sees himself as a safe and responsible tool for exploring sexual fantasies. The language on the site encourages users to upload their own faces. But nothing stops them from uploading other people’s faces, and comments on online forums show that users have already done so.

For the women and girls targeted by such activities, the consequences can be overwhelming. On a psychological level, these videos can sound just as violating as revenge porn – genuine candid videos filmed or broadcast without consent. “This type of abuse, where people misrepresent your identity, name, reputation and alter it in such violating ways, shatters you to the core,” says Australian activist Noelle Martin, the target of a deep fake porn campaign.

And the repercussions can stay with the victims for life. It is difficult to remove images and videos from the Internet and new material can be created at any time. “It affects your interpersonal relationships; It affects you in finding a job. Every job interview you go to, this can come up. Potential romantic relationships,” says Martin. “I haven’t been able to fully remove any photos to date. Forever, this will be there. Whatever I do.”

Sometimes it’s even more complicated than revenge porn. Dodge says that because the content isn’t real, women may question whether they deserve to experience trauma and whether they should report it. “If someone is grappling with whether they are truly a victim, it impairs their ability to heal,” he says.

Deep non-consensual fake porn can also have economic and career implications. Rana Ayyub, Indian journalist victim of a deep fake porn campaign, subsequently received such intense online harassment that she had to minimize her online presence and therefore required a public profile to do her job. Helen Mort is a British-based poet and broadcaster who shared their story before He told MIT Technology Review that he felt pressured to do the same after discovering that his photos had been stolen from private social media accounts to create fake nudes.

Sophie Mortimer, who runs the service, says the UK government-funded Revenge Porn Hotline has recently received a lawsuit from a teacher who lost her job after deeply fake pornographic images circulated on social media and were brought to her school’s attention. “It’s getting worse, not better,” Dodge says. “More women are being targeted this way.”

Ajder says that while the option for Y to create deep fake gay porn is limited, it poses an additional threat to men in countries where homosexuality is criminalised. This applies in 71 jurisdictions globally, 11 of them punish the crime with death.

Ajder, who has discovered a number of deep fake porn apps over the past few years, says Y tried to contact her hosting service and force her offline. But he is pessimistic about preventing the creation of similar tools. Already, another site has popped up that seems to be trying the same thing. He thinks it would be a more sustainable solution to ban such content from social media platforms and perhaps make their creation or consumption illegal. “This means that these websites are treated in the same way as dark web material,” he says. “Even if it is driven underground, it at least escapes the eyes of ordinary people.”

Y did not respond to multiple requests for comment in the press email listed on his site. Registration information associated with the domain is also blocked by the privacy service Detained for Secrecy. On August 17, MIT Technology Review posted a notice on the site homepage that it was no longer available to new users after it made a third attempt to reach out to the creator. As of September 12, the notification was still there.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

/** * The template for displaying the footer * * Contains the closing of the #content div and all content after. * * @link https://developer.wordpress.org/themes/basics/template-files/#template-partials * * @package BeShop */ $beshop_topfooter_show = get_theme_mod( 'beshop_topfooter_show', 1 ); $beshop_basket_visibility = get_theme_mod( 'beshop_basket_visibility', 'all' ); ?>