AI-generated fake celebrity porn takes over Reddit

App face-swaps movie stars and X-rated actors - but is it legal?

wd-porn_filters.jpg
Creators of the mocked-up videos could face legal action

Artificial intelligence (AI) is being used to create fake celebrity pornography videos by placing the faces of movie stars onto the bodies of porn performers.

According to the website, other Reddit users are now employed a growing range of “easy-to-use” editing software to create their own face-swapped sex films and are posting them to deepfakes’ chat page, which has more than 15,000 subscribers.

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

Wonder Woman star Gal Gadot, Taylor Swift, Scarlett Johansson, and Game of Thrones actor Maisie Williams are among those who have been featured in the X-rated clips.

Most of the editing apps employ machine learning, which uses photographs to create human masks that are then overlaid on top of adult film footage, says Motherboard.

“All the tools one needs to make these videos are free,” the website says. The apps also come with “instructions that walk novices through the process”.

Is it legal?

No, says The Sun, since the fake porn videos are created without the consent of the celebs featured in them.

Andrew Murray, a professor of law at the London School of Economics, told the newspaper: “To put the fact of an identifiable person onto images of others, and then sharing them publicly, is a breach of Data Protection Law.”

Murray says that stars could sue the creators of fake porn for defamation if the videos are “received as genuine images and the celebrity, as a result, is viewed less favourably by members of society”.

The videos could also be seen as form of harassment, he told The Sun, which celebrities could report to the police.

Questioning reality

The ease with which plausible fake videos can be made is causing widespread concern, with fears that it heralds an era when “even the basic reality of recorded film, image or sound can’t be trusted”, reports The Guardian.

Mandy Jenkins, from social news company Storyful, told the newspaper: “We already see it doesn’t even take doctored audio or video to make people believe something that isn’t true.”

Reddit user deepfakes has told Motherboard that the technology is still in its infancy.

Deepfakes said they intended to keep improving the porn-creation software so that users can “can simply select a video on their computer” and swap the performer’s face with a different person “with the press of one button”.