AI-generated fake celebrity porn takes over Reddit
App face-swaps movie stars and X-rated actors - but is it legal?

Artificial intelligence (AI) is being used to create fake celebrity pornography videos by placing the faces of movie stars onto the bodies of porn performers.
The trend was kick-started in December when a Reddit user by the name of deepfakes posted mocked-up celeb porn videos made using AI-assisted editing software, reports The Verge.
According to the website, other Reddit users are now employed a growing range of “easy-to-use” editing software to create their own face-swapped sex films and are posting them to deepfakes’ chat page, which has more than 15,000 subscribers.
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
Wonder Woman star Gal Gadot, Taylor Swift, Scarlett Johansson, and Game of Thrones actor Maisie Williams are among those who have been featured in the X-rated clips.
Most of the editing apps employ machine learning, which uses photographs to create human masks that are then overlaid on top of adult film footage, says Motherboard.
“All the tools one needs to make these videos are free,” the website says. The apps also come with “instructions that walk novices through the process”.
Is it legal?
No, says The Sun, since the fake porn videos are created without the consent of the celebs featured in them.
Andrew Murray, a professor of law at the London School of Economics, told the newspaper: “To put the fact of an identifiable person onto images of others, and then sharing them publicly, is a breach of Data Protection Law.”
Murray says that stars could sue the creators of fake porn for defamation if the videos are “received as genuine images and the celebrity, as a result, is viewed less favourably by members of society”.
The videos could also be seen as form of harassment, he told The Sun, which celebrities could report to the police.
Questioning reality
The ease with which plausible fake videos can be made is causing widespread concern, with fears that it heralds an era when “even the basic reality of recorded film, image or sound can’t be trusted”, reports The Guardian.
Mandy Jenkins, from social news company Storyful, told the newspaper: “We already see it doesn’t even take doctored audio or video to make people believe something that isn’t true.”
Reddit user deepfakes has told Motherboard that the technology is still in its infancy.
Deepfakes said they intended to keep improving the porn-creation software so that users can “can simply select a video on their computer” and swap the performer’s face with a different person “with the press of one button”.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
-
May 31 editorial cartoons
Cartoons Saturday's political cartoons include how much to pay for a pardon, medical advice from a brain worm, and a simple solution to the national debt.
-
5 costly cartoons about the national debt
Cartoons Political cartoonists take on the USA's financial hole, rare bipartisan agreement, and Donald Trump and Mike Johnson.
-
Green goddess salad recipe
The Week Recommends Avocado can be the creamy star of the show in this fresh, sharp salad
-
Google's new AI Mode feature hints at the next era of search
In the Spotlight The search giant is going all in on AI, much to the chagrin of the rest of the web
-
How the AI takeover might affect women more than men
The Explainer The tech boom is a blow to gender equality
-
Did you get a call from a government official? It might be an AI scam.
The Explainer Hackers may be using AI to impersonate senior government officers, said the FBI
-
What Elon Musk's Grok AI controversy reveals about chatbots
In the Spotlight The spread of misinformation is a reminder of how imperfect chatbots really are
-
Is Apple breaking up with Google?
Today's Big Question Google is the default search engine in the Safari browser. The emergence of artificial intelligence could change that.
-
Inside the FDA's plans to embrace AI agencywide
In the Spotlight Rumors are swirling about a bespoke AI chatbot being developed for the FDA by OpenAI
-
Digital consent: Law targets deepfake and revenge porn
Feature The Senate has passed a new bill that will make it a crime to share explicit AI-generated images of minors and adults without consent
-
AI hallucinations are getting worse
In the Spotlight And no one knows why it is happening