AI-generated fake celebrity porn takes over Reddit
App face-swaps movie stars and X-rated actors - but is it legal?
Artificial intelligence (AI) is being used to create fake celebrity pornography videos by placing the faces of movie stars onto the bodies of porn performers.
The trend was kick-started in December when a Reddit user by the name of deepfakes posted mocked-up celeb porn videos made using AI-assisted editing software, reports The Verge.
According to the website, other Reddit users are now employed a growing range of “easy-to-use” editing software to create their own face-swapped sex films and are posting them to deepfakes’ chat page, which has more than 15,000 subscribers.
The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
Wonder Woman star Gal Gadot, Taylor Swift, Scarlett Johansson, and Game of Thrones actor Maisie Williams are among those who have been featured in the X-rated clips.
Most of the editing apps employ machine learning, which uses photographs to create human masks that are then overlaid on top of adult film footage, says Motherboard.
“All the tools one needs to make these videos are free,” the website says. The apps also come with “instructions that walk novices through the process”.
Is it legal?
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
No, says The Sun, since the fake porn videos are created without the consent of the celebs featured in them.
Andrew Murray, a professor of law at the London School of Economics, told the newspaper: “To put the fact of an identifiable person onto images of others, and then sharing them publicly, is a breach of Data Protection Law.”
Murray says that stars could sue the creators of fake porn for defamation if the videos are “received as genuine images and the celebrity, as a result, is viewed less favourably by members of society”.
The videos could also be seen as form of harassment, he told The Sun, which celebrities could report to the police.
Questioning reality
The ease with which plausible fake videos can be made is causing widespread concern, with fears that it heralds an era when “even the basic reality of recorded film, image or sound can’t be trusted”, reports The Guardian.
Mandy Jenkins, from social news company Storyful, told the newspaper: “We already see it doesn’t even take doctored audio or video to make people believe something that isn’t true.”
Reddit user deepfakes has told Motherboard that the technology is still in its infancy.
Deepfakes said they intended to keep improving the porn-creation software so that users can “can simply select a video on their computer” and swap the performer’s face with a different person “with the press of one button”.
-
Starbucks workers are planning their ‘biggest strike’ everThe Explainer The union said 92% of its members voted to strike
-
‘These wouldn’t be playgrounds for billionaires’Instant Opinion Opinion, comment and editorials of the day
-
The 5 best nuclear war movies of all timeThe Week Recommends ‘A House of Dynamite’ reanimates a dormant cinematic genre for our new age of atomic insecurity
-
Is AI to blame for recent job cuts?Today’s Big Question Numerous companies have called out AI for being the reason for the culling
-
‘Deskilling’: a dangerous side effect of AI useThe explainer Workers are increasingly reliant on the new technology
-
AI models may be developing a ‘survival drive’Under the radar Chatbots are refusing to shut down
-
Saudi Arabia could become an AI focal pointUnder the Radar A state-backed AI project hopes to rival China and the United States
-
AI is making houses more expensiveUnder the radar Homebuying is also made trickier by AI-generated internet listings
-
‘How can I know these words originated in their heart and not some data center in northern Virginia?’instant opinion Opinion, comment and editorials of the day
-
AI: is the bubble about to burst?In the Spotlight Stock market ever-more reliant on tech stocks whose value relies on assumptions of continued growth and easy financing
-
Your therapist, the chatbotFeature Americans are increasingly turning to artificial intelligence for mental health support. Is that sensible?