Does ‘Elsagate’ prove YouTube is too big to control?
Scandal of violent and sexual videos aimed at children exposes the difficulty of relying on algorithms
In February, YouTube announced it had hit a staggering milestone: visitors were now consuming the equivalent of a billion hours’ worth of video every day.
The sheer size of the platform’s userbase is equally astounding. “More than 1.5 billion people use YouTube,” says The Guardian’s Roger McNamee, giving it a global reach comparable to Islam.
But, as they saying goes, with great power comes great responsibility and the recent “ElsaGate”, a scandal involving disturbing videos aimed at children, has led some to wonder if YouTube has become a Frankenstein’s monster, beyond the control of its creators.
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
What is ElsaGate?
Some of YouTube’s most popular channels are aimed at children, with creators specialising in nursery rhymes, colourful cartoons and the all-powerful toy-unboxing videos gaining millions of subscribers and billions of views.
But there is a problem. YouTube is “absolutely flooded with extremely violent, inappropriate, sexually suggestive videos targeted at children,” says Inquistr, and these videos are finding their way into autoplay lists alongside age-appropriate clips.
Journalist James Bridle delved into this unsettling phenomenon, dubbed Elsagate after the popular Frozen character who appears in many of the videos, for an article titled Something is Wrong on the Internet on Medium this month.
He found that as content creators chase viewers, successful - and originally harmless - formulas for garnering views are “endlessly repeated across the network in increasingly outlandish and distorted recombinations”.
At their most extreme, these include a legion of unsettling videos which appear to be produced, or in some cases automatically generated, in response to popular keywords. They often feature disturbing themes and sexual or violent content.
For instance, a search for “Peppa Pig dentist” returns a homemade clip in which the popular children’s character is “tortured, before turning into a series of Iron Man robots and performing the Learn Colours dance”.
ElsaGate has generated a flood of response from concerned parents, the media and internet sleuths dedicated to finding out who or what is making these disturbing videos and why.
Has YouTube become too big to control?
At the heart of the ElsaGate and other controversies over YouTube’s content, such as videos promoting terrorism or violence, is the omnipotence of the platform’s algorithms.
Contrary to what some parents may believe, content on YouTube’s dedicated Kids app is not curated or even pre-screened by humans. Instead, suggested videos appear in its autoplay list automatically based on shared keywords or similar audiences.
The sheer size of YouTube’s catalogue goes beyond the capabilities of human oversight. Content is uploaded to the platform at the equivalent of 400 hours of video every minute, according to Statista.
The incredible proliferation of videos that have clearly been produced in response to common search terms show the extent to which YouTube is essentially run by machines, says Medium’s Bridle.
On a platform where content visibility – and thus potential for ad revenue – is controlled by algorithms, “even if you’re a human, you have to end up impersonating the machine”.
ElsaGate has exposed a long-standing truth about YouTube that can no longer be ignored, says Polygon: the filters designed to protect users of all ages from disturbing, violent or illegal content are not up to the job.
The algorithms are “ripped apart, analyzed and beaten up” by content creators, human or automated, who have become adept at gaming the system to make sure their output is seen.
In a particularly notorious example, pranksters on the 4chan message board “managed to splice pornography into children’s shows and game the algorithm to ensure the videos were monetized and seen”, says Polygon.
Can the problem be fixed?
YouTube has pledged to crack down on the plague of inappropriate children’s videos slipping past the filters on its Kids app.
Juniper Downs, director of policy, said a new system which predates the ElsaGate revelations will classify videos flagged by users as age restricted, which will automatically keep them out of the YouTube Kids app.
On a wider levels, there are signs YouTube is shifting its attitude towards content curation towards a more hands-on approach, prompted by pressure from governments concerned about online extremism.
Multiple terrorist attacks in Europe and the US have been perpetrated by individuals believed to have been self-radicalised in part by watching YouTube propaganda videos made by groups such as Islamic State and their sympathisers.
Politicians have threatened to take action against tech firms who do not strengthen measures to remove videos containing extremist content. Alphabet, YouTube's parent company, seems to be taking the threats seriously.
Previously, “inflammatory” religious or political videos were permitted on YouTube if they did not violate the site’s rules on graphic content or promoting violence, although they were not eligible for ad revenue.
A YouTube spokesperson confirmed to Reuters their policy has changed to prohibit any video that features people or groups classified as “terrorist”, including material such as lectures by al-Qaeda recruiter Anwar al-Awlaki.
However, even if YouTube does develop new strategies to fix the loopholes, the implications of ElsaGate remain disturbing on a more profound level, says TechCrunch’s Natasha Lomas.
“The YouTube medium incentivises content factories to produce click fodder,” Lomas writes, exposing a generation to a tidal wave of “pop culture slurry” based around keywords rather than coherency. “It’s hard to imagine anything positive coming from something so intentionally base and bottom-feeding being systematically thrust in front of kids’ eyeballs.”
Create an account with the same email registered to your subscription to unlock access.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
-
Today's political cartoons - September 7, 2024
Cartoons Saturday's cartoons - football widows, meddling kids, and more
By The Week US Published
-
Smoking ban: the return of the nanny state?
Talking Point Starmer's plan to revive Sunak-era war on tobacco has struck an unsettling chord even with some non-smokers
By The Week UK Published
-
Crossword: September 7, 2024
The Week's daily crossword puzzle
By The Week Staff Published
-
Pakistan 'gaslighting' citizens over sudden internet slowdown
Under the Radar Government accused of 'throttling the internet' and spooking businesses with China-style firewall, but minister blames widespread use of VPNs
By Harriet Marsden, The Week UK Published
-
Apple Intelligence: iPhone maker set to overhaul the AI experience
In the Spotlight A 'top-to-bottom makeover of the iPhone' sees the tech giant try to win the consumer AI game
By Harriet Marsden, The Week UK Published
-
How technology helps and harms endangered languages
Under the radar Languages are disappearing at fastest rate in history, accelerated by digital dominance of English
By Harriet Marsden, The Week UK Published
-
Generation Alpha: the new kids making Gen Z feel old
feature The next generation of children are on the cusp of becoming teenagers – and are making their presence felt
By Felicity Capon Published
-
YouTuber Trevor Jacob facing prison over deliberate crashing of plane for views
Speed Read US prosecutors said the influencer ‘did not intend to reach his destination’
By The Week Staff Published
-
The Supreme Court, Section 230 and the future of the internet
feature Lawsuits brought against tech giants could have far-reaching consequences for the internet as we know it
By Richard Windsor Published
-
Spain spends €258m on trains too big for tunnels
feature And other stories from the stranger side of life
By Chas Newkey-Burden Published
-
Animal shelter will name cat litter tray after your ex
feature And other stories from the stranger side of life
By Chas Newkey-Burden Published