6 perspectives on social media's hate problem
The smartest insight and analysis, from all perspectives, rounded up from around the web
The smartest insight and analysis, from all perspectives, rounded up from around the web:
The massacre at two New Zealand mosques last week was a first, said Kevin Roose at The New York Times: "an internet-native mass shooting." The accused gunman, Brenton Tarrant, broadcast the killings live on Facebook, with video designed to pander to the internet's white supremacist subcultures. It was shared on all the major internet platforms, and in the hours following the shooting, not only Facebook but also YouTube, Twitter, and Reddit scrambled to take it down. The shooter's actions beforehand suggest an acute awareness of his audience — he even paused in his broadcast to say "Subscribe to PewDiePie," a reference to a popular right-wing YouTube influencer. The heinous acts were carried out with the knowledge that the platforms "create and reinforce extremist beliefs." They have algorithms that "steer users toward edgier content" and weak policies to contain hate speech, and they've barely addressed how to remove graphic videos. Extremists are exploiting this with increasing skill, said Joan Donovan at The Atlantic. The New Zealand attacker "knew that others would be recording and archiving" his video so that it could be re-uploaded in the wake of removals. In the first 24 hours after the attack, "Facebook alone removed 1.5 million postings of the video" and was still working around the clock days later.
Facebook did exactly what it's designed to do, said Peter Kafka at Recode. That is, "allow humans to share whatever they want, whenever they want, to as many people as they want." Of course, Facebook Live was never intended for white supremacists. But the company produced the tools for ease of uploading, and the platform is "fundamentally built" to spread content with little friction. Facebook's recent announcement about a shift to private communication "wouldn't prevent that stuff from going up," and encryption might make it even harder for Facebook to police. Facebook, Twitter, and Google (which owns YouTube) have invested heavily in artificial intelligence "designed to detect violence," said Jon Emont at The Wall Street Journal. Unfortunately, it's nearly impossible for AI to determine "which videos cross the line." It has difficulty recognizing a person holding a gun, for instance, because there are many different types of guns and stances for holding them. "Computers also struggle to distinguish real violence from fictional films."
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
Actually, there is technology that can flag "obvious indicators" of extremism and prevent hate speech from spreading, said Ben Goodale at The New Zealand Herald. After all, if I searched online for a suitcase, there would be smart ad software targeting me instantly. These platforms compose digital profiles indicating what "I'm interested in — my age range, gender, hobbies, reading preferences, sporting affiliations, you name it." All trackable, all within seconds. So why can't the platforms pick up "the obvious indicators" of fanaticism? Simply saying "we can't help it" or "that's not our job" is no longer acceptable, said Margaret Sullivan at The Washington Post. Facebook puts "tremendous resources and ingenuity" into maximizing clicks and advertising revenue. They rely on low-paid moderators and faulty algorithms to control content. But just as major news companies have dealt with such questions for centuries, "editorial judgment" from the platforms is not merely possible. "It's necessary."
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
-
Magazine solutions - December 27, 2024 / January 3, 2025
Puzzles and Quizzes Issue - December 27, 2024 / January 3, 2025
By The Week US Published
-
Magazine printables - December 27, 2024 / January 3, 2025
Puzzles and Quizzes Issue - December 27, 2024 / January 3, 2025
By The Week US Published
-
Why ghost guns are so easy to make — and so dangerous
The Explainer Untraceable, DIY firearms are a growing public health and safety hazard
By David Faris Published