What are trolls?
They’re the anonymous provocateurs who flood the internet with inflammatory insults, threats, and profanity. The term originates from the fishing technique of dragging a baited hook behind a moving boat; someone who uses offensive language to provoke a response is said to be "trolling." The practice has existed since the earliest days of the internet, and was long considered to be harmless, if annoying. But in recent years, trolls have become a scourge. Reasoned political discussion is often so overwhelmed by venomous, tit-for-tat name-calling that websites have to shut down their comment boards, as hundreds and even thousands of invective-filled responses pour in. On sites across the internet, liberals are regularly slammed as "libtards" and conservatives as "teabaggers"; comparisons to Auschwitz, Hitler, and the Nazis run rampant. Letting people comment about a racial controversy like the Trayvon Martin/George Zimmerman case, said Slate.com political reporter David Weigel, has become the equivalent of "putting out a freshly baked pie on the windowsill, smack dab in the middle of Racistville."
What motivates these people?
Trolling gives its anonymous practitioners the catharsis of venting forbidden feelings and ideas without suffering any consequences. On the internet, you can cuss out a stranger with even more vigor and impunity than you can a bad driver from the safety of your own car. "The enjoyment comes from finding a context in which you can let go, take a moral vacation," says psychologist Tom Postmes of Exeter University in the U.K. "Trolls aspire to violence, to the level of trouble they can cause in an environment." That prospect is particularly appealing to disaffected men in their late teens and 20s, but they are hardly alone: CNN tracked down a troll putting anti-Islamic screeds online and found that he was a 39-year-old father in Belgium. Rider University psychologist John Suler says an "online disinhibition effect" allows people who might never utter a hateful word in person to unleash withering vitriol on comment boards. Politics, race, gender, and religion all serve as lightning rods for troll rage, provoking such witty banter as "you n---er lover" and "you racist scumbag." But almost any topic can lead to outpourings of bile. When author Paul Carr recently wrote a column in The Wall Street Journal about quitting drinking without the help of Alcoholics Anonymous, he was greeted by an avalanche of furious commenters calling him a "narcissistic dry drunk" and predicting he would soon relapse and ruin his life.
Why have comments at all?
"Commenting is the secret sauce of social media," says Stanford social psychologist BJ Fogg. Creating a place for readers to debate issues makes them more likely to return, and that drives up website traffic and advertising revenue. Impassioned debate can be lucrative: The most engaged 1 percent of the audience on any given site can account for as much as 25 percent of its traffic. But editors who allow trolls to take over their comment sections risk undermining their sites in the long run. "Everyone is desperately chasing eyeballs as a way to increase advertising," said Rem Rieder, editor of American Journalism Review. "But rare is the advertiser who would want to be associated with the ugliness of many comment sections."
Could legislation deter the trolls?
Not in the U.S. While the U.K. has a law banning the posting of "grossly offensive" or "indecent, obscene, or menacing" messages online, our Constitution protects the right of trolls to be as rude or offensive as they like. In March, Arizona passed a bill banning the use of "any electronic or digital device" to "terrify, intimidate, threaten, harass, annoy, or offend a person." But legislators withdrew the bill after freedom-of-speech groups protested that it violated the First Amendment. UCLA law professor Eugene Volokh said the broad statute would have outlawed the use of such relatively tame insults as "this author is f---ing out of line."
So are sites powerless to halt personal attacks?
Some are calling for an end to online anonymity as a way to restrain trolls. Users of Facebook and Google+ must now use their real names and email addresses when creating accounts, and some comment boards are using software from Facebook that requires commenters to identify themselves. But a total ban on anonymity would be almost impossible to enforce. Far better, say Web activists, to let all comments stand, if only as a mirror of human depravity. "People are saying nasty, stupid things. So deal with it," says Rob Manuel, founder of digital community B3TA. "Shutting down free speech and stamping on people’s civil liberties is not a price worth paying." But more and more websites are taking a middle course by rigorously policing their own comment boards. "We’re still trying to find our way," says Paul Bass, editor of the New Haven, Conn., Independent, "between a free-flowing democratic discussion and a harsh, anonymous hate-fest."
Cleaning up after the trolls
The rise of the internet troll has created a booming new profession: comment moderator. Patrolling the endless reams of internet comments for abusive and incendiary language has become a massive task. HuffingtonPost.com, for example, which attracts more than 5 million comments every month, says each member of its in-house moderating team "reads the equivalent of Moby-Dick 18 times a month." Outside companies have spotted a business opportunity. Market leader ICUC Moderation Services generates annual revenues of some $10 million cleaning up comment boards for companies such as Starbucks, Chevron, and NPR. The job isn’t for everybody, says founder Keith Bilous, who employs some 200 moderators around the world. Many new hires quit within the first two weeks, and even after 10 years in the business Bilous says he still isn’t completely inured to the vile stuff he has to read. "Some Fridays you feel like you need to spend two hours in the shower, it’s so disgusting," he says.