Google's instant search 'blacklist'
Google's new "Instant" function blocks search results for everyday words like 'are' and 'Latina.' Why?
A new Google feature allows users to see results instantly as they type letters into the search bar — but some mundane words and phrases don't seem to register. (Watch a Google user list some of those words.) Hackers have compiled a masterlist of the "blocked" terms, which they have dubbed the "Google Blacklist." Here's an instant guide to this tech mini-controversy:
What kind of words are on the blacklist?
Curse words and sexual terms obviously, but also words like "lesbian," "Latina," "are" and "ecstacy" that Google thinks could bring up offensive material. For a full list, see 2600.com's crowdsourced collection of blacklisted words.
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
What happens when you type them in?
Rather than present the results, the space where they would normally appear simply goes blank. The results are only invisible until the the user presses enter. Then, the full search results are shown as usual.
Why does Google do this?
The search engine is undoubtedly "aiming to protect us from stumbling across a gallery of penises when searching for quality Italian meats," says Brian Ries at The Daily Beast. But it has set up an incompetent algorithm to filter out the filth. I don't know about that, says John Atchison at Helium. The algorithm cleverly blacklists on the basis of "historical data." Sure, you might search "lesbian" for perfectly inoffensive reasons. But if "the last thousand people" who typed it in "followed it up with sex or porn," then the computer figures it's a word that needs to be shielded from children. Besides, the results are still available if you press enter. "In the end, it all works out."
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
So this blacklist could be viewed as a good thing?
Perhaps, if you're a concerned parent. But for those who work in sex education or counselling, this skewering of results may affect business. "People looking for me may feel that I'm 'violent' or 'pornographic' because I don't show up," sex educator Shanna Katz tells The Daily Beast. There are also concerns that it could hurt search engine optimization, the system which Google uses to decide what order to list the sites in your search results — meaning your website may lose traffic just for having the wrong keywords attached to it.
What does Google say?
Google said that it was working to remove some of the more innocuous words from the blacklist, but the process might take some time. "It's important to note that removing queries from autocomplete is a hard problem, and not as simple as blacklisting particular terms and phrases," said a spokesperson.
Sources: The Daily Beast, Helium, Business Insider
-
Will Starmer's Brexit reset work?
Today's Big Question PM will have to tread a fine line to keep Leavers on side as leaks suggest EU's 'tough red lines' in trade talks next year
By The Week UK Published
-
How domestic abusers are exploiting technology
The Explainer Apps intended for child safety are being used to secretly spy on partners
By Chas Newkey-Burden, The Week UK Published
-
Scientists finally know when humans and Neanderthals mixed DNA
Under the radar The two began interbreeding about 47,000 years ago, according to researchers
By Justin Klawans, The Week US Published