Microsoft’s Bing search engine serving up child sex abuse images, says report

Researchers claim even seemingly innocuous search terms brought up illegal porn

Microsoft
Microsoft launched Bing in 2009
(Image credit: 2017 Getty Images)

Microsoft’s Bing not only shows illegal images depicting child sexual abuse but also suggests search terms to help find them, according to a new report.

The study found that searching terms such as “porn kids” and “nude family kids” surfaced images of “illegal child exploitation”, the tech site says.

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

But the researchers also discovered that some seemingly innocuous terms could lead to illegal images.

Users searching for “Omegle Kids”, referring to a video chat app popular among teenagers, got a suggestion to search for term “Omegle Kids Girls 13”, which produced child abuse pictures, The Daily Telegraph says.

The researchers were “closely supervised by legal counsel” when conducting the study, adds TechCrunch, as searching for child pornography online is illegal.

Responding to the report, Microsoft’s vice president of Bing and AI products, Jordi Ribas, said: “Clearly, these results were unacceptable under our standards and policies and we appreciate TechCrunch making us aware.

“We’re focused on learning from this so we can make any other improvements needed.”

Microsoft isn’t the only tech giant struggling to tackle the problem.

In September, Israel-based safety groups Netivei Reshet and Screensaverz concluded that it was “easy” to find WhatsApp groups that posted and shared explicit images of children, according to CNet.

WhatsApp responded by saying it had a “zero-tolerance policy around child sexual abuse” and claimed it had removed 130,000 accounts in the space of ten days, the tech site adds.

Explore More