YouTube's algorithm has recommended videos of young, partially clothed kids to users who had previously watched sexual content, The New York Times reports.
Three researchers from Harvard's Berkman Klein Center for Internet and Society found that going through YouTube's recommendations for "sexually themed videos," such as videos of women talking about sex, led them to videos that "placed greater emphasis on youth" until eventually, "YouTube would suddenly begin recommending videos of young and partially clothed children," the Times reports.
"So a user who watches erotic videos might be recommended videos of women who become conspicuously younger, and then women who pose provocatively in children's clothes," the Times writes. "Eventually, some users might be presented with videos of girls as young as 5 or 6 wearing bathing suits, or getting dressed or doing a split."
The algorithm "curated" videos of kids by "at times plucking out the otherwise innocuous home movies of unwitting families," the report says, apparently "learning from users who sought out revealing or suggestive images of children." Some of these videos have racked up millions of views, with one parent saying she became "scared" after a video of her 10-year-old daughter playing in her backyard while wearing a bathing suit was viewed more than 400,000 times.
YouTube in a blog post on Monday said that "responsibility is our number one priority, and chief among our areas of focus is protecting minors and families." The site also said that it has updated its policies so that "younger minors" will not be permitted to live stream unless they are "clearly accompanied by an adult." When asked if it would turn off recommendations for videos featuring minors, YouTube told the Times that doing so would hurt creators but said it would continue to limit recommending videos that put children at risk.
Previously, YouTube disabled the comments on videos with minors following reports that pedophiles were sharing links to child pornography on the site.