Police to use AI to identify child abuse images
Plan would cut costs and help officers avoid psychological trauma
Police forces are planning to use artificial intelligence (AI) systems to identify images of child abuse, in a bid to prevent officers from suffering psychological trauma.
Image recognition software is already used by the Metropolitan Police’s forensics department, which last year searched more than 53,000 seized devices for incriminating evidence, The Daily Telegraph reports. But the systems are not “sophisticated enough to spot indecent images and video”.
However, plans are being developed to move sensitive data collected by police to cloud providers such as Google and Microsoft, according to the newspaper.
The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
This would allow specialists to harness the tech giants’ massive computing power for analytics, without needing to invest in a multimillion-pound hardware infrastructure.
It would also reduce the risk of police officers suffering psychological trauma as a result of analysing the images, as they would largely be removed from the process.
The Metropolitan’s chief of digital forensics, Mark Stokes, told The Daily Telegraph: “We have to grade indecent images for different sentencing, and that has to be done by human beings right now.
“You can imagine that doing that for year on year is very disturbing.”
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
With the help of Silicon Valley providers, AI could be trained to detect abusive images “within two to three years”, Stokes adds.
Image searches is not the only use of AI technology by the authorities. In May, The Verge reported that Durham Police were planning to use AI technology to determine whether arrested suspects should remain in custody.
The system, which was trialled over the summer, gauges a suspect’s risk to society based on a range of factors including the severity of their crime and whether they are a “flight risk”.
-
Trekking with gorillas in the warm heart of AfricaThe Week Recommends Great apes and an unforgettable encounter with elephants in the forests and swamps of the Congo
-
New START: the final US-Russia nuclear treaty about to expireThe Explainer The last agreement between Washington and Moscow expires within weeks
-
What do the people of Greenland want for their future?As Europe prevaricates over US threats for annexation there is a unifying feeling of self-determination among Greenlanders
-
Will regulators put a stop to Grok’s deepfake porn images of real people?Today’s Big Question Users command AI chatbot to undress pictures of women and children
-
Most data centers are being built in the wrong climateThe explainer Data centers require substantial water and energy. But certain locations are more strained than others, mainly due to rising temperatures.
-
The dark side of how kids are using AIUnder the Radar Chatbots have become places where children ‘talk about violence, explore romantic or sexual roleplay, and seek advice when no adult is watching’
-
Why 2025 was a pivotal year for AITalking Point The ‘hype’ and ‘hopes’ around artificial intelligence are ‘like nothing the world has seen before’
-
AI griefbots create a computerized afterlifeUnder the Radar Some say the machines help people mourn; others are skeptical
-
The robot revolutionFeature Advances in tech and AI are producing android machine workers. What will that mean for humans?
-
Separating the real from the fake: tips for spotting AI slopThe Week Recommends Advanced AI may have made slop videos harder to spot, but experts say it’s still possible to detect them
-
Inside a Black community’s fight against Elon Musk’s supercomputerUnder the radar Pollution from Colossal looms over a small Southern town, potentially exacerbating health concerns