Durham police to use AI for custody decisions
System has an 89 per cent success rate in identifying suspects who are likely to offend
Police in Durham are preparing to use artificial intelligence (AI) to assist officers deciding whether or not to send a suspect into custody, reports the BBC.
A system has been developed to categorise suspects into "low, medium or high risk of offending". It has been developed using five years of criminal history data.
Sheena Urwin, head of criminal justice at Durham Constabulary, told the BBC: "I imagine in the next two to three months we'll probably make it a live tool to support officers' decision making".
The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
Police trialled the harm assessment risk tool (Hart) for a two-year period starting in 2013, says Alphr, during which researchers discovered it had a 98 per cent success rate in identifying low-risk subjects and an 89 per cent rate for high-risk subjects.
It's decisions are based on factors such as "seriousness of alleged crime and previous criminal history".
Hart "leans towards a cautious outlook", says Alphr, so it is more likely to label a suspect as medium or high-risk, reducing the danger of "releasing dangerous criminals".
Such technology is becoming a vital tool in helping police in their investigations.
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
Last month, a man was charged with murdering his wife after investigators were able to work out her final moments using her Fitbit health tracker.
Information on how many steps the victim had walked indicated she had been active for an hour after the time her husband said she died, says The Guardian.
It also suggested she had "traveled more than 1,200ft after arriving home", adds the paper, while her husband said she was murdered by intruders immediately after arriving.
-
Tips for surviving loneliness during the holiday season — with or without peoplethe week recommends Solitude is different from loneliness
-
‘This is where adaptation enters’Instant Opinion Opinion, comment and editorials of the day
-
4 signs you have too much credit card debtthe explainer Learn to recognize the red flags
-
Is AI to blame for recent job cuts?Today’s Big Question Numerous companies have called out AI for being the reason for the culling
-
‘Deskilling’: a dangerous side effect of AI useThe explainer Workers are increasingly reliant on the new technology
-
AI models may be developing a ‘survival drive’Under the radar Chatbots are refusing to shut down
-
Saudi Arabia could become an AI focal pointUnder the Radar A state-backed AI project hopes to rival China and the United States
-
AI is making houses more expensiveUnder the radar Homebuying is also made trickier by AI-generated internet listings
-
‘How can I know these words originated in their heart and not some data center in northern Virginia?’instant opinion Opinion, comment and editorials of the day
-
AI: is the bubble about to burst?In the Spotlight Stock market ever-more reliant on tech stocks whose value relies on assumptions of continued growth and easy financing
-
Your therapist, the chatbotFeature Americans are increasingly turning to artificial intelligence for mental health support. Is that sensible?