The UK's safeguarding minister has called for an overhaul of the main tool used to decide if a domestic abuse victim needs urgent support.
Jess Phillips (pictured above, top right) told the BBC's File on 4 that the current Dash assessment "doesn't work", amid mounting evidence that it fails to correctly identify those at the highest risk of further harm.
How does Dash work? The Dash (Domestic, Abuse, Stalking, Harassment and Honour-Based Violence) assessment is a checklist of 27 mainly yes or no questions to be put to victims – including "Is the abuse getting worse?" and "Has the current incident resulted in injury?"
The victim's answers produce a score that's meant to determine their risk of imminent harm or death. Since 2009, the Dash risk scores have been relied on by many police forces, social services and healthcare workers to determine what action is taken after a reported incident, although practitioners are encouraged to use their "professional judgement" to override low scores.
What's wrong with it? Academics, domestic abuse charities and bereaved families have long raised doubts over the accuracy of the Dash assessment.
In 2020, a London School of Economics study of Greater Manchester Police data found that, in nearly nine out of ten repeat cases of violence, the victim had been classed as standard or medium risk by Dash. Earlier this month, an investigation by The Telegraph identified at least 55 women who had been killed by their partner after being graded only standard or medium risk.
Pauline Jones, the mother of Bethany Fields, who was killed by her partner in 2019, a month after being graded a medium risk by Dash, put it more directly: "When you hear about the Dash, and you know your daughter's death was so easily preventable, it destroys not just your heart but your very soul."
Is there a better option? Some police forces have adopted Dara, a newer tool developed by the College of Policing, instead of Dash. Other forces and organisations, in the UK and abroad, are calling for a more radical overhaul, using new technology to assess future risk. "In certain contexts," said Forbes, "AI-enabled tools are making it easier to discreetly gather evidence, assess personal risk and document abuse – actions that were previously unsafe or more difficult to carry out."
|