Dash: the UK's 'flawed' domestic violence tool

Risk-assessment checklist relied on by police and social services deemed unfit for frontline use

Jess Phillips
Jess Phillips (top right), the minister for safeguarding, told the BBC that the current Dash assessment 'doesn't work' amid mounting evidence that it fails to correctly identify those at the highest risk of further harm
(Image credit: Gareth Fuller / WPA Pool / Getty Images)

The UK's safeguarding minister has called for an overhaul of the main tool used to decide if a domestic abuse victim needs urgent support.

Jess Phillips told the BBC's File on 4 that the current Dash assessment "doesn't work", amid mounting evidence that it fails to correctly identify those at the highest risk of further harm.

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

How does Dash work?

The Dash (Domestic, Abuse, Stalking, Harassment and Honour-Based Violence) assessment is a checklist, co-developed by domestic-abuse charity SafeLives. It features 27 mainly yes or no questions put to victims – including "Is the abuse getting worse?" and "Has the current incident resulted in injury?"

The victim's answers produce a score that's meant to determine their risk of imminent harm or death. Answering "yes" to at least 14 questions classes a victim as "high risk" and guarantees intensive support and urgent protection. No specialist support is guaranteed to anyone who gets a "medium" or "standard" risk score.

Since 2009, the Dash risk scores have been relied on by many police forces, social services and healthcare workers to determine what action is taken after a reported incident, although practitioners are encouraged to use their "professional judgement" to override low scores or to escalate a case if there are multiple police callouts in a year.

What's wrong with it?

Academics, domestic abuse charities and bereaved families have long raised doubts over the accuracy of the Dash assessment.

As far back as 2016, the College of Policing review found "inconsistencies" in Dash, and recommended a different tool for frontline police responders. And in 2020, a London School of Economics study of Greater Manchester Police data found that, in nearly nine out of ten repeat cases of violence, the victim had been classed as standard or medium risk by Dash.

In 2022, an analysis by researchers from the Universities of Manchester and Seville found that the Dash questions "contributed almost nothing" to its performance as a predictive tool, and, earlier this month, an investigation by The Telegraph identified at least 55 women who had been killed by their partner after being graded only standard or medium risk.

"Too many have died without help, as the Dash system failed to recognise the true threat they faced," said Alicia Kearns MP, the shadow safeguarding minister.

Pauline Jones, the mother of Bethany Fields, who was killed by her partner in 2010, a month after being graded a medium risk by Dash, put it more directly: "When you hear about the Dash, and you know your daughter's death was so easily preventable, it destroys not just your heart, but your very soul.”

Is there a better option?

Dash has "obvious problems", said Phillips. She is reviewing the entire system but "until I can replace it with something" that works better, "we have to make the very best of the system that we have." Any risk assessment tool is "only as good as the person who is using it".

Some police forces have adopted Dara, the tool that the College of Policing has developed, instead of Dash. Other forces and organisations, in the UK and abroad, are calling for a more radical overhaul, using using new technology to assess future risk. "In certain contexts," said Forbes, "AI-enabled tools are making it easier to discreetly gather evidence, assess personal risk and document abuse – actions that were previously unsafe or more difficult to carry out."