Police seek algorithms to predict domestic violence

Domestic violence is a widespread problem in the United States and around the world. Violence at the hands of an intimate partner has affected more than 600 million women worldwide, according to World Health Organization estimatesand the problem only got worse during the pandemic.

Law enforcement has turned to a variety of tools, from simple questionnaires to algorithms, to prioritize the most risky crimes. While some research has acknowledged the tools’ potential benefits, it has also left experts in the domestic violence community with questions about the ethics and efficacy of relying on technology to predict future acts of violence. .

Matthew Bland, associate professor of evidence-based policing at the University of Cambridge, said it was widely recognized that something needed to be done to improve services for victims of domestic violence, but how, or if, using technology as a solution is up for debate.

“We’re still pretty polarized, I think, as a domestic violence community, on the right track,” he said.

Range of techniques

Some tools used by the police are in fact only paper questionnaires. In the UK, the police use a relatively simple tool called DASH, short for “Domestic Abuse, Stalking and Honor-Based Violence”. After an incident, police interview victims and add up the number of “yes” responses to produce a risk classification that guides their response.

Although the idea has gained the most traction in Europe, some police forces in the United States also use a basic form of risk assessment similar to DASH.

Other systems are relatively advanced. The Spanish government launched an ambitious project in 2007 to combat domestic violence through a system called VioGén. His goal was to build a centralized system for domestic violence cases that could also predict future incidents.

VioGén is powered by an algorithm developed by researchers based on the factors of an incident that have been linked to high-risk cases in the past. The police record the details of a case, such as whether the attacker has made death threats or used drugs, and VioGén calculates a score based on the entries.

VioGén has since performed millions of “risk assessments”. The scale rates risk from lowest to highest and guides how police respond, including whether to lay charges or provide a victim with police protection.

Today, VioGén is probably the most advanced predictive tool for domestic violence. According a report by the Eticas Foundationa nonprofit tech advocacy group that studied the tool, there were more than 670,000 cases in the system as of early 2022.

Efficiency and ethics

Are the tools effective in preventing domestic violence?

“It’s kind of the gigantic elephant in the room, not just in Spain but with all the risk assessment tools,” said Juan Jose Medina Ariza, a criminal science researcher at the University of Seville. “We don’t really know” whether putting these tools in the hands of police improves their response to domestic violence, he said.

Researchers have found some relatively simple tools like DASH to be disappointing. A 2019 study by Medina Ariza and colleagues found that the system was “underperforming” and was “poorly predictive of revictimization at best.”

The research published on VioGén has been relatively positive, Medina Ariza said, but it has been criticized for being evaluated by researchers working directly on the tool with the Spanish government.

Eticas CEO Gemma Galdon said more transparency was needed from Spain’s interior ministry, which developed the system. Police have the latitude to override the algorithm and manually increase a case’s risk level, but, 95% of the time, officers followed the algorithm, according to the Eticas report, which relied on limited data available on the system.

Without independent third-party audits, Galdon said, the public can’t really be assured that tools like VioGén are effective and that resources are reaching the people they are meant to help.

“When a woman with a low risk score is killed, the department cannot confidently say, ‘It’s an anecdote, and the system works,'” Galdon said. “It’s very, very, very concerning.”

The Spanish Interior Ministry did not respond to a request for comment.

More options, more controversy

Some officials and researchers have suggested using more data-intensive techniques. A controversial idea: machine learning.

VioGén’s decisions are based on factors predetermined by researchers to be related to violence – whether the abuser has had suicidal thoughts, for example, is factored into VioGén’s decisions.

But a machine learning tool can draw its own conclusions about risk. Such a system could read police data on crimes and autonomously decide which cases pose the highest risk, based on factors such as previous arrests and convictions. The system might even decide that cases from certain ZIP codes are higher risk because it sees more reports of abuse in those neighborhoods.

Several researchers found that they were able to improve the predictions of simple risk assessments using such a technique. But Medina Ariza, who also published a document finding that a machine learning technique could improve the predictive power of the UK’s DASH tool if implemented, said the use of machine learning in domestic violence cases remains ethically controversial .

The technique relies on past data to make predictions about the future, raising concerns that it may reinforce past biases, such as a focus on one racial group. If a machine learning algorithm is trained on arrest data, for example, it may overestimate abuse in groups that the police arrest disproportionately.

“Our fear is that we are replacing really flawed and discriminatory human systems with even worse and more opaque technical systems,” Galdon said.

Still, the idea of ​​using machine learning to triage cases is being considered. Last year, for example, police in Queensland, Australia, announced that they would pilot the use of a machine learning program trained on police data to predict the most at-risk perpetrators of domestic violence.

According to The Guardian, police officials said officers would use the tools to predict which cases would escalate and “proactively knock on doors without any calls for service.” Matt Adams, a spokesman for the Queensland Police Service, told The Markup the trial was delayed by COVID, but police are moving forward with the plan.

Medina Ariza said that, at the very least, researchers have shown big data techniques to be better at predicting domestic violence than simpler risk assessments.

“The question then becomes one of: is it acceptable to use a machine learning model, even with all the ongoing debates about algorithmic fairness?” he said. “I think that’s still an open question.”

This article has been originally posted on The Markup and has been reissued under the Creative Commons Attribution-NonCommercial-NoDerivs Licence.

Sharon D. Cole