The slippery slope of child abuse algorithms

By JEFFREY REYNOLDS //

In a case that has drawn national attention, former NYPD cop Michael Valva and his former fiancée, Angela Pollina, will face a judge and jury later this month.

The two are accused of murdering Thomas Valva, the officer’s 8-year-old autistic son, in January 2020, by forcing him to sleep on the freezing garage floor of their Center Moriches home. They are also accused of abusing Thomas’ 10-year-old brother Anthony.

Both Valva and Pollina have denied the charges. But they are not the only ones to be judged here.

The preventable death of Thomas Valva has shone the spotlight on Suffolk County Department of Social Services. The child’s grieving mother and teachers said they repeatedly phoned reports of abuse and pleaded – to no avail – for Suffolk Child Protective Services social workers to help.

Although the child abuse records are highly confidential, the case has lifted the curtain on the inner workings of the CPS. In response, the Suffolk bipartisan CPS transformation law mandated several reforms, including strict workload standards.

A recent Newsday reporthowever, found that in 2021, nearly 60% of county CPS workers had an average caseload exceeding the 12 per month required under the 2020 law. Some workers handled up to 26 cases at the time, according to the report.

Jeffrey Reynolds: Questions worth asking.

Suffolk officials blamed COVID-related staff shortages and said they were working to reduce the number of cases. But hiring in this labor market is difficult and new staff must be properly trained.

Meanwhile, Suffolk gets nearly 9,000 reports of child abuse and neglect each year. Nassau average closer to 7,000.

The stakes in these cases are high – the absence of signs of abuse or neglect can lead to the death of a child – and even the most experienced social workers make mistakes. Reporting a family, on the other hand, subjects them to scrutiny that can lead to traumatic removal of children, termination of parental rights, and horrific foster care placements.

This is why some national child protection agencies have turned to algorithm-assisted decision-making. Allegheny County, Pennsylvania led the way in 2016, where officials developed a computer program that would stratify family risk and help social workers decide which abuse and neglect allegations should be investigated. quickest and most thorough investigation.

It sounds good. But new search from Carnegie Mellon University found that Allegheny’s Family Screening Tool flagged 68% of black children it stratified for investigations of potential neglect, compared to only 50% of white children.

The study also found that trained social workers disagreed with computerized risk scores a whopping third of the time. And when they overrode the system — using augmented rather than automated decision-making — racial disparity dropped to 7%.

It’s still not great, but to put it in context, more than half of all black children in the state of California are subject to CPS investigation based solely on human judgments – a model constantly repeated nationwide, often doubling the percentages of white children surveyed.

Data Driven: Alleghany County, Pennsylvania now uses statistical data to prioritize its child abuse workflow. (Graphic: Pennsylvania Department of Social Services)

The data variables and formulas used by Allegheny are kept secret, although they certainly include information from sister government agencies. Families who receive food, cash, housing, health care, drug and alcohol counseling, and/or mental health services from government agencies will have a larger electronic footprint. important than those who take advantage of second mortgages to finance a luxury rehab in Malibu.

public citizena Washington-based nonprofit consumer advocacy group, says “black box” algorithms with undisclosed data sources and decision-making rules can exacerbate economic, social and racial injustices.

But aren’t data, by their very nature, sterile and unbiased?

Critics say financial metrics, criminal justice information, health records and more cannot be racially neutral because they rely on data shaped by generations of discrimination and invariably embedded in the ‘equation. Others point out that algorithms can be biased depending on their builders, developers, and users.

Nassau County Department of Social Services Commissioner Nancy Nunziata said her CPS staff did not use an algorithm because of the potential disproportionate impact, nor did she believe there was a way to replace human intervention with technology.

Suffolk County Department of Social Services Commissioner Frances Pierre did not respond to inquiries about whether her department uses artificial intelligence to predict risk.

The upcoming trial will likely address that – and raise even more questions about how Pierre’s staff and everyone who interacts with children at risk can do a better job of preventing tragedy.

There are no easy answers. But we owe it to Thomas Valva to keep asking those questions.

Jeffrey Reynolds is the president and CEO of the Garden City-based company Family and childhood association.

Sharon D. Cole