Algorithms are quietly running the city of Washington, DC and maybe your hometown

Washington, DC, East the home base of the most powerful government in the world. It is also home to 690,000 people and 29 obscure algorithms that shape their lives. City agencies are using automation to screen housing applicants, predict criminal recidivism, identify food aid fraud, determine if a high school student is likely to drop out, inform youth sentencing decisions, and many other things.

This snapshot of semi-automated city life comes from a new report Electronic Privacy Information Center (EPIC). The non-profit organization spent 14 months investigating the city’s use of the algorithms and found they were used in 20 agencies, with more than a third deployed in law enforcement or criminal justice. For many systems, city agencies would not provide full details about the operation or use of their technology. The project team concluded that the city is probably using even more algorithms than they were able to uncover.

The findings are notable beyond DC as they add to evidence that many cities have quietly implemented bureaucratic algorithms in their departments, where they can contribute to decisions that affect citizens’ lives.

Government agencies often turn to automation in hopes of adding efficiency or objectivity to bureaucratic processes, but it is often difficult for citizens to know they are at work, and some systems have proven to be discriminatory and lead to decisions that ruin human lives. In Michigan, an unemployment fraud detection algorithm with a 93% error rate caused 40,000 false allegations of fraud. A Analysis 2020 by Stanford University and New York University found that nearly half of federal agencies use some form of automated decision-making systems.

EPIC dug deep into a city’s use of algorithms to provide insight into the many ways they can impact the lives of citizens and encourage people in other places to undertake similar exercises. Ben Winters, who leads the nonprofit’s work on AI and human rights, says Washington was chosen in part because about half of the city’s residents identify as black.

“More often than not, automated decision-making systems have disproportionate impacts on black communities,” says Winters. The project found evidence that automated traffic control cameras are disproportionately placed in neighborhoods with more black residents.

Cities with large black populations have recently played a central role in campaigns against municipal algorithms, particularly in policing. Detroit has become the epicenter of facial recognition debates following the fake arrests of Robert Williams and Michael Oliver in 2019 after algorithms misidentified them. In 2015, the rollout of facial recognition in Baltimore following the death of Freddie Gray in police custody led to some of the first congressional inquiries the use of technology by law enforcement.

EPIC researched algorithms by researching public disclosures by city agencies and also filed requests for public records, contracts, data-sharing agreements, privacy impact assessments and other information. Six out of 12 city agencies responded, sharing documents such as a $295,000 contract with Pondera Systems, owned by Thomson Reuters, which makes fraud detection software called FraudCaster used to screen applicants for food aid. Earlier this year, California officials discovered that more than half of the 1.1 million claims from state residents that Pondera’s software flagged as suspicious were actually legit.

Sharon D. Cole