California Attorney General Addresses Racial and Ethnic Bias in Algorithms Used in Healthcare, Launches Investigation


SACRAMENTO, Calif. — California Attorney General Rob Bonta launched an investigation last week into the prevalence of racial and ethnic bias in health algorithms.

According to a statement from Bonta’s office, “Healthcare algorithms are becoming increasingly common tools and have been used to help healthcare professionals determine patients’ medical needs as well as in administrative work…and are used to assist healthcare professionals in health-related decision making.

The complexity of healthcare decision-making technology can range from simple graphics for decision-making to complex AI programming (and) enable patient care and outcomes to become more effective and efficient” , said CA AG.

But, the AG Office added, as health algorithms become more widely used, “there is growing concern that they are affirming long-standing racial and ethnic biases in the health industry. .. leading to inequitable outcomes for patients”.

An example of this can be found in a study in which researchers found that a widely implemented algorithmic tool referred black patients less to improved services than white patients with the same medical conditions.

This issue arose because the algorithm used the patient’s medical history but did not account for gaps in care due to racial inequality, the AG said.

According to Bonta, “We know that historical bias contributes to the racial disparities in health that we continue to see today. It is essential that we work together to address these disparities and bring equity to our health care system.

He said the survey is aimed at bringing hospitals and other health care systems into compliance with state non-discrimination laws, adding that he hopes “to ensure that all Californians can access the care that they need to lead a long and healthy life”.

Such findings have highlighted the demand for clarity in the construction and use of algorithms, he explained, noting, according to the release, “the need to eliminate any biases that might affirm inequities in health care that has historically disadvantaged populations of technological decision-making tools”.

In an effort to eliminate these biases, AG Bonta is asking hospitals across the state to provide information on how they are working to address racial and ethnic bias in their decision-making technology.

The Attorney General asks what algorithms, software systems, and decision-making programs are used in clinical decision-making, population health management, care and utilization, scheduling of appointments and operations, as well as billing practices and approvals.

Bonta said he hopes to find out where the biases persist so they can be eliminated.

Sharon D. Cole