The US Department of Health and Human Services (HHS) intends to add language to a federal law to reign to clarify that physicians could be held accountable for decisions made based on clinical algorithms that discriminate on the basis of a patient’s race.
The intent, HHS said, is to have medical practices and hospitals take a closer look at clinical decision support algorithms. Overreliance on algorithms could result in violations of Section 1557 of the Affordable Care Act, which prohibits discrimination based on sex, gender, race, and other identities.
In the proposed rule, HHS said its intent is not to prohibit or impede the use of clinical algorithms, but rather to clarify that discrimination that occurs through their use is prohibited.
HHS said it expects its Office of Civil Rights to handle future cases of discrimination involving algorithms with an emphasis on voluntary compliance, such as he has manipulated past cases discrimination based on sex.
The agency admits that checking for discrimination in algorithms is “a complex and evolving area that can be challenging” for clinicians. The American Medical Association (AMA) is among groups that have called on HHS to pause the plan and gather more information from the medical community on how to address discrimination in algorithms.
According to Sharona Hoffman, JD, co-director of the Law-Medicine Center at Case Western Reserve University School of Law, Cleveland.
“For example, what are healthcare providers supposed to do to detect discrimination? Will they be expected to conduct studies to determine whether women or minorities are disadvantaged by particular algorithms?” Hoffman said Medscape Medical News. “Will healthcare providers be so concerned about liability that they stop buying and using algorithms?”
Algorithm proposal is part of larger draft rule addressing discrimination than HHS unveiled in July.
Racism ‘hidden in plain sight’
In the medical community, awareness has grown that algorithms and other clinical decision support tools can harm patients.
“For Blacks, Hispanics and Asians whose heart, lungs, bones, brain, bladder and kidneys have been judged differently for years due to race-based algorithms that are “hidden from the view, “this new remedy is much needed,” Aletha Maybank, MD, MPH, Director of Health Equity and AMA Group Vice President, and co-authors wrote recently in Health Affairs.
For many years, the Estimated Glomerular Filtration Rate (eGFR) test weighed an “African American” entry as automatically indicating a higher serum concentration. creatinine than for a non-black patient based on the unsubstantiated idea that black people have more creatinine in the blood to start with, such as Medscape reported.
And in 2021, the American Academy of Pediatrics (AAP) withdrew guidelines used to diagnose urinary tract infections in children aged 2 to 24 months after researchers argued that the recommendations wrongly raised the threshold for screening black children for the condition, putting them at risk. for an untreated illness. In May, the AAP published a policy statement which launched its review of clinical guidelines and policies that include race as a biological indicator.
Clinicians often mistakenly view algorithms and clinical decision support tools as purely objective scientific aids that can better guide treatment decisions. But researchers create tools, and they can incorporate their biases and misconceptions, according to Lundy Braun, PhD, researcher at Brown University in Providence, Rhode Island.
Algorithms did not come into existence on their own. They were brought into the world by human beings
“Algorithms didn’t just pop into the world by themselves,” Braun said. “They were brought into the world by human beings.”
Stages of a long journey
The National Kidney Foundation (NKF) and the American Society of Nephrology recommended in 2021 the adoption of a new eGFR equation in which the faulty adjustment for black patients had been removed.
“There’s a groundswell to push for health equity in this country, and I think we have a lot of work to do,” said Joseph A. Vassalotti, MD, chief medical officer at NKF. “These steps are certainly positive steps, but we have to be honest, they are just steps on a long journey.”
The American Academy of Family Physicians (AAFP), in its comments on the proposed rule, recommended that HHS warn physicians of civil rights violations through faulty algorithms and work with them to establish new policies, rather than to focus on the punishment.
In a comment from September 28 On the proposed rule, the AAFP said clinicians should not be required to evaluate algorithms on their own, and urged that responsibility be shared with vendors who make tools.
If used correctly, algorithms and other forms of artificial intelligence have the potential to improve patient care and help ease the burden on primary care specialists, according to Steven Waldren, MD, director of the medical informatics at the AAFP.
“The market will continue to move in this direction” of increased use of artificial intelligence in medicine, Waldren said. “We can’t just stick our heads in the sand and say, ‘They’re not quite ready, so let’s not do anything. “
The AMA and the American Health Insurance Plans (AHIP) separately asked HHS to drop its current algorithm proposal, and they urged the agency to work with the medical community on the issue.
In an Oct. 3 commentary on the proposed rule, James L. Madara, MD, AMA’s chief executive, said, “Issues of attribution of liability seem at best extremely premature and at worst very damaging to the pursuit of the innovation in this space.”
AHIP shared the same view, arguing the need to first determine how best to detect problems in the design of algorithms that can harm patients.
“There is unanimity in the health care system that we need to root out and mitigate these biases, but we are still in the early stages of our ability to do so,” wrote Jeanette Thornton, executive vice president of policy. and strategy at AHIP, in a comment at HHS.
Kerry Dooley Young is a freelance journalist based in Miami Beach. Follow her on Twitter @kdooleyyoung.