FEATURE-US renters fall for house hunting algorithms

(Corrects description to civil rights group in paragraph 7) *

Hundreds of automated tenant screening programs in use*

Applicants report vague reasons for refusal of housing*

Discrimination Fears Cause DC Motors to Brake Algorithms By Carey L. Biron

WASHINGTON, Nov. 16 (Thomson Reuters Foundation) — C andice’s most recent email rejection for a rental home in Washington DC was much like all her others — a generic response with no proper explanation of the decision. “He’ll just say, ‘Unfortunately, at this time, we haven’t accepted your application,'” said Candice, 36, who has lived in the US capital for most of her life and is now looking for a longer home. great for her and her. three children.

“I felt like it was computer generated. And of course the computers – they are faulty,” she told the Thomson Reuters Foundation, asking to be identified by her first name only. Candice, who says she has received a dozen refusals of this type in recent months, is currently unemployed but receives social assistance which would cover all of her rent.

She and other potential tenants believe their applications for rental housing are not up to landlords, but to automated screening programs that analyze credit scores, evictions or criminal history and even social media activity to determine whether an applicant presents a rental risk. The widely used programs are under intense scrutiny from lawmakers in Washington DC and beyond, amid broader concern about the potential of algorithms to lock in bias and perpetuate inequality.

Susie McClannahan, who runs the fair housing rights program for civil rights group Equal Rights Center and has worked with Candice, calls her the “algorithmic discrimination black box.” Applicants to rent are being “turned away from properties for reasons that they don’t know about and the provider may not even know about,” McClannahan said, adding that some third-party screening systems leverage data that we don’t know about. they were prohibited from using, such as old criminal convictions. .

“For tenants with housing vouchers and low-income tenants … it’s harder for them to find housing in a city that’s already in the midst of a housing crisis,” she said. City officials take note. In September, they debated legislation banning “algorithmic discrimination”, including in housing – one of many nationwide efforts.

And last month, the White House released a “Blueprint for an AI Bill of Rights,” warning that “discrimination by algorithms” is unacceptable. Regulatory action on the issue is likely in the coming year, said Ben Winters, an attorney for the Electronic Privacy Information Center (EPIC) watchdog group.

“We are at a transition point,” he said. GROWING SECTOR

The roughly $1 billion tenant-selection industry is attracting interest from tech startups and venture capitalists, according to the Tech Equity Collaborative, a watchdog group. There are hundreds of tenant screening tools available in the United States, replacing a process traditionally undertaken by landlords, said Cynthia Khoo, senior associate at Georgetown University’s Center on Privacy & Technology.

While this process is also open to discrimination, she said today’s automated tools work much more efficiently, at greater scale and speed, and with access to much more data. “These are new technological tools used to perform the same age-old discrimination that we know,” she said, adding that they were even less transparent.

As regulators in California and Colorado, and at the Federal Trade Commission, are working on the issue, many are eyeing the capital’s Stop Discrimination by Algorithms Act (SDAA) as a potential blueprint. “This is the most robust legislation in the United States,” Khoo said of the bill.

The current draft states that the algorithms cannot discriminate against any group already protected by local law, Winters said, while applicants should be alerted to the use of such systems and given explanations if rejected. Most companies using these tools would have to audit their algorithms to make sure they knew what the programs were doing, he said, and applicants could sue for potential violations.

In response to a request for comment, the Consumer Data Industry Association, a trade group, referred to testimony it gave in opposition to the SDAA, as well as a letter sent to the DC Council in October by nine groups of financial services. The letter noted that companies were already prohibited from discriminating in credit or other financial services, and that the DC bill would increase the potential for fraud and affect access to credit.

“Algorithms make credit decisions more accurate, fair, faster and more affordable by judging applicants on their creditworthiness,” the groups said. “Algorithms also remove some of the risk of bias that can be found in human interactions and can help identify products and services designed to benefit communities, including historically underserved populations, helping to close the racial wealth gap. .”

‘STAINED’ DATA Still, some wonder whether algorithms relying on public data can be objective when the data itself is tainted, said Catherine D’Ignazio, associate professor of urban science and planning at the Massachusetts Institute of Technology (MIT).

Often, data such as credit scores that appear objective are actually the result of decades of racism or marginalization — which builds bias into the calculations, she said. The idea of ​​algorithmic fairness suggests that “everyone starts out the same and is treated the same. But history has not treated people the same.”

Yet acknowledging that disconnect offers an opportunity for change for the better, D’Ignazio said. “Dirty” historical data can also skew home valuations, said John Liss, founder of True Footage, whose company was launched last year with the aim of closing valuation gaps between white and minority homeowners. using a combination of automation and human supervision.

For years, home valuations often didn’t seem tied to the data, Liss said — to the particular detriment of black and Hispanic homeowners. Although automating the valuation process helps solve this problem in part, he said, “automated valuation models are extremely dangerous because they are tainted” by historical data.

For True Footage, he said, the key is to have human evaluators, increasingly from historically marginalized communities, involved in interpreting the data. “There is a place for technology,” Liss said. “(But) having a human behind the wheel to interpret the data is much more accurate.”

Originally posted at: https://www.context.news/ai/us-renters-fall-foul-of-algorithms-in-search-for-a-home

(This story has not been edited by the Devdiscourse team and is auto-generated from a syndicated feed.)

Sharon D. Cole