Algorithms Amplify Historical Racism of US Financial Systems | Across America

Federal regulators must be vigilant against sophisticated digital discrimination by financial institutions, online banks and social media platforms like Facebook, said Rohit Chopra, director of a government consumer watchdog agency.

“We’re pretty focused on the future of how technology will reshape lending and financial services,” said Chopra, director of the Consumer Financial Protection Bureau. “A lot of this is related to what we think is digital redlining.”

In the past, redlining was seen as financial institutions not lending to a particular geographic area, he said.

“But now, with these highly targeted algorithms, it’s getting harder and harder to understand where there might be exclusion or discrimination,” Chopra said.

He was in Philadelphia for a talk at the Federal Reserve Bank of Philadelphia and gave the Philadelphia Tribune an exclusive interview about digital discrimination.

In March, Wells Fargo customer Aaron Braxton, who is black, filed a lawsuit in U.S. District Court in Northern California alleging racial discrimination after he was denied a mortgage to refinance his home. According to the complaint, the bank’s algorithms amplified the historic racism of the US financial system.

In June, the US Department of Justice reached an agreement with Facebook owner Meta to settle a discrimination lawsuit in federal court accusing the social media platform of violating the Fair Housing Act for the use of an algorithm.

In the complaint, the government alleged that Facebook allowed landlords to market their real estate listings in a discriminatory manner. For example, Facebook allowed advertisers to target housing-related ads based on race, gender, religion, and other characteristics, in a way that enabled discrimination.

Kristen Clarke, assistant attorney general for the US Department of Justice’s Civil Rights Division, said at the time that it was the first time that Meta had agreed to terminate one of its algorithmic tools and modify those tools to broadcast real estate advertisements, in response to a lawsuit for discrimination.

As part of the settlement, Facebook agreed to remove one of its algorithmic tools and create a new automated advertising system, to ensure its real estate listings are seen by a fairer audience. The company also agreed to pay a fine of $115,054, the highest amount allowed by law.

A Facebook spokesperson said it would refrain from using only postcodes to target customers, as part of the agreement.

An algorithm is a complex formula or advanced calculation method that can be used to make decisions.

“Think of it like, it’s using variables to come to some sort of decision. That decision might be whether or not you get a loan or whether or not you get an advertisement for it,” Chopra said. Many of these algorithms rely on a huge amount of data about you.”

For example, the data the company may rely on to make a decision may have nothing to do with your creditworthiness or past debts, but may be based on things such as your browsing history, geolocation, or whether whether or not you go to church, Chopra said. . “It could also include something entirely independent of the commercial transaction.”

According to the website, Google and Facebook claim the largest share of digital ad revenue in the US, with Google at 28.6% and Facebook at 23.8%. Together, they account for over 50% of all digital ad revenue in the United States.

Meanwhile, Facebook’s June settlement with the Justice Department and the allegations are one of several allegations against the social media platform for its algorithms related to housing, employment and credit.

In July 2018, Washington State Attorney General Bob Ferguson announced that Facebook had signed a legally binding agreement to end advertisers’ ability to block people from seeing certain ads, based on race. , ethnic origin, religion, LGBTQ people and other protected groups. of its targeting practices or algorithms.

According to Ferguson, third parties may have discriminated by not allowing certain people to see their credit, employment, housing, insurance and loan ads.

As part of the agreement, Facebook was required to end the practice of allowing advertisers to exclude people via algorithms from seeing advertisements for public accommodations, such as restaurants, accommodations, but also advertisements for credit, employment and insurance. Additionally, the social media platform had to pay $90,000 in fees and charges.

During a 20-month investigation, the Washington Attorney General’s office placed 20 fake Facebook ads posing as bankers, employers, insurance companies and apartment renters. They were able to exclude certain minorities with their advertisements.

“When searching for listings, empirical research has shown that people with equal qualifications, a minority candidate might not even see the listing because they (advertisers) use targeted analytics,” said Chopra, director of CFPB. .

To counter these practices, Chopra said the government can use one of its longstanding anti-discrimination laws on lending: the Equal Credit Opportunity Act.

Under the Equal Credit Opportunity Act, if you are turned down or get an unfavorable decision, the lender is required to make it clear to people why that unfavorable action occurred, a-t -he declares.

“We’ve released a legal interpretation that says just because you’re using a fancy algorithm doesn’t mean you don’t have to,” Chopra said. “If you can’t tell us what this machine is saying, you can’t use it.”

Sharon D. Cole