US Supreme Court to review recommendation algorithms in landmark internet shield case

The United States Supreme Court has agreed to hear a case centered on Section 230, a legal shield that protects Internet platforms from civil and criminal liability for user content.
Source: United States Supreme Court

The United States Supreme Court is preparing to take up a case with significant implications for the functioning of the Internet.

In Gonzalez vs. Google LLCthe court will decide whether Google LLCit is YouTube LLC is responsible for the content that the platform algorithmically recommends to users. He is the first time the US High Court has agreed to hear a challenge to Section 230 of the Communications Decency Act, a landmark law that shields internet platforms from civil and criminal liability for user-created content.

In the case, the Gonzalez family argues that Google should be responsible for promoting an Islamic State recruitment video by its algorithms. The video is believed to be linked to a 2015 terror attack in Paris that killed 130 people, including 23-year-old Nohemi Gonzalez.

SNL picture

Recommendation algorithms generate a variety of traffic on today’s Internet platforms, for example by suggesting items to buy on e-commerce platforms, videos to watch on streaming platforms, or sites to visit in search engine results. In the case of Google, a recommendation algorithm drives the functionality of a Google or YouTube search by finding an optimal URL or video for users based on what is typed into the search bar.

If recommendation algorithms are deemed unprotected under Section 230, it will force platforms to rethink how content is pushed to consumers, policy experts have said.

“These are really complex issues, and they’re going to require complex regulation and thoughtful government interactions,” said Patrick Hall, senior scientist at the law firm and professor of data ethics at the University. Washington University.

Scope Considerations

Two years ago, Conservative Justice Clarence Thomas said the court should consider whether the text of Section 230 “aligns with the current state of immunity enjoyed by internet platforms” after the court ruled dismissed a separate case regarding the authority of Section 230.

In first revision Gonzalez vs. Google, the United States Court of Appeals for the 9th Circuit has ruled that Section 230 protects recommendation engines. However, the majority said Section 230 “housing more activity” than Congress could have previously imagined, and they encouraged federal lawmakers to clarify the scope of the law.

Hall likened the spread of harmful content on social platforms to a news anchor talking to children on the air about suicidal thoughts. “The FCC would be all over it,” Hall said. “What’s the news reach compared to some of these social media recommenders? In some cases, I bet social media reaches more people.”

Platform transparency has become a talking point in technology policy as lawmakers debate legislation that would require companies to reveal the behavior of their algorithms to researchers and others. Recommendation algorithms have come under intense scrutiny over the past year following reports of Instagram LLC, TikTok Inc. and other platforms recommending content that harms the mental health of some users. , spread disinformation or erode democracy.

But opening up an algorithm to the public or researchers is a double-edged sword, said Michael Schrage, a Massachusetts Institute of Technology researcher and author of the book. Recommendation engines.

Recommendation algorithms are best understood when they are transparent, interpretable and explainable, and the level of transparency has business and competitive implications, Schrage said.

“Google has been tight-lipped about its algorithms because there’s an incentive to mess with the algorithm. If I understand Google’s algorithm, I can get my content up in recommendation,” the researcher said.

Technical problems

Algorithms are designed primarily to optimize engagement. For example, a recommendation engine could be directed to maximize ad revenue on certain content or minimize ‘likes’ and ‘shares’ with low-quality content. If engagement with specific content is deemed valuable to a platform, its algorithm calculates the likelihood that a user will like, comment on, or share it. If this calculated value reaches a certain threshold, it will show up in a user’s feed.

In Google’s case, the only way to know if YouTube deliberately optimized for terrorist content recommendation would be if the tech giant opened up its platforms to inform the court’s decision, Schrage said. This would likely require legislative or legal action to force Google’s cooperation. One option would be for the Supreme Court to appoint a special master to review how Google tagged the metadata of the video in question.

Ari Cohn, free speech attorney at TechFreedom, views the ISIS content recommendation as a likely mistake on Google’s part. How the court responds to it could have significant implications, Cohn noted.

“Sosometimes [platforms] are going to fail, but imposing liability because sometimes they failed doesn’t make sense if you really want the internet to be safer,” Cohn said in an interview.

Federal legislative efforts to demand greater platform transparency have stalled in recent months, with congressional leaders focusing more on issues such as inflation and abortion. As Gonzalez vs. Google product, it could motivate lawmakers to move on to Section 230 reforms, said Jesse Lehrich, who leads Accountable Tech, a group that advocates for regulation on technology platforms.

Barring preemptive legislative reforms, policy experts have said U.S. judges are likely to push Congress to act.

“It’s not that the company explicitly wants to create tools to empower [bad] let things happen” Lehrich said. “But it’s the by-product of their business model…with very few controls or limits.”

Representatives for the Supreme Court and Google did not return requests for comment.

Gonzalez vs. Google is due to plead during the Supreme Court’s October 2022-2023 term. An exact argument date has yet to be set.

Sharon D. Cole