New California law adds complexity to content moderation

States and Congress have adopted or debated different approaches to online “content moderation” by social media and other Internet platforms. California’s bill “Content Moderation Requirements for Internet Terms of Service” (“AB 587”) comes into effect on January 1, 2024. In short, AB 587 requires social media companies to disclose their processes for removing or managing content and users on their platforms. AB 587 takes a somewhat different approach to regulating social media content than previously enacted laws in Texas and Florida. The Texas and Florida laws also address the content management practices of social media companies, but go beyond the disclosure requirement and also prohibit specific conduct in order to limit putative perspective discrimination. The Eleventh Circuit partially rejected the Florida law because the associated content moderation requirements violate the First Amendment rights of social media companies to exercise editorial judgment on their platforms. The Fifth Circuit, on the other hand, upheld a similar Texas law because the court held that viewpoint-based content moderation would constitute censorship and that a platform’s content moderation activity was not not protected by the First Amendment.

Background

Section 230 of the Communications Decency Act was read to grant broad immunity to social media companies to handle third-party posts or content on their platforms. Importantly, Section 230 protected social media companies from liability for “any action willfully taken in good faith to restrict access to or availability of material that the provider or user considers obscene, obscene , lascivious, dirty, excessively violent, harassing or otherwise”. objectionable, whether or not such material is constitutionally protected.

Florida and Texas have sought to regulate social media companies’ handling of third-party posts or content by passing “content moderation” or “transparency” laws. Both laws focus on 1) content moderation; 2) requirements for disclosure of internal guidelines on how content moderation and censorship decisions are made; and 3) blocking, banning or deleting users and their content from the platform.

As noted above, both laws were challenged on constitutional grounds. In May 2022, the Eleventh Circuit upheld a district court’s entry of an injunction barring the application of certain provisions of Florida content moderation law. The basic position of the Eleventh Circuit was that the moderation of platform content was protected speech, similar in nature to the editorial powers wielded by traditional media companies. In contrast, in September 2022, the Fifth Circuit ruled that a district court erred in enjoining the Texas law, finding that the Texas law did not violate the First Amendment rights of tech platforms. The Fifth Circuit’s lengthy opinion is not easy to sum up, but at its core, the “dismiss[s] the idea that corporations have a free First Amendment right to censor what people say. The court also said the Texas law “does not chill speech; on the contrary, it cools censorship” and “does not regulate the discourse of the Platforms at all; He protects Other people speech and regulates the Platforms conduct.” The Fifth Circuit also likened social media companies to “common carriers,” using that analogy to argue that the government can broadly limit their content moderation activities. The Fifth Circuit therefore rejected the challengers’ claim that Texas law conflicts with the First Amendment. Notwithstanding this decision, the Texas law has not yet entered into force. The Fifth Circuit granted a stay of its decision in mid-October, agreeing to suspend enforcement of the law pending a possible review by the Supreme Court.

While the new California law is designed to require transparency regarding content moderation practices, rather than restricting editorial decisions, the California law may also face legal challenges under the First Amendment and possibly as well. section 230.

Who is covered?

AB 587 is intended to cover social media companies, a term broadly defined to mean “a person or entity that owns or operates one or more social media platforms”. A social media platform is defined as “a public or semi-public Internet service or application that has users in California” and is intended to both connect users within the service or application and allow users to create a profile to interact with others and consume users. generated content.

This definition would seem to encompass a wide range of businesses, including message board companies, platforms like Instagram, Tik Tok and Meta. However, the legislative history suggests that AB 587 should be interpreted more narrowly. More specifically, the legislative history indicates that the legislator intended to apply to an Internet service or application for which interactions between users are limited to direct messages, commercial transactions, consumer opinions on products, sellers , services, events or venues, or any combination thereof. This could potentially mean that even top companies such as Amazon or Yelp would not be considered social media companies that own social media platforms.

Requirements of AB 587

According to AB 587, a social media company must publish its terms of service and include (1) contact information for users to ask questions about the terms of service; (2) a description of the content moderation policies; and (3) a list of potential actions a social media company can take against prohibited content. Platform users have extensive rights under AB 587, including the right to report content that they believe violates the Terms of Service and to obtain an explanation of a company decision. of social media. Notably, there is no right of private action under AB 587; provisions that would have created a private right of action were removed during the legislative process. There does not appear to be any regulations provided or guidance issued regarding the implementation of the bill at this time; however, there is no indication that such guidelines or regulations would be excluded or not provided at a later date.

Reporting requirements

Under AB 587, a social media company must provide a report regarding its terms of use to the California Attorney General every six months. The report should include the terms of use, any changes to the terms of use since the last report, and a statement about whether the terms of use define certain categories of speech. Additionally, AB 587 requires disclosure of content moderation practices that the social media company uses in its day-to-day operations.

Civil fines

There is a fine of $15,000 per violation of the requirements of AB 587. A social media company will be considered in violation of the bill for each day the social media company does any of the following:

(A) Does not post terms of use.

(B) Fails to submit a required report to the Attorney General regarding its terms of use in a timely manner.

(C) Materially omits or misrepresents information required in a report regarding its Terms of Use.

Take away food

State legislatures and federal courts take different approaches to regulating social media. Following the litigation in the Fifth and Eleventh Circuits, there is a split over the issue of “content moderation” laws that could lead to the Supreme Court taking up these issues soon. In the meantime, other courts and state legislatures, and possibly the US Congress, may intervene in other laws and legal decisions. Each of these developments will shape the legal landscape and inform how laws governing the moderation of social media content will be applied and analyzed under the First Amendment and Section 230 of the Communications Decency Act. Social media companies should continue to monitor developments in this area and prepare to respond to a variety of content moderation laws, taking into account in their planning the possibility that these laws may impose different legal obligations and possibly even be obligations that conflict with each other.

Sharon D. Cole