Page 1
CAHILL GORDON & REINDEL LLP
JOEL KURTZBERG (pro hac vice pending, SBN NY 1758184)
FLOYD ABRAMS (pro hac vice pending, SBN NY 2835007)
JASON ROZBRUCH (pro hac vice pending, SBN NY 5753637)
LISA J. COLE (pro hac vice pending, SBN NY 5927744)
32 Old Slip
New York, New York Phone: 212-701-Facsimile: 212-269-jkurtzberg@cahill.com
DOWNEY BRAND LLP
WILLIAM R. WARNE (Bar No. 141280)
bwarne@downeybrand.com
MEGHAN M. BAKER (Bar No. 243765)
mbaker@downeybrand.com
621 Capitol Mall, 18th Floor
Sacramento, California Telephone: 916.444.Facsimile: 916.444.Attorneys for Plaintiff X Corp.
UNITED STATES DISTRICT COURT
EASTERN DISTRICT OF CALIFORNIA
SACRAMENTO DIVISION
X CORP.,
Case No.
Plaintiff,
v.
COMPLAINT FOR DECLARATORY AND
INJUNCTIVE RELIEF
ROBERT A. BONTA, Attorney General of
California, in his official capacity,
DEMAND FOR JURY TRIAL
Defendant.
Plaintiff X Corp., by and through its attorneys, Cahill Gordon & Reindel LLP and Downey
Brand LLP, alleges for its complaint against Defendant ROBERT A. BONTA, in his official
capacity as Attorney General of California, as follows:
COMPLAINT Page 2
NATURE OF THE ACTION
1.
Plaintiff X Corp. brings this action challenging the constitutionality and legal
validity of California Assembly Bill No. 587 (“AB 587”), which is codified in law at Cal. Bus. &
Prof. Code §§ 22675–22681.
2.
AB 587 requires large social media companies like X Corp. to (1) post terms of
service dictated by the government and include terms about how content is moderated on their
platforms (the “Terms of Service Requirement”) and (2) submit, on a semi-annual basis, to the
California Attorney General a “terms of service report” that includes, among other things, (a) “a
detailed description of content moderation practices used by the social media company for that
platform”; (b) information about whether, and if so how, the social media company defines and
moderates (i) hate speech or racism, (ii) extremism or radicalization, (iii) disinformation or
misinformation, (iv) harassment, and (v) foreign political interference; as well as (c) information
and statistics about actions taken by the social media company to moderate these categories of
content (the “Terms of Service Report”).
3.
AB 587 violates the First Amendment of the United States Constitution and Article
I, Section 2, of the California Constitution because it compels companies like X Corp. to engage in
speech against their will, impermissibly interferes with the constitutionally-protected editorial
judgments of companies such as X Corp., has both the purpose and likely effect of pressuring
companies such as X Corp. to remove, demonetize, or deprioritize constitutionally-protected speech
that the State deems undesirable or harmful, and places an unjustified and undue burden on social
media companies such as X Corp.
4.
The State of California touts AB 587 as a mere “transparency measure” under which
certain social media companies must make their content moderation policies and statistics publicly
available. See Press Release, Governor Newsom Signs Nation-Leading Social Media Transparency
COMPLAINT Page 3
Measure (Sept. 13, 2022), https://www.gov.ca.gov/2022/09/13/governor-newsom-signs-nation-
leading-social-media-transparency-measure/. Yet, a review of the law’s purpose and likely effect
— as evidenced by the legislative history and statements from AB 587’s author, sponsors, and
supporters — demonstrates otherwise. As made clear by both the legislative history and public
court submissions from the Attorney General in defending the law, the true intent of AB 587 is to
pressure social media platforms to “eliminate” certain constitutionally-protected content viewed by
the State as problematic. Ex. 1 (Cal. Assemb. Comm. on Judiciary Report, 2021–22 Sess. (AB
587), Apr. 27, 2021) at 4 (“if social media companies are forced to disclose what they do in this
regard [i.e., how they moderate online content], it may pressure them to become better corporate
citizens by doing more to eliminate hate speech and disinformation.”); Mot. to Dismiss at 15–16,
Minds, Inc., et al. v. Bonta, No. 23-cv-2705 (ECF 23-1) (C.D. Cal. May 25, 2023) (“[T]he
Legislature also considered that, by requiring greater transparency about platforms’ content-
moderation rules and decisions, AB 587 may result in public pressure on social media companies
to ‘become better corporate citizens by doing more to eliminate hate speech and disinformation’ on
their platforms. . . . This, too, is a substantial state interest.”). AB 587 is, according to the law’s
lead author, Assembly Member Jesse Gabriel, an “important first step in protecting our democracy
from the dangerously divisive content that has become all too common on social media.” Ex. 1 at
4. The legislative record is crystal clear that one of the main purposes of AB 587 — if not the main
purpose — is to pressure social media companies to eliminate or minimize content that the
government has deemed objectionable.
5.
The topics that AB 587 forces social media platforms to speak about against their
will are highly controversial and politically charged. As the legislative history acknowledges, the
categories of speech on which AB 587 focuses are those that are difficult to define because their
boundaries are “often fraught with political bias.” Ex. 2 (Cal. Assemb. Comm. on Privacy and
COMPLAINT Page 4
Consumer Protection Report, 2021–22 Sess. (AB 587), Apr. 22, 2021) at 4. And social media
companies are frequently criticized, no matter what they do, by individuals on both sides of the
political aisle, for their editorial decisions about speech that arguably falls into these ill-defined
categories. The Assembly Reports from the Committee on Privacy and Consumer Protection
describe the “complex dilemma” that social media companies increasingly find themselves in when
trying to define and moderate the politically-charged and controversial categories of content that
are the focus of AB 587:
As online social media become increasingly central to the public discourse, the
companies responsible for managing social media platforms are faced with a
complex dilemma regarding content moderation, i.e., how the platforms determine
what content warrants disciplinary action such as removal of the item or banning of
the user. In broad terms, there is a general public consensus that certain types of
content, such as child pornography, depictions of graphic violence, emotional abuse,
and threats of physical harm are undesirable, and should be mitigated on these
platforms to the extent possible. Many other categories of information, however,
such as hate speech, racism, extremism, misinformation, political interference,
and harassment [i.e., the categories that are the focus of AB 587], are far more
difficult to reliably define, and assignment of their boundaries is often fraught
with political bias. In such cases, both action and inaction by these companies
seems to be equally maligned: too much moderation and accusations of
censorship and suppressed speech arise; too little, and the platform risks fostering
a toxic, sometimes dangerous community.
Id. (emphasis added); see also Ex. 3 (Cal. Assemb. Floor Analysis, 2021–22 Sess. (AB 587), Apr.
28, 2021) at 1–2 (same) (emphasis added); Ex. 4 (Cal. Assemb. Floor Analysis, 2021–22 Sess. (AB
587), Aug. 24, 2022) at 2 (same) (emphasis added).
6.
In other words, AB 587 seeks to force social media companies to provide the
Attorney General and the public detailed information about how, if at all, they define and moderate
the boundaries of the most controversial categories of content — i.e., those categories with no
“general public consensus” about how to define or moderate them because their boundaries are
“fraught with political bias” and are “difficult to reliably define” — and provide detailed
information about what actions they have or have not taken in regulating those controversial
categories, even though action or inaction “seems to be equally maligned” by members of the
COMPLAINT Page 5
public, depending on their political viewpoint. Put another way, through AB 587, the State is
compelling social media companies to take public positions on controversial and politically-
charged issues. And, because X Corp. must take such positions on these topics as they are
formulated by the State, X Corp. is being forced to adopt the State’s politically-charged terms,
which is a form of compelled speech in and of itself.
7.
AB 587 thus mandates X Corp. to speak about sensitive, controversial topics about
which it does not wish to speak in the hopes of pressuring X Corp. to limit constitutionally-protected
content on its platform that the State apparently finds objectionable or undesirable. This violates
the free speech rights granted to X Corp. under the First Amendment to the United States
Constitution and Article I, Section 2, of the California Constitution.
8.
The First Amendment affords X Corp. “both the right to speak freely and the right
to refrain from speaking at all,” Wooley v. Maynard, 430 U.S. 705, 714 (1977), because “[t]he right
to speak and the right to refrain from speaking are complementary components of the broader
concept of ‘individual freedom of mind.’” Id.; Riley v. Nat’l Fed’n of the Blind, 487 U.S. 781,
(1988). See also Agency for Int’l Dev. v. Alliance for Open Soc’y Int’l, Inc., 570 U.S. 205,
(2013) (the First Amendment “prohibits the government from telling people what they must say”).
These protections against compelled speech apply equally to “compelled statements of ‘fact’” and
“compelled statements of opinion” because “either form of compulsion burdens protected speech.”
Riley, 487 U.S. at 797–98. AB 587 violates X Corp.’s First Amendment right to not speak about
controversial topics and to decide for itself what it will say or not say about these topics.
9.
Moreover, “[l]ike a newspaper or a news network,” X Corp.’s “decisions about what
content to include, exclude, moderate, filter, label, restrict, or promote” on its platform are
“protected by the First Amendment,” O’Handley v. Padilla, 579 F. Supp. 3d 1163, 1186–87 (N.D.
Cal. 2022), aff’d sub nom. on other grounds, O’Handley v. Weber, 62 F.4th 1145 (9th Cir. 2023).
COMPLAINT Page 6
AB 587 also violates the First Amendment because it has the purpose and intended chilling effect
of pressuring social media companies into limiting or censoring constitutionally-protected content
that the State finds objectionable. AB 587 would also impermissibly inject the government into X
Corp.’s editorial judgments — e.g., by dictating the contents of X Corp.’s Terms of Service and
compelling controversial disclosures about how X Corp. moderates content on its platform as part
of an effort to pressure social media companies to regulate content in a manner desired by the State,
as opposed to allowing the social media companies to regulate content on their platforms as they
deem fit.
10.
The First Amendment unequivocally prohibits this kind of interference with a
traditional publisher’s editorial judgment. For example, if a law were to require newspapers to
promulgate Terms of Service that disclose (i) a time by which they will respond to requests to
publish letters to the editor or opinion pieces; (ii) detailed disclosures about their criteria for
publication and statistics about the bases for decisions regarding whether to publish letters to the
editor or opinion pieces; and (iii) statistics about how many submissions were accepted and rejected
on the ground that they contained “hate speech” or “misinformation,” it would undoubtedly violate
the First Amendment because it would impermissibly interfere with the constitutionally-protected
editorial judgment of newspapers. See, e.g., Herbert v. Lando, 441 U.S. 153, 174 (1979) (a law
that “subjects the editorial process to private or official examination merely to satisfy curiosity or
to serve some general end such as the public interest . . . would not survive constitutional scrutiny
as the First Amendment is presently construed”); id. at 172 (concluding that “if inquiry into editorial
conclusions threatens the suppression . . . of truthful information,” it raises First Amendment
problems); Miami Herald Pub. Co. v. Tornillo, 418 U.S. 241, 258 (1974) (“The choice of material
to go into a newspaper, and the decisions made as to limitations on the size and content of the paper,
and treatment of public issues and public officials — whether fair or unfair — constitute the
COMPLAINT Page 7
exercise of editorial control and judgment. It has yet to be demonstrated how governmental
regulation of this crucial process can be exercised consistent with First Amendment guarantees of
a free press as they have evolved to this time.”).
11.
That X is a social media platform does not change the analysis. 1 These core First
Amendment principles prohibit the government from interfering with the right of private parties,
like X Corp., to exercise “editorial control over speech and speakers on their properties or
platforms.” Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1932 (2019). And the
U.S. Supreme Court has made clear that the right to choose whether to speak and how to tailor
one’s speech is not “restricted to the press.” Hurley v. Irish-Am. Gay, Lesbian & Bisexual Grp. of
Boston, 515 U.S. 557, 574 (1995). This fundamental right applies equally to “business corporations
generally” and “ordinary people engaged in unsophisticated expression as well as professional
publishers.” Id. A private social media company’s editorial judgment about how to regulate
content on its platform and what to say (or not to say) about its regulation of content is therefore
fully protected under the First Amendment. Id.; see also U.S. Telecom Ass’n v. FCC, 855 F.3d
381, 435 (D.C. Cir. 2017) (Kavanaugh, J., dissenting from denial of rehearing en banc) (the
government may not “tell Twitter or YouTube what videos to post” or “tell Facebook or Google
what content to favor” any more than it may “tell The Washington Post or the Drudge Report what
columns to carry”); Tornillo, 418 U.S. at 258 (editorial control and judgment protected from
government regulation by First Amendment).
12.
AB 587 also imposes tremendously burdensome requirements on social media
companies, requiring them to keep records about potentially hundreds of millions of content
moderation decisions made on a daily basis. Ex. 2 at 4. Worse yet, it threatens draconian financial
penalties of up to $15,000 per violation per day if compliance is not made in “reasonable, good
X Corp. provides the X service (“X,” formerly the platform referred to as Twitter).
COMPLAINT Page 8
faith,” a term that the statute does not define and that gives the Attorney General nearly unfettered
discretion to threaten to impose draconian fines if social media companies’ content moderation
policies are not to the State’s liking. Given the broad enforcement powers granted to the Attorney
General under Cal. Gov’t Code §§ 11180–81, AB 587 empowers the Attorney General to issue
civil investigative demands to social media companies about their content moderation policies and
practices to determine if they have complied with the statute in “reasonable, good faith.” This
broad, unfettered discretion to scrutinize the editorial judgments of social media companies
empowers the Attorney General to use his enforcement powers to pressure social media companies
to regulate content in ways that the government wants — which is one of the stated purposes of the
law.
13.
Concerns about the potential use of AB 587 in this manner are not speculative:
Attorney General Bonta has already written to X Corp. (and other social media companies)
threatening that “[t]he California Department of Justice will not hesitate to enforce [AB 587],”
while in the same proverbial breath reminding the companies of their “responsibility” to combat
what the Attorney General views as the “dissemination of disinformation that interferes with our
electoral system.” Letter from Attorney General Robert Bonta to Twitter, Inc., et al., 4 (Nov. 3,
https://oag.ca.gov/system/files/attachments/press-
14.
AB 587 also violates the Dormant Commerce Clause. By failing to restrict its
extensive reporting requirements to information about Californians, AB 587 places an undue,
excessive burden on interstate commerce. That failure is by design — indeed, as bill author Jesse
Gabriel’s press release upon the passage of AB 587 made clear, the goal is for AB 587 to have
COMPLAINT Page 9
“national implications.”2 Thus, the law will not just impact content moderation of California users
but is intended to and will impact content moderation nationwide.
15.
Moreover, AB 587 directly contravenes the immunity provided for providers and
users of interactive computer services set forth by 47 U.S.C. § 230(c)(2), which prohibits liability
“on account” of “any action voluntarily taken in good faith to restrict access to or availability of
material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively
violent, harassing, or otherwise objectionable, whether or not such material is constitutionally
protected.” This is because, first, AB 587 imposes civil liability on social media companies that
are covered by the statute if they attempt to “restrict access to or availability of material that the
[social media company] or user considers to be obscene, lewd, lascivious, filthy, excessively
violent, harassing, or otherwise objectionable” without making the disclosures required by AB 587.
The immunity afforded by Section 230(c)(2) is broad — it covers “any action” voluntarily taken in
good faith to regulate objectionable content. Because AB 587 imposes liability on such actions if
they are taken without the required disclosures, AB 587 is preempted by the broad immunity
afforded by Section 230(c)(2).
16.
AB 587 also contravenes the immunity provided to X Corp. by Section 230(c)(2)
for the additional reason that Attorney General Bonta may penalize X Corp. — under his unfettered
discretion as to what constitutes “misrepresent[ation]” and “reasonable, good faith” — if X Corp.
were to “restrict access” to content in a way that, in Attorney General Bonta’s view, is contrary to
X Corp.’s promulgated content moderation policies.
17.
In pursuing this action, X Corp. seeks declaratory relief and preliminary and
permanent injunctive relief on the grounds that AB 587 (i) violates X Corp.’s free speech rights
Press Release, Office of Assembly member Jesse Gabriel, After Two-Year Fight, Governor Newsom Signs
Landmark Social Media Transparency Bill (Sept. 13, 2021) https://a46.asmdc.org/press-releases/20220913after-two-year-fight-governor-newsom-signs-landmark-social-media.
COMPLAINT Page 10
under the First Amendment to the United States Constitution and Article I, Section 2, of the
California Constitution; (ii) violates the Dormant Commerce Clause of the United States
Constitution; and (iii) directly conflicts with, and is thus preempted by, the immunity afforded by
47 U.S.C. § 230(c)(2).
18.
In pursuing this action, X Corp. seeks to vindicate the deprivation of constitutional
rights under color of state statute, ordinance, regulation, custom, and/or usage. X Corp. is also
entitled to attorneys’ fees and costs if it prevails on any of its § 1983 claims. See 42 U.S.C. § 1988.
PARTIES
19.
Plaintiff X Corp. is a corporation organized and existing under the laws of the State
of Nevada, with its principal place of business in San Francisco, California. X Corp. is successor
in interest to Twitter, Inc. X Corp. provides the X service. X is a real-time, open, public
conversation platform, where people can see every side of a topic, discover news, share their
perspectives, and engage in discussion and debate. X allows people to create, distribute, and
discover content and has democratized content creation and distribution. X allows users to create
and share ideas and information instantly through various product features, including public posts.
20.
AB 587 applies to X Corp. because it is a “social media company,” as defined in the
statute — i.e., an “entity that owns or operates one or more social media platforms” — and has
generated more than $100 million in gross revenue in the preceding calendar year.
21.
X is a “social media platform,” as defined by AB 587, because it is a public internet-
based service or application with users in California and (1) “[a] substantial function of the service
or application is to connect users in order to interact socially with each other within the service or
application” and (2) it allows its users to (a) “construct a public or semipublic profile for purposes
of signing into and using the service or application”; (b) “[p]opulate a list of other users with whom
an individual shares a social connection within the system”; and (c) “[c]reate or post content
COMPLAINT Page 11
viewable by other users, including but not limited to, on message boards, in chat rooms, or through
a landing page or main feed that presents the user with content generated by other users.”
22.
Defendant Robert Bonta is the Attorney General of the State of California and is
charged with enforcing AB 587. X Corp. sues Attorney General Bonta in his official capacity as
the person charged with enforcing AB 587.
JURISDICTION
23.
This Court has jurisdiction over X Corp.’s federal claims pursuant to 28 U.S.C.
§§ 1331 and 1343(a) and 42 U.S.C. § 1983 because X Corp. alleges violations of its rights under
the Constitution and laws of the United States. The Court has jurisdiction over X Corp.’s state
claim pursuant to 28 U.S.C. § 1367.
24.
This Court has authority to grant declaratory and injunctive relief under the
Declaratory Judgment Act, 28 U.S.C. §§ 2201, 2202, and under the Court’s inherent equitable
jurisdiction.
VENUE
25.
Venue is proper in this Court under 28 U.S.C. § 1391(b)(1), (2) because the
Defendant is located, resides, and has offices in this judicial district and in the State of California,
and the violations of X Corp.’s rights are occurring and will occur within this judicial district. AB
587 was also enacted in this judicial district.
FACTUAL ALLEGATIONS
I.
26.
AB 587’s Statutory Scheme
AB 587, which applies to “social media companies” (defined as persons or entities
owning or operating one or more “social media platforms”3), has three main components — (i) a
A “social media platform” is a “public or semipublic internet-based service or application that has users in
California” (i) for which “[a] substantial function of the service or application is to connect users in order to
allow users to interact socially with each other within the service or application,” or (ii) allows users to (a)
“[c]onstruct a public or semipublic profile for purposes of signing into and using the service or application,”
COMPLAINT Page 12
requirement that social media companies publicly post their terms of service, including processes
for flagging content and potential actions that may be taken with respect to flagged content (“Terms
of Service Requirement”), see § 22676; (ii) a requirement that social media companies submit to
Attorney General Bonta, who will in turn disseminate publicly a report including, among other
things, whether and, if so, how they define hate speech, racism, extremism, radicalization,
disinformation, misinformation, harassment, and foreign political interference (“Terms of Service
Report”), see § 22677; and (iii) a penalty provision, whereby companies may be liable to pay
$15,000 per violation per day and may be sued in court for failing to make a “reasonable, good
faith attempt” to comply with AB 587’s requirements, see § 22678.
a. Terms of Service Requirement
27.
AB 587’s Terms of Service Requirement mandates that social media companies
publicly post, “in a manner reasonably designed to inform all users” of its “existence and contents,”
their platforms’ terms of service. § 22676(a). Those terms of service must include (i) “contact
information,” so that users may “ask the social media company questions about” the terms, (ii) a
“description of the process that users must follow to flag content, groups, or other users that they
believe violate the terms of service,” as well as “the social media company’s commitments on
response and resolution time,” and (iii) a “list of potential actions the social media company may
take against an item of content or a user, including, but not limited to, removal, demonetization,
deprioritization, or banning.” § 22676(b).
28.
The Terms of Service Requirement is an impermissible attempt by the State to inject
itself into the content moderation editorial process. By requiring Terms of Service that contain
certain provisions about how content is moderated, the State is requiring social media companies
(b) “[p]opulate a list of other users with whom an individual shares a social connection within the system,”
or (c) “[c]reate or post content viewable by other users.” § 22675(e).
COMPLAINT Page 13
that are covered by the statute to moderate content in ways that the State deems appropriate, rather
than leaving those constitutionally-protected editorial choices to the social media companies. The
requirements may not mandate that a particular process or methodology be used, but they do force
social media companies to make certain disclosures about how the content moderation process
works, how users can flag objectionable content, how quickly the companies will respond to
flagged content, and what actions the companies may take with respect to potentially objectionable
content.
29.
Article I, Section 2, of the California Constitution.
These forced disclosures violate the First Amendment of the U.S. Constitution and
b. Terms of Service Report
30.
AB 587 requires social media companies to submit bi-annual Terms of Service
Reports, on April 1 and October 1 of each year, to Attorney General Bonta, who will then make
them publicly available. The Report must include:
a. A statement of whether, and if so how, the platform’s terms of use define hate speech, racism,
extremism, radicalization, disinformation, misinformation, harassment, and foreign political
interference, § 22677(a)(3);
b. The current version of the terms of service and a “complete and detailed description of any
changes” to the terms since any previous report, § 22677(a)(2);
c. A “detailed description” of the platform’s “content moderation practices,” including, but not
limited to:
i. Any “existing policies intended to address the categories of content described in
[§ 22677(a)(3)],” § 22677(a)(4)(A);
ii. How “automated content moderation systems enforce” the platform’s terms of service
and “when these systems involve human review,” § 22677(a)(4)(B);
COMPLAINT Page 14
iii. How the “company responds to user reports of violations of the terms of service,”
§ 22677(a)(4)(C); and
iv. How the “company would remove individual pieces of content, users, or groups that
violate the terms of service, or take broader action against” individual or groups of
users “that violate the terms of service,” § 22677(a)(4)(D);
d. Information regarding “content that was flagged by the social media company as content
belonging to any of the categories described in [§ 22677(a)(3)]” including:
i. The “total number of flagged items of content,” § 22677(a)(5)(A)(i);
ii. The “total number of actioned items of content,” § 22677(a)(5)(A)(ii);
iii. The “total number of actioned items of content that resulted in action taken by the
social media company against the user or group of users responsible for the content,”
§ 22677(a)(5)(A)(iii);
iv. The “total number of actioned items of content that were removed, demonetized, or
deprioritized” by the company, § 22677(a)(5)(A)(iv);
v. The “number of times actioned items of content were viewed by users,”
§ 22677(a)(5)(A)(v);
vi. The “number of times actioned items of content were shared, and the number of users
that viewed the content before it was actioned,” § 22677(a)(5)(A)(vi); and
vii. The “number of times users appealed” company actions “taken on that platform and
the number of reversals of social media company actions on appeal disaggregated by
each type of action,” § 22677(a)(5)(A)(vii);
e. All of the information required by § 22677(a)(5)(A) “disaggregated” by:
i. The “category of content, including any relevant categories described in [§
22677(a)(3)],” § 22677(a)(5)(B)(i);
COMPLAINT Page 15
ii. The “type of content” (e.g., “posts, comments, messages”), § 22677(a)(5)(B)(ii);
iii. The “type of media of the content” (e.g., “text, images, and videos,”),
§ 22677(a)(5)(B)(iii);
iv. How “the content was flagged” (e.g., “flagged by company employees or contractors,
flagged by artificial intelligence software, flagged by community moderators, flagged
by civil society partners, and flagged by users”), § 22677(a)(5)(B)(iv); and
v. How “the content was actioned” (e.g., “actioned by company employees or contractors,
actioned by artificial intelligence software, actioned by community moderators,
actioned by civil society partners, and actioned by users”), § 22677(a)(5)(B)(v).
31.
The Terms of Service Report is also an impermissible attempt by the State to inject
itself into the content moderation editorial process. By requiring detailed disclosures about how
controversial content is moderated, the State is impermissibly compelling social media companies
to make disclosures about how their editorial processes work. This will inevitably lead to
controversy that will have been generated by the way the State has required the debate to have been
framed. By compelling social media companies to make politically-charged disclosures about
content moderation, the State is impermissibly trying to generate public controversy about content
moderation in a way that will pressure social media companies, such as X Corp., to restrict, limit,
disfavor, or censor certain constitutionally-protected content on X that the State dislikes.
32.
These forced disclosures violate the First Amendment of the U.S. Constitution and
Article I, Section 2, of the California Constitution.
c.
Penalties
33.
AB 587 also sets forth a penalty scheme under which social media companies may
be fined $15,000 per violation per day, and may be enjoined in any court of competent jurisdiction
by Attorney General Bonta or by a “city attorney,” if the company (i) “[f]ails to post terms of
COMPLAINT Page 16
service in accordance with Section 22676,” (ii) “[f]ails to timely submit” a Terms of Service Report,
or (iii) “[m]aterially omits or misrepresents required information in a” Terms of Service Report.
§ 22678. AB 587’s penalty provision further instructs that the court shall, “[i]n assessing the
amount of a civil penalty pursuant to paragraph . . . consider whether the social media company has
made a reasonable, good faith attempt to comply with the provisions of this chapter.” Id.
II.
The Topics About Which AB 587 Forces X Corp. to Speak are Extremely
Controversial and Inherently Subject to Disagreement
34.
As the legislative history makes clear, AB 587 compels disclosure from social media
companies about the most controversial types of content moderation. While there may be a general
public consensus that some types of constitutionally-unprotected content (e.g., child pornography
or threats of physical harm) should be limited on social media platforms to the extent possible, AB
587 focuses primarily on categories of content (e.g., hate speech, racism, extremism,
misinformation, political interference, and harassment) for which there is no such “general public
consensus.” Ex. 2 at 4; see also Ex. 3 at 1–2 (same); Ex. 4 at 2 (same).
35.
How to define and regulate these categories of content is a politically-charged and
controversial undertaking. That is because, as the California Assembly Committee on Privacy and
Consumer Protection Report about AB 587 candidly acknowledged, these categories are “far more
difficult to reliably define, and assignment of their boundaries is often fraught with political bias.”
Ex. 2 at 4; see also Ex. 3 at 1–2 (same); Ex. 4 at 2 (same).
36.
As a result, “the companies responsible for managing social media platforms are
faced with a complex dilemma regarding content moderation.” Id. No matter what they do when
regulating these controversial categories of content, they are likely to be criticized. “In such cases,
both action and inaction by these companies seems to be equally maligned: too much moderation
and accusations of censorship and suppressed speech arise; too little, and the platform risks
fostering a toxic, sometimes dangerous community.” Id.
COMPLAINT Page 17
37.
Because these controversial categories are “difficult to reliably define” and
“assignment of their boundaries is often fraught with political bias,” id., decisions about how to
moderate such content are inherently political. As a result, views about whether there is too much
or too little moderation of these types of content are controversial political questions.
38.
questions can be:
To take just a few examples of how controversial and politically-charged these
Some people view speech intentionally misgendering a transgender individual as
“hate speech” and harassment. See National Institute of Health, Gender Pronouns
Resource, U.S. Department of Health and Human Services (Mar. 8, 2023),
(Mar.
9,
2020),
2023)
As these examples make clear, AB 587 focuses on the most controversial and
politically-charged categories of content moderation. It forces social media companies to speak
publicly and take a position on these controversial topics, notwithstanding that doing so will almost
always result in public criticism from one political group or another. As the California Assembly’s
Committee on Privacy and Consumer Protection Report makes clear, “both action and inaction by
these [social media] companies [in regulating these controversial categories of content] seems to
be equally maligned.” Ex. 2 at 4; see also Ex. 3 at 1–2 (same); Ex. 4 at 2 (same).
40.
The problem with this from a First Amendment standpoint is obvious. AB
forces X Corp., under threat of substantial civil penalty, to publish whether and, if so, how it defines,
and whether and, if so, how it moderates, (i) hate speech, (ii) racism, (iii) extremism, (iv)
radicalization, (v) disinformation, (vi) misinformation, (vii) harassment, and (viii) foreign political
interference — categories of speech that are almost entirely constitutionally-protected. Defining
COMPLAINT Page 20
these controversial categories and making decisions about how or whether to moderate them is an
exercise “fraught with political bias,” that is likely to result in controversy, no matter what the
social media companies do. Id. The First Amendment does not permit the government to compel
social media companies to publicly take positions on these controversial political topics against
their will. Wooley, 430 U.S. at 714 (the First Amendment affords X Corp. “both the right to speak
freely and the right to refrain from speaking at all”); Riley, 487 U.S. at 797 (same); Agency for Int’l
Dev., 570 U.S. at 213 (the First Amendment “prohibits the government from telling people what
they must say”). This is particularly true given the highly controversial and politically-charged
nature of questions about content moderation.
41.
And by requiring X Corp. to disclose whether and, if so, how it regulates these
controversial and difficult-to-define categories of content, AB 587 impermissibly attempts to
pressure X Corp. to adopt — and regulate — these categories of content, even if X Corp. would
prefer to categorize content differently. For example, X Corp. does not currently regulate “hate
speech,” “racism,” or “extremism,” which are three categories of conduct that AB 587 forces social
media companies to publicly address. But X Corp. does regulate “hateful conduct,” a category of
content that may include content that some people might argue constitutes “hate speech,” “racism,”
or “extremism.” See Safety and Cybercrime: Hateful Conduct, X. Corp. (last accessed Sept. 8,
2023) https://help.twitter.com/en/rules-and-policies/hateful-conduct-policy. While AB 587 does
not force X Corp. to adopt and regulate these three categories, it intentionally attempts to pressure
X Corp. to do so in an insidious and impermissible way. Because the Terms of Service Report
forces disclosures about content moderation to cover the controversial topics that the State has
defined, social media companies like X Corp. will feel pressure to indicate that they take steps to
regulate such content, rather than be subject to a pronouncement that, for example, they do not
regulate “hate speech” or “racism” at all. AB 587 was passed with the intent of pressuring social
COMPLAINT Page 21
media companies to do just that. It is, in short, an attempt by the State to impermissibly frame the
debate about content moderation in a way that pressures social media companies to regulate
constitutionally-protected content that the State finds objectionable.
42.
This type of compelled speech cannot pass constitutional muster. An analogy helps
illustrate the point. Imagine, for instance, a state law that, in the interest of transparency, forced
individuals running for President of the United States to publish their positions on several “hot
button” political issues (e.g., whether abortion should be permitted in cases of incest and rape or
whether former President Trump committed any crimes) on which, in the government’s view, most
political candidates were avoiding stating their positions. Even if the law did not mandate
politicians to take any particular position, but simply required them, under the threat of civil
penalties, to publicly spell out their position (or lack thereof) on those controversial issues as part
of an effort to “pressure” them to adopt a certain political view, there can be little doubt that such
a law would trigger heightened scrutiny under the First Amendment. AB 587 is no different.
43.
The Attorney General has suggested that AB 587 involves merely compelled
commercial speech of “purely factual and uncontroversial information,” and is thus subject to the
more relaxed standard of review under Zauderer v. Office of Disciplinary Counsel, 471 U.S. 626,
651 (1985). That is simply untrue. As the legislative history of AB 587 makes plain, the categories
of speech that are focused on by AB 587 are controversial to define and controversial to moderate.
And the speech that is compelled about the definitions of these categories and application of content
moderation rules is not commercial; rather, it is core political speech about controversial questions
on which reasonable minds can disagree. What is more, AB 587’s compelled disclosure is not
purely factual because it may mislead consumers — that is, if a company submits a Terms of
Service Report explaining that it does not moderate the controversial categories of content required
by the State (because, instead, the company moderates categories of content pursuant to its own
COMPLAINT Page 22
policies and terminology), there is a high likelihood that consumers could be misled to believe that
the company is not moderating content sufficiently, even if that is not the case. As such, Zauderer
does not apply.
44.
Even if Zauderer did apply, AB 587 fails to satisfy even the more relaxed test set
forth in that case because the legislation is unduly burdensome and unjustified. Id. at 651. As
noted in the legislative history of AB 587, “the largest social media platforms are faced with
thousands, if not millions of similarly difficult decisions related to content moderation on a daily
basis.” Ex. 2 at 4. This enormous burden, of course, is the reason why the statute limits itself to
social media companies with over $100 million in gross revenue from the preceding year. Id. at 7.
Indeed, the legislative history for AB 587 describes the amount of content received by many of
these platforms as “enormous.” Id.
III.
AB 587’s Will Suppress Speech Based on Content & Viewpoint
45.
According to California Governor Gavin Newsom, AB 587 does nothing more than
“pull back the curtain” and provide “transparency” as to the already-existent content moderation
policies of social media companies. See, e.g., Press Release, Governor Newsom Signs Nation-
46.
Social
Media
Transparency
Measure
Governor Newsom’s claim is belied by the record.
(Sept.
13,
2022),
Even if AB 587 uses
“transparency” as its effectuating mechanism, it does so for the purpose of censoring particular
viewpoints with respect to eight content categories in § 22677(a)(3). And AB 587 will do so
successfully, given its amorphous penalty scheme (it is entirely unclear what constitutes a
“reasonable, good faith attempt to comply”) that can and will likely be used to silence viewpoints
with which Attorney General Bonta and the State of California disagree. See Sorrell v. IMS Health
COMPLAINT Page 23
Inc., 564 U.S. 552, 565 (2011) (applying “heightened scrutiny” to law intended to suppress speech
“in conflict with the goals of the state”).
47.
The legislative record is abundantly clear: according to AB 587’s authors, sponsors,
and supporters, the law’s true purpose and desired effect is to use the compelled disclosures to
pressure social media companies into regulating and censoring constitutionally-protected content
that the government believes is undesirable. For instance:
Within the April 27, 2021 Assembly Committee on Judiciary Hearing Report for AB
587, lead bill author Jesse Gabriel stated that AB 587 is an “important first step” in
ensuring that “social media companies [] moderate or remove hateful or
incendiary content” on their platforms. He hoped that AB 587 will “pressure them”
to “eliminate hate speech and disinformation.” Ex. 1 at 4 (emphasis added);
The official bill comments accompanying AB 587’s May 24, 2021 Assembly Floor
Analysis applauded the bill’s “unique, data driven approach” to “content
moderation on social media.” Ex. 5 (Cal. Assemb. Analysis, 2021-22 Sess. (AB
587), May 24, 2021) at 2 (emphasis added);
The July 13, 2021 Senate Judiciary Committee Hearing Report for AB 587 cites to
comments from official bill sponsor ADL that emphasized that the law will allow
“policymakers [to] take meaningful action to decrease online hate and
extremism.” Ex. 6 (Cal. Sen. Judiciary Report, 2021–22 Sess. (AB 587), July 13,
2021) at 13 (emphasis added);
In July 2020, California Senator Scott Wiener, who ultimately co-authored AB 587,
tweeted, “Social media platforms have a moral obligation—& need to have a legal
obligation—not to become engines for violent hate speech.” Senator Scott Wiener
48.
(June
14,
5:
PM
EST),
It is clear, moreover, that AB 587 aims to censor particular viewpoints espoused on
social media platforms with respect to the eight categories of content in § 22677(a)(3). For instance,
within the April 27, 2021 Assembly Committee on Judiciary Hearing Report for AB 587, bill author
Jesse Gabriel cited a “study of Twitter posts” that supposedly found that “the greater proportion of
COMPLAINT Page 25
tweets related to race- and ethnicity-based discrimination in a given city, the more hate crimes were
occurring in that city.” Ex. 1 at 4; Request for Judicial Notice, Ex. 1 at 4, Minds Inc., No. 23-cv-
2705 (ECF 23-3) (C.D. Cal. May 25, 2023) (citing same).
49.
Similarly, the June 28, 2022 Senate Judiciary Committee Hearing Report for AB
587 cites a study finding that a third of individuals who “experience online harassment . . . attribute
at least some harassment to their identity . . . affecting the ability of already marginalized
communities to be safe in digital spaces.” Ex. 7 (Cal. Sen. Judiciary Report, 2021–22 Sess. (AB
587), June 28, 2022) at 8; see also id. at 18 (Los Angeles County Democratic Party noting its
support for AB 587 because social media companies “enable[] the micro targeting of vulnerable
individuals.”).
50.
That Attorney General Bonta plans to use threats of enforcement of AB 587 to
pressure the social media companies to regulate speech that the government does not like is not in
doubt. In fact, he has already done so. Less than two months after AB 587’s enactment, the
Attorney General reminded X Corp. and other social media companies of their “responsibility” to
combat what the Attorney General views as the “dissemination of disinformation that interferes
with our electoral system,” while simultaneously reminding them that the “California Department
of Justice will not hesitate to enforce” AB 587. Letter from Attorney General Robert Bonta to
Twitter, Inc., et al., 4, (Nov. 3, 2022), https://oag.ca.gov/system/files/attachments/press-
51.
That AB 587 grants Attorney General Bonta nearly unfettered discretion to
determine if, in his view, X Corp. has complied with AB 587 in “reasonable, good faith” and grants
him the ability to investigate potential violations (pursuant to Cal. Gov’t Code §§ 11180–81) by
issuing document demands about the social media companies’ content moderation practices
compounds the problem. The end result is that AB 587 violates the First Amendment of the United
COMPLAINT Page 26
States Constitution and Section I, Article 2, of the California Constitution by impermissibly
injecting the State into X Corp.’s constitutionally-protected editorial decisions in a manner that is
designed to pressure X Corp. to regulate constitutionally-protected content in ways that the State
wants.
52.
Because AB 587 forces X Corp. to make disclosures about how it defines and
moderates controversial categories of content as part of an effort to pressure X Corp. to regulate
constitutionally-protected content that the State disfavors, strict scrutiny applies.
53.
AB 587 fails to satisfy strict — or even intermediate — scrutiny. It does not serve
a compelling (or even a substantial governmental interest), will not directly and materially advance
any such interest, and is not narrowly tailored to further any such interest.
54.
AB 587 purports to further the government’s interest in transparency in content
moderation. But there is no evidence that AB 587’s compelled speech solves any real problem at
all. Indeed, X Corp. is already transparent with its users and the public about its content moderation.
See Rules and Policies, X Corp. (last accessed Sept. 8, 2023) https://help.twitter.com/en/rules-and-
policies. X Corp. supports transparency in content moderation — but not in the impermissible
manner proscribed by AB 587. Here, there is no evidence that the Act’s compelled speech solves
any real problem at all. X Corp. is already transparent about its content moderation policies, and
there is no compelling, substantial, or important governmental interest in compelling transparency
using the type of governmental oversight proscribed by AB 587.
55.
The government’s other stated interest in passing AB 587 – to “pressure” social
media companies to “become better corporate citizens by doing more to eliminate hate speech and
disinformation,” Ex. 1 at 4; see also Mot. to Dismiss at 15–16, Minds, Inc., et al. v. Bonta, No. 23-
cv-2705 (ECF 23-1) (C.D. Cal. May 25, 2023), is not compelling, substantial, or important. State
COMPLAINT Page 27
pressure to moderate constitutionally-protected content that the State disfavors is an illegal,
unconstitutional, and illegitimate purpose.
56.
AB 587 also fails to satisfy heightened scrutiny because there is no evidence that it
will “directly and materially” advance a compelling, substantial, or important governmental
interest. Likewise, it regulates much more speech than is necessary to achieve any compelling,
substantial, or important goal.
IV.
AB 587 Excessively and Unduly Burdens X Corp.’s Nationwide and Global Activities
57.
In addition to forcing X Corp. to speak about highly controversial topics about which
it does not wish to speak, AB 587 establishes an unduly burdensome disclosure requirement regime,
under the threat of significant financial penalties and civil suits.
58.
The undue burden that AB 587 places on X Corp. is compounded by the law’s failure
to limit the Terms of Service Report’s information requirements to information about Californians.
To be clear, this was an intentional decision, made to ensure that the AB 587 “will have national
implications.”4 But AB 587 does not stop there. Regulations like AB 587, which govern global
social media platforms’ publication of content, are inherently international in scope. AB
applies just as much to posts by Texans, in Texas, or Australians, in Australia, as it does to
Californians, in California. And even if a post was made by a Californian, in California, it is
available globally and viewed globally.
59.
The burden AB 587 places on X Corp. is even further compounded by the sheer
volume of posts that occur on X. As of August 2022, 6,000 posts are sent on X every second;
350,000 posts are sent every minute; 500 million posts are sent every day; and 200 billion posts are
sent every year. David Sayce, The Number of Tweets Per Day in 2022 (last visited Sept. 6, 2023),
Press Release, Office of Assembly member Jesse Gabriel, After Two-Year Fight, Governor Newsom Signs
Landmark Social Media Transparency Bill (Sept. 13, 2021) https://a46.asmdc.org/press-releases/20220913after-two-year-fight-governor-newsom-signs-landmark-social-media.
COMPLAINT Page 28
https://www.dsayce.com/social-media/tweets-
60.
In light of this volume, and in light of the specificity and depth of information
required to comply with AB 587 — i.e., at least 161 categories of information must be disclosed
bi-annually5 — the tremendous burden AB 587 creates for X Corp. is clear. X Corp. will be forced
to expend significant monetary and employee resources to comply with AB 587, all for the
unconstitutional and unjustifiable goal of moderating particular content and viewpoints with which
Attorney General Bonta and the State of California disagree.
FIRST CAUSE OF ACTION
(Declaratory Relief and Preliminary and Permanent Injunctive Relief for Violations of the
First Amendment to the United States Constitution (42 U.S.C. § 1983) and Article I, Section
2, of the California Constitution)
X Corp. realleges and incorporates herein by reference Paragraphs 1 through
62.
AB 587 violates the First Amendment to the United States Constitution and Article
above.
61.
I, Section 2, of the California Constitution by compelling X Corp. to divulge publicly, under the
threat of significant financial penalty and civil suit, its opinions and confidential editorial processes
with respect to extremely controversial subject matters. Tornillo, 418 U.S. at 258 (editorial control
and judgment protected from government regulation by First Amendment).6 Each of the content
See Eric Goldman, Will California Clone-and-Revise Some Terrible Ideas from Florida/Texas’ Social
Media Censorship Laws? (Analysis of CA AB587), Technology & Marketing Law Blog (June 21, 2022),
https://blog.ericgoldman.org/archives/2022/06/will-california-clone-and-revise-some-terrible-ideas-fromflorida-texas-soc
(“All told, there are 7 categories of
disclosures, and the bill indicates that the disclosure categories have, respectively, 5 options, at least options, at least 3 options, at least 5 options, and at least 5 options. So I believe the bill requires that each
service’s reports should include no less than 161 different categories of disclosures
(7×5+7×5+7×3+7×5+7×5).”).
AB 587 violates Article I, Section 2, of the California Constitution for all of the same reasons that it violates
the First Amendment to the United States Constitution. See, e.g., City of Montebello v. Vasquez, 1 Cal. 5th
409, 421 n.11 (2016) (“[T]he California liberty of speech clause is broader and more protective than the free
COMPLAINT Page 29
categories about which X Corp. is compelled to speak — hate speech, racism, extremism,
radicalization, disinformation, misinformation, harassment, and foreign political interference,
§ 22677(a)(3) — is “anything but an ‘uncontroversial’ topic.” Nat’l Inst. of Fam. & Life Advocs.
v. Becerra (“NIFLA”), 138 S. Ct. 2361, 2372 (2018); see also Wooley, 430 U.S. at 714 (“First
Amendment” protects “both the right to speak freely and the right to refrain from speaking at all.”).
The same is true about how such categories of speech are defined and policed on X.
63.
The deep-seated controversy inherently tied to each topic in § 22677(a)(3) makes
crystal clear that AB 587’s compelled disclosures are not the sort of purely factual and
uncontroversial disclosures that fit within the narrow exception to strict scrutiny set forth under
Zauderer. Strict scrutiny thus applies because AB 587 “compel[s]” X Corp. to “speak a particular
message,” which necessarily “alters the content of” its speech. NIFLA, 138 S. Ct. at 2371 (quoting
Riley, 487 U.S. at 795 (emphasis added)). AB 587 fails to satisfy strict (or even intermediate)
scrutiny.
64.
Moreover, “[f]ormal legislative findings accompanying” AB 587 make clear its
illicit “purpose and practical effect,” Sorrell, 564 U.S. at 565, which is to regulate speech on X
“based on ‘the topic discussed or the idea or message expressed.’” City of Austin, Texas v. Reagan
Nat’l Advert. of Austin, LLC, 142 S. Ct. 1464, 1474 (2022) (citing Reed v. Town of Gilbert, Ariz.,
576 U.S. 155, 171 (2015)). Indeed, the official legislative record demonstrates that AB 587 was
intended to pressure X Corp. to censor particular viewpoints on X that the government deems
undesirable, notwithstanding that such editorial “decisions about what content to include” on X are
“protected by the First Amendment.” O’Handley, 579 F. Supp. 3d at 1186–87, aff’d sub nom. on
other grounds, 62 F.4th 1145; see also Sorrell, 564 U.S. at 565 (applying “heightened scrutiny” to
speech clause of the First Amendment.”); Delano Farms Co. v. California Table Grape Com., 4 Cal. 5th
1204, 1221 (2018) (“[O]ur case law interpreting California’s free speech clause has given respectful
consideration to First Amendment case law for its persuasive value[.]”).
COMPLAINT Page 30
law intended to suppress speech “in conflict with the goals of the state”); Reed, 576 U.S. at
(“Because strict scrutiny applies either when a law is content based on its face or when the purpose
and justification for the law are content based, a court must evaluate each question before it
concludes that the law is content neutral and thus subject to a lower level of scrutiny.”).
65.
Here, as in Sorrell, the California legislature’s “expressed statement of purpose”
demonstrates that AB 587 “imposes burdens . . . aimed at a particular viewpoint.” Sorrell, 564 U.S.
at 565; see also id. at 578–79 (“[A] State’s failure to persuade does not allow it to hamstring the
opposition. The State may not burden the speech of others in order to tilt public debate in a
preferred direction.”). It is hornbook law that such “ideologically driven attempts to suppress a
particular point of view are presumptively unconstitutional.” Rosenberger v. Rector & Visitors of
Univ. of Virginia, 515 U.S. 819, 830 (1995) (internal quotation omitted).
66.
This, the First Amendment does not permit, absent a compelling state interest and
means that are narrowly tailored to achieve that interest, since “subtler forms of discrimination that
achieve identical results based on function or purpose” do not escape strict scrutiny. City of Austin,
142 S. Ct. at 1474 (citing Reed, 576 U.S. at 159–60, 163–64).
67.
AB 587 also violates the First Amendment of the United States Constitution and
Article I, Section 2, of the California Constitution because it would impermissibly interfere with
the constitutionally-protected editorial judgment of social media companies. See, e.g., Herbert,
441 U.S. at 174 (a law that “subjects the editorial process to private or official examination merely
to satisfy curiosity or to serve some general end such as the public interest . . . would not survive
constitutional scrutiny as the First Amendment is presently construed”); Tornillo, 418 U.S. at
(“The choice of material to go into a newspaper, and the decisions made as to limitations on the
size and content of the paper, and treatment of public issues and public officials — whether fair or
unfair — constitute the exercise of editorial control and judgment. It has yet to be demonstrated
COMPLAINT Page 31
how governmental regulation of this crucial process can be exercised consistent with First
Amendment guarantees of a free press as they have evolved to this time.”).
68.
There is a bona fide and actual controversy between X Corp. and Attorney General
Bonta because Attorney General Bonta is charged with enforcing, and intends to enforce, AB 587,
even though it violates the First Amendment to the United States Constitution and Article I, Section
2, of the California Constitution.
69.
X Corp. maintains that AB 587 is illegal and unconstitutional. Attorney General
Bonta claims otherwise.
70.
X Corp. requests a judicial determination regarding the validity of AB 587 to prevent
the harm caused by its enactment. Such a determination is both necessary and appropriate to avoid
the deprivation of X Corp.’s constitutional rights, which would occur if AB 587 is applied to X.
Corp.
71.
In light of the violation of the First Amendment to the United States Constitution
and Article I, Section 2, of the California Constitution, X Corp. seeks preliminary and permanent
injunctive relief against enforcement of AB 587. X Corp. would be irreparably harmed if it were
forced to comply with AB 587’s requirements and has no adequate remedy at law.
SECOND CAUSE OF ACTION
(Declaratory Relief and Preliminary and Permanent Injunctive Relief for Violation of the
Dormant Commerce Clause of the United States Constitution (42 U.S.C. § 1983))
72.
X Corp. realleges and incorporates herein by reference Paragraphs 1 through
73.
AB 587 violates the Dormant Commerce Clause of the United States Constitution
above.
because “the burden” it “impose[s] on interstate commerce” is “clearly excessive in relation to [its]
putative local benefits.” Nat’l Pork Producers Council v. Ross, 143 S. Ct. 1142, 1157 (2023)
(quoting Pike v. Bruce Church, Inc., 397 U.S. 137, 142 (1970)).
COMPLAINT Page 32
74.
AB 587 imposes significant burdens on interstate commerce because it “regulat[es]
[] activities that are inherently national or require a uniform system of regulation.” Nat’l Pork
Producers Council v. Ross, 6 F.4th 1021, 1031 (9th Cir. 2021), aff’d, 143 S. Ct. 1142 (internal
quotation omitted). Those burdens, moreover, are “clearly excessive in relation to the putative
local benefits.” Id. at 1026 (quoting Pike, 397 U.S. at 142). That is, AB 587 places burdens on
interstate commerce — by regulating global platform content — that far exceed the alleged benefit
of increased social media transparency for Californians.
75.
It was the intent of AB 587 and the likely effect to impact and change how content
is moderated on social media platforms nationwide, not only in California.
76.
There is a bona fide and actual controversy between X Corp. and Attorney General
Bonta because Attorney General Bonta is charged with enforcing, and intends to enforce, AB 587,
even though it violates the Dormant Commerce Clause of the United States Constitution.
77.
X Corp. maintains that AB 587 is illegal and unconstitutional. Attorney General
Bonta claims otherwise.
78.
X Corp. requests a judicial determination regarding the validity of AB 587 to prevent
the harm caused by its enactment. Such a determination is both necessary and appropriate to avoid
the deprivation of X Corp.’s constitutional rights, which would occur if AB 587 is applied to X.
Corp.
79.
In light of the violation of the Dormant Commerce Clause of the United States
Constitution, X Corp. seeks permanent injunctive relief against enforcement of AB 587. X Corp.
would be irreparably harmed if it were forced to comply with AB 587’s requirements and has no
adequate remedy at law.
COMPLAINT Page 33
THIRD CAUSE OF ACTION
(Declaratory Relief and Preliminary and Permanent Injunctive Relief for Immunity Under
and Preemption by 47 U.S.C. § 230(c)(2))
80.
X Corp. realleges and incorporates herein by reference Paragraphs 1 through
81.
47 U.S.C. § 230(c)(2) preempts AB 587 due to the direct conflict between them.
82.
AB 587 imposes liability on X Corp. with regard to how it moderates content on X.
above.
This directly contravenes the immunity granted under § 230(c)(2), which provides that “[n]o
provider or user of an interactive computer service shall be held liable on account” of “any action
voluntarily taken in good faith to restrict access to or availability of material that the provider or
user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise
objectionable, whether or not such material is constitutionally protected.”
83.
X is an “interactive computer service,” as that term is defined under Section
230(c)(2).
84.
The immunity provided by Section 230(c)(2) is very broad. It precludes liability for
“any action” taken voluntarily in good faith by an interactive computer service to restrict access to
certain content. AB 587 imposes such liability, however, if actions are taken by interactive
computer services to restrict access to content without certain requisite disclosures.
85.
The phrase “any action” in Section 230(c)(2) means “any action” — including any
actions taken in good faith to restrict access to content without the public disclosures mandated by
AB 587.
86.
AB 587 also contravenes the immunity provided to X Corp. by Section 230(c)(2)
because Attorney General Bonta may penalize X Corp. if he determines, in his unfettered discretion
as to what constitutes “misrepresent[ation]” and “reasonable, good faith,” that X Corp. is
COMPLAINT Page 34
“restrict[ing] access” to content in a way that, in Attorney General Bonta’s view, is contrary to X
Corp.’s promulgated content moderation policies.
87.
There is a bona fide and actual controversy between X Corp. and Attorney General
Bonta because Attorney General Bonta is charged with enforcing, and intends to enforce, AB 587,
even though such enforcement is precluded and preempted by 47 U.S.C. § 230(c)(2).
88.
General Bonta claims otherwise.
X Corp. maintains that AB 587 is invalid and void as a matter of law. Attorney
89.
X Corp. seeks a declaratory judgment that AB 587 is legally invalid and
unenforceable because it is precluded and preempted by 47 U.S.C. § 230(c)(2).
90.
In light of the violation of the Section 230(c)(2), X Corp. seeks preliminary and
permanent injunctive relief against enforcement of AB 587. X Corp. would be irreparably harmed
if it were forced to comply with, or litigate, AB 587’s requirements and has no adequate remedy at
law.
PRAYER FOR RELIEF
WHEREFORE, X Corp. respectfully requests that this Court enter judgment in X Corp.’s
favor and grant the following relief:
1.
A declaration that AB 587 violates the First Amendment of the United States
Constitution and Article I, Section 2, of the California Constitution;
2.
A declaration that AB 587 violates the Dormant Commerce Clause of the United
States Constitution;
3.
A declaration that the imposition of civil penalties under AB 587 is precluded and
preempted by 47 U.S.C. § 230(c)(2) and are therefore null and void and have no legal effect;
4.
A preliminary and permanent injunction enjoining Attorney General Bonta from
enforcing AB 587 against X Corp.;
COMPLAINT Page 35
5.
An award of fees, costs, expenses, and disbursements, including attorneys’ fees, to
which X Corp. is entitled pursuant to 42 U.S.C. § 1988 and other applicable law; and
6.
Such other and further relief as the Court deems just and proper.
DEMAND FOR JURY TRIAL
Pursuant to Federal Rule of Civil Procedure 38, X Corp. demands a trial by jury in this
action of all issues so triable.
Dated:
September 8,
By: /s/ William R. Warne
DOWNEY BRAND LLP
William R. Warne (SBN 141280)
Meghan M. Baker (SBN 243765)
621 Capitol Mall, 18th Floor
Sacramento, CA Phone: 916-444-Facsimile: 916-444-
CAHILL GORDON & REINDEL LLP
Joel Kurtzberg (pro hac vice pending)
Floyd Abrams (pro hac vice pending)
Jason Rozbruch (pro hac vice pending)
Lisa J. Cole (pro hac vice pending)
32 Old Slip
New York, NY Phone: 212-701-Facsimile: 212-269-jkurtzberg@cahill.com
COMPLAINT
PDF Page 1
PlainSite Cover Page
PDF Page 2
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 1 of 35
1
2
3
4
5
6
7
8
9
10
CAHILL GORDON & REINDEL LLP
JOEL KURTZBERG (pro hac vice pending, SBN NY 1758184)
FLOYD ABRAMS (pro hac vice pending, SBN NY 2835007)
JASON ROZBRUCH (pro hac vice pending, SBN NY 5753637)
LISA J. COLE (pro hac vice pending, SBN NY 5927744)
32 Old Slip
New York, New York 10005
Phone: 212-701-3120
Facsimile: 212-269-5420
jkurtzberg@cahill.com
DOWNEY BRAND LLP
WILLIAM R. WARNE (Bar No. 141280)
bwarne@downeybrand.com
MEGHAN M. BAKER (Bar No. 243765)
mbaker@downeybrand.com
621 Capitol Mall, 18th Floor
Sacramento, California 95814
Telephone: 916.444.1000
Facsimile: 916.444.2100
Attorneys for Plaintiff X Corp.
11
UNITED STATES DISTRICT COURT
12
EASTERN DISTRICT OF CALIFORNIA
13
SACRAMENTO DIVISION
14
15
X CORP.,
Case No.
Plaintiff,
16
v.
17
18
COMPLAINT FOR DECLARATORY AND
INJUNCTIVE RELIEF
ROBERT A. BONTA, Attorney General of
California, in his official capacity,
DEMAND FOR JURY TRIAL
Defendant.
19
20
Plaintiff X Corp., by and through its attorneys, Cahill Gordon & Reindel LLP and Downey
21
Brand LLP, alleges for its complaint against Defendant ROBERT A. BONTA, in his official
22
capacity as Attorney General of California, as follows:
23
24
COMPLAINT
1
PDF Page 3
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 2 of 35
1
2
NATURE OF THE ACTION
1.
Plaintiff X Corp. brings this action challenging the constitutionality and legal
3
validity of California Assembly Bill No. 587 (“AB 587”), which is codified in law at Cal. Bus. &
4
Prof. Code §§ 22675–22681.
5
2.
AB 587 requires large social media companies like X Corp. to (1) post terms of
6
service dictated by the government and include terms about how content is moderated on their
7
platforms (the “Terms of Service Requirement”) and (2) submit, on a semi-annual basis, to the
8
California Attorney General a “terms of service report” that includes, among other things, (a) “a
9
detailed description of content moderation practices used by the social media company for that
10
platform”; (b) information about whether, and if so how, the social media company defines and
11
moderates (i) hate speech or racism, (ii) extremism or radicalization, (iii) disinformation or
12
misinformation, (iv) harassment, and (v) foreign political interference; as well as (c) information
13
and statistics about actions taken by the social media company to moderate these categories of
14
content (the “Terms of Service Report”).
15
3.
AB 587 violates the First Amendment of the United States Constitution and Article
16
I, Section 2, of the California Constitution because it compels companies like X Corp. to engage in
17
speech against their will, impermissibly interferes with the constitutionally-protected editorial
18
judgments of companies such as X Corp., has both the purpose and likely effect of pressuring
19
companies such as X Corp. to remove, demonetize, or deprioritize constitutionally-protected speech
20
that the State deems undesirable or harmful, and places an unjustified and undue burden on social
21
media companies such as X Corp.
22
4.
The State of California touts AB 587 as a mere “transparency measure” under which
23
certain social media companies must make their content moderation policies and statistics publicly
24
available. See Press Release, Governor Newsom Signs Nation-Leading Social Media Transparency
COMPLAINT
2
PDF Page 4
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 3 of 35
1
Measure (Sept. 13, 2022), https://www.gov.ca.gov/2022/09/13/governor-newsom-signs-nation-
2
leading-social-media-transparency-measure/. Yet, a review of the law’s purpose and likely effect
3
— as evidenced by the legislative history and statements from AB 587’s author, sponsors, and
4
supporters — demonstrates otherwise. As made clear by both the legislative history and public
5
court submissions from the Attorney General in defending the law, the true intent of AB 587 is to
6
pressure social media platforms to “eliminate” certain constitutionally-protected content viewed by
7
the State as problematic. Ex. 1 (Cal. Assemb. Comm. on Judiciary Report, 2021–22 Sess. (AB
8
587), Apr. 27, 2021) at 4 (“if social media companies are forced to disclose what they do in this
9
regard [i.e., how they moderate online content], it may pressure them to become better corporate
10
citizens by doing more to eliminate hate speech and disinformation.”); Mot. to Dismiss at 15–16,
11
Minds, Inc., et al. v. Bonta, No. 23-cv-2705 (ECF 23-1) (C.D. Cal. May 25, 2023) (“[T]he
12
Legislature also considered that, by requiring greater transparency about platforms’ content-
13
moderation rules and decisions, AB 587 may result in public pressure on social media companies
14
to ‘become better corporate citizens by doing more to eliminate hate speech and disinformation’ on
15
their platforms. . . . This, too, is a substantial state interest.”). AB 587 is, according to the law’s
16
lead author, Assembly Member Jesse Gabriel, an “important first step in protecting our democracy
17
from the dangerously divisive content that has become all too common on social media.” Ex. 1 at
18
4. The legislative record is crystal clear that one of the main purposes of AB 587 — if not the main
19
purpose — is to pressure social media companies to eliminate or minimize content that the
20
government has deemed objectionable.
21
5.
The topics that AB 587 forces social media platforms to speak about against their
22
will are highly controversial and politically charged. As the legislative history acknowledges, the
23
categories of speech on which AB 587 focuses are those that are difficult to define because their
24
boundaries are “often fraught with political bias.” Ex. 2 (Cal. Assemb. Comm. on Privacy and
COMPLAINT
3
PDF Page 5
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 4 of 35
1
Consumer Protection Report, 2021–22 Sess. (AB 587), Apr. 22, 2021) at 4. And social media
2
companies are frequently criticized, no matter what they do, by individuals on both sides of the
3
political aisle, for their editorial decisions about speech that arguably falls into these ill-defined
4
categories. The Assembly Reports from the Committee on Privacy and Consumer Protection
5
describe the “complex dilemma” that social media companies increasingly find themselves in when
6
trying to define and moderate the politically-charged and controversial categories of content that
7
are the focus of AB 587:
8
14
As online social media become increasingly central to the public discourse, the
companies responsible for managing social media platforms are faced with a
complex dilemma regarding content moderation, i.e., how the platforms determine
what content warrants disciplinary action such as removal of the item or banning of
the user. In broad terms, there is a general public consensus that certain types of
content, such as child pornography, depictions of graphic violence, emotional abuse,
and threats of physical harm are undesirable, and should be mitigated on these
platforms to the extent possible. Many other categories of information, however,
such as hate speech, racism, extremism, misinformation, political interference,
and harassment [i.e., the categories that are the focus of AB 587], are far more
difficult to reliably define, and assignment of their boundaries is often fraught
with political bias. In such cases, both action and inaction by these companies
seems to be equally maligned: too much moderation and accusations of
censorship and suppressed speech arise; too little, and the platform risks fostering
a toxic, sometimes dangerous community.
15
Id. (emphasis added); see also Ex. 3 (Cal. Assemb. Floor Analysis, 2021–22 Sess. (AB 587), Apr.
16
28, 2021) at 1–2 (same) (emphasis added); Ex. 4 (Cal. Assemb. Floor Analysis, 2021–22 Sess. (AB
17
587), Aug. 24, 2022) at 2 (same) (emphasis added).
9
10
11
12
13
18
6.
In other words, AB 587 seeks to force social media companies to provide the
19
Attorney General and the public detailed information about how, if at all, they define and moderate
20
the boundaries of the most controversial categories of content — i.e., those categories with no
21
“general public consensus” about how to define or moderate them because their boundaries are
22
“fraught with political bias” and are “difficult to reliably define” — and provide detailed
23
information about what actions they have or have not taken in regulating those controversial
24
categories, even though action or inaction “seems to be equally maligned” by members of the
COMPLAINT
4
PDF Page 6
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 5 of 35
1
public, depending on their political viewpoint. Put another way, through AB 587, the State is
2
compelling social media companies to take public positions on controversial and politically-
3
charged issues. And, because X Corp. must take such positions on these topics as they are
4
formulated by the State, X Corp. is being forced to adopt the State’s politically-charged terms,
5
which is a form of compelled speech in and of itself.
6
7.
AB 587 thus mandates X Corp. to speak about sensitive, controversial topics about
7
which it does not wish to speak in the hopes of pressuring X Corp. to limit constitutionally-protected
8
content on its platform that the State apparently finds objectionable or undesirable. This violates
9
the free speech rights granted to X Corp. under the First Amendment to the United States
10
11
Constitution and Article I, Section 2, of the California Constitution.
8.
The First Amendment affords X Corp. “both the right to speak freely and the right
12
to refrain from speaking at all,” Wooley v. Maynard, 430 U.S. 705, 714 (1977), because “[t]he right
13
to speak and the right to refrain from speaking are complementary components of the broader
14
concept of ‘individual freedom of mind.’” Id.; Riley v. Nat’l Fed’n of the Blind, 487 U.S. 781, 797
15
(1988). See also Agency for Int’l Dev. v. Alliance for Open Soc’y Int’l, Inc., 570 U.S. 205, 213
16
(2013) (the First Amendment “prohibits the government from telling people what they must say”).
17
These protections against compelled speech apply equally to “compelled statements of ‘fact’” and
18
“compelled statements of opinion” because “either form of compulsion burdens protected speech.”
19
Riley, 487 U.S. at 797–98. AB 587 violates X Corp.’s First Amendment right to not speak about
20
controversial topics and to decide for itself what it will say or not say about these topics.
21
9.
Moreover, “[l]ike a newspaper or a news network,” X Corp.’s “decisions about what
22
content to include, exclude, moderate, filter, label, restrict, or promote” on its platform are
23
“protected by the First Amendment,” O’Handley v. Padilla, 579 F. Supp. 3d 1163, 1186–87 (N.D.
24
Cal. 2022), aff’d sub nom. on other grounds, O’Handley v. Weber, 62 F.4th 1145 (9th Cir. 2023).
COMPLAINT
5
PDF Page 7
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 6 of 35
1
AB 587 also violates the First Amendment because it has the purpose and intended chilling effect
2
of pressuring social media companies into limiting or censoring constitutionally-protected content
3
that the State finds objectionable. AB 587 would also impermissibly inject the government into X
4
Corp.’s editorial judgments — e.g., by dictating the contents of X Corp.’s Terms of Service and
5
compelling controversial disclosures about how X Corp. moderates content on its platform as part
6
of an effort to pressure social media companies to regulate content in a manner desired by the State,
7
as opposed to allowing the social media companies to regulate content on their platforms as they
8
deem fit.
9
10.
The First Amendment unequivocally prohibits this kind of interference with a
10
traditional publisher’s editorial judgment. For example, if a law were to require newspapers to
11
promulgate Terms of Service that disclose (i) a time by which they will respond to requests to
12
publish letters to the editor or opinion pieces; (ii) detailed disclosures about their criteria for
13
publication and statistics about the bases for decisions regarding whether to publish letters to the
14
editor or opinion pieces; and (iii) statistics about how many submissions were accepted and rejected
15
on the ground that they contained “hate speech” or “misinformation,” it would undoubtedly violate
16
the First Amendment because it would impermissibly interfere with the constitutionally-protected
17
editorial judgment of newspapers. See, e.g., Herbert v. Lando, 441 U.S. 153, 174 (1979) (a law
18
that “subjects the editorial process to private or official examination merely to satisfy curiosity or
19
to serve some general end such as the public interest . . . would not survive constitutional scrutiny
20
as the First Amendment is presently construed”); id. at 172 (concluding that “if inquiry into editorial
21
conclusions threatens the suppression . . . of truthful information,” it raises First Amendment
22
problems); Miami Herald Pub. Co. v. Tornillo, 418 U.S. 241, 258 (1974) (“The choice of material
23
to go into a newspaper, and the decisions made as to limitations on the size and content of the paper,
24
and treatment of public issues and public officials — whether fair or unfair — constitute the
COMPLAINT
6
PDF Page 8
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 7 of 35
1
exercise of editorial control and judgment. It has yet to be demonstrated how governmental
2
regulation of this crucial process can be exercised consistent with First Amendment guarantees of
3
a free press as they have evolved to this time.”).
4
11.
That X is a social media platform does not change the analysis. 1 These core First
5
Amendment principles prohibit the government from interfering with the right of private parties,
6
like X Corp., to exercise “editorial control over speech and speakers on their properties or
7
platforms.” Manhattan Cmty. Access Corp. v. Halleck, 139 S. Ct. 1921, 1932 (2019). And the
8
U.S. Supreme Court has made clear that the right to choose whether to speak and how to tailor
9
one’s speech is not “restricted to the press.” Hurley v. Irish-Am. Gay, Lesbian & Bisexual Grp. of
10
Boston, 515 U.S. 557, 574 (1995). This fundamental right applies equally to “business corporations
11
generally” and “ordinary people engaged in unsophisticated expression as well as professional
12
publishers.” Id. A private social media company’s editorial judgment about how to regulate
13
content on its platform and what to say (or not to say) about its regulation of content is therefore
14
fully protected under the First Amendment. Id.; see also U.S. Telecom Ass’n v. FCC, 855 F.3d
15
381, 435 (D.C. Cir. 2017) (Kavanaugh, J., dissenting from denial of rehearing en banc) (the
16
government may not “tell Twitter or YouTube what videos to post” or “tell Facebook or Google
17
what content to favor” any more than it may “tell The Washington Post or the Drudge Report what
18
columns to carry”); Tornillo, 418 U.S. at 258 (editorial control and judgment protected from
19
government regulation by First Amendment).
20
12.
AB 587 also imposes tremendously burdensome requirements on social media
21
companies, requiring them to keep records about potentially hundreds of millions of content
22
moderation decisions made on a daily basis. Ex. 2 at 4. Worse yet, it threatens draconian financial
23
penalties of up to $15,000 per violation per day if compliance is not made in “reasonable, good
24
1
X Corp. provides the X service (“X,” formerly the platform referred to as Twitter).
COMPLAINT
7
PDF Page 9
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 8 of 35
1
faith,” a term that the statute does not define and that gives the Attorney General nearly unfettered
2
discretion to threaten to impose draconian fines if social media companies’ content moderation
3
policies are not to the State’s liking. Given the broad enforcement powers granted to the Attorney
4
General under Cal. Gov’t Code §§ 11180–81, AB 587 empowers the Attorney General to issue
5
civil investigative demands to social media companies about their content moderation policies and
6
practices to determine if they have complied with the statute in “reasonable, good faith.” This
7
broad, unfettered discretion to scrutinize the editorial judgments of social media companies
8
empowers the Attorney General to use his enforcement powers to pressure social media companies
9
to regulate content in ways that the government wants — which is one of the stated purposes of the
10
law.
11
13.
Concerns about the potential use of AB 587 in this manner are not speculative:
12
Attorney General Bonta has already written to X Corp. (and other social media companies)
13
threatening that “[t]he California Department of Justice will not hesitate to enforce [AB 587],”
14
while in the same proverbial breath reminding the companies of their “responsibility” to combat
15
what the Attorney General views as the “dissemination of disinformation that interferes with our
16
electoral system.” Letter from Attorney General Robert Bonta to Twitter, Inc., et al., 4 (Nov. 3,
17
2022),
18
docs/Election%20Disinformation%20and%20Political%20Violence.pdf.
19
https://oag.ca.gov/system/files/attachments/press-
14.
AB 587 also violates the Dormant Commerce Clause. By failing to restrict its
20
extensive reporting requirements to information about Californians, AB 587 places an undue,
21
excessive burden on interstate commerce. That failure is by design — indeed, as bill author Jesse
22
Gabriel’s press release upon the passage of AB 587 made clear, the goal is for AB 587 to have
23
24
COMPLAINT
8
PDF Page 10
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 9 of 35
1
“national implications.”2 Thus, the law will not just impact content moderation of California users
2
but is intended to and will impact content moderation nationwide.
3
15.
Moreover, AB 587 directly contravenes the immunity provided for providers and
4
users of interactive computer services set forth by 47 U.S.C. § 230(c)(2), which prohibits liability
5
“on account” of “any action voluntarily taken in good faith to restrict access to or availability of
6
material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively
7
violent, harassing, or otherwise objectionable, whether or not such material is constitutionally
8
protected.” This is because, first, AB 587 imposes civil liability on social media companies that
9
are covered by the statute if they attempt to “restrict access to or availability of material that the
10
[social media company] or user considers to be obscene, lewd, lascivious, filthy, excessively
11
violent, harassing, or otherwise objectionable” without making the disclosures required by AB 587.
12
The immunity afforded by Section 230(c)(2) is broad — it covers “any action” voluntarily taken in
13
good faith to regulate objectionable content. Because AB 587 imposes liability on such actions if
14
they are taken without the required disclosures, AB 587 is preempted by the broad immunity
15
afforded by Section 230(c)(2).
16
16.
AB 587 also contravenes the immunity provided to X Corp. by Section 230(c)(2)
17
for the additional reason that Attorney General Bonta may penalize X Corp. — under his unfettered
18
discretion as to what constitutes “misrepresent[ation]” and “reasonable, good faith” — if X Corp.
19
were to “restrict access” to content in a way that, in Attorney General Bonta’s view, is contrary to
20
X Corp.’s promulgated content moderation policies.
21
22
17.
In pursuing this action, X Corp. seeks declaratory relief and preliminary and
permanent injunctive relief on the grounds that AB 587 (i) violates X Corp.’s free speech rights
23
2
24
Press Release, Office of Assembly member Jesse Gabriel, After Two-Year Fight, Governor Newsom Signs
Landmark Social Media Transparency Bill (Sept. 13, 2021) https://a46.asmdc.org/press-releases/20220913after-two-year-fight-governor-newsom-signs-landmark-social-media.
COMPLAINT
9
PDF Page 11
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 10 of 35
1
under the First Amendment to the United States Constitution and Article I, Section 2, of the
2
California Constitution; (ii) violates the Dormant Commerce Clause of the United States
3
Constitution; and (iii) directly conflicts with, and is thus preempted by, the immunity afforded by
4
47 U.S.C. § 230(c)(2).
5
18.
In pursuing this action, X Corp. seeks to vindicate the deprivation of constitutional
6
rights under color of state statute, ordinance, regulation, custom, and/or usage. X Corp. is also
7
entitled to attorneys’ fees and costs if it prevails on any of its § 1983 claims. See 42 U.S.C. § 1988.
8
PARTIES
9
19.
Plaintiff X Corp. is a corporation organized and existing under the laws of the State
10
of Nevada, with its principal place of business in San Francisco, California. X Corp. is successor
11
in interest to Twitter, Inc. X Corp. provides the X service. X is a real-time, open, public
12
conversation platform, where people can see every side of a topic, discover news, share their
13
perspectives, and engage in discussion and debate. X allows people to create, distribute, and
14
discover content and has democratized content creation and distribution. X allows users to create
15
and share ideas and information instantly through various product features, including public posts.
16
20.
AB 587 applies to X Corp. because it is a “social media company,” as defined in the
17
statute — i.e., an “entity that owns or operates one or more social media platforms” — and has
18
generated more than $100 million in gross revenue in the preceding calendar year.
19
21.
X is a “social media platform,” as defined by AB 587, because it is a public internet-
20
based service or application with users in California and (1) “[a] substantial function of the service
21
or application is to connect users in order to interact socially with each other within the service or
22
application” and (2) it allows its users to (a) “construct a public or semipublic profile for purposes
23
of signing into and using the service or application”; (b) “[p]opulate a list of other users with whom
24
an individual shares a social connection within the system”; and (c) “[c]reate or post content
COMPLAINT
10
PDF Page 12
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 11 of 35
1
viewable by other users, including but not limited to, on message boards, in chat rooms, or through
2
a landing page or main feed that presents the user with content generated by other users.”
3
22.
Defendant Robert Bonta is the Attorney General of the State of California and is
4
charged with enforcing AB 587. X Corp. sues Attorney General Bonta in his official capacity as
5
the person charged with enforcing AB 587.
6
JURISDICTION
23.
7
This Court has jurisdiction over X Corp.’s federal claims pursuant to 28 U.S.C.
8
§§ 1331 and 1343(a) and 42 U.S.C. § 1983 because X Corp. alleges violations of its rights under
9
the Constitution and laws of the United States. The Court has jurisdiction over X Corp.’s state
10
claim pursuant to 28 U.S.C. § 1367.
24.
11
This Court has authority to grant declaratory and injunctive relief under the
12
Declaratory Judgment Act, 28 U.S.C. §§ 2201, 2202, and under the Court’s inherent equitable
13
jurisdiction.
VENUE
14
25.
15
Venue is proper in this Court under 28 U.S.C. § 1391(b)(1), (2) because the
16
Defendant is located, resides, and has offices in this judicial district and in the State of California,
17
and the violations of X Corp.’s rights are occurring and will occur within this judicial district. AB
18
587 was also enacted in this judicial district.
FACTUAL ALLEGATIONS
19
I.
20
26.
21
22
23
24
AB 587’s Statutory Scheme
AB 587, which applies to “social media companies” (defined as persons or entities
owning or operating one or more “social media platforms”3), has three main components — (i) a
3
A “social media platform” is a “public or semipublic internet-based service or application that has users in
California” (i) for which “[a] substantial function of the service or application is to connect users in order to
allow users to interact socially with each other within the service or application,” or (ii) allows users to (a)
“[c]onstruct a public or semipublic profile for purposes of signing into and using the service or application,”
COMPLAINT
11
PDF Page 13
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 12 of 35
1
requirement that social media companies publicly post their terms of service, including processes
2
for flagging content and potential actions that may be taken with respect to flagged content (“Terms
3
of Service Requirement”), see § 22676; (ii) a requirement that social media companies submit to
4
Attorney General Bonta, who will in turn disseminate publicly a report including, among other
5
things, whether and, if so, how they define hate speech, racism, extremism, radicalization,
6
disinformation, misinformation, harassment, and foreign political interference (“Terms of Service
7
Report”), see § 22677; and (iii) a penalty provision, whereby companies may be liable to pay
8
$15,000 per violation per day and may be sued in court for failing to make a “reasonable, good
9
faith attempt” to comply with AB 587’s requirements, see § 22678.
10
11
a. Terms of Service Requirement
27.
AB 587’s Terms of Service Requirement mandates that social media companies
12
publicly post, “in a manner reasonably designed to inform all users” of its “existence and contents,”
13
their platforms’ terms of service. § 22676(a). Those terms of service must include (i) “contact
14
information,” so that users may “ask the social media company questions about” the terms, (ii) a
15
“description of the process that users must follow to flag content, groups, or other users that they
16
believe violate the terms of service,” as well as “the social media company’s commitments on
17
response and resolution time,” and (iii) a “list of potential actions the social media company may
18
take against an item of content or a user, including, but not limited to, removal, demonetization,
19
deprioritization, or banning.” § 22676(b).
20
28.
The Terms of Service Requirement is an impermissible attempt by the State to inject
21
itself into the content moderation editorial process. By requiring Terms of Service that contain
22
certain provisions about how content is moderated, the State is requiring social media companies
23
24
(b) “[p]opulate a list of other users with whom an individual shares a social connection within the system,”
or (c) “[c]reate or post content viewable by other users.” § 22675(e).
COMPLAINT
12
PDF Page 14
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 13 of 35
1
that are covered by the statute to moderate content in ways that the State deems appropriate, rather
2
than leaving those constitutionally-protected editorial choices to the social media companies. The
3
requirements may not mandate that a particular process or methodology be used, but they do force
4
social media companies to make certain disclosures about how the content moderation process
5
works, how users can flag objectionable content, how quickly the companies will respond to
6
flagged content, and what actions the companies may take with respect to potentially objectionable
7
content.
8
9
29.
Article I, Section 2, of the California Constitution.
10
11
These forced disclosures violate the First Amendment of the U.S. Constitution and
b. Terms of Service Report
30.
AB 587 requires social media companies to submit bi-annual Terms of Service
12
Reports, on April 1 and October 1 of each year, to Attorney General Bonta, who will then make
13
them publicly available. The Report must include:
14
a. A statement of whether, and if so how, the platform’s terms of use define hate speech, racism,
15
extremism, radicalization, disinformation, misinformation, harassment, and foreign political
16
interference, § 22677(a)(3);
17
18
19
20
21
22
23
24
b. The current version of the terms of service and a “complete and detailed description of any
changes” to the terms since any previous report, § 22677(a)(2);
c. A “detailed description” of the platform’s “content moderation practices,” including, but not
limited to:
i. Any “existing policies intended to address the categories of content described in
[§ 22677(a)(3)],” § 22677(a)(4)(A);
ii. How “automated content moderation systems enforce” the platform’s terms of service
and “when these systems involve human review,” § 22677(a)(4)(B);
COMPLAINT
13
PDF Page 15
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 14 of 35
1
2
iii. How the “company responds to user reports of violations of the terms of service,”
§ 22677(a)(4)(C); and
3
iv. How the “company would remove individual pieces of content, users, or groups that
4
violate the terms of service, or take broader action against” individual or groups of
5
users “that violate the terms of service,” § 22677(a)(4)(D);
6
d. Information regarding “content that was flagged by the social media company as content
7
belonging to any of the categories described in [§ 22677(a)(3)]” including:
8
i. The “total number of flagged items of content,” § 22677(a)(5)(A)(i);
9
ii. The “total number of actioned items of content,” § 22677(a)(5)(A)(ii);
10
iii. The “total number of actioned items of content that resulted in action taken by the
11
social media company against the user or group of users responsible for the content,”
12
§ 22677(a)(5)(A)(iii);
13
14
15
16
17
18
iv. The “total number of actioned items of content that were removed, demonetized, or
deprioritized” by the company, § 22677(a)(5)(A)(iv);
v. The “number of times actioned items of content were viewed by users,”
§ 22677(a)(5)(A)(v);
vi. The “number of times actioned items of content were shared, and the number of users
that viewed the content before it was actioned,” § 22677(a)(5)(A)(vi); and
19
vii. The “number of times users appealed” company actions “taken on that platform and
20
the number of reversals of social media company actions on appeal disaggregated by
21
each type of action,” § 22677(a)(5)(A)(vii);
22
23
24
e. All of the information required by § 22677(a)(5)(A) “disaggregated” by:
i. The “category of content, including any relevant categories described in [§
22677(a)(3)],” § 22677(a)(5)(B)(i);
COMPLAINT
14
PDF Page 16
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 15 of 35
1
ii. The “type of content” (e.g., “posts, comments, messages”), § 22677(a)(5)(B)(ii);
2
iii. The “type of media of the content” (e.g., “text, images, and videos,”),
3
§ 22677(a)(5)(B)(iii);
4
iv. How “the content was flagged” (e.g., “flagged by company employees or contractors,
5
flagged by artificial intelligence software, flagged by community moderators, flagged
6
by civil society partners, and flagged by users”), § 22677(a)(5)(B)(iv); and
7
v. How “the content was actioned” (e.g., “actioned by company employees or contractors,
8
actioned by artificial intelligence software, actioned by community moderators,
9
actioned by civil society partners, and actioned by users”), § 22677(a)(5)(B)(v).
10
31.
The Terms of Service Report is also an impermissible attempt by the State to inject
11
itself into the content moderation editorial process. By requiring detailed disclosures about how
12
controversial content is moderated, the State is impermissibly compelling social media companies
13
to make disclosures about how their editorial processes work. This will inevitably lead to
14
controversy that will have been generated by the way the State has required the debate to have been
15
framed. By compelling social media companies to make politically-charged disclosures about
16
content moderation, the State is impermissibly trying to generate public controversy about content
17
moderation in a way that will pressure social media companies, such as X Corp., to restrict, limit,
18
disfavor, or censor certain constitutionally-protected content on X that the State dislikes.
19
20
21
22
32.
These forced disclosures violate the First Amendment of the U.S. Constitution and
Article I, Section 2, of the California Constitution.
c.
Penalties
33.
AB 587 also sets forth a penalty scheme under which social media companies may
23
be fined $15,000 per violation per day, and may be enjoined in any court of competent jurisdiction
24
by Attorney General Bonta or by a “city attorney,” if the company (i) “[f]ails to post terms of
COMPLAINT
15
PDF Page 17
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 16 of 35
1
service in accordance with Section 22676,” (ii) “[f]ails to timely submit” a Terms of Service Report,
2
or (iii) “[m]aterially omits or misrepresents required information in a” Terms of Service Report.
3
§ 22678. AB 587’s penalty provision further instructs that the court shall, “[i]n assessing the
4
amount of a civil penalty pursuant to paragraph . . . consider whether the social media company has
5
made a reasonable, good faith attempt to comply with the provisions of this chapter.” Id.
6
II.
The Topics About Which AB 587 Forces X Corp. to Speak are Extremely
Controversial and Inherently Subject to Disagreement
7
34.
As the legislative history makes clear, AB 587 compels disclosure from social media
8
companies about the most controversial types of content moderation. While there may be a general
9
public consensus that some types of constitutionally-unprotected content (e.g., child pornography
10
or threats of physical harm) should be limited on social media platforms to the extent possible, AB
11
587 focuses primarily on categories of content (e.g., hate speech, racism, extremism,
12
misinformation, political interference, and harassment) for which there is no such “general public
13
consensus.” Ex. 2 at 4; see also Ex. 3 at 1–2 (same); Ex. 4 at 2 (same).
14
35.
How to define and regulate these categories of content is a politically-charged and
15
controversial undertaking. That is because, as the California Assembly Committee on Privacy and
16
Consumer Protection Report about AB 587 candidly acknowledged, these categories are “far more
17
difficult to reliably define, and assignment of their boundaries is often fraught with political bias.”
18
Ex. 2 at 4; see also Ex. 3 at 1–2 (same); Ex. 4 at 2 (same).
19
36.
As a result, “the companies responsible for managing social media platforms are
20
faced with a complex dilemma regarding content moderation.” Id. No matter what they do when
21
regulating these controversial categories of content, they are likely to be criticized. “In such cases,
22
both action and inaction by these companies seems to be equally maligned: too much moderation
23
and accusations of censorship and suppressed speech arise; too little, and the platform risks
24
fostering a toxic, sometimes dangerous community.” Id.
COMPLAINT
16
PDF Page 18
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 17 of 35
1
37.
Because these controversial categories are “difficult to reliably define” and
2
“assignment of their boundaries is often fraught with political bias,” id., decisions about how to
3
moderate such content are inherently political. As a result, views about whether there is too much
4
or too little moderation of these types of content are controversial political questions.
5
38.
6
questions can be:
7
To take just a few examples of how controversial and politically-charged these
Some people view speech intentionally misgendering a transgender individual as
8
“hate speech” and harassment. See National Institute of Health, Gender Pronouns
9
Resource, U.S. Department of Health and Human Services (Mar. 8, 2023),
10
https://dpcpsi.nih.gov/sgmro/gender-pronouns-
11
resource#:~:text=Being%20misgendered%20(i.e.%2C%20being%20referred,violati
12
on%20of%20one's%20civil%20rights (“Being misgendered (i.e., being referred to
13
with incorrect pronouns) can be an extremely hurtful and invalidating experience.
14
Intentional refusal to use someone’s correct pronouns is equivalent to harassment and
15
a violation of one’s civil rights.”); Gender Expression Non-Discrimination Act,
16
S.1047/A.747 (N.Y. 2019). Others insist that forcing someone to call a transgender
17
individual by their preferred pronouns violates their deeply-held beliefs about what is
18
true. See Khorri Atkinson, Fight Over Transgender Pronouns at Work Faces Muddy
19
Legal
20
https://news.bloomberglaw.com/daily-labor-report/fight-over-transgender-pronouns-
21
at-work-faces-muddy-legal-waters (“[T]here’s been a steady increase in internal
22
complaints from workers who say their religious beliefs prevent them from calling a
23
transgender person by their desired name or pronoun[.]”).
Waters,
Bloomberg
24
COMPLAINT
17
Law
(Apr.
13,
2023),
PDF Page 19
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 18 of 35
1
Political debates rage about whether and when criticism of Israel can be considered
2
anti-Semitic hate speech. See A Guide to Recognizing When Anti-Israel Actions
3
Become Antisemitic, American Jewish Committee (last accessed Sept. 8, 2023)
4
https://www.ajc.org/sites/default/files/pdf/2021-
5
10/A%20Guide%20to%20Recognizing%20When%20Anti-
6
Israel%20Actions%20Become%20Antisemitic.pdf (“Sometimes antisemitism is not
7
easy to recognize—especially when it involves Israel”); Is Criticism of Israel
8
Antisemitic?,
9
https://www.annefrank.org/en/topics/antisemitism/all-criticism-israel-antisemitic/
Anne
Frank
House
(last
accessed
Sept.
8,
2023)
10
(“Criticism of Israel or of the policies of the Israeli government is not automatically
11
antisemitic.”).
12
Some commentators have defined the term “racism” to apply only when
13
discrimination based on race is directed at traditionally disadvantaged groups. See
14
The Myth of Reverse Racism, Alberta Civil Liberties Research Centre (last accessed
15
Sept. 8, 2023) https://www.aclrc.com/myth-of-reverse-racism (“While assumptions
16
and stereotypes about white people do exist, this is considered racial prejudice, not
17
racism. . . . [It] is not considered racism because of the systemic relationship to
18
power.”). Others have argued that the term “racism” should apply when there is
19
discrimination based on race directed at any group. See Can White People Experience
20
Racism?, The Economist (Sept. 18, 2018) https://www.economist.com/open-
21
future/2018/09/18/can-white-people-experience-racism; Michelle Gao, Who Can be
22
‘Racist’?,
23
https://www.thecrimson.com/column/between-the-lines/article/2018/8/10/gao-who-
24
can-be-racist/.
COMPLAINT
The
Harvard
Crimson
18
(Aug.
10,
2018)
PDF Page 20
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 19 of 35
1
In March 2020, many commentators insisted that it was “disinformation” to suggest
2
that the COVID-19 virus originated in a lab in Wuhan, China. See Marlette Vazquez,
3
Calling COVID-19 the “Wuhan Virus” or “China Virus” is Inaccurate and
4
Xenophobic,
5
https://medicine.yale.edu/news-article/calling-covid-19-the-wuhan-virus-or-china-
6
virus-is-inaccurate-and-xenophobic/. Others insisted that suppressing such views was
7
tantamount to censorship. Michael Shellenberger Testimony to the House Select
8
Committee on the Weaponization of the Federal Government, The Censorship
9
Industrial Complex: U.S. Government Support For Domestic Censorship And
Yale
School
of
Medicine
(Mar.
12,
10
Disinformation
11
https://judiciary.house.gov/sites/evo-subsites/republicans-
12
judiciary.house.gov/files/evo-media-document/shellenberger-testimony.pdf.
13
39.
Campaigns,
2016
-
2022
(Mar.
9,
2020),
2023)
As these examples make clear, AB 587 focuses on the most controversial and
14
politically-charged categories of content moderation. It forces social media companies to speak
15
publicly and take a position on these controversial topics, notwithstanding that doing so will almost
16
always result in public criticism from one political group or another. As the California Assembly’s
17
Committee on Privacy and Consumer Protection Report makes clear, “both action and inaction by
18
these [social media] companies [in regulating these controversial categories of content] seems to
19
be equally maligned.” Ex. 2 at 4; see also Ex. 3 at 1–2 (same); Ex. 4 at 2 (same).
20
40.
The problem with this from a First Amendment standpoint is obvious. AB 587
21
forces X Corp., under threat of substantial civil penalty, to publish whether and, if so, how it defines,
22
and whether and, if so, how it moderates, (i) hate speech, (ii) racism, (iii) extremism, (iv)
23
radicalization, (v) disinformation, (vi) misinformation, (vii) harassment, and (viii) foreign political
24
interference — categories of speech that are almost entirely constitutionally-protected. Defining
COMPLAINT
19
PDF Page 21
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 20 of 35
1
these controversial categories and making decisions about how or whether to moderate them is an
2
exercise “fraught with political bias,” that is likely to result in controversy, no matter what the
3
social media companies do. Id. The First Amendment does not permit the government to compel
4
social media companies to publicly take positions on these controversial political topics against
5
their will. Wooley, 430 U.S. at 714 (the First Amendment affords X Corp. “both the right to speak
6
freely and the right to refrain from speaking at all”); Riley, 487 U.S. at 797 (same); Agency for Int’l
7
Dev., 570 U.S. at 213 (the First Amendment “prohibits the government from telling people what
8
they must say”). This is particularly true given the highly controversial and politically-charged
9
nature of questions about content moderation.
10
41.
And by requiring X Corp. to disclose whether and, if so, how it regulates these
11
controversial and difficult-to-define categories of content, AB 587 impermissibly attempts to
12
pressure X Corp. to adopt — and regulate — these categories of content, even if X Corp. would
13
prefer to categorize content differently. For example, X Corp. does not currently regulate “hate
14
speech,” “racism,” or “extremism,” which are three categories of conduct that AB 587 forces social
15
media companies to publicly address. But X Corp. does regulate “hateful conduct,” a category of
16
content that may include content that some people might argue constitutes “hate speech,” “racism,”
17
or “extremism.” See Safety and Cybercrime: Hateful Conduct, X. Corp. (last accessed Sept. 8,
18
2023) https://help.twitter.com/en/rules-and-policies/hateful-conduct-policy. While AB 587 does
19
not force X Corp. to adopt and regulate these three categories, it intentionally attempts to pressure
20
X Corp. to do so in an insidious and impermissible way. Because the Terms of Service Report
21
forces disclosures about content moderation to cover the controversial topics that the State has
22
defined, social media companies like X Corp. will feel pressure to indicate that they take steps to
23
regulate such content, rather than be subject to a pronouncement that, for example, they do not
24
regulate “hate speech” or “racism” at all. AB 587 was passed with the intent of pressuring social
COMPLAINT
20
PDF Page 22
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 21 of 35
1
media companies to do just that. It is, in short, an attempt by the State to impermissibly frame the
2
debate about content moderation in a way that pressures social media companies to regulate
3
constitutionally-protected content that the State finds objectionable.
4
42.
This type of compelled speech cannot pass constitutional muster. An analogy helps
5
illustrate the point. Imagine, for instance, a state law that, in the interest of transparency, forced
6
individuals running for President of the United States to publish their positions on several “hot
7
button” political issues (e.g., whether abortion should be permitted in cases of incest and rape or
8
whether former President Trump committed any crimes) on which, in the government’s view, most
9
political candidates were avoiding stating their positions. Even if the law did not mandate
10
politicians to take any particular position, but simply required them, under the threat of civil
11
penalties, to publicly spell out their position (or lack thereof) on those controversial issues as part
12
of an effort to “pressure” them to adopt a certain political view, there can be little doubt that such
13
a law would trigger heightened scrutiny under the First Amendment. AB 587 is no different.
14
43.
The Attorney General has suggested that AB 587 involves merely compelled
15
commercial speech of “purely factual and uncontroversial information,” and is thus subject to the
16
more relaxed standard of review under Zauderer v. Office of Disciplinary Counsel, 471 U.S. 626,
17
651 (1985). That is simply untrue. As the legislative history of AB 587 makes plain, the categories
18
of speech that are focused on by AB 587 are controversial to define and controversial to moderate.
19
And the speech that is compelled about the definitions of these categories and application of content
20
moderation rules is not commercial; rather, it is core political speech about controversial questions
21
on which reasonable minds can disagree. What is more, AB 587’s compelled disclosure is not
22
purely factual because it may mislead consumers — that is, if a company submits a Terms of
23
Service Report explaining that it does not moderate the controversial categories of content required
24
by the State (because, instead, the company moderates categories of content pursuant to its own
COMPLAINT
21
PDF Page 23
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 22 of 35
1
policies and terminology), there is a high likelihood that consumers could be misled to believe that
2
the company is not moderating content sufficiently, even if that is not the case. As such, Zauderer
3
does not apply.
4
44.
Even if Zauderer did apply, AB 587 fails to satisfy even the more relaxed test set
5
forth in that case because the legislation is unduly burdensome and unjustified. Id. at 651. As
6
noted in the legislative history of AB 587, “the largest social media platforms are faced with
7
thousands, if not millions of similarly difficult decisions related to content moderation on a daily
8
basis.” Ex. 2 at 4. This enormous burden, of course, is the reason why the statute limits itself to
9
social media companies with over $100 million in gross revenue from the preceding year. Id. at 7.
10
Indeed, the legislative history for AB 587 describes the amount of content received by many of
11
these platforms as “enormous.” Id.
12
III.
13
AB 587’s Will Suppress Speech Based on Content & Viewpoint
45.
According to California Governor Gavin Newsom, AB 587 does nothing more than
14
“pull back the curtain” and provide “transparency” as to the already-existent content moderation
15
policies of social media companies. See, e.g., Press Release, Governor Newsom Signs Nation-
16
Leading
17
https://www.gov.ca.gov/2022/09/13/governor-newsom-signs-nation-leading-social-media-
18
transparency-measure/.
19
46.
Social
Media
Transparency
Measure
Governor Newsom’s claim is belied by the record.
(Sept.
13,
2022),
Even if AB 587 uses
20
“transparency” as its effectuating mechanism, it does so for the purpose of censoring particular
21
viewpoints with respect to eight content categories in § 22677(a)(3). And AB 587 will do so
22
successfully, given its amorphous penalty scheme (it is entirely unclear what constitutes a
23
“reasonable, good faith attempt to comply”) that can and will likely be used to silence viewpoints
24
with which Attorney General Bonta and the State of California disagree. See Sorrell v. IMS Health
COMPLAINT
22
PDF Page 24
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 23 of 35
1
Inc., 564 U.S. 552, 565 (2011) (applying “heightened scrutiny” to law intended to suppress speech
2
“in conflict with the goals of the state”).
3
47.
The legislative record is abundantly clear: according to AB 587’s authors, sponsors,
4
and supporters, the law’s true purpose and desired effect is to use the compelled disclosures to
5
pressure social media companies into regulating and censoring constitutionally-protected content
6
that the government believes is undesirable. For instance:
7
Within the April 27, 2021 Assembly Committee on Judiciary Hearing Report for AB
8
587, lead bill author Jesse Gabriel stated that AB 587 is an “important first step” in
9
ensuring that “social media companies [] moderate or remove hateful or
10
incendiary content” on their platforms. He hoped that AB 587 will “pressure them”
11
to “eliminate hate speech and disinformation.” Ex. 1 at 4 (emphasis added);
12
The official bill comments accompanying AB 587’s May 24, 2021 Assembly Floor
13
Analysis applauded the bill’s “unique, data driven approach” to “content
14
moderation on social media.” Ex. 5 (Cal. Assemb. Analysis, 2021-22 Sess. (AB
15
587), May 24, 2021) at 2 (emphasis added);
16
The July 13, 2021 Senate Judiciary Committee Hearing Report for AB 587 cites to
17
comments from official bill sponsor ADL that emphasized that the law will allow
18
“policymakers [to] take meaningful action to decrease online hate and
19
extremism.” Ex. 6 (Cal. Sen. Judiciary Report, 2021–22 Sess. (AB 587), July 13,
20
2021) at 13 (emphasis added);
21
In July 2020, California Senator Scott Wiener, who ultimately co-authored AB 587,
22
tweeted, “Social media platforms have a moral obligation—& need to have a legal
23
obligation—not to become engines for violent hate speech.” Senator Scott Wiener
24
COMPLAINT
23
PDF Page 25
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 24 of 35
1
(@Scott_Wiener),
2
https://twitter.com/Scott_Wiener/status/1278743357032812544 (emphasis added);
3
Twitter
(July
2,
2020,
1:32
PM
EST),
On Mach 29, 2021 in a press release about AB 587, the Anti-Defamation League
4
(“ADL”), an official sponsor of AB 587, clarified that the intent of the law is to
5
“improv[e]” the “enforcement of [social media companies’ content-moderation]
6
policies” or “provide enough evidence for legal action against them.” Press
7
Release, California Legislators Introduce Bipartisan Effort to Hold Social Media
8
Companies Accountable for Online Hate and Disinformation (Mar. 29, 2021),
9
https://a46.asmdc.org/press-releases/20210329-california-legislators-introduce-
10
11
bipartisan-effort-hold-social-media (emphasis added).
In that same March 29, 2021 press release, the National Hispanic Media Coalition,
12
another official supporter of AB 587, emphasized AB 587’s value in disrupting social
13
media companies’ facilitation of “white supremacy, hate, conspiracies, and
14
extremism online” by carrying that content on their platforms. Id. (emphasis added);
15
On June 21, 2021, lead bill author Jesse Gabriel Tweeted that AB 587 was going to
16
“address . . . concerns that platforms aren’t doing enough to stop the spread of
17
misinformation and hate speech.”
Assm. Jesse Gabriel (@AsmJesseGabriel),
18
Twitter
2021,
19
https://twitter.com/AsmJesseGabriel/status/1404553699502870529 (emphasis added,
20
internal quotation omitted).
21
48.
(June
14,
5:37
PM
EST),
It is clear, moreover, that AB 587 aims to censor particular viewpoints espoused on
22
social media platforms with respect to the eight categories of content in § 22677(a)(3). For instance,
23
within the April 27, 2021 Assembly Committee on Judiciary Hearing Report for AB 587, bill author
24
Jesse Gabriel cited a “study of Twitter posts” that supposedly found that “the greater proportion of
COMPLAINT
24
PDF Page 26
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 25 of 35
1
tweets related to race- and ethnicity-based discrimination in a given city, the more hate crimes were
2
occurring in that city.” Ex. 1 at 4; Request for Judicial Notice, Ex. 1 at 4, Minds Inc., No. 23-cv-
3
2705 (ECF 23-3) (C.D. Cal. May 25, 2023) (citing same).
4
49.
Similarly, the June 28, 2022 Senate Judiciary Committee Hearing Report for AB
5
587 cites a study finding that a third of individuals who “experience online harassment . . . attribute
6
at least some harassment to their identity . . . affecting the ability of already marginalized
7
communities to be safe in digital spaces.” Ex. 7 (Cal. Sen. Judiciary Report, 2021–22 Sess. (AB
8
587), June 28, 2022) at 8; see also id. at 18 (Los Angeles County Democratic Party noting its
9
support for AB 587 because social media companies “enable[] the micro targeting of vulnerable
10
11
individuals.”).
50.
That Attorney General Bonta plans to use threats of enforcement of AB 587 to
12
pressure the social media companies to regulate speech that the government does not like is not in
13
doubt. In fact, he has already done so. Less than two months after AB 587’s enactment, the
14
Attorney General reminded X Corp. and other social media companies of their “responsibility” to
15
combat what the Attorney General views as the “dissemination of disinformation that interferes
16
with our electoral system,” while simultaneously reminding them that the “California Department
17
of Justice will not hesitate to enforce” AB 587. Letter from Attorney General Robert Bonta to
18
Twitter, Inc., et al., 4, (Nov. 3, 2022), https://oag.ca.gov/system/files/attachments/press-
19
docs/Election%20Disinformation%20and%20Political%20Violence.pdf (emphasis added).
20
51.
That AB 587 grants Attorney General Bonta nearly unfettered discretion to
21
determine if, in his view, X Corp. has complied with AB 587 in “reasonable, good faith” and grants
22
him the ability to investigate potential violations (pursuant to Cal. Gov’t Code §§ 11180–81) by
23
issuing document demands about the social media companies’ content moderation practices
24
compounds the problem. The end result is that AB 587 violates the First Amendment of the United
COMPLAINT
25
PDF Page 27
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 26 of 35
1
States Constitution and Section I, Article 2, of the California Constitution by impermissibly
2
injecting the State into X Corp.’s constitutionally-protected editorial decisions in a manner that is
3
designed to pressure X Corp. to regulate constitutionally-protected content in ways that the State
4
wants.
5
52.
Because AB 587 forces X Corp. to make disclosures about how it defines and
6
moderates controversial categories of content as part of an effort to pressure X Corp. to regulate
7
constitutionally-protected content that the State disfavors, strict scrutiny applies.
8
9
10
11
53.
AB 587 fails to satisfy strict — or even intermediate — scrutiny. It does not serve
a compelling (or even a substantial governmental interest), will not directly and materially advance
any such interest, and is not narrowly tailored to further any such interest.
54.
AB 587 purports to further the government’s interest in transparency in content
12
moderation. But there is no evidence that AB 587’s compelled speech solves any real problem at
13
all. Indeed, X Corp. is already transparent with its users and the public about its content moderation.
14
See Rules and Policies, X Corp. (last accessed Sept. 8, 2023) https://help.twitter.com/en/rules-and-
15
policies. X Corp. supports transparency in content moderation — but not in the impermissible
16
manner proscribed by AB 587. Here, there is no evidence that the Act’s compelled speech solves
17
any real problem at all. X Corp. is already transparent about its content moderation policies, and
18
there is no compelling, substantial, or important governmental interest in compelling transparency
19
using the type of governmental oversight proscribed by AB 587.
20
55.
The government’s other stated interest in passing AB 587 – to “pressure” social
21
media companies to “become better corporate citizens by doing more to eliminate hate speech and
22
disinformation,” Ex. 1 at 4; see also Mot. to Dismiss at 15–16, Minds, Inc., et al. v. Bonta, No. 23-
23
cv-2705 (ECF 23-1) (C.D. Cal. May 25, 2023), is not compelling, substantial, or important. State
24
COMPLAINT
26
PDF Page 28
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 27 of 35
1
pressure to moderate constitutionally-protected content that the State disfavors is an illegal,
2
unconstitutional, and illegitimate purpose.
3
56.
AB 587 also fails to satisfy heightened scrutiny because there is no evidence that it
4
will “directly and materially” advance a compelling, substantial, or important governmental
5
interest. Likewise, it regulates much more speech than is necessary to achieve any compelling,
6
substantial, or important goal.
7
IV.
8
9
10
AB 587 Excessively and Unduly Burdens X Corp.’s Nationwide and Global Activities
57.
In addition to forcing X Corp. to speak about highly controversial topics about which
it does not wish to speak, AB 587 establishes an unduly burdensome disclosure requirement regime,
under the threat of significant financial penalties and civil suits.
11
58.
The undue burden that AB 587 places on X Corp. is compounded by the law’s failure
12
to limit the Terms of Service Report’s information requirements to information about Californians.
13
To be clear, this was an intentional decision, made to ensure that the AB 587 “will have national
14
implications.”4 But AB 587 does not stop there. Regulations like AB 587, which govern global
15
social media platforms’ publication of content, are inherently international in scope. AB 587
16
applies just as much to posts by Texans, in Texas, or Australians, in Australia, as it does to
17
Californians, in California. And even if a post was made by a Californian, in California, it is
18
available globally and viewed globally.
19
59.
The burden AB 587 places on X Corp. is even further compounded by the sheer
20
volume of posts that occur on X. As of August 2022, 6,000 posts are sent on X every second;
21
350,000 posts are sent every minute; 500 million posts are sent every day; and 200 billion posts are
22
sent every year. David Sayce, The Number of Tweets Per Day in 2022 (last visited Sept. 6, 2023),
23
4
24
Press Release, Office of Assembly member Jesse Gabriel, After Two-Year Fight, Governor Newsom Signs
Landmark Social Media Transparency Bill (Sept. 13, 2021) https://a46.asmdc.org/press-releases/20220913after-two-year-fight-governor-newsom-signs-landmark-social-media.
COMPLAINT
27
PDF Page 29
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 28 of 35
1
https://www.dsayce.com/social-media/tweets-
2
day/#:~:text=Every%20second%2C%20on%20average%2C%20around%206%2C000%20tweets
3
%20are%20tweeted%20on,200%20billion%20tweets%20per%20year.
4
60.
In light of this volume, and in light of the specificity and depth of information
5
required to comply with AB 587 — i.e., at least 161 categories of information must be disclosed
6
bi-annually5 — the tremendous burden AB 587 creates for X Corp. is clear. X Corp. will be forced
7
to expend significant monetary and employee resources to comply with AB 587, all for the
8
unconstitutional and unjustifiable goal of moderating particular content and viewpoints with which
9
Attorney General Bonta and the State of California disagree.
10
11
12
FIRST CAUSE OF ACTION
(Declaratory Relief and Preliminary and Permanent Injunctive Relief for Violations of the
First Amendment to the United States Constitution (42 U.S.C. § 1983) and Article I, Section
2, of the California Constitution)
13
14
17
18
19
20
21
22
23
24
X Corp. realleges and incorporates herein by reference Paragraphs 1 through 60
62.
AB 587 violates the First Amendment to the United States Constitution and Article
above.
15
16
61.
I, Section 2, of the California Constitution by compelling X Corp. to divulge publicly, under the
threat of significant financial penalty and civil suit, its opinions and confidential editorial processes
with respect to extremely controversial subject matters. Tornillo, 418 U.S. at 258 (editorial control
and judgment protected from government regulation by First Amendment).6 Each of the content
5
See Eric Goldman, Will California Clone-and-Revise Some Terrible Ideas from Florida/Texas’ Social
Media Censorship Laws? (Analysis of CA AB587), Technology & Marketing Law Blog (June 21, 2022),
https://blog.ericgoldman.org/archives/2022/06/will-california-clone-and-revise-some-terrible-ideas-fromflorida-texas-social-media-censorship-laws-analysis-of-ca-ab587.htm (“All told, there are 7 categories of
disclosures, and the bill indicates that the disclosure categories have, respectively, 5 options, at least 5
options, at least 3 options, at least 5 options, and at least 5 options. So I believe the bill requires that each
service’s reports should include no less than 161 different categories of disclosures
(7×5+7×5+7×3+7×5+7×5).”).
6
AB 587 violates Article I, Section 2, of the California Constitution for all of the same reasons that it violates
the First Amendment to the United States Constitution. See, e.g., City of Montebello v. Vasquez, 1 Cal. 5th
409, 421 n.11 (2016) (“[T]he California liberty of speech clause is broader and more protective than the free
COMPLAINT
28
PDF Page 30
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 29 of 35
1
categories about which X Corp. is compelled to speak — hate speech, racism, extremism,
2
radicalization, disinformation, misinformation, harassment, and foreign political interference,
3
§ 22677(a)(3) — is “anything but an ‘uncontroversial’ topic.” Nat’l Inst. of Fam. & Life Advocs.
4
v. Becerra (“NIFLA”), 138 S. Ct. 2361, 2372 (2018); see also Wooley, 430 U.S. at 714 (“First
5
Amendment” protects “both the right to speak freely and the right to refrain from speaking at all.”).
6
The same is true about how such categories of speech are defined and policed on X.
7
63.
The deep-seated controversy inherently tied to each topic in § 22677(a)(3) makes
8
crystal clear that AB 587’s compelled disclosures are not the sort of purely factual and
9
uncontroversial disclosures that fit within the narrow exception to strict scrutiny set forth under
10
Zauderer. Strict scrutiny thus applies because AB 587 “compel[s]” X Corp. to “speak a particular
11
message,” which necessarily “alters the content of” its speech. NIFLA, 138 S. Ct. at 2371 (quoting
12
Riley, 487 U.S. at 795 (emphasis added)). AB 587 fails to satisfy strict (or even intermediate)
13
scrutiny.
14
64.
Moreover, “[f]ormal legislative findings accompanying” AB 587 make clear its
15
illicit “purpose and practical effect,” Sorrell, 564 U.S. at 565, which is to regulate speech on X
16
“based on ‘the topic discussed or the idea or message expressed.’” City of Austin, Texas v. Reagan
17
Nat’l Advert. of Austin, LLC, 142 S. Ct. 1464, 1474 (2022) (citing Reed v. Town of Gilbert, Ariz.,
18
576 U.S. 155, 171 (2015)). Indeed, the official legislative record demonstrates that AB 587 was
19
intended to pressure X Corp. to censor particular viewpoints on X that the government deems
20
undesirable, notwithstanding that such editorial “decisions about what content to include” on X are
21
“protected by the First Amendment.” O’Handley, 579 F. Supp. 3d at 1186–87, aff’d sub nom. on
22
other grounds, 62 F.4th 1145; see also Sorrell, 564 U.S. at 565 (applying “heightened scrutiny” to
23
24
speech clause of the First Amendment.”); Delano Farms Co. v. California Table Grape Com., 4 Cal. 5th
1204, 1221 (2018) (“[O]ur case law interpreting California’s free speech clause has given respectful
consideration to First Amendment case law for its persuasive value[.]”).
COMPLAINT
29
PDF Page 31
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 30 of 35
1
law intended to suppress speech “in conflict with the goals of the state”); Reed, 576 U.S. at 166
2
(“Because strict scrutiny applies either when a law is content based on its face or when the purpose
3
and justification for the law are content based, a court must evaluate each question before it
4
concludes that the law is content neutral and thus subject to a lower level of scrutiny.”).
5
65.
Here, as in Sorrell, the California legislature’s “expressed statement of purpose”
6
demonstrates that AB 587 “imposes burdens . . . aimed at a particular viewpoint.” Sorrell, 564 U.S.
7
at 565; see also id. at 578–79 (“[A] State’s failure to persuade does not allow it to hamstring the
8
opposition. The State may not burden the speech of others in order to tilt public debate in a
9
preferred direction.”). It is hornbook law that such “ideologically driven attempts to suppress a
10
particular point of view are presumptively unconstitutional.” Rosenberger v. Rector & Visitors of
11
Univ. of Virginia, 515 U.S. 819, 830 (1995) (internal quotation omitted).
12
66.
This, the First Amendment does not permit, absent a compelling state interest and
13
means that are narrowly tailored to achieve that interest, since “subtler forms of discrimination that
14
achieve identical results based on function or purpose” do not escape strict scrutiny. City of Austin,
15
142 S. Ct. at 1474 (citing Reed, 576 U.S. at 159–60, 163–64).
16
67.
AB 587 also violates the First Amendment of the United States Constitution and
17
Article I, Section 2, of the California Constitution because it would impermissibly interfere with
18
the constitutionally-protected editorial judgment of social media companies. See, e.g., Herbert,
19
441 U.S. at 174 (a law that “subjects the editorial process to private or official examination merely
20
to satisfy curiosity or to serve some general end such as the public interest . . . would not survive
21
constitutional scrutiny as the First Amendment is presently construed”); Tornillo, 418 U.S. at 258
22
(“The choice of material to go into a newspaper, and the decisions made as to limitations on the
23
size and content of the paper, and treatment of public issues and public officials — whether fair or
24
unfair — constitute the exercise of editorial control and judgment. It has yet to be demonstrated
COMPLAINT
30
PDF Page 32
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 31 of 35
1
how governmental regulation of this crucial process can be exercised consistent with First
2
Amendment guarantees of a free press as they have evolved to this time.”).
3
68.
There is a bona fide and actual controversy between X Corp. and Attorney General
4
Bonta because Attorney General Bonta is charged with enforcing, and intends to enforce, AB 587,
5
even though it violates the First Amendment to the United States Constitution and Article I, Section
6
2, of the California Constitution.
7
8
69.
X Corp. maintains that AB 587 is illegal and unconstitutional. Attorney General
Bonta claims otherwise.
9
70.
X Corp. requests a judicial determination regarding the validity of AB 587 to prevent
10
the harm caused by its enactment. Such a determination is both necessary and appropriate to avoid
11
the deprivation of X Corp.’s constitutional rights, which would occur if AB 587 is applied to X.
12
Corp.
13
71.
In light of the violation of the First Amendment to the United States Constitution
14
and Article I, Section 2, of the California Constitution, X Corp. seeks preliminary and permanent
15
injunctive relief against enforcement of AB 587. X Corp. would be irreparably harmed if it were
16
forced to comply with AB 587’s requirements and has no adequate remedy at law.
17
SECOND CAUSE OF ACTION
18
(Declaratory Relief and Preliminary and Permanent Injunctive Relief for Violation of the
Dormant Commerce Clause of the United States Constitution (42 U.S.C. § 1983))
19
20
21
72.
X Corp. realleges and incorporates herein by reference Paragraphs 1 through 71
73.
AB 587 violates the Dormant Commerce Clause of the United States Constitution
above.
22
because “the burden” it “impose[s] on interstate commerce” is “clearly excessive in relation to [its]
23
putative local benefits.” Nat’l Pork Producers Council v. Ross, 143 S. Ct. 1142, 1157 (2023)
24
(quoting Pike v. Bruce Church, Inc., 397 U.S. 137, 142 (1970)).
COMPLAINT
31
PDF Page 33
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 32 of 35
1
74.
AB 587 imposes significant burdens on interstate commerce because it “regulat[es]
2
[] activities that are inherently national or require a uniform system of regulation.” Nat’l Pork
3
Producers Council v. Ross, 6 F.4th 1021, 1031 (9th Cir. 2021), aff’d, 143 S. Ct. 1142 (internal
4
quotation omitted). Those burdens, moreover, are “clearly excessive in relation to the putative
5
local benefits.” Id. at 1026 (quoting Pike, 397 U.S. at 142). That is, AB 587 places burdens on
6
interstate commerce — by regulating global platform content — that far exceed the alleged benefit
7
of increased social media transparency for Californians.
8
9
75.
It was the intent of AB 587 and the likely effect to impact and change how content
is moderated on social media platforms nationwide, not only in California.
10
76.
There is a bona fide and actual controversy between X Corp. and Attorney General
11
Bonta because Attorney General Bonta is charged with enforcing, and intends to enforce, AB 587,
12
even though it violates the Dormant Commerce Clause of the United States Constitution.
13
14
77.
X Corp. maintains that AB 587 is illegal and unconstitutional. Attorney General
Bonta claims otherwise.
15
78.
X Corp. requests a judicial determination regarding the validity of AB 587 to prevent
16
the harm caused by its enactment. Such a determination is both necessary and appropriate to avoid
17
the deprivation of X Corp.’s constitutional rights, which would occur if AB 587 is applied to X.
18
Corp.
19
79.
In light of the violation of the Dormant Commerce Clause of the United States
20
Constitution, X Corp. seeks permanent injunctive relief against enforcement of AB 587. X Corp.
21
would be irreparably harmed if it were forced to comply with AB 587’s requirements and has no
22
adequate remedy at law.
23
24
COMPLAINT
32
PDF Page 34
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 33 of 35
1
THIRD CAUSE OF ACTION
2
(Declaratory Relief and Preliminary and Permanent Injunctive Relief for Immunity Under
and Preemption by 47 U.S.C. § 230(c)(2))
3
4
5
6
80.
X Corp. realleges and incorporates herein by reference Paragraphs 1 through 79
81.
47 U.S.C. § 230(c)(2) preempts AB 587 due to the direct conflict between them.
82.
AB 587 imposes liability on X Corp. with regard to how it moderates content on X.
above.
7
This directly contravenes the immunity granted under § 230(c)(2), which provides that “[n]o
8
provider or user of an interactive computer service shall be held liable on account” of “any action
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
voluntarily taken in good faith to restrict access to or availability of material that the provider or
user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise
objectionable, whether or not such material is constitutionally protected.”
83.
X is an “interactive computer service,” as that term is defined under Section
230(c)(2).
84.
The immunity provided by Section 230(c)(2) is very broad. It precludes liability for
“any action” taken voluntarily in good faith by an interactive computer service to restrict access to
certain content. AB 587 imposes such liability, however, if actions are taken by interactive
computer services to restrict access to content without certain requisite disclosures.
85.
The phrase “any action” in Section 230(c)(2) means “any action” — including any
actions taken in good faith to restrict access to content without the public disclosures mandated by
AB 587.
86.
AB 587 also contravenes the immunity provided to X Corp. by Section 230(c)(2)
because Attorney General Bonta may penalize X Corp. if he determines, in his unfettered discretion
as to what constitutes “misrepresent[ation]” and “reasonable, good faith,” that X Corp. is
24
COMPLAINT
33
PDF Page 35
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 34 of 35
1
“restrict[ing] access” to content in a way that, in Attorney General Bonta’s view, is contrary to X
2
Corp.’s promulgated content moderation policies.
3
87.
There is a bona fide and actual controversy between X Corp. and Attorney General
4
Bonta because Attorney General Bonta is charged with enforcing, and intends to enforce, AB 587,
5
even though such enforcement is precluded and preempted by 47 U.S.C. § 230(c)(2).
6
7
88.
General Bonta claims otherwise.
8
9
X Corp. maintains that AB 587 is invalid and void as a matter of law. Attorney
89.
X Corp. seeks a declaratory judgment that AB 587 is legally invalid and
unenforceable because it is precluded and preempted by 47 U.S.C. § 230(c)(2).
10
90.
In light of the violation of the Section 230(c)(2), X Corp. seeks preliminary and
11
permanent injunctive relief against enforcement of AB 587. X Corp. would be irreparably harmed
12
if it were forced to comply with, or litigate, AB 587’s requirements and has no adequate remedy at
13
law.
14
15
16
17
18
19
20
21
22
23
24
PRAYER FOR RELIEF
WHEREFORE, X Corp. respectfully requests that this Court enter judgment in X Corp.’s
favor and grant the following relief:
1.
A declaration that AB 587 violates the First Amendment of the United States
Constitution and Article I, Section 2, of the California Constitution;
2.
A declaration that AB 587 violates the Dormant Commerce Clause of the United
States Constitution;
3.
A declaration that the imposition of civil penalties under AB 587 is precluded and
preempted by 47 U.S.C. § 230(c)(2) and are therefore null and void and have no legal effect;
4.
A preliminary and permanent injunction enjoining Attorney General Bonta from
enforcing AB 587 against X Corp.;
COMPLAINT
34
PDF Page 36
Case 2:23-at-00903 Document 1 Filed 09/08/23 Page 35 of 35
1
2
5.
An award of fees, costs, expenses, and disbursements, including attorneys’ fees, to
which X Corp. is entitled pursuant to 42 U.S.C. § 1988 and other applicable law; and
3
6.
Such other and further relief as the Court deems just and proper.
4
DEMAND FOR JURY TRIAL
5
6
Pursuant to Federal Rule of Civil Procedure 38, X Corp. demands a trial by jury in this
action of all issues so triable.
7
8
Dated:
September 8, 2023
9
By: /s/ William R. Warne
DOWNEY BRAND LLP
William R. Warne (SBN 141280)
Meghan M. Baker (SBN 243765)
621 Capitol Mall, 18th Floor
Sacramento, CA 95814
Phone: 916-444-1000
Facsimile: 916-444-2100
10
11
12
13
CAHILL GORDON & REINDEL LLP
Joel Kurtzberg (pro hac vice pending)
Floyd Abrams (pro hac vice pending)
Jason Rozbruch (pro hac vice pending)
Lisa J. Cole (pro hac vice pending)
32 Old Slip
New York, NY 10005
Phone: 212-701-3120
Facsimile: 212-269-5420
jkurtzberg@cahill.com
14
15
16
17
18
19
1886555
20
21
22
23
24
COMPLAINT
35