Page 1 IN THE DISTRICT COURT OF OSAGE,. COU
STATE OF OKLAHOMA St CUE Osage County, Okla.
STATE OF OKLAHOMA, ex rel.,
GENTNER DRUMMOND, NAV 1 6 ATTORNEY GENERAL OF
OKLAHOMA, BURD, Court Clerk
pte DO DLTY
Plaintiff, JURY TRIAL DEMANDED
v. Case No. CJ-2023-
Judge Stuart Tate
META PLATFORMS, INC. f/k/a
FACEBOOK, INC., and
INSTAGRAM, LLC,
Defendants.
PETITIONPage 2 TABLE OF CONTENTS
SUMMARY OF THE CASE .ioi.ccccccccscscestssestsscacsessesnssrsssseenesesnsneanessseessscsssecsisseseseniesteeranvecensataneeen I
AROUND sas cnietsnzssaeaprinn renee nineteen isin ay chaste: Sean tbls aT alee aA TE aed, THE: PARTE Bpecesscecsaees siractiioa ate vba ae Ae Ra Te, See JURISDICTION AND VENUE wo... cee eecceeeeeesteteeeeseecseensteeassessaeeeseetesseesas¢eesseesecsneasesansvassasseees EAGT URAL (0G BGT LOIN yc eunpenyesamnernzcren meus gamseremienverees ene ects lovteins mati ece treme aantescerenmnreneres A. Defendants Engage in Consumer Transactions with Oklahoma Consumefs ......00....c.ccsc00 1. Meta Offers Instagram in Exchange for Consumers’ Valuable Consideration That
Enables Meta to Sell Advertisiig fesse rus inirworaviacunteticiutaiaitancceclea caren vi astanaace lis 2. Advertising is the Core of Meta’s Business ........0....0..00...c6.ccsccccsccseseseeseesesuesssesescecesverseens
3. Meta Promntizes Acquiring Adolescents and Maximizing Their Time Spent on
ImStagrarm ......c.ccccceeceeseeecnceeeeeeceeseseseccesssesueeseseesesstesssesestbesscseessursevuseusscssvavaseasnenseacsesuvasnas B. Meta Operates Instagram in a Manner That is Unfair to Adolescent ...........0.0000cccccceue 1. By Meta’s Design, Instagram Induces Compulsive Use Among Adolescents .........0.0.00
2. Instagram’s “Teen Fundamentals” Study Shows Instagram’s Power to Induce
Compulsive Use Among Adolescents .......cccccsscsesesseeeessestssecsssccessestssscearastensaerenseeerenas 3... nstagram: Features Induce Compulsive Use ons ccsimssicaseacsncneinrssitireceearcccticecmuacevsawwbiaden
4. Instagram Induces Widespread Compulsive Use Among Adolescents ........0...0.00c00000..5. Instagram Harms Adolescents by Inducing Compulsive Use ...........ccccccsccseeseenessnsenennens
C. Meta Engages in Deceptive Conduct by Omitting and Misrepresenting Material Facts
ASE HRS FOU BAN Nay weaves nesmnarsecener weeiaensanea oie ssereietar tasted taaaihiaan wneadeesanerceors Tai aegls ere
1. Meta Did Not Disclose its Knowledge That Instagram Harms Users, Particularly
CaLELS Sy cnemrneseayescuinstensetanccs cuter: eyeetanenp ah ead ennanenuuts coin fos ntor ems emcineseammenecisinagmecedanieseanatcaee:
2. Meta Promoted Misleading Metrics About the Incidence of Harm on Instagram............3. Meta Deceived Consumers by Promoting “Time Spent” Tools Despite Known
TCCUN AGIOS iincccarceniais ee AG i tea ai GU Be
4. Through Public Misrepresentations, Meta Leads the Public to Believe That
Instagram. is Safe: for- Adolescents 2.0532 3)2:cisccsitacleaseineeiin UA. el Whee is aes VIOLATIONS OF LAW 00 cccccscsseesssseesssesssecatsensenssestsessssssacatssutensecssseeessssesaseaenernevetsteraesasenes
RELIEF REQUESTED i iccssaunniiacmn nian caannenineniandnnmaci mainsail e ete mee 70Page 3 The State of Oklahoma, by and through Attorney General Gentner Drummond, (“Plaintiff’
or “Attorney General”) brings this action pursuant to the Oklahoma Consumer Protection Act, O.S. §§ 751-763 (““OCPA”) against Defendants Meta Platforms, Inc. f/k/a Facebook, Inc. and
Instagram, LLC (collectively “Defendants” or “Meta”) to stop Meta’s deceptive and unfair
business practices that are fueling a mental health crisis among adolescents in the State of
Oklahoma.
SUMMARY OF THE CASE
1. Meta—through Instagram and Facebook—has created a social media empire to
generate enormous profits at the expense of millions of young Americans. Meta develops and
continually refines powerful and unprecedented technologies that attract, engage, and ultimately
hijack the time and attention of Oklahoma’s youth. Meta’s social media platforms have had
profound and far-reaching effects on the psychological and social well-being of young
Oklahomans. For most Oklahoma youth, Meta’s social media platforms, and specifically
Instagram, are an integral part of growing up, a necessity as they navigate adolescence.
2. Meta’s motivation is simple: greed. To maximize its profits, Meta has repeatedly
deceived and misled the public about the known and substantial dangers associated with the use
of its social media products. Meta has concealed the ways its social media products manipulate
and exploit the most vulnerable Oklahoma consumers: kids and teenagers. This was not an
accident. Meta has at all times been aware of the widespread risks its social media products pose
to the mental and physical health of Oklahoma youth yet it knowingly and repeatedly opted to
prioritize profits above users’ well-being. In doing so, Meta engaged in, and continues to engage
in, unlawful conduct that violates Oklahoma law.Page 4 3. As alleged in Section A, Meta’s core business revolves around maximizing the
amount of time users are actively engaged on its platform. The longer those users stay engaged,
the more data they provide to Meta, and the more advertising revenue Meta rakes in.
4. As alleged in Section B, Meta operates its social media products in an unfair
manner. Meta understands that developing adolescent brains are especially vulnerable to
manipulation. With that knowledge, Meta engineers and programmers created social media
products to exploit those vulnerabilities. Meta’s exploitation took several forms including, among
others, engagement-maximizing features such as: (a) intermittent dopamine-release
recommendation algorithms; (b) “Likes” and features designed to allow users to socially compare
themselves to other young users; (c) audiovisual and haptic alerts that incessantly recall young
users to Instagram at all times of the day and night; and (d) content-presentation formats, such as
“infinite scroll,” designed to make it difficult for young users to disengage with Meta’s products
even when they want to. Meta knows that Instagram induces compulsive use and facilitates
addiction, and Meta knows that Instagram harms young users.
5. As alleged in Section C, Meta deceives consumers. Meta knew that adolescent use
of its platforms—particularly Instagram—is associated with serious mental health problems like
depression, anxiety, insomnia, and interference with education and daily life. Meta knew these
risks to young users because it had all the user engagement data. It had all the research. However,
rather than disclose what it knew, Meta published misleading metrics through its Community
Standard Enforcement Reports that dramatically understate the actual rates of harm being suffered
by young Instagram users.
6. Not only did Meta knowingly publish inaccurate metrics, it actively concealed its
own internal research findings that repeatedly showed the actual harm experienced by users wasPage 5 far higher than Meta’s published metrics. Meta further misled consumers and the public at large
by claiming (1) that its “Time Spent” tool was an effective way to curb use when it knew the tool
provided inaccurate information, (2) that it restricts young users from accessing harmful content,
(3) that it does not prioritize the maximization of time spent by users on its platform, (4) that it
does not place monetary values on teen users, (5) that it does not restrict access to its internal
research that would allow researchers, consumers, and the public in general, to fully understand
the risks posed by Meta’s platform, and (6) that its platforms, particularly Instagram, are not
addictive.
7. With these allegations in mind, the state of teen mental health in Oklahoma cannot
be ignored. In the decade ending in 2021, the percentage of teens who reported having felt so
consistently sad or hopeless that they discontinued their usual activities increased by more than
50%.' Rates of students who report having contemplated or even attempted suicide are similarly
alarming.
8. Taken individually and together, Meta’s actions and omissions constitute both
unfair and deceptive trade practices that are prohibited by the OCPA.
BACKGROUND
9. “We have a saying. ‘Move fast and break things.’ The idea is if you never break
anything, you’re probably not moving fast enough,” wrote then-Facebook CEO Mark Zuckerberg
(“Zuckerberg”) to potential investors just prior to Facebook’s 2012 initial public offering.? Meta
may have removed this saying from its public mission statement, but the motivation behind it is
' Oklahoma Youth Risk Behavior Survey 10-Year Trend Monitoring Report (available at
https://okiahoma.gov/content/dam/ok/en/health/health2/aem-documents/family-health/maternal-and-child-
health/child-adolescent-health/yrbs/202 |Oklahoma%020YRBS%2010-
Year%20Trend%20Monitoring %20Report?o202011-2021 FINAL.pdf).
? Mark Zuckerberg, Founder’s Letter, 2012 (available at
https://m.facebook.com/nt/screen/?params=%7B%22note_id%22%3A261129471966151%7D&path=%2Fnotes*o2F
note%2F &refsrc=deprecated& _rdr).Page 6 emblematic of its pursuit of innovation and profit maximization without regard to the collateral
damage it may cause.
10. The Attorney General’s investigation of Meta has revealed that it knowingly and
repeatedly engaged in unfair and deceptive conduct at the expense of Oklahoma consumers.
Meta’s conduct was, and continues to be, unlawful.
11. Meta deliberately designed its social media platforms, particularly Instagram, to be
an addiction machine targeted at consumers under eighteen years old (“Adolescents”).
12. This was not an accident. Meta marshalled vast resources to study and understand
Adolescents’ psychology and behavior so it could better exploit their developmental vulnerabilities
through irresistible design features.
13. Meta did this to capture an ever-increasing amount of Adolescents’ time and user
data—all to serve Meta’s advertising business.
14. Unlike other products that have appealed to Adolescents for generations—like
Hershey bars or cans of Coke—Instagram has no single unit of consumption. There is no natural
stopping point. Instead, Instagram serves up a bottomless pit of content where users can spend
their time limited only by the total hours in a day. And for every second a consumer spends on
Meta’s platforms, Meta profits.
15. | Meta designed its social media products exploit attention, embedding an array of
design features that maximize engagement of Adolescents on its platforms, and peppering them
with inducements to “log on” and making it psychologically difficult to “log off.” These features,
including push notifications, automatically playing short-form videos (i.e., videos shorter than 1-
minute), infinite scrolling, and ephemeral content, are designed as obstacles to prevent Adolescents
from disengaging from the platform.Page 7 16. The U.S. Surgeon General recently issued an Advisory acknowledging as much:
“You have some of the best designers and product developers in the world who have designed
these products to make sure people are maximizing the amount of time they spend on these
platforms. And if we tell a child, use the force of your willpower to control how much time you’re
spending, you’re pitting a child against the world’s greatest product designers.”
17. Instagram’s design features have fueled a dramatic increase in the amount of time
Adolescents spend on the platform. Indeed, for many Adolescents, Instagram is viewed as an
indispensable part of their identity, a podium from which they can share a carefully cultivated
“highlight reel” of who they are and a place where they must constantly be “present.”
18. Adolescents feel addicted to Instagram. They report difficulty controlling their time
spent on the application. And they frequently express that they would prefer to spend less time on
Instagram but feel powerless to do so. And Meta’s internal studies have repeatedly confirmed these
feelings.
19, Researchers wam that compulsive use of social media platforms like Instagram
impose a wide range of harms, including increased levels of depression, anxiety, and attention
deficit disorders; altered psychological and neurological development; and reduced sleep, to name
a few. Additionally, there is an immense opportunity cost when adolescent years are spent glued
to Instagram rather than engaged in the physical world and in-person experiences that are critical
to development of the adolescent brain.
20. + Meta’s business strategy that consciously addicts Adolescents to its social media
platform has caused, and is continuing to cause, widespread and significant injury to Adolescents
in Oklahoma. This is an unfair practice that violates the OCPA.
4 Ex. 1, Bejar Trans., 236:16-290:14.
3 Hd, at 319:21-320:3.
6 fd., at 200:16-201:13.
Id, at 319:11-17.Page 11 Meta neither fixed the problem nor discontinued the tool. Meta prioritized misleading its
Adolescent users (and their parents) over suffering a public relations hit.
31. Fourth, Meta made material misrepresentations to develop trust among consumers
and parents that Instagram is a safe place for Adolescents. In various public channels, Meta
deceptively represented (1) that it does not prioritize increasing users’ time on Instagram; (2) that
it protects Adolescents from harmful or inappropriate content on Instagram; (3) that it does not
place a monetary value on Adolescents’ use of Meta platforms; (4) that it has not changed its
internal data and research access policies in response to The Wall Street Journal’s 2021 coverage
of its internal research findings; (5) that it uses internal research to improve product safety on a
regular basis, and (6) that its platforms are not addictive.
32. In sum, through its acts, omissions, and misrepresentations, Meta carefully created
the impression that its social media platforms, and specifically Instagram, are safe places for
Adolescents. That impression was false and misleading.
33. Based on this misconduct, the Attorney General brings this action pursuant to the
OCPA and seeks injunctive relief and civil penalties; recovery of attorney fees; and payment of
reasonable expenses, including expert and investigation costs.
THE PARTIES
34. Plaintiff is the State of Oklahoma. This enforcement action is brought by and
through Attorney General Gentner Drummond pursuant to the authority conferred by the OCPA.
35. Defendant Meta Platforms, Inc. f/k/a Facebook, Inc. is a Delaware corporation with
its principal place of business in Menlo Park, California.*
® On October 28, 2021, Facebook, Inc. changed its name to Meta Platforms, Inc.
9Page 12 36. Defendant Instagram, LLC, is a Delaware limited liability company with its
principal place of business in Menlo Park, California. Instagram, LLC is a subsidiary of Meta
Platforms, Inc.
37. Defendants Meta Platforms, Inc. and Instagram, LLC acted in concert with one
another and as agents and/or principals of one another in relation to the conduct described in this
Petition.
38. All the allegations described in this Petition were part of, and in furtherance of, the
unlawful conduct alleged herein, and were authorized, ordered, and/or done by Defendants’
officers, agents, employees, or other representatives while actively engaged in the management of
Defendants’ affairs within the course and scope of their duties and employment, and/or with
Defendants’ actual, apparent, and/or ostensible authority.
JURISDICTION AND VENUE
39, By this Petition, Plaintiff is asserting causes of action, and seeking remedies, based
exclusively on Oklahoma statutory law.
40. The Petition does not confer diversity jurisdiction upon federal courts pursuant to
28 U.S.C. § 1332, as Oklahoma is not a citizen of any state and this action is not subject to the
jurisdictional provision of the Class Action Faimess Act of 2005, 28 U.S.C. § 1332(d). Federal
question subject matter jurisdiction under 28 U.S.C. § 1331 is not invoked by the Petition.
Nowhere does Plaintiff plead, expressly or implicitly, any cause of action or request any remedy
that necessarily arises under federal law.
41. Asacourt of general jurisdiction, the District Court is authorized to hear this matter.
42. Although previously conducting business in Oklahoma for years without
registering, on January 17, 2019, Meta filed its Certificate of Qualification to register as a foreignPage 13 corporation doing business in Oklahoma. In its filing, Meta stated that it intended to conduct
business as “Social media” and that it had assets of $80,077,747, 700. That same day, the Oklahoma
Secretary of State granted Meta a Certificate of Authority to transact business in the State of
Oklahoma.’ On May 15, 2023, in order to continue doing business in Oklahoma, Meta filed an
Annual Certificate including “a report of the amount of capital invested in Oklahoma by a foreign
corporation.” In that filed certificate, Meta stated that the amount of “funds, credits, securities
and property” that it “used or employed in the business carried on in the State of Oklahoma” was
$610,956.
43. Meta has entered into millions of individual contracts with individual Oklahomans
and Oklahoma businesses to use its platforms.!!
44. By 2020, Meta estimated that Instagram, only one of its “Family of Apps” or
“FoA,” had reached a staggering saturation rate of 80% of Oklahomans under 35 years old. Meta’s
own documents also show that total daily active users (“DAU”) in the State of Oklahoma for
Instagram were: (1) more than 800,000 Oklahomans in 2018; (2) more than 900,000 in 2019; (3)
more than I 000,000 in 2020; and (4) more than 1,200,000 in 2021. Instagram is massively popular
among young Oklahomans. According to Meta’s internal metrics, from July 2020 to June over 300,000 Oklahoma teens used Instagram monthly. During that time, over 219,000 Oklahoma
teens used Instagram daily. Between October 2022 and April 2023, over 296,000 “young adults”
(according to Meta’s internal definition) in Oklahoma used Instagram daily.
45, Meta closely monitored its performance, both in terms of numbers of users and time
spent on its platforms, in Oklahoma. Meta monitored the following metrics for Instagram usage in
* Ex. 2, Facebook, Inc.'s Certificate of Qualification (Foreign Corporation) and Facebook, Inc.’s Certificate of
Authority to Transact Business in Oklahoma.
' Ex. 3, Facebook, Inc.’s May 15, 2023, Annual Certificate.
Ex. 4, Instagram Terms of Use.
llPage 14 Oklahoma: (1) the amount of time daily active teens spent on Instagram per day; (2) teen
“penetration” in the state; (3) the ratio of teen daily active users versus monthly active users; (4)
teen monthly active user “story participation” rates; (5) the amount of “feed media” daily active
teens consumed per day on Instagram; (6) the amount of “stories” that daily active teens consumed
per day on Instagram; (7) Instagram market saturation with respect to users under 35; (8) the
percentage of Facebook Android monthly active users on Instagram; and (9) the reduction in
monthly active users over a two month time period.
46. Perhaps most strikingly, Meta internally estimated that about 80% of Oklahoma
teens were monthly active users of Instagram.
47, And of course, Meta enriched itself by selling advertisements targeted at users in
Oklahoma. According to Meta’s public advertising library, Meta regularly targets advertisements
to Oklahoma. All manner of Oklahoma entities—from the OKC Thunder, the Tulsa World,
QuikTrip, and Hobby Lobby, to smaller entities within Osage County like the Osage County
Tourism Department and the Fairfax Community Foundation—advertise on Meta’s platforms. To
be sure, countless others also advertise on Instagram to reach Oklahoma audiences and expand
their businesses in Oklahoma.
48. In sum, Meta aims its platforms at Oklahomans, collects massive amounts of
information on Oklahoma users, studies the impact its platform has on Oklahomans, selis
advertising to Oklahomans based on that information, and tracks its performance in Oklahoma—
all to further its goal of generating profits for its shareholders.
49. This Court has personal jurisdiction over each Defendant pursuant to 12 O.S. §
2004 because of their contacts in Oklahoma. As is described more fully below, Defendants (1)
entered contracts with millions of Oklahomans and intentionally availed themselves of thePage 15 Oklahoma market by directing marketing efforts towards Oklahomans; (2} sold the opportunity to
advertise to Oklahomans; and (3) monitored their substantial contacts in Oklahoma, so as to render
personal jurisdiction over Meta consistent with traditional notions of fair play and substantial
justice. The allegations in this Petition establish that Defendants had minimum contacts with
Oklahoma and are incorporated by reference herein.
50. Pursuant to 12 O.S. § 137, venue is proper in Osage County because Osage County
is a county where the alleged misconduct occurred and where Defendants have conducted or
transacted business.
FACTUAL ALLEGATIONS
A. Defendants Engage in Consumer Transactions with Oklahoma Consumers
51. The OCPA broadly defines a “consumer transaction” as “the advertising, offering
for sale or purchase, sale, purchase, or distribution of any services or any property, tangible or
intangible, real, personal, or mixed, or any other article, commodity, or thing of value wherever
located, for purposes that are personal, household, or business oriented.” 15 O.S, § 752(2). The
definition includes “anything that could be sold or marketed to a consumer.” Horton v, Bank of
America, N.A., 189 F.Supp.3d 1286, 1293 (N.D. Okla.2016). As described herein, Defendants have
engaged, and continue to engage, in conduct that constitutes a “consumer transaction.”
1. Meta Offers Instagram in Exchange for Consumers’ Valuable Consideration That
Enables Meta to Sell Advertising
52. Through its mobile application and website, Instagram offers Oklahoma consumers
the opportunity to connect with friends, follow accounts, and explore various interests.
53. On Instagram, consumers interact with different “surfaces.” Those include: (1) the
main “Feed” and “Stories” surfaces, which display content posted by accounts the consumer
follows; (2) the “Explore” surface that suggests new content to consumers; (3) the “Reels” surface,
13Page 16 focused on short-form videos; and (3) the “Direct Messaging” surface, which allows consumers to
send messages to one another.
54. No two consumers’ experiences on Instagram are the same. Rather, Instagram
presents a customized display to each consumer based on the interests and preferences they express
on Instagram, along with other user data in Meta’s possession.
55. To fully access Instagram, consumers must create an account.
56. To create an account, consumers enter into a contract with Meta.'?
57. By entering that contract, users agree to comply with Instagram’s Terms of Use
(the “Instagram Terms”). !?
58. The Instagram Terms state that “The Instagram Platform is one of the Meta
Products, provided to you by Meta Platforms, Inc. The Instagram Terms therefore constitute an
agreement between you and Meta Platforms, Inc.”"
59. Under the Instagram Terms, users do not pay to use Instagram. Rather, in exchange
for the right to use Instagram, consumers agree to a host of terms that power Meta’s advertising
business.
60. For example, in a section titled “How Our Service Is Funded,” the Instagram Terms
explain that “[i]nstead of paying to use Instagram, by using the Service covered by these Terms
[i.e. Instagram], you acknowledge that we can show you ads that businesses and organizations pay
us to promote on and off the Meta Company Products. We use your personal data, such as
information about your activity and interests, to show you ads that are more relevant to you.”!>
!2 See Ex. 4, Instagram Terms of Use.
13 fed.
14 Id.
15 IdPage 17 61. The Instagram Terms also state that Meta “allow[s] advertisers to tell us things like
their business goal and the kind of audience they want to see their ads. We then show their ad to
people who might be interested. We also provide advertisers with reports about the performance
of their ads to help them understand how people are interacting with their content on and off
Instagram, For example, we provide general demographic and interest information to advertisers
to help them better understand their audience.”’!®
62. Under the Instagram Terms Oklahoma consumers “pay” for Instagram by allowing
Meta to build its advertising business using consumers’ time and attention.
63. Under the Instagram Terms, consumers “must agree to [Meta’s] Privacy Policy to
use Instagram.” There is no other way for a consumer to use Instagram.
64, Every consumer must agree that Meta may collect a host of data including, among
other things: (1) information about the consumer’s activity on Instagram; (2) the messages the
consumer sends and receives; (3) the content the consumer posts through Instagram’s camera
feature and the consumer’s camera roll; (4) the consumer’s responses to various types of
advertisements; and even (5) the hardware and software the consumer is using, GPS data,
Bluetooth signals, nearby Wi-Fi access points, and beacons and cell towers.
65. Meta uses this data to further its advertising business. In particular, it allows
advertisers to target consumers that reside in specific locations in Oklahoma. For example, Meta
allows advertisers to target Oklahoma consumers located in Oklahoma City and Tulsa simply by
choosing from a Meta-created list of “Designated Market Areas,” or “DMA.” Meta assigns DMA
Code 650 to Oklahoma City and DMA Code 671 to Tulsa.!”
16 Fd.
'7 See Designated market areas for ad marketing, Meta Business Heip Center (available at
https:/www. facebook.com/business/help/'1 501907550136620).
15Page 18 66. Meta’s connections to Oklahoma are not limited to Oklahoma City and Tulsa. Meta
enables advertisers to focus on any part of Oklahoma, including Osage County, either by targeting
specific areas by zip code or by creating custom DMAs simply by dropping a pin on a map and
choosing a radius around the pin.'*
2. Advertising is the Core of Meta’s Business
67. Meta has become one of the largest and most profitable companies in the world by
offering highly targeted, data-driven advertising using massive databases of information collected
from the consumers who use its platforms.
68. As Zuckerberg explained, “based on what pages people like, what they click on,
and other signals, we create categories . . . and then charge advertisers to show ads to that category.
Although advertising to specific groups existed well before the internet, online advertising allows
much more precise targeting and therefore more-relevant ads.””!”
69. On information and belief, Meta charges a premium to businesses for access to the
“much more precise” advertising opportunities on its social media platforms.
70. Consumers are served targeted advertisements during all, or nearly all, sessions on
Instagram. And consumers see advertisements almost constantly on Instagram, often several times
per minute. The advertisements Meta displays on Instagram are interwoven into most if not all of
Instagram’s “surfaces.” The same is true of Facebook.
71. Given this business model, Meta is motivated to maximize the time users spend on
Instagram and Facebook.
'8 Meta’s services include a function it calls “Drop a Pin” that allows targeted advertising by identifying a location on
a map using a pin and adjusting a slider scale to establish a particular radius around the pinned location to represent
the designated market area. See Use location targeting, Meta Business Help Center, (available at
https://www. facebook.com/business/help/3655613507856427id=176276233019487).
'9 See Understanding Facebook's Business Model, Mark Zuckerberg, January 24, 2019 (available at
https://about. fo.com/news/20 19/0 1 /understanding-facebooks-business-model/).
16Page 19 72. One incentive is that the more time users spend on Meta’s platforms, the more
“inventory” Meta can sell. For instance, if a user increases her time spent viewing her Instagram
“feed” from one to five hours per day, Meta can deliver roughly five times the number of
advertisements to that user. The increase in time spent therefore significantly increases the profits
Meta can make off this user.
73. Second, the more time that same user is engaged on Instagram, the more Meta
learns about her. This data is gathered and refined so that she can be more accurately dropped into
a particular category and ads shown to her can be more precisely targeted. On information and
belief, advertisers will pay more for these ads.
74. As described more fully below, Meta has succeeded in capturing a breathtaking
amount of consumer time, attention, and data—especially on Instagram, and especially from
Adolescents.
ay Meta Prioritizes Acquiring Adolescents and Maximizing Their Time Spent on
Instagram
7S, In Meta’s business model, not all consumers are created equal. Adolescents are
Meta’s prized demographic.
76, Meta has pursued increasing Adolescents’ time spent on its platforms as one of the
company’s most important goals. As one Meta analyst wrote in 2016:
Lifetime Value (LTV)
This number is core to making decisions about your business. Lifetime value is the
cumulative total “value” (usually expressed as “profit’) you expect from a
customer/ user. With this number, we can make better decisions regarding how
much to spend on each user. Generally, you do not want to spend more than the
LTV of the user.
Short summary is the “young ones are the good ones.” You want to bring people to
your service young and early.Page 20 77. For example, as of November 2016, Meta’s “overall goal remain[ed] total teen time
spent... with some specific efforts (Instagram) taking on tighter focused goals like U.S. teen total
time spent.”
78. This strategy was directed by Zuckerberg, who “decided that the top priority for the
company in 2017 is teens.”
79. On information and belief, Meta has worked to maximize Adolescents’ “time
spent” throughout its corporate history. To that end, a product manager within Instagram wrote
that he wanted to establish a small team “focused on getting a very clear understanding of our
current US DAP and MAP growth situation, opportunities, and challenges because 1) US Teens
are our #1 cohort for both long-term growth of [Instagram] and [Facebook].”
80. This is especially true of Instagram, which is central to Meta’s strategy to grow its
number of Adolescent users.
81. As Meta knows, Instagram is especially appealing to Adolescents and is Meta’s
most popular application with that demographic. Meta therefore devotes vast resources to
increasing Adolescents’ engagement on Instagram.
82. Meta’s internal studies show that Adolescents have an outsized influence on their
entire households’ attitudes towards Instagram. As Meta’s internal research shows, “[t]eens are
household influencers bringing [family] members (parents and younger siblings) to IG
[Instagram], as well as shaping what is ‘normal’ behavior on IG [Instagram].”
83. Even more fundamentally, Meta pursues Adolescents because Meta’s advertising
customers value that audience.
18Page 21 84. Among other reasons, Meta’s advertising partners want to reach Adolescents
because they (1) are more likely to be influenced by advertisements; (2) may become lifelong
customers; and (3) set trends that the rest of society emulates.
85. Notably, Meta allows advertisers to target Adolescents on Instagram based on their
age and location.
86. On information and belief, many advertisers pay Meta a premium to serve
advertisements to Adolescents, including advertisements to Adolescents in specific geographic
markets, such as those in Oklahoma and in Osage County.
87. Meta is motivated to increase Adolescents’ time spent on Instagram not only
because it is a meaningful stream of advertising business, but also, because the data that Meta
collects from that use is itself highly valuable.
88. Meta has profited immensely from its business model. Meta reported earning
$116.6 billion in revenue in 2022, with $23.2 billion in net income, and Zuckerberg, its CEO, has
become one of the wealthiest people in the world.
89. In addition to financial success, Zuckerberg’s role as Meta’s CEO and Founder has
made him a public figure able to exert significant influence not only over the company, but also
over society writ large. In a private email exchange with at least four other billionaires, one of
Meta’s major investors told Zuckerberg that he believed “Mark Zuckerberg has been cast as *the
spokesman* for the Millennial Generation — as the single person who gives voice to the hopes and
fears and the unique experiences of this generation, at least in the USA.” In a response, Mr.
Zuckerberg agreed with that sentiment, stating that he is “the most well-known person of [his]
generation.”Page 22 B. Meta Operates Instagram in a Manner That is Unfair to Adolescents
90. Meta has engaged in unfair practices by designing and operating its platforms in a
manner that addicts Adolescents on massive scale.
1. By Meta’s Design, Instagram Induces Compulsive Use Among Adolescents
91. For generations, companies have marketed products to Adolescents—from bikes to
Barbies to baseball cards. Unquestionably, products like those appealed to a young audience, and
their creators marketed them accordingly.
92. Meta could have followed a similar course. It might have offered a version of
Instagram that was simply appealing, but not addictive.
93. Instead, Meta designed Instagram to exploit known vulnerabilities in Adolescents’
neurological development, making Instagram biologically difficult—and in some cases nearly
impossible—-for teens to resist.
94. As Meta’s founding president, Sean Parker, explained in 2018:
The thought process that went into building these applications, Facebook being the
first of them ... was all about: ‘How do we consume as much of your time and
conscious attention as possible?’ That means that we need to sort of give you a
little dopamine hit every once in a while, because someone liked or commented
on a photo or a post or whatever. And that’s going to get you to contribute more
content and that’s going to get you ... more likes and comments. It’s a social-
validation feedback loop ... exactly the kind of thing that a hacker like myself
would come up with, because you’re exploiting a vulnerability in human
psychology. The inventors, creators—me, [Meta founder] Mark [Zuckerberg],
[Instagram founder} Kevin Systrom on Instagram, all of these people—
understood this consciously. And we did it anyway. ~°
95, On an ongoing basis, Meta pours massive resources into understanding
Adolescents’ cognitive development and vulnerabilities.
20 See Alex Heam, ‘Never get high on your own supply’ — why social media bosses don't use social media, The
Guardian (Jan. 23, 2018) (available at https:/www.theguardian.com/media/20 1 8/jan/23/never-get-high-on-your-
own-supply-why-social-media-bosses-dont-use-social-media (emphasis added)).
20Page 23 06. For example, in the late 2010s, Meta’s Consumer Market Research team created a
“very deep body of work over the course of years/months” studying teens. That group did
“enormous work and investment” in “teen foundational research.”
97. But that “very deep body of work” was not enough. In 2020 Meta started the “Teen
Ecosystem Understand” project, which was an ongoing effort to study Adolescent users. Led by
Instagram’s “growth” team, this project sought to deliver insights that would allow Meta to make
Instagram increasingly irresistible to Adolescent users.
98. On information and belief, Meta’s Teen Ecosystem Understand and Consumer
Market Research projects were two of many initiatives Meta directed to study Adolescents so that
it could capture more of their time and attention.
2. Instagram’s “Teen Fundamentals” Study Shows Instagram’s Power to Induce
Compulsive Use Among Adolescents
99. A May 2020 report by the Teen Ecosystem Understand project illustrates the
lengths to which Meta studied, understood, and sought to exploit teens’ neurological
vulnerabilities.
100. Titled “Teen Fundamentals,” the 97-page internal presentation”' purports to be a
“synthesis of adolescent development concepts, neuroscience as well as nearly 80 studies of our
own product research.” One of the presentation’s stated goal was to “look . .. to biological factors
that are relatively consistent across adolescent development and gain valuable unchanging insights
to inform product strategy today.”
101. The first section of the internal presentation, titled “Biology,” contains several
images of brains in various stages of development.
2! Meta employees regularly convey information to one another through slideshows using Microsoft PowerPoint,
21Page 24 102. As part of the “Biology” section, the internal presentation explained that “Unlike
the body which functions wholly from day one, the brain essential [sic] spot trains certain areas
and functions at a partial capacity before it is wholly developed . . . The teenage brain is about
80% mature. The remaining 20% rests in the frontal cortex... .at this time teens are highly dependent
on their temporal lobe where emotions, memory and learning, and the reward system reign
supreme.”
103. According to the report, “teens’ decisions are mainly driven by emotion, the
intrigue of novelty and reward .. . [making] teens very vulnerable at the elevated levels they
operate on. Especially in the absence of a mature frontal cortex to help impose limits on the
indulgence in these.”
104. The next section of the Teen Fundamentals slide presentation, titled “Behavior,”
acknowledged what Meta knew well: “the teenage brain happens to be pretty easy to stimulate.”
105. By way of an example, the presentation noted that “everytime [sic] one of our teen
users finds something unexpected their brains deliver them a dopamine hit.”
106. The next slide explained that “teens are insatiable when it comes to ‘feel good’
dopamine effects.”
107. And the following slide highlighted that “teens brains’ [sic] are especially ‘plastic’
or keen to learn presenting a unique opportunity that coupled with curiosity can send teens down
some interesting rabbit holes... .”
108. Suggesting another way that teen brains are “easy to stimulate,” the internal
presentation notes that “a huge driver for teen behavior is the prospect of reward. This is what
makes them predisposed to impulse, peer pressure, and potentially harmful risky behavior like
drugs, stunts, and pranks...”
22Page 25 109. Building on that theme, the presentation also observed that “approval and
acceptance are huge rewards for teens and interactions are the currency on IG [Instagram]. DMs
{direct messages], notifications, comments, follows, likes, etc. encourage teens to continue
engaging and keep coming back to the app.”
110. The presentation confirmed that Instagram was successfully exploiting these
vulnerabilities, even when its young consumers voiced their concerns directly to Meta.
111. For example, the internal presentation conceded that:
teen brains are much more sensitive to dopamine, one of the reasons that drug
addiction is higher for adolescents and keeps them scrolling and scrolling. And due
to the immature brain they have a much harder time stopping even though they want
to — our own product foundation research has shown teens are unhappy with the
amount of time they spend on our app.
112. But that was not enough for the members of the Instagram “growth” team that
authored the presentation. Instead, the presentation repeatedly asked how Instagram could become
even more irresistible to teens:
e “So, now that we know this — what is the effect of teen’s biology on their
behavior? And how does this manifest itself in product usage?”
e “How weil does [G [Instagram] cater to [teens’ desired] activity? How does it
stack up against [its competitors]?”
e “Teen’s [sic] insatiable appetite for novelty puts them on a persistent quest to
discover new means of stimulation . . . how can your team give teens
somewhere new to go or something new to find from the product you work on?”
113. Inthe end, the internal presentation succinctly described Meta’s future direction: “‘]
want to remind you all once more of the core things that make teens tick. New things, feeling good
and reward. We are not quite checking all of these boxes . . . some teens are tuming to competitors
to supplement for [sic] those needs.” It concluded: “we [would] do well to think hard about how
we can make IG [Instagram] an app tailored to the teenage mindset.”
23Page 26 114. The Teen Fundamentals report was shared with various teams inside Meta,
culminating in its presentation to Instagram’s leadership team (including Adam Mosseri) in June
2020.
115. In response to the presentation, Instagram’s leadership requested additional
research, which led to a subsequent report titled “Deepening Rewards to Drive More Meaningful
Daily Usage” designed to “unpack” the concept of “rewards.” As part of that report, Instagram
employees conducted user interviews and “synthesized this data with academic literature to
understand how it applies at a psychological level.” Through other related projects, Instagram
continued to use its research and understanding of Adolescent users’ brains to gain a competitive
advantage.
3. Instagram Features Induce Compulsive Use
116. Leveraging its understanding of “the things that make teens tick,” Meta exploited
Adolescents’ limited capacity for self-control through an array of features, such as push
notifications, ephemerality, auto-play, and infinite scroll.
117. Collectively, these and other Instagram features created and exploited obstacles to
Adolescents’ decision-making, causing them to spend more time on Instagram than they otherwise
would.
i. Incessant Push Notifications.
118. Meta causes Adolescents to increase their time spent on Instagram by inundating
them with notifications. The Instagram mobile application, by default, peppers users (including
Adolescents) with frequent alerts or notifications intended to cause users to open the application
119. Echoing Meta’s “Teen Fundamentals” research, academics have observed that
these notifications impact the brain in similar ways as narcotic stimulants:
24Page 27 Although not as intense as [sic] hit of cocaine, positive social stimuli will similarly
result in a release of dopamine, reinforcing whatever behavior preceded it... Every
notification, whether it’s a text message, a “like” on Instagram, or a Facebook
notification, has the potential to be a positive social stimulus and dopamine influx.
120. On information and belief, by default Meta notifies Adolescents when another user
follows them, likes their content, comments on their content, “tags” them, mentions them, sends
them a message, or “goes live” (if the young person follows the user).
121. As Meta’s own research shows, Adolescents have a difficult time resisting these
notifications.
122. In an internal analysis of “Levers for Teen Growth,” a member of the “Instagram
Growth Data Science Team” noted that Defendants could “[l]everage teens’ higher tolerance for
notifications to push retention and engagement.”
123. In a November 2019 internal presentation entitled “IG Notification Systems
Roadshow,” Meta’s employees acknowledged that some of its users are “overloaded because they
are inherently more susceptible to notification dependency.”
124. Similarly, in an internal presentation titled “State of US Teens 2020”—authored by
the “IG [Instagram] Growth Analytics” team—Meta observed that teens “have longer time spent
than adults because they tend to have more sessions per day than adults. This may be because US
teens are more sensitive to notifications and have more notification-driven sessions than adults.”
125. By dispensing notifications and other “rewards” on a variable or intermittent
schedule, Meta increases the addictive nature of Instagram.
ii. Ephemeral Nature of Instagram Content
126. Méeta’s internal research also showed that Adolescents are developmentally wired
” See Trevor Haynes, Dopamine, Smartphones & You: A battle for your time, Blog - Science In The News (Harvard
University May 1, 2018) (available at https://sitn.hms.harvard.edw’flash/2018/dopamine-smartphones-battle-time/).
25Page 28 to fear “missing out.” Meta induces constant engagement by making certain Instagram experiences
ephemeral.
127. Unlike content delivery systems that permit a user to view existing posts on a
schedule convenient for the user, ephemeral content is available only on a temporary basis,
incentivizing users to engage with it immediately.
128. For example, Instagram’s popular “Stories” surface displays user-created images,
videos, and narratives for twenty-four hours, at most, before the content disappears.??7+
129. Similarly, Instagram’s “Live” feature gives users the ability to livestream videos to
followers or the public during a specific session, after which the stream video is typically no longer
available.”°
130. In the case of “Live,” for instance, an Adolescent’s failure to quickly join the live
stream when it begins means that the user will miss out on the chance to view the content entirely.
Often, Instagram sends users notifications that an account they follow is going live so that users
do not “miss out.”
131. Likewise, because “Stories” delete within 24 hours, Adolescents must constantly
monitor that surface if they desire to keep up with the accounts they follow.
132. Meta deploys ephemeral content features because it knows Adolescents’ fear of
missing out on content will keep them glued to Instagram.
133. Meta’s internal documents acknowledge that Instagram’s ephemeral features drive
compulsive Instagram use.
33 See Introducing Instagram Stories (Aug. 2, 2016) (available ai
https://about.instagram.com/blog/announcements/introducing-instagram-stories).
+4 See Josh Constine, Instagram Launches “Stories,” A Snapchatty Feature for Imperfect Sharing, TechCrunch (Aug.
2, 2016) (available at https://techcrunch.com/2016/08/02/instagram-stories’).
*5 See Live, Instagram Help Center (available at https://help.instagram.com/272122157758915/?helpref=he_fnav).
26Page 29 134. For instance, Meta’s internal documents acknowledge that Instagram’s ephemeral
features drive compulsive Instagram use on at least a monthly basis.
135. Meta’s documents noted that “[yJoung people are acutely aware that Instagram can
be bad for their mental health, yet are compelled to spend time on the app for fear of missing out
on cultural and social trends.”
136. Even though it knew ephemerality was fueling out-of-control Instagram usage
among Adolescents, Meta pressed forward. Illustrating the mindset within Meta, in 2021 a user
experience researcher observed that direct messages on Instagram “were not urgent (especially
compared to other apps like Snapchat)” and “consisted mainly of videos and memes from friends
which could be watched at [a user’s]} leisure.” The researcher noted that “we need to develop new
products that increase the possibilities for time-sensitive interactions on [Instagram]... .”
iii. Infinite Scroll, Autoplay, and Reels Induce Perpetual Instagram Use
137. Meta has also implemented tools that induce perpetual, passive Instagram use.
138. For example, Instagram presents an infinite scroll on several key surfaces. In other
words, Instagram partially displays additional content at the bottom of the user’s screen, such that
the user is typically unable to look at a single post in isolation (without seeing the top portion of
the next post in their feed).
139. Instagram teases yet-to-be-fully-viewed content indefinitely—as the user scrolls
down the feed, new content is automatically loaded and previewed. This design choice makes it
difficult for Adolescents to disengage because there is no natural end point to the display of new
information.
140. Meta also deploys the ‘‘auto-play” feature to keep Adolescents on Instagram.
27Page 30 141. Much like infinite scroll, Instagram’s “Stories” surface automatically and
continuously plays content, encouraging Adolescents to remain on the platform.
142. Meta understands that these are powerful tools. Tellingly, when news broke that a
Meta competitor was turning off auto-play for users under 18, Meta’s internal researcher team
registered surprise. One observed that “[t]urning off autoplay for teens seems like a huge move!
Imagine if we turned off infinite scroll for teens.” The second responded “Yeah, I was thinking the
same thing. Autoplay is HUGE.”
143. Meta’s popular “Reels” surface has these same characteristics. An internal strategy
presentation shows that Reels is “a TikTok competitor for short and entertaining videos” and one
of “three big bets” that “Instagram focused on . . . to bring value to teens” in 2020.
144. Videos on Reels automatically and perpetually play as the user swipes the screen
up to the next video. The short-form nature of Reels (between 15 to 90 seconds, as of April 2023)
makes it difficult for Adolescents to close the app. Other aspects of Reels—including the
placement of the like, comment, save, and share buttons on top of the video— reduce or prevent
interruption and keep the user constantly viewing the video.
145. Internally, Meta employees recognized that the design of Reels was harmful to
Adolescents. As one employee observed in September 2020, “Reels seems to be everything they
denounce in the stupid documentary [i.e. Netflix’s The Social Dilemma], and everything we know
from our research: passive consumption of an endless feed, without any connection to the content
creator. Yay.” A Meta mental health researcher responded, “Exactly. Ugh.”
146. On information and belief, the above-described Instagram features are but a small
sample of the tools Meta has deployed to induce Adolescents to spend more time on Instagram
than they otherwise would.
28Page 31 4. Instagram Induces Widespread Compulsive Use Among Adolescents
147. Because of Meta’s design choices, Instagram has already hooked a generation of
Adolescents.
148. Meta knew. Meta’s studies repeatedly confirmed that Adolescents used Instagram
at alarming rates. They also showed that Adolescents wanted to reduce their time on Instagram
and that Instagram’s engagement-inducing features simply overpowered them. Meta’s studies also
showed that compulsive Instagram use had detrimental effects on Adolescents’ mental health,
sleep, and relationships. But because the compulsive use of Instagram by Adolescents benefitted
Meta’s bottom line, Meta chose to ignore its own internal recommendations.
149. For example, in a February 2019 internal presentation titled “Instagram Teen Well-
Being Study: US Topline Findings,” Meta observed that “App Addiction is Common on IG
[Instagram].” The presentation noted that 23% of teenage monthly active users find that they often
feel like they “‘waste too much time on” Instagram.
150. InSeptember 2019, Meta commissioned a third-party study on Teen Mental Health.
That study’s first “Topline Headline” was that “Instagram is an inevitable and unavoidable
component of teens lives. Teens can’t switch off from Instagram even if they want to.”
151. Another “Topline Headline” was that “Teens talk of Instagram in terms of an
‘addicts’ narrative’ spending too much time indulging in a compulsive behavior that they know is
negative but feel powerless to resist.”
152. A later slide observed that “Teens are hooked despite how it makes them feel . . .
Instagram is addictive, and time-spend on platform is having a negative impact on mental health.”
153. The Teen Mental Health report also found that teens ‘know they stay up later than
they should and miss out on sleep to stay plugged in” to Instagram.
29Page 32 154. Elsewhere, the report noted that “Adolescents are acutely aware that Instagram is
bad for their mental health yet are compelled to spend time on the app for fear of missing out on
cultural and social trends.”
155. Relatedly, in an October 2019 discussion regarding mental health research, an
employee observed:
teens told us that they don’t like the amount of time they spend on the app...they
often feel ‘addicted’ and know that what they’re seeing is bad for their mental
health but feel unable to stop themselves. This makes them not feel like they get a
break or can’t switch off social media. In the survey, about 30% (and even larger
proportions of those who are unsatisfied with their lives) said that the amount of
time they spend on social media makes them feel worse.
156. In March 2020, one Instagram employee asked if there were “any recent studies
where we explicitly talk about time spent tools and why teens want them.” In response, a different
employee confirmed that “[t]he feedback, essentially, is that (1) teens feel addicted to IG
[Instagram] and feel a pressure to be present, (2) like addicts, they feel that they are unable to stop
themselves from being on IG, and (3) the tools we currently have aren't effective at limiting their
time on the ap [sic].”
157. But despite that survey feedback, Meta made sure not to speak about the concept
of “addiction” publicly. In that same March 2020 exchange, the two employees discussed a draft
public statement regarding ‘efforts to combat social media addiction.”
158. The first asked: “Do we want to call it addiction? Maybe not.” The second clarified:
“(this is internal only).” The first employee responded: “Internal only makes it better. I’m just a
little cautious about calling it addiction.” The second responded: ‘Totally agree, we would never
want to say that!”
159. Employees continued to grapple with this issue in September 2020, when Netflix
released The Social Dilemma, which accused Meta of addicting Adolescents to Instagram.
30Page 33 160. The program hit home with some Meta employees. In one exchange among several
Instagram employees, Instagram’s Director of Data Science stated ‘by the way] there is a new
Netflix [documentary] basically saying we’re creating a world of addicts...” In response, a second
employee stated that the documentary “makes me feel like tech plays to humans’ inability to have
self-control lol.”
161. Inresponse, Instagram’s Director of Data Science stated, “Yeh that’s exactly what
the [documentary] says. I think its true [to be honest] . . . | do worry what it does to Adolescents
who are still developing their brains and social skills, as well as being more susceptible to mean
comments or lack of friends/feedback.”
162. A third employee asked if Meta was “creating addicts or facilitating them.... giving
existing addicts a really accessible outlet?” The second employee clarified, “a really accessible
outlet that optimizes for time spent...{and] keeps people coming back even when it stops being
good for them.”
163. Instagram’s Director of Data Sciences responded, “without the right stimulus,
someone might never become an addict. So it’s a tricky one. It’s like, you’ll never become a
gambling addict if you don’t visit vegas : P”
164. That same day in September 2020, Instagram’s Director of Data Science analyzed
the scope of the problem, creating charts titled “Number of US Humans who spend a lot of time
on IG ina day,” and “US Humans that spend a ton of time on IG in a Week.”
165. The daily chart showed that in the United States more than 475,000 teens spend 3-
4 hours per day on Instagram; more than 235,000 spend 4-5 hours; and more than 300,000 spend
5 or more hours. The weekly chart showed that in the United States more than 1 million teens
31Page 34 spend 14-21 hours; more than 420,000 spend 21-28 hours; and 400,000 spend 28 or more hours
per week on Instagram.
166. Meta knew that this level of usage was caused by its design. As a Meta Vice
President of Product told Instagram’s leadership in February 2021, “problematic use . . . [will]
require more fundamental changes to our goals, what type of work they incentive [sic], and
therefore how core mechanics work (feed design, ranking, sharing, notif[ications]}.” That message
did not receive a response.
167. And Meta knew that Instagram’s core mechanics were interfering with a critical
part of Adolescents’ development: Sleep.
168. For example, in an April 2021 analysis, Meta observed that “peak” hours for
messaging were “in the late evenings,” with the highest rate of “‘sessions with message sends”
occurring between 9—] 1pm.
169. That same analysis showed that on weekdays, US teens spent the most time on
Instagram between 9-11 pm.
170. After reviewing that information, a Meta data scientist commented, “Honestly the
only insight I see in these charts is that teens are really into using IG [Instagram] at | 1pm when
they should probably be sleeping : ( ”
171. Internally, Meta understood the specific ways that compulsive use manifested on
Instagram.
172. For example, a November 2021 analysis titled “Well-being: Problematic Use”
showed that “more reliable proxies for identifying problematic use” included: ‘passive’
consumption, frequent low-engagement sessions, disproportionate night-time usage, repetitive app
checking, and receiving and responding to more push notifications.”
32Page 35 173. That same analysis also acknowledged that “problematic use” was “more common
among teens and people in their 20s.” It explained: “this is consistent with Adolescents having
problems with self-regulation.”
5, Instagram Harms Adolescents by Inducing Compulsive Use
174. Defendants have substantially injured Adolescents by designing Instagram to
induce compulsive and excessive use, which interferes with important developmental processes
and behaviors.
175. These injuries include Adolescents’ lack of sleep and related health outcomes,
diminished in-person socialization skills, reduced attention, increased hyperactivity, self-control
challenges and interruption of various brain development processes.
176. Defendants have also caused Adolescents to experience mental health harms, such
as increased levels of depression and anxiety.
177. In addition, Defendants have caused Adolescents to have diminished social
capacity and other developmental skills by virtue of the “opportunity cost” associated with
devoting significant time to social media, rather than participating in other developmentally
important, in-person, life experiences.
178. The United States Surgeon General’s May 2023 Advisory, titled “Social Media and
Youth Mental Health” (the “Advisory”), describes some of the harms caused by Defendants.”° As
the Advisory explains, “{a] Surgeon General’s advisory is a public statement that calls the
American people’s attention to an urgent public health issue . . . Advisories are reserved for
significant public health challenges that require the nation’s immediate awareness and action.”
26 See U.S. Dep't of Health & Hum. Servs., Social Media and Youth Mental Health: The U.S. Surgeon General’s
Advisory 4 (2023) (available at https://www hhs.govw/sites/default/files/se-youth-mental-health-social-media-
advisory.pdf).
33Page 36 According to the Surgeon General, Adolescents’ social media use ts one such significant public
health challenge.
179. As the Advisory explains, “[e]xcessive and problematic social media use, such as
compulsive or uncontrollable use, has been linked to sleep problems, attention problems, and
feelings of exclusion among adolescents.”
49
180. The Advisory also identifies “changes in brain structure,” “altered neurological
development,” “depressive symptoms, suicidal thoughts and behaviors,” “attention
deficit/hyperactivity disorder (ADHD)” and “depression, anxiety and neuroticism,” as additional
harms to Adolescents associated with compulsive social media use.?’
On Meta Engages in Deceptive Conduct by Omitting and Misrepresenting Material
Facts About Instagram
18i. Under the OCPA, a business engages in deceptive conduct when its acts,
representations, or omissions deceive or could reasonably be expected to deceive or mislead a
person to their detriment.
182. Asan initial matter, Meta failed to disclose Instagram’s addictive nature. For years,
Meta has led young consumers (and their parents) to believe that Instagram is a safer and less
harmful platform than it is.
183. Meta deceived young consumers and their parents by failing to disclose that
Instagram is, on balance, harmful to consumers (and especially damaging to girls), by concealing
information about some of its most harmful platform features, by promoting misleading metrics
about platform safety, and by touting inaccurate and ineffective “well-being” initiatives, among
other methods.
28 See Community Standard Enforcement Report, Q2 2023 Report, Meta Transparency Center (Aug. 2023)
(available at https://transparency.fb.com/data/community-standards-enforcement/).
50Page 53 282. In March 2021, Meta conducted an internal Meta “Company Narrative Audit” that
suggested ways the company could improve its standing with the public-—and with consumers,
more specifically, The audit identified several “narratives” the company should attempt to combat
the narratives such as “[Meta] allows hateful and harmful content to proliferate on its platform.”
283. To counteract the narrative, the audit suggests that Meta should publicize that:
“Every three months we publish our Community Enforcement Standards Report to track our
progress and demonstrate our continued commitment to making Facebook and Instagram safe.”
284. Consistent with this effort, internal communications show that Meta encouraged
employees to use the Reports as an external “measure for platform safety” that “illustrate the
efforts we are making to keep our platforms safe.”
285. But the impression that the Reports create—that Meta platforms are safe and only
rarely display harmful content—is false and misleading.
286. For example, Meta’s 2021 third quarter Report states that on Instagram, “less than
0.05% of views were of content that violated our standards against suicide & self-injury.” That
representation created the impression that it was very rare for users to experience content relating
to suicide and self-injury on Instagram.
287. But Meta’s contemporaneous internal BEEF survey data showed that during 2021,
6.7% of surveyed Instagram users had seen self-harm content within the last seven days. For users
between 13-15 years of age, 8.4% had seen content relating to self-harm on Instagram within the
last seven days.
288. Thus, the frequency with which users encounter self-harm-related content on
Instagram vastly exceeded the impression Meta created through its Reports.
51Page 54 289. A similar discrepancy may be seen in Meta’s measurement of bullying and
harassing content.
290. For example, the third quarterly Report of 2021 stated that “we estimate between
0.05% to 0.06% of views were of content that violated our standards against bullying & harassment
[on Instagram].” This representation created the impression that it was very rare for users to
observe or experience bullying or harassment on Instagram.
291. Again, Meta’s contemporaneous internal user survey data told a different story:
Among surveyed Instagram users, 28.3% witnessed bullying on the platform within the last seven
days and 8.1% were the target of bullying on the platform within the last seven days.
292. Among 13-15-year-olds, 27.2% reported witnessing bullying within seven days.
Among users aged 16-17, that figure was 29.4%.
293. When asked whether they had been the target of bullying on Instagram within the
last seven days, 10.8% of 13-15-year-olds said yes.
294. Similarly, and contrary to the 2021 third quarter Report’s representation that
harassment on Instagram was rare, Meta’s contemporaneous internal survey showed that 11.9% of
all survey respondents said they had received unwanted advances on Instagram within the last
seven days.
295. Among 13-15-year-olds, 13.0% reported that they had received unwanted advances
within the last seven days. Among 16-17-year-olds, that figure was 14.1%.
296. In other words, contrary to the impression the Reports created, Instagram users in
general—and Adolescents in particular—regularly encounter content related to self-harm,
bullying, and harassment on Instagram. Through its Reports, Meta affirmatively misrepresented
the actual prevalence of such harms.
32Page 55 ii. Meta’s Executive leadership Knew the Reports Misled Consumers
297, Meta’s leadership team understood the discrepancy between Meta’s public Reports
and Meta’s internal survey results.
298. On October 5, 2021, Bejar—then a contractor for Meta and formerly Meta’s
Director of Site Engineering—emailed Zuckerberg, Sandberg, Cox, and Mosseri voicing concerns
that “there was a critical gap in how [Meta] as a company approach[es] harm.””°
299. Bejar proposed that the company shift the focus of its public communications away
from the “prevalence” of community standards violations, towards a measure (like the BEEF
surveys)*’ that reflected the true scope of harmful content encountered on Instagram.?!
300. Meta’s senior Leadership did not respond to Bejar. In fact, Zuckerberg, with whom
Bejar worked directly for several years, declined to respond to Bejar’s email. Bejar has stated that
he could “not think of an email that I sent to Mark [Zuckerberg] during my time [at Meta] that he
didn’t read or respond to.”
301. Undeterred, Meta continued to issue and publicize the Reports—even though
Meta’s leadership team knew the Reports vastly under-represent the volume of harmful content on
Instagram, and despite Bejar’s pleas.
302. During the multistate investigation, Bejar testified that Meta adopted and
maintained this strategy to mistead the public.
29 Ex. 1, Bejar Trans., at 236:16-290:14.
39 fd.
31 Fed.
32 Jd, at 291:7-17.Page 56 303. When asked if he believed “that Mr. Zuckerberg and other company leaders focused
on the Prevalence metric because it created a distorted picture about the safety of Meta’s
platforms,” Bejar testified ‘I do.”
304. When asked if he thought “Mr. Zuckerberg’s public statements about prevalence
created a misleading picture of the harmfulness of Meta’s platforms,” Bejar testified “I do.’
305. And when asked if he was “aware of any instances where the company, in [his]
view, minimized the harms users were experiencing on Meta’s platforms,” Bejar testified: “Every
time that a company spokesperson in the context of harms quotes Prevalence statistics I believe
that is what they are doing, that they’re minimizing the harms that people are experiencing in the
product.’?>
306. Meta issued the Reports and made other public representations in order to downplay
the harmful experiences that are widespread on Instagram—particularly for Adolescents.
oe Meta Deceived Consumers by Promoting “Time Spent” Tools Despite Known
Inaccuracies
307, For years, Meta has affirmatively deceived consumers by promoting and
maintaining inaccurate time-tracking tools on Meta platforms.
308. On August 1, 2018, Meta announced “new tools to help people manage their time
on Facebook and Instagram.” The announcement touted platform-specific activity dashboards,
daily use reminders, and a push notification-limiting tool engineered “based on collaboration and
inspiration from leading mental health experts and organizations, academics, [Meta’s}] own
extensive research and feedback from [Meta’s] community.”
3 fd, at 200:16-201:13.
44 Fad.
35 fe. at 291:7-17,
36 See Ameet Ranadive and David Ginsberg, New Tools to Manage Your Time on Facebook and instagram, Meta
Newsroom (Aug. 1, 2018) (avai/ab/e ai https://about.fo.com/news/2018/08/manage-your-time’}.
54Page 57 309. In that announcement, Meta acknowledged that it has “a responsibility to help
people understand how much time they spend on [Meta] platforms so they can better manage their
experience.” Meta stated that it hopes “that these tools give people more control over the time they
spend on our platforms and also foster conversations between parents and teens about the online
habits that are right for them.’
310. Through these public statements and others, Meta led Oklahoma consumers and
parents to believe they could rely on Meta’s so-called “Time Spent” tools to track and manage the
time spent on Instagram in a meaningful, accurate way.
311. That representation was false. By March 2020, Meta knew that its Time Spent data
was materially flawed.
312. As one Meta staffer observed at the time, “[oJur [Time Spent] data as currently
shown is incorrect. It’s not just that Apple / Google have better data. Ours is wrong. Far worse.
We're sharing bad metrics externally .. . The reason this is relevant is we vouch for these
numbers. Any day they’re out there is a legal liability.”
313. By the middle of 2020, Instagram’s team charged with decommissioning (or at
Meta’s refers to it, “unshipping”) platform features recommended that Meta’s Time Spent tools
should be removed from Meta’s platforms.
314. But Meta did not follow that recommendation because the “Time Spent” tool was
a key part of Meta’s message to users that Instagram was a trustworthy platform where the risks
of addiction were low.
315. For instance, when she learned about the effort to remove the Time Spent tools,
Instagram’s Head of Policy feared that the removal of inaccurate Time Spent tools would strip
37 Id.
55Page 58 Meta of its “biggest proof point” on “tech addiction/problematic use.” Consequently, she
advocated that the Time Spent tools should remain in place, despite their inaccuracy:
{T]he time spent dash{board and] end of feed notification is the biggest proof
point we have on tech addiction/problematic use and the tool with the most
positive sentiment from our mental health stakeholders—there’s no product work
we’ve done in the last four years that comes close and we wouldn’t have the
credibility we now have in the social comparison/mental health parent space had
we not launched this... In order to land this unship successfully we would need to
land the why, and without doing so we would lose significant credibility with our
policy and mental health stakeholders . . . I don’t think that’s going to land well
without having something that addresses the underlying issue around problematic
use.
316. Internal resistance to the removal of Time Spent tools continued in the latter half of
2020, as users spent more time on Meta’s platforms during the COVID-19 pandemic. For example,
in July 2020, Meta’s Product Marketing and Communications teams told colleagues that Meta
should not remove the inaccurate Time Spent tools because:
« “Time spent is a bigger concern due to COVID/spending more time online.”
¢ “{Meta] just deprioritized the mental health team, so no new or upcoming
[mental health-promoting] features to point to here.”
e “[Facebook] launched their v2 time spent tool on iOS in Q2 (Android coming
in Q3) and got decent press around the re-launch.”
* “Upcoming moments make the market environment sensitive in this area
(suicide prevention day (sept), world mental health day (oct)) and there is
concern that back-to-school will spark new issues in market perception due to
the majority being online/remote learning so time spent online will likely be
top-of-mind for many.”
317. In other words, Meta’s Product Marketing and Communications teams preferred to
maintain the facade of Meta’s Time Spent tools because the truth—that Meta was not actually
providing any meaningful, accurate tools to help users or parents combat or reduce compulsive
use—would undermine Meta’s business interests and public “sentiment.”
a6Page 59 318. By October of 2020, internal momentum to discontinue Meta’s inaccurate Time
Spent tools was successfully stymied.
319. In the words of one Meta employee who originally advocated for the removal of
inaccurate Time Spent tools: “I don’t think we can touch [the Time Spent tool] for months, maybe
even more. The regulatory and brand risk from removing our only addiction-related features
outweighs . .. the wins around user trust in the data from the few users who use it.”
320. More than a year after recognizing the Time Spent tools’ inaccuracy, Meta
continued publicly touting the features.
321. On information and belief, Meta regularly promoted its “Time Spent” tool as an
accurate and useful way for users to control their use of Instagram, even when it knew that the
“Time Spent” tool delivered inaccurate metrics.
322. Meta made these representations to build trust with consumers and parents that
Meta’s Time Spent tool would help users (particularly young users) manage their time on
Instagram, even though Meta knew that tool was broken. In this way, Meta won public trust and
sentiment by deceiving the public about the utility of its core addiction-mitigation feature.
4. Through Public Misrepresentations, Meta Leads the Public to Believe That
Instagram is Safe for Adolescents
323. The Time Spent episode is not the only time Meta has prioritized winning trust over
telling the truth. To the contrary, Meta has repeatedly misrepresented facts about its business to
convince consumers and their parents that Meta can be trusted to keep Adolescents safe on
Instagram.
i. Meta Created the False Impression That It Restricts Adolescents from
Accessing Harmful Content on Instagram
324, Through express representations, Meta cultivated the impression that it protects
37Page 60 young users from harmful or inappropriate content on Instagram.
325. For example, in the opening statement to his Congressional testimony in December
2021, Adam Mosseri stated “We've put in place multiple protections to create safe and age-
appropmiate experiences for people between the ages of 13 and 18” on Instagram.
326. Antigone Davis—Meta’s Global Head of Safety—made the same representation in
prepared remarks to Congress in September 2021 .?
327, During subsequent questioning from senators, Davis explained that “[w]hen it
comes to those between 13 and 17, we consult with experts to ensure that our policies properly
account for their presence, for example, by age-gating content.’”*’ Davis added, Meta does not
“allow young people to see certain types of content. And we have age gating around certain types
of content,’
328. Davis also specifically testified that Meta does not “direct people towards content
that promotes eating disorders.”"!
329. But intemal documents reveal Meta’s knowledge that Adolescents regularly
encounter content on Meta’s platforms that is not age appropriate—and that Meta’s platforms do,
in fact, push certain Adolescents toward content that promotes eating disorders.
330. In fact, a report that Davis herself authored less than a year before her testimony
contradict the representations she made to Congress (and the public). Titled “Child Safety: State
of Play,” Davis’s October 2020 report contains many alarming findings regarding the lack of
protections for young users on Instagram.
¥ See Facebook Head of Safety Testimony on Mental Health Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at hitps://www.rev.com/blog/transcripts/facebook-head-of-safety-testimony-on-mental-health-
effects-full-senate-hearing-transcript).
39 Fd.
40 Id.
4 fd.
58Page 61 331. For example, according to that report, Instagram had “minimal child safety
protections” that were needed to prevent “Child Sexual Exploitation.”
332. On the topic of “Age Assurance/Appropriateness” on Instagram—a key feature of
Davis’s testimony—--Davis’s report showed that Instagram’s “vulnerabilities” included “Uenforcement.” More specifically, Davis’s report noted that Instagram’s “age gating relies on either
stated age or weak age models; lack[s] checkpoint.” The same slide identified “content” as an
additional “[v]ulnerability” for Instagram, due to the presence of “inappropriate/harmful content
and experiences for minors.” This slide concluded that for these topics, “there is work happening
in this area but not resourced to move quickly.”
333. A later slide observed Instagram’s significant “vulnerabilities” regarding users’
well-being. It found that “core product features connect to challenging societal issues” such as the
“objectification of women (e.g. [augmented reality] face and body altering filters), competitive
social comparisons (e.g. likes and comments) and anxiety/[fear of missing out] (e.g.
notifications).” That slide also noted that Instagram was ‘“‘vulnerable” because it had difficulty
“calibrating for content impact on well-being (e.g.[,] eating disorder content and gender-based hate
speech).”
334. Other internal documents demonstrate that Meta did not meaningfully improve
Instagram’s safety or age-appropriateness by the time Davis testified in September 2021.
335. For instance, according to Meta’s internal findings from October 2021 (just after
Davis’s testimony), only 2% of content that young users encounter on Meta’s platforms is “age
appropriate” or “the sort of content we would like to promote to teens.”
336. And, in contrast to Davis’s testimony, Meta’s internal studies show that Instagram
disproportionately directs teen girls to negative appearance comparison-promoting content
59Page 62 content that Meta knows promotes eating disorders. For example, a June 2021 internal study shows
that on Instagram “approximately 70% of teen girls see ‘too much’ sensitive content,” .e., content
that makes them “often feel worse about themselves.” And another June 2021 internal study
showed that “roughly | in 5 piece pieces of content” teen girls see is “associated with more
negative appearance comparison.”
337. As these examples show, through Mosseri and Davis’s testimony, Meta
affirmatively misled the public about the efficacy of Meta’s efforts to protect young users from
harmful content and/or to deliver age-appropriate experiences on Instagram. These are material
misrepresentations, as reasonable consumers would be less likely to use a platform (or to allow
young users in their care to use a platform) that exposes users to age-inappropriate or harmful
content.
ii. Meta Created the False Impression That It Does Not Prioritize Time Spent
338. To downplay concerns that Instagram is addictive, Meta has repeatedly created the
public impression that it does not prioritize increasing users’ time on Instagram. To create that
impression, Meta’s executives claimed that it does not internally measure success in terms of the
time users spend on Meta’s platforms or otherwise encourage employees to pursue that goal.
339. For example, in October 2019, Mark Zuckerberg publicly stated that Meta does not
allow Meta “teams [to] set goals around increasing time spent on [Meta’s] services.”
340. Similarly, in October 2021, Sheryl Sandberg represented that the company does not
“optimize [its] systems to increase amount of time spent” and that Meta “explicitly do[es]n’t give
[its] team goals around time spent.”Page 63 341. Meta makes representations like these to garner trust: it wants the public (including
consumers and parents) to believe that it does not measure success in terms of time spent to dispel
the notion that it intentionally fuels compulsive use of Meta’s products.
342. But Meta’s representation that it does not set goals based on time spent is false.
343. For instance, on December 28, 2015, Zuckerberg instructed that Meta should aim
to increase the time that Instagram users spend on the platform by 10% within the next five years.
344. Similarly, an internal email to Instagram’s co-founders lists “emphasis on driving
time spent” among the company’s “‘[k]ey {t]hemes” for the first half of 2016.
345. As another example, an internal Meta presentation titled “2017 Teens Strategic
Focus” explicitly describes Meta’s 2017 “Top-Line Goals” for the first half of 2017, which was
“shared with Zuck.” The first “Top Line Goal” is to “grow teen time spent.”
346. On information and belief, Meta continues to work to increase users’ time spent on
Instagram even to this day.
347. Thus, by claiming that it did not set goals based on time spent, Meta affirmatively
misled the public—inciuding Oklahoma consumers and parents—about Meta’s motivations and
internal business practices. This is a material misrepresentation, as reasonable consumers (and the
parents of Adolescents) would be less likely to trust a platform that works to capture ever-
increasing shares of users’ time.
lil. Meta Created False Impressions That It Does Not Place a Monetary Value on
Adolescents
348. Ina similar vein, Meta deceptively led the public to believe that it does not place a
monetary value on Adolescents’ use of Meta platforms. Meta created that impression it does not
discuss its youngest users in terms of their financial value to the company.
61Page 64 349. For example, during Antigone Davis’s September 2021 Congressional testimony,
Senator Amy Klobuchar asked Davis for the monetary value that Meta places upon a young user’s
lifetime use of Meta products.
350. Davis responded, “That’s not how we think about building products for young
people... It’s just not the way we think about it.”
351. Through Davis’s testimony, Meta led the public to believe that it does not place a
monetary value on Adolescents’ use of Meta’s platforms.
352. But Meta’s internal correspondence demonstrates that Davis’s response to Senator
Klobuchar was inaccurate and misleading.
353. For instance, an internal email from September 2018 illustrates that Meta plainly
discusses the financial value that Adolescents represent to the company. According to Meta, “The
lifetime value of a 13 [year old] teen is roughly $270 per teen.”
354. Consequently, through Davis’s testimony, Meta affirmatively misled the public
including Oklahoma consumers-—about whether the company places a monetary value upon young
users’ lifetime use of Meta’s products. This is a material misrepresentation, as reasonable
consumers (and especially the parents of Adolescents) would be less likely to trust a platform that
calculates the monetary value that the platform may extract from an Adolescent’s lifetime
engagement.
iv. Meta Created the Misleading Impression That It Was Not Restricting Access
to Internal Research Findings and that it Used Its Internal research to Improve
Produce Safety
355. Through Congressional testimony, Meta deceptively led the public to believe that
it had not changed its internal data and research access policies in response to The Wall Street
Journal’s 2021 coverage of Meta’s internal research findings. Meta wanted to create that
62Page 65 impression so consumers and parents would believe that the company’s well-being research was
widely available internally and that the company had no reason to lock down internal information
about Instagram’s mental health impacts.
356. During Davis’s September 2021 Congressional testimony, Senator Marsha
Blackbum asked Davis “how are you restricting access to data internally? Have your policies
changed since The Wall Street Journal articles [by Jeff Horwitz describing the internal Meta
research shared by Frances Haugen]?
357. Davis succinctly responded, “Senator, not that | am—not that I’m aware of
certainly.”4?
358. Through Davis’s testimony, Meta led the public to believe Meta did not change its
internal access policies—such as restricting internal access to data and research—following The
Wall Street Journal’s coverage of the internal Meta research shared by Frances Haugen.
359. But in fact, as described in detail in Section C.1. ii. above, in reaction to Haugen’s
public disclosures (which led to Davis’s Congressional testimony), Meta methodically locked
down internal access to well-being related data and research.
360. To briefly restate evidence described above, in August 2021—shortly after Meta
learned of The Wall Street Journal’s forthcoming journalism—one Instagram research manager
noted that the company was “locking down access to some of the extra sensitive pieces of work.”
* See Facebook Head of Safety Testimony on Mental Health Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at https://www.rev.com/blog/transcripts/facebook-head-of-safety-testimony-on-mental-heaith-
effects-full-senate-hearing-transcript).
43 Tad.
63Page 66 361. The same manager subsequently instructed a research colleague to “make sure that
any of our shareable deliverables or insights docs that you own on the mental well-being space are
locked down.’"
362. Consequently, through Davis’s testimony, Meta affirmatively misled the public
including Oklahoma consumers—about whether the company internally restricted access to data
and research following The Wall Street Journal’s coverage of Meta’s internal findings. This is a
material misrepresentation, as reasonable consumers (and parents of Adolescents) would be less
likely to trust a platform that undertakes affirmative steps to shield from disclosure internal
information about the platform’s “mental well-being” impacts.
363. Similarly, through Congressional testimony, Meta deceptively led the public to
believe that the company regularly uses internal research findings to inform safety-oriented
product improvements. Meta created this impression so consumers, parents, and guardians would
believe that the company used its troubling internal research findings to improve the safety of its
platforms.
364. During Davis’s September 2021 Congressional testimony, Senator Klobuchar
asked Davis: “What specific steps did you... take in response to your own research [into
Instagram users’ body image issues] and when?”
365. Davis responded: “Senator Klobuchar, I don’t know that I'll be able to give you
exact dates, but what I can tell you is that this research has fueled numerous product changes.”
366. Similarly, during Mosseri’s December 2021 Congressional testimony, Senator Ted
44 Id.
45 See Facebook Head of Safety Testimony on Mental Health Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at https://www.rev.com/blog: -testintony-on-mental-health-
effects-full-senate-hearing-transcript).
46 Fd.
64Page 67 Cruz asked Mossert: “How did you change your policies as a result of [Meta’s internal research
into Instagram users’ suicidal thoughts] to protect young girls?”
367. Mosseri responded: ‘Senator, I appreciate the question. We use research to not only
change our policies, but te change our product on a regular basis.”
368. Through Davis and Mossert’s Congressional testimony, Meta led the public to
believe Meta regularly uses internal research findings to improve product safety.
369. But in fact, as described in detail in Section C.1.i. above, members of Meta’s
leadership—including Mosseri—acknowledged the company’s failure to translate research
findings into meaningful product changes (1) shortly after Davis’s testimony; and (2) in the months
after Davis’s testimony and preceding Mosseri’s testimony.
370. To briefly restate the evidence detailed above, in October 2021—just two months
before Mosseri’s testimony-—a senior Meta employee explicitly told Mosseri that Meta had “not
made a lot of progress on getting the research into product.”
371. Around the same time, Mosseri complained about Meta’s failure to translate
research findings into product safety improvements: “I’m really worried about this... we’ve been
talking about this for a long time but have made little progress.”
372. And in November 2021—just one month before Mosseri’s testimony—another
senior Meta employee sent an email to Zuckerberg, Mosseri, and others, underscoring Meta’s
outstanding need “to ensure we have the product roadmaps necessary to stand behind our external
narrative of well-being on our apps.”
“7 See U.S, Senate Committee on Commerce, Science & Transportation, Subcommittee on Consumer Protection,
Product Safety, and Data Security, Protecting Kids Online: Instagram and Reforms for Young Users (Dec. 8, 2021)
(available at hitps://www.commerce,senate,pov/202 L/12/protecting-kids-online-instagram-and-reforms-for-
users).
“8 Id.
65Page 68 373. Consequently, through Davis and Mosseri’s Congressional testimony, Meta
affirmatively misled the public—including Oklahoma consumers—about measures the company
had taken (or failed to undertake) to translate troubling research findings into meaningful product
safety improvements. This is a material misrepresentation, as reasonable consumers, parents, and
guardians would be less likely to trust a platform that fails to deploy safety improvements to
products that pose a known, mitigatable risk to users.
v. Meta Created the [Impression That Its Products Are Not Addictive Despite
Meta’s Internal Research Showing the Contrary
374. Through Congressional testimony, Meta deceptively led the public to believe that
its platforms are not addictive, despite Meta’s own internal research to the contrary.
375. In her September 2021 Congressional testimony, Davis said that Meta does not
build its products to be addictive and disputed the addictive nature of Meta’s products.*°
376. Similarly, in Congressional testimony from December 2021, Adam Mosseri said,
“I don’t believe that research suggests that our products are addictive.”
377. Through Davis and Mosseri’s testimony, Meta led the public to believe Meta’s
platforms are not addictive.
378. In fact, as described in detail in Section B above, Meta (1) knew Instagram was
addictive; and (2) made decisions that facilitated addiction to Instagram long before Davis and
Mosseri’s misleading testimony.
*° See Facebook Head of Safety Testimony on Mental Heaith Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at https://www.rev.com’blog/transcripts/facebook-head-of-safety-testimony-on-mentail-health-
effects-full-senate-hearing-transcript).
% See Taylor Hatmaker, /nstagram’s Adam Mossieri Defends the App’s Teen Safety Track Record to Congress,
TechCrunch (Dec. 8, 2021) (available at https://techcrunch.com/2021/12/08/instagrams-adam-mosseri-senate-
hearing-teen-safety/).
66Page 69 379. To briefly restate evidence described above, by September 2019, Meta knew from
internal research that “[t]eens are hooked despite how [Instagram] makes them feel .. . Instagram
is addictive, and time-spend on platform is having a negative impact on mental health.”
380. And after observing in May 2020 that “approval and acceptance are huge rewards
for teens and interactions are the currency on [Instagram], Meta deployed engagement-inducing
platforms features, such as “fdirect messages], notifications, comments, follows, likes, etc. [that]
encourage teens to continue engaging and keep coming back to the app.”
381. Consequently, through Davis and Mosseri’s Congressional testimony, Meta
affirmatively misled the public—including Oklahoma consumers—about the addictive nature of
the Instagram platform. This is a material misrepresentation, as reasonable consumers (and parents
of Adolescents) would be less likely to trust an addictive platform.
VIOLATIONS OF LAW
COUNT I
OKLAHOMA CONSUMER PROTECTION ACT
15 OS. § 751-(UNFAIRNESS)
382. Oklahoma re-alleges and incorporates by reference all prior paragraphs of this
Petition.
383. The OCPA prohibits businesses from knowingly engaging in “unfair” trade
practices, which are defined as any practice “which offends established public policy” or is
“immoral, unethical, oppressive, unscrupulous or substantially injurious to consumers.” 15 O.S. §
752(14).
384. Defendants have engaged and continue to engage in “consumer transactions” as
that term is defined in the OCPA with hundreds of thousands of Oklahomans.
67Page 70 385. By designing and deploying Instagram in a manner that induces compulsive use,
the Defendants have engaged in unfair trade practices prohibited by the OCPA.
386. Defendants designed and deployed Instagram in a manner that overwhelmed
consumers’ free and informed choice regarding how much time to spend on the Instagram
platform.
387. Defendants’ scheme was particularly unfair as it relates to Adolescent users, who
are a highly susceptible class of consumers. Indeed, Defendants designed and deployed Instagram
in a manner that intentionally exploited the developmental nature of Adolescents’ brains, creating
an obstacle to Adolescents’ free choice and causing them to spend more time on Instagram than
they otherwise would.
388. By designing and deploying Instagram in a manner that induces compulsive use,
Defendants caused or are likely to cause substantial injury to Oklahoma consumers. Specifically,
Defendants’ unfair conduct has caused or is likely to cause significant harms to the mental health
and well-being of Adolescents, who Defendants have caused to spend vastly more time on
Instagram than they otherwise would.
389. Through their conduct, Defendants have likely injured a large number of
Oklahomans, including a significant number of Adolescents that have likely suffered profound and
severe harms as a result of Defendants’ conduct.
390. Each instance of Defendants’ unfair practices constitutes a separate violation of the
OCPA.
391. Insofar as there are positive benefits associated with Defendants’ conduct, those
benefits do not outweigh the harm arising out of Defendants’ conduct.
COUNT H
OKLAHOMA CONSUMER PROTECTION ACT
68Page 71 15 O.S. § 751-(DECEPTION)
392. Oklahoma re-alleges and incorporates by reference all prior paragraphs of this
Petition.
393. Under the OCPA, a business engages in deceptive conduct by, either orally or
through a writing, making a “misrepresentation, omission or other practice that has deceived or
could reasonably be expected to deceive or misiead a person to the detriment of that person.” O.S. § 752(13).
394. Defendants have engaged and continue to engage in “consumer transactions” as
that term is defined in the OCPA with hundreds of thousands of Oklahomans.
395. As described in this Petition, Defendants have repeatedly deceived consumers
through their words, conduct, silence, and action—in violation of the OCPA.
396. By making express and implied material misrepresentations about Instagram’s
safety, the incidence of harmful experiences on Instagram, and the efficacy of Instagram’s “well-
being” related platform features (such as the “Time Spent” tool), the Defendants have engaged in
deceptive trade practices that are prohibited by the OCPA.
397, Defendants also engaged in deceptive conduct in violation of the OCPA by failing
to disclose the harms associated with Instagram in general and with certain Instagram platform
features, which Defendants knew had a harmful effect on consumers’ mental health and well-
being. Defendants knew the express and implied representations they were making were not true
but made these representations anyway to increase consumers’ engagement with Instagram.
398. Through their acts, omissions, and misrepresentations, the Defendants downplayed
the risks of Instagram use and caused reasonable consumers to believe something that was false,
i.e., that Instagram is a safer platform than it is in reality.
69Page 72 399.
the OCPA.
Each instance of Defendants’ deceptive practices constitutes a separate violation of
RELIEF REQUESTED
Plaintiff respectfully requests that this Court:
A.
Enter judgment against each Defendant in favor of the State of Oklahoma for each
violation alleged in this Petition;
Issue a permanent injunction: (i) prohibiting Defendants from using platform
features that Defendants know or have reason to believe cause compulsive use
among Adolescents; and (ii) requiring Defendants to meaningfully and publicly
disclose, on a regular basis, the risks posed by Instagram to Adolescents;
Issue a permanent injunction prohibiting Defendants from engaging in deceptive
acts and practices in violation of the OCPA;
Order each Defendant to separately pay civil penalties to the State of Oklahoma not
more than $10,000 per violation of the OCPA as provided by 15 O.S. § 761.1;
1. Enter judgment finding that each instance in which an Adolescent accessed
the Instagram platform in the State of Oklahoma represents a distinct
violation of the OCPA;
Enter judgment against Defendants and in favor of the State of Oklahoma for the
reasonable costs and expenses of the investigation and prosecution of Defendants’
unlawful conduct, including attorney’s fees, expert and other witness fees, and
costs, as provided by 15 O.S. § 761.1;
Award such other relief as the Courts deems necessary and proper under the
circumstances.
70Page 73 GENTNER DRUMMOND
ATTORNEY GENERAL OF OKLAHOMA
a Ean
Robert J. Carlson, OBA #
Senior Assistant Attorney General
Caleb J. Smith, OBA #
Assistant Attorney General
Oklahoma Office of the Attorney General
15 West 6th Street, Suite
Tulsa, OK
Telephone: 918-581-
Email: Robert.Carlson@oag.ok.gov
Email: Caleb.Smith@oag.ok.gov
-and-
Ethan A. Shaner, OBA #
Deputy Attorney General
Oklahoma Office of the Attorney General
313 N.E. 21st Street
Oklahoma City, OK
Telephone: (405) 521-
Email: Ethan.Shaner@oag.ok.gov
71Page 74 CERTIFCATE OF SERVICE
I do hereby certify that on the 16" day of November 2023, a true, correct and exact copy
of the above and foregoing document was served to those parties as listed below via:
[_] U.S. Postal Service (_] In Person Delivery [_] Courier Service Ge Mai [_] Fax
Robert G. McCampbell, OBA No.
Nicholas (“Nick”) V. Merkley, OBA No.
GABLEGOTWALS
BOK Park Plaza
499 West Sheridan Avenue, Suite
Oklahoma City, Oklahoma
Tel (405) 235-5500 | Fax (405) 235-
Email RMcCampbell(@Gablelaw.com
NMerkleyi@Gablelaw.com
COUNSEL FOR DEFENDANTS,
META PLATFORMS, INC. F/K/A FACEBOOK, INC.
AND INSTAGRAM, LLC
< ' Tilr~
Robert J. Carlson
72Page 75 EXHIBIT 1Page 76
CONFIDENTIAL
--000--
CONFIDENTIAL PROCEEDINGS
EXAMINATION UNDER OATH OF ARTURO BEJAR
Regarding Meta Platforms
Tuesday, May 16,
Palo Alto, California
Stenographically Reported By:
Hanna Kim, CLR, CSR No.
Job No.
Page
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-5127Page 77 =
—— ee
oO
oO
J&
tH
=
i
9 = Q._ In that way, do you think Mr. Zuckerberg's
10 public statements about prevalence created a
11 misteading picture of the harmfulness of Meta's
03.09.
e
3,
:
—
>
& !
i
—
=
LSS eo
(ed (lo
re (ll
[— Cd
I :
Sf es oa
eee [Dd
Co Ce
[oe . ee
Se j som
a | _— a |
—
3 THE WITNESS: Frances Haugen's
14 whistleblowing.
15 BY MR. PHELPS:
16 Q. Might this been the same day as
117 Ms, Haugen’s testimony before Congress?
18 =A. I don't know.
19 Q. Okay. Do you know what note - it says,
20 “I saw the note you shared." Do you know what note 03:59:21 you were talking sbout?
|22 A. Yes. He posted an internal note where he
23 was talking about all of thes¢ issues.
|
04:00:06 |
| __i
60 (Pages 234 - 237)
|
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-51 27Page 79 CONFIDENTIAL
|
|
1 there were really not -- ] mean, we'll get into it,
2 but | think I can summarize it by saying, there's a
3 sentence here that says, “We care deeply about
4 issues like safety, well-being, and mental health.”
5 So what is the well-being metric at this 04:03:6 point? There is no well-being meric in the company
7 at this point.
8 What are the safety metrics? If they care
9 about bullying and harassment for teenagers, where
10 are the metrics that tell you the — and the work ==. 04:03:11 based around the percentage of teenagers that are
12 telling you they're experiencing firsthand bullying
13 and harassment? What's the content that's driving
14 that? What are the features that make that better?
15 Right? 04:03:16 "We care.” What does it mean, "we care“?
17 What does it mean we care when they talk about
18 mental health, when ['m working on an Instagram
19 team, and they de-prioritized the mental health work
20 in the middie of the pandemic? Right? So what docs 04:03:53!
21 it mean "they care about mental health"? Does it
22 mean that he cares as a human being, or does the
23 company actually act on these things?
| 24 And so, I read this e-mail, and I felt
| 25 that it was very misleading to employees because = —-04:04:age 238 | Page
ma
1 he -- he talks about claims that don't make any
2 sense. But if we go through them, no, actually, all
3 those claims make a lot of sense. And I believe
4 it's his responsibility as the leader of the company
5 to investigate, address these claims and innovate 04:04:6 and make these things better because that compeny
7 has managed to get most of humanity on its services.
8 And so, great, you did that, but, like,
| 9 this is — this is the work. This is what you need
| 10 to be really great at in order to provide asafe 04:04:| 11 environment for teenagers, in order to create a safe
| 12 environment for everybody and respectful environment
13 for everybody.
14 And so, when { wrote this note, I felt
15 that it embodied all of the gaps that [had been 04:04:16 researching in that previous year that | tried to
17 summarize in the e-mail | sent to him and Sheryl and
18 Adam and -- and — and Chris.
7 |
20 «=. Okay. What was your reaction to this note 04:02:01 7 |
21 in ~ in broad strokes before we get into it in
22 detail?
3 A. I think it's misleading. And — and it
24 represented many of the things that I felt were
25 driving the behavior of the company in a way that 04:02:
nw
~~
Q. Are you aware of any instances where the
8 company, in your view, minimized the harms users
| 9 were experiencing on Meta's platforms after you sent
10 those e-mails? 06:12:
11 A. Every time that a company spokesperson in
12 the context of harms quotes prevalence, statistics,
13 [believe that is what they are doing, that they're
14 minimizing the harms that people are experiencing in
15 the product. Be it -- albeit somebody in the 06:12:
16 communications team, | think that prevalence is not
17 harm.
18 These numbers in that e-mail, that's harm.
19 You need prevatence, but it’s not reflective of the
20 harm people are experiencing in the product.
06:12:
_ Page 320 j
ee
|
|
eae
Pp
SS
|
i
|
81 (Pages 318 - 321)
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-5127Page 81 EXHIBIT 2Page 82 FILED - Oklahoma Secretary of State #2312726928 01/17/
~~
@1/17/2019 «16:29 AR
OXLAHOMA SECRETARY OF STATE
i
aa
il
$
SERTIFICATE OF QUALIFICATION
IRANIAN orem
| Fa
395422308 ar
Fiting Fee: Minimum $700.
CLASS NUMIBER OF SHARES SERIES PAR VALUE PER SHARE
(If any) (Os, 1 without par value, #0 state)
Common §,000,000,000 A 0.Common 4,141,000,000 8 0.
1}. Maximum smount of cepitel eaid corporation intends and expects to invest in this state at any time during the
current fiscal year: $ é
> *lovested ca ia defined a2 the velve of the maximum amount of funds, credits, eceuritics ind property of
whatever existing at any time during the fiscal year in the State of Oklthoms and used or employed by such
corporation in itz business carried on in this sate.
12, E-MAUL eddeess of the primary contact for the registered business:
(80S FORM 0013-07712)Page 84 Delaware -
The First State
I, JEFFREY W. BULLOCK, SECRETARY OF STATE OF THE STATE OF
DELAWARE, DO HEREBY CERTIFY “FACEBOOK, INC." IFS DULY INCORPORATED
UNDER THE LAWS OF THE STATE OF DELAWARE AND IS IN GOOD STANDING AND
HAS A LEGAL CORPORATE EXISTENCE SO FAR AS THE RECORDS OF THIS
OFFICE SHOW, AS OF THE TENTH DAY OF JANUARY, A.D. 2019.
AND I DO HEREBY FURTHER CERTIFY THAT THE ANNUAL REPORTS HAVE
BEEN FILED TO DATE.
AND I DO HEREBY FURTHER CERTIFY THAT THE SAID “FACEBOOK, INC."
WAS INCORPORATED ON THE TWENTY-NINTH DAY OF JULY, A.D. 2004.
AND I DO REREBY FURTHER CERTIFY THAT THE FRANCHISE TAXES HAVE
BEEN PAID TO DATE. ,
Ot; @. Gas, 0 heme
3835615 8300 Authentication:
SRN 20190197586 ee Date: 01-10-You may vertly this cestificate ondine a: corp.delaware.gov/authvershiralPage 85 aaa
i . a iy EK
OFFICE OF THE SECRETARY OF STATE
CERTIFICATE OF AUTHORITY
WHEREAS, FACEBOOK, INC.
incorporated under the laws of the State of -DELAWARE has filed in the office of
the Secretary of State duly authenticated evidence of us incorporation and an
application for Certificate of Authority to transact business in this State, as
provided by the laws of the State of Oklahoma.
NOW THEREFORE, I, the undersigned, Secretary of State of the State of
Oklahoma, by virtue of the powers vested in me by law, do hereby issue this
Certificate of Authority authorizing said Corporation to transact business in this
Slate.
IN TESTIMONY WHEREOF, ! hereunto set my hand and cause to be
affixed the Great Seal of the State of Oklahoma.
Filed in the city of Oklahoma City this
L7th day of January, 2019.
\
Secretary of StatePage 86 EXHIBIT 3Page 87 rs
=
03/13/2023) 18:4t an
OKLAHOMA SECRETARY OF sTaTE 2020 ANNUAL CERTIFICATE
LAHOMA SECRETARY OF STATE
NRRGBANAE
FACEBOOK, INC. CORPORATION KEY:
STATE OF DOMICILE:
DELAWARE
STATUS :
OUSTED
ANNIVERSARY DATE:
January 17°
PLEASE READ CAREFULLY. TO EXPEQITE THE FILING OF THIS FORM, DO NOT CHANGE ANY OF
THE PRE-PRINTEO INFORMATION OR FIGURES. IF ANY INFORMATION CONTAINEO IN THIS
REPORT HAS BEEN CHANGED, OR THE CORPORATION HAS CEASED DOING SUSINESS IN
IT 1S WPORTANT THAT YOU NOTIFY THIS OFFICE IN WRITING TO REQUEST THE
APPROPRIATE FORM TO AMEND THE RECORDS OR TO WITHDRAW FROM THE STATE.
3.
THIS FORM is lo be used tor Ging the annual certificate, a report of the amount of capilal invested In
Oktahome by a foreign corporation pursuant to 18 O.S., § 1142,A.t3. YOU MUST COMPLETE LINES
6 & 8 ON THE REVERSE SIDE OF THIS ORIGINAL FORM TO CORRECTLY DETERMINE THE
FILING FEE. ALL BLANKS MUST GE FILLED IN WITH EITHER AN AMOUNT OR THE WORD
“HONE®.
ATTACH PAYMENT TO THIS CERTIFICATE. You may wish to retain 8 copy for your files. MAKE ALL
CHECKS PAYABLE AND DIRECT ALL CORRESPONDENCE TO:
OKLAHOMA SECRETARY OF STATE
421 NW 13™ STREET, SUITE OKLAHOMA CITY, OK (405) 621-
WHO MUST FILE: Any corporation thal has NOT paid a fee on its authorized capital. This form MUST
be filed each year with the Secretary of State on the anniversary of ks qualification (at chown above). it
must be signed by the president, vicg president, or managing officer of the corporation.
The PENALTY for faiure to file an Annual Certificate required by taw Is OUSTER of tha corporation
from doing business in the State of Ckishoma.
PLEASE NOTE that the Annus? Certificate Is SEPARATE FROM, AND IN ADDITION TO, the
Franchise Tex retumn which must be filed each year with the Okishoma Tax Commrasion.
RECEIVED
. MAY 15
OKLAHOMA SECRET,
OF STATE ARY
FILED - Oklahoma Secretary of State #2312726928 06/15/2023Page 88 ALL BLANKS MUST BE FILLED IN THE EITHER AN AMOUNT OR THE WORD “NONE”
2. ___ 8,141,000,
ok 457,050,000,4. 457,050,000,
5. 300,
ee geet es eee
g.__ 610,
7, 457.049,700,
ga 310,
10, 320.
+$500 ouster penalty
T. Turner 4/25/
Total authorized par vatue capitol stock
Number of authorized no par value shares
No par vatue shares x $50.00 (Par vaiue assigned by law 18 0.8. §
1142.4, 10 for computing the filing fees only
Sum of fines 1 and 3.
Total smount of capita’ on which the corporation has creda for paying. This
is an aggregate amount including the amount paid initially upon quelificstion
aad subsequently upon annual certificates, Wf hwo or more corporations have
merged, the eurvivor is given credit for the amount of captal upon which the
merging comorations have paid fees in Oldahoma, upon @ing procf of the
marger with the SOS.
Maximum amount of capital invested by said corporation In the slate of
Oktshoma, This means the maximum amount of funds, credits, securities
and property of whatever kind used or employed in the business carried on
{n the State of Okiahama
Diereaces between the total authorized capital and paid an credit. (Ling co
Line 5}
Total invested in excess of the amount heretofore paid on.
(Line 6 — Line 5).
FEE CALCULATION INSTRUCTIONS - complate only one.
if the emount entered on Line & 1s Zero of negative, enter and atiach tne
filing tae of $10.00 to this rapart.
If the amount entered on Line & is greater then zero, compute the fee ai
ing ae 1 per cent ($1.00 per $1,000.00) on this amount plus the $10.ee.
Farr boo*s Inc.
(EXACT NAME OF CORPORATION)
BY: aaa Tinea
Ketoring @. rely _ Secretary
(Plense print name)
IN THE DISTRICT COURT OF OSAGE,. COU
STATE OF OKLAHOMA St CUE Osage County, Okla.
STATE OF OKLAHOMA, ex rel.,
GENTNER DRUMMOND, NAV 1 6 2023
ATTORNEY GENERAL OF
OKLAHOMA, BURD, Court Clerk
pte DO DLTY
Plaintiff, JURY TRIAL DEMANDED
v. Case No. CJ-2023-180
Judge Stuart Tate
META PLATFORMS, INC. f/k/a
FACEBOOK, INC., and
INSTAGRAM, LLC,
Defendants.
PETITION
PDF Page 3
TABLE OF CONTENTS
SUMMARY OF THE CASE .ioi.ccccccccscscestssestsscacsessesnssrsssseenesesnsneanessseessscsssecsisseseseniesteeranvecensataneeen I
AROUND sas cnietsnzssaeaprinn renee nineteen isin ay chaste: Sean tbls aT alee aA TE aed, 3
THE: PARTE Bpecesscecsaees siractiioa ate vba ae Ae Ra Te, See 9
JURISDICTION AND VENUE wo... cee eecceeeeeesteteeeeseecseensteeassessaeeeseetesseesas¢eesseesecsneasesansvassasseees 10
EAGT URAL (0G BGT LOIN yc eunpenyesamnernzcren meus gamseremienverees ene ects lovteins mati ece treme aantescerenmnreneres 13
A. Defendants Engage in Consumer Transactions with Oklahoma Consumefs ......00....c.ccsc00 13
1. Meta Offers Instagram in Exchange for Consumers’ Valuable Consideration That
Enables Meta to Sell Advertisiig fesse rus inirworaviacunteticiutaiaitancceclea caren vi astanaace lis 13
2. Advertising is the Core of Meta’s Business ........0....0..00...c6.ccsccccsccseseseeseesesuesssesescecesverseens 16
3. Meta Promntizes Acquiring Adolescents and Maximizing Their Time Spent on
ImStagrarm ......c.ccccceeceeseeecnceeeeeeceeseseseccesssesueeseseesesstesssesestbesscseessursevuseusscssvavaseasnenseacsesuvasnas 17
B. Meta Operates Instagram in a Manner That is Unfair to Adolescent ...........0.0000cccccceue 20
1. By Meta’s Design, Instagram Induces Compulsive Use Among Adolescents .........0.0.00 20
2. Instagram’s “Teen Fundamentals” Study Shows Instagram’s Power to Induce
Compulsive Use Among Adolescents .......cccccsscsesesseeeessestssecsssccessestssscearastensaerenseeerenas 21
3... nstagram: Features Induce Compulsive Use ons ccsimssicaseacsncneinrssitireceearcccticecmuacevsawwbiaden 24
4. Instagram Induces Widespread Compulsive Use Among Adolescents ........0...0.00c00000..29
5. Instagram Harms Adolescents by Inducing Compulsive Use ...........ccccccsccseeseenessnsenennens 33
C. Meta Engages in Deceptive Conduct by Omitting and Misrepresenting Material Facts
ASE HRS FOU BAN Nay weaves nesmnarsecener weeiaensanea oie ssereietar tasted taaaihiaan wneadeesanerceors Tai aegls ere 34
1. Meta Did Not Disclose its Knowledge That Instagram Harms Users, Particularly
CaLELS Sy cnemrneseayescuinstensetanccs cuter: eyeetanenp ah ead ennanenuuts coin fos ntor ems emcineseammenecisinagmecedanieseanatcaee: 35
2. Meta Promoted Misleading Metrics About the Incidence of Harm on Instagram............49
3. Meta Deceived Consumers by Promoting “Time Spent” Tools Despite Known
TCCUN AGIOS iincccarceniais ee AG i tea ai GU Be 54
4. Through Public Misrepresentations, Meta Leads the Public to Believe That
Instagram. is Safe: for- Adolescents 2.0532 3)2:cisccsitacleaseineeiin UA. el Whee is aes 57
VIOLATIONS OF LAW 00 cccccscsseesssseesssesssecatsensenssestsessssssacatssutensecssseeessssesaseaenernevetsteraesasenes 67
RELIEF REQUESTED i iccssaunniiacmn nian caannenineniandnnmaci mainsail e ete mee 70
PDF Page 4
The State of Oklahoma, by and through Attorney General Gentner Drummond, (“Plaintiff’
or “Attorney General”) brings this action pursuant to the Oklahoma Consumer Protection Act, 15
O.S. §§ 751-763 (““OCPA”) against Defendants Meta Platforms, Inc. f/k/a Facebook, Inc. and
Instagram, LLC (collectively “Defendants” or “Meta”) to stop Meta’s deceptive and unfair
business practices that are fueling a mental health crisis among adolescents in the State of
Oklahoma.
SUMMARY OF THE CASE
1. Meta—through Instagram and Facebook—has created a social media empire to
generate enormous profits at the expense of millions of young Americans. Meta develops and
continually refines powerful and unprecedented technologies that attract, engage, and ultimately
hijack the time and attention of Oklahoma’s youth. Meta’s social media platforms have had
profound and far-reaching effects on the psychological and social well-being of young
Oklahomans. For most Oklahoma youth, Meta’s social media platforms, and specifically
Instagram, are an integral part of growing up, a necessity as they navigate adolescence.
2. Meta’s motivation is simple: greed. To maximize its profits, Meta has repeatedly
deceived and misled the public about the known and substantial dangers associated with the use
of its social media products. Meta has concealed the ways its social media products manipulate
and exploit the most vulnerable Oklahoma consumers: kids and teenagers. This was not an
accident. Meta has at all times been aware of the widespread risks its social media products pose
to the mental and physical health of Oklahoma youth yet it knowingly and repeatedly opted to
prioritize profits above users’ well-being. In doing so, Meta engaged in, and continues to engage
in, unlawful conduct that violates Oklahoma law.
PDF Page 5
3. As alleged in Section A, Meta’s core business revolves around maximizing the
amount of time users are actively engaged on its platform. The longer those users stay engaged,
the more data they provide to Meta, and the more advertising revenue Meta rakes in.
4. As alleged in Section B, Meta operates its social media products in an unfair
manner. Meta understands that developing adolescent brains are especially vulnerable to
manipulation. With that knowledge, Meta engineers and programmers created social media
products to exploit those vulnerabilities. Meta’s exploitation took several forms including, among
others, engagement-maximizing features such as: (a) intermittent dopamine-release
recommendation algorithms; (b) “Likes” and features designed to allow users to socially compare
themselves to other young users; (c) audiovisual and haptic alerts that incessantly recall young
users to Instagram at all times of the day and night; and (d) content-presentation formats, such as
“infinite scroll,” designed to make it difficult for young users to disengage with Meta’s products
even when they want to. Meta knows that Instagram induces compulsive use and facilitates
addiction, and Meta knows that Instagram harms young users.
5. As alleged in Section C, Meta deceives consumers. Meta knew that adolescent use
of its platforms—particularly Instagram—is associated with serious mental health problems like
depression, anxiety, insomnia, and interference with education and daily life. Meta knew these
risks to young users because it had all the user engagement data. It had all the research. However,
rather than disclose what it knew, Meta published misleading metrics through its Community
Standard Enforcement Reports that dramatically understate the actual rates of harm being suffered
by young Instagram users.
6. Not only did Meta knowingly publish inaccurate metrics, it actively concealed its
own internal research findings that repeatedly showed the actual harm experienced by users was
PDF Page 6
far higher than Meta’s published metrics. Meta further misled consumers and the public at large
by claiming (1) that its “Time Spent” tool was an effective way to curb use when it knew the tool
provided inaccurate information, (2) that it restricts young users from accessing harmful content,
(3) that it does not prioritize the maximization of time spent by users on its platform, (4) that it
does not place monetary values on teen users, (5) that it does not restrict access to its internal
research that would allow researchers, consumers, and the public in general, to fully understand
the risks posed by Meta’s platform, and (6) that its platforms, particularly Instagram, are not
addictive.
7. With these allegations in mind, the state of teen mental health in Oklahoma cannot
be ignored. In the decade ending in 2021, the percentage of teens who reported having felt so
consistently sad or hopeless that they discontinued their usual activities increased by more than
50%.' Rates of students who report having contemplated or even attempted suicide are similarly
alarming.
8. Taken individually and together, Meta’s actions and omissions constitute both
unfair and deceptive trade practices that are prohibited by the OCPA.
BACKGROUND
9. “We have a saying. ‘Move fast and break things.’ The idea is if you never break
anything, you’re probably not moving fast enough,” wrote then-Facebook CEO Mark Zuckerberg
(“Zuckerberg”) to potential investors just prior to Facebook’s 2012 initial public offering.? Meta
may have removed this saying from its public mission statement, but the motivation behind it is
' Oklahoma Youth Risk Behavior Survey 10-Year Trend Monitoring Report (available at
https://okiahoma.gov/content/dam/ok/en/health/health2/aem-documents/family-health/maternal-and-child-
health/child-adolescent-health/yrbs/202 |Oklahoma%020YRBS%2010-
Year%20Trend%20Monitoring %20Report?o202011-2021 FINAL.pdf).
? Mark Zuckerberg, Founder’s Letter, 2012 (available at
https://m.facebook.com/nt/screen/?params=%7B%22note_id%22%3A261129471966151%7D&path=%2Fnotes*o2F
note%2F &refsrc=deprecated& _rdr).
PDF Page 7
emblematic of its pursuit of innovation and profit maximization without regard to the collateral
damage it may cause.
10. The Attorney General’s investigation of Meta has revealed that it knowingly and
repeatedly engaged in unfair and deceptive conduct at the expense of Oklahoma consumers.
Meta’s conduct was, and continues to be, unlawful.
11. Meta deliberately designed its social media platforms, particularly Instagram, to be
an addiction machine targeted at consumers under eighteen years old (“Adolescents”).
12. This was not an accident. Meta marshalled vast resources to study and understand
Adolescents’ psychology and behavior so it could better exploit their developmental vulnerabilities
through irresistible design features.
13. Meta did this to capture an ever-increasing amount of Adolescents’ time and user
data—all to serve Meta’s advertising business.
14. Unlike other products that have appealed to Adolescents for generations—like
Hershey bars or cans of Coke—Instagram has no single unit of consumption. There is no natural
stopping point. Instead, Instagram serves up a bottomless pit of content where users can spend
their time limited only by the total hours in a day. And for every second a consumer spends on
Meta’s platforms, Meta profits.
15. | Meta designed its social media products exploit attention, embedding an array of
design features that maximize engagement of Adolescents on its platforms, and peppering them
with inducements to “log on” and making it psychologically difficult to “log off.” These features,
including push notifications, automatically playing short-form videos (i.e., videos shorter than 1-
minute), infinite scrolling, and ephemeral content, are designed as obstacles to prevent Adolescents
from disengaging from the platform.
PDF Page 8
16. The U.S. Surgeon General recently issued an Advisory acknowledging as much:
“You have some of the best designers and product developers in the world who have designed
these products to make sure people are maximizing the amount of time they spend on these
platforms. And if we tell a child, use the force of your willpower to control how much time you’re
spending, you’re pitting a child against the world’s greatest product designers.”
17. Instagram’s design features have fueled a dramatic increase in the amount of time
Adolescents spend on the platform. Indeed, for many Adolescents, Instagram is viewed as an
indispensable part of their identity, a podium from which they can share a carefully cultivated
“highlight reel” of who they are and a place where they must constantly be “present.”
18. Adolescents feel addicted to Instagram. They report difficulty controlling their time
spent on the application. And they frequently express that they would prefer to spend less time on
Instagram but feel powerless to do so. And Meta’s internal studies have repeatedly confirmed these
feelings.
19, Researchers wam that compulsive use of social media platforms like Instagram
impose a wide range of harms, including increased levels of depression, anxiety, and attention
deficit disorders; altered psychological and neurological development; and reduced sleep, to name
a few. Additionally, there is an immense opportunity cost when adolescent years are spent glued
to Instagram rather than engaged in the physical world and in-person experiences that are critical
to development of the adolescent brain.
20. + Meta’s business strategy that consciously addicts Adolescents to its social media
platform has caused, and is continuing to cause, widespread and significant injury to Adolescents
in Oklahoma. This is an unfair practice that violates the OCPA.
3 Allison Gordon & Pamela Brown, Surgeon General says 13 is ‘too early’ to join social media, CNN (Jan. 29, 2023)
(available at https://www.cnn.com/2023/01/29/health/surgeon-general-social-media/index.html).
PDF Page 9
21, But Meta’s misconduct does not end there. Meta has also violated the OCPA by
deceiving Oklahoma’s Adolescents—and, critically, their parents. Meta misled Adolescents and
their parents both by concealing the significant risks Instagram presents to Adolescent users, and
by making deceptive claims that masked its knowledge of those risks. Meta did so in at least four
ways.
22. First, Meta has long known that Instagram was harmful for users, and especially
ruinous for teen girls. Meta did not share that information with consumers and, in fact, took
affirmative steps to bury it. Meta’s leadership—including Zuckerberg—repeatedly declined
employees’ requests to fund measures that would reduce known harms. And not only did Meta fail
to disclose its knowledge of these harmful effects, it limited internal access to incriminating
findings to minimize the risk that this information would become public.
23. For example, Zuckerberg personally intervened to lift an internal ban on “selfie”
filters that mimicked plastic surgery effects, even though Meta’s experts overwhelmingly found
the filters had devastating effects on teen girls. Zuckerberg did this in spite of the recommendation
of Meta’s own employees and Meta’s own internal research. Meta never disclosed this to the
public.
24. Second, Meta publicizes its “Community Standards Enforcement Reports,” to
create the facade that Instagram is a safe platform. These reports touted the low “prevalence” of
Community Standards violations, a self-created metric that Meta uses as “evidence” that its
platform is safe. But the truth is Meta’s “prevalence” metric is simply a straw man created to
support the false narrative that Instagram is safe while concealing the true extent of the dangers on
the platform.
Bejar (“Bejar”) rang the alarm bell explicitly and directly to Zuckerberg, former Chief Operating
Officer Sheryl Sandberg (“Sandberg”), and Instagram Head Adam Mosseri (“Mosseri”). Bejar
wrote:
I saw the note you shared today after the testimony, and ] wanted to bring to your
attention what I believe is a critical gap in how we as a company approach hann,
and how the people we serve experience it. I’ve raised this to Chris [Cox], Sheryl
{Sandberg}, and Adam [Mosseri] in the last couple of weeks.*
28. But Meta’s leadership ignored Bejar’s email and continues to issue misleading
reports to this day.
29, Bejar provided swom testimony as part of the investigation by the Attorney
General. When asked if he believed “that Mr. Zuckerberg and other company leaders focused on
the ‘prevalence’ metric because it created a distorted picture about the safety of Meta’s platforms,”
Bejar testified “I do.”> When asked if he thought “Mr. Zuckerberg’s public statements about
prevalence created a misleading picture of the harmfulness of Meta’s platforms,” he testified “I
do.’ And when asked if he was aware of any instances where Meta, in his view, downplayed the
harms users were experiencing on Meta’s platforms, Bejar testified:
Every time that a Company spokesperson in the context of harms quotes prevalence
statistics | believe that is what they are doing, that they’re minimizing the harms
that people are experiencing in the product.’
30. Third, Meta misled the public through false statements about its commitment to
well-being related features. For instance, it touted its “Time Spent” tool as a way for Adolescent
users (and their parents) to manage engagement on Instagram and as a demonstration of Meta’s
commitment to well-being. But when Meta learned the tool delivered inaccurate data to consumers,
4 Ex. 1, Bejar Trans., 236:16-290:14.
3 Hd, at 319:21-320:3.
6 fd., at 200:16-201:13.
Id, at 319:11-17.
PDF Page 12
Meta neither fixed the problem nor discontinued the tool. Meta prioritized misleading its
Adolescent users (and their parents) over suffering a public relations hit.
31. Fourth, Meta made material misrepresentations to develop trust among consumers
and parents that Instagram is a safe place for Adolescents. In various public channels, Meta
deceptively represented (1) that it does not prioritize increasing users’ time on Instagram; (2) that
it protects Adolescents from harmful or inappropriate content on Instagram; (3) that it does not
place a monetary value on Adolescents’ use of Meta platforms; (4) that it has not changed its
internal data and research access policies in response to The Wall Street Journal’s 2021 coverage
of its internal research findings; (5) that it uses internal research to improve product safety on a
regular basis, and (6) that its platforms are not addictive.
32. In sum, through its acts, omissions, and misrepresentations, Meta carefully created
the impression that its social media platforms, and specifically Instagram, are safe places for
Adolescents. That impression was false and misleading.
33. Based on this misconduct, the Attorney General brings this action pursuant to the
OCPA and seeks injunctive relief and civil penalties; recovery of attorney fees; and payment of
reasonable expenses, including expert and investigation costs.
THE PARTIES
34. Plaintiff is the State of Oklahoma. This enforcement action is brought by and
through Attorney General Gentner Drummond pursuant to the authority conferred by the OCPA.
35. Defendant Meta Platforms, Inc. f/k/a Facebook, Inc. is a Delaware corporation with
its principal place of business in Menlo Park, California.*
® On October 28, 2021, Facebook, Inc. changed its name to Meta Platforms, Inc.
9
PDF Page 13
36. Defendant Instagram, LLC, is a Delaware limited liability company with its
principal place of business in Menlo Park, California. Instagram, LLC is a subsidiary of Meta
Platforms, Inc.
37. Defendants Meta Platforms, Inc. and Instagram, LLC acted in concert with one
another and as agents and/or principals of one another in relation to the conduct described in this
Petition.
38. All the allegations described in this Petition were part of, and in furtherance of, the
unlawful conduct alleged herein, and were authorized, ordered, and/or done by Defendants’
officers, agents, employees, or other representatives while actively engaged in the management of
Defendants’ affairs within the course and scope of their duties and employment, and/or with
Defendants’ actual, apparent, and/or ostensible authority.
JURISDICTION AND VENUE
39, By this Petition, Plaintiff is asserting causes of action, and seeking remedies, based
exclusively on Oklahoma statutory law.
40. The Petition does not confer diversity jurisdiction upon federal courts pursuant to
28 U.S.C. § 1332, as Oklahoma is not a citizen of any state and this action is not subject to the
jurisdictional provision of the Class Action Faimess Act of 2005, 28 U.S.C. § 1332(d). Federal
question subject matter jurisdiction under 28 U.S.C. § 1331 is not invoked by the Petition.
Nowhere does Plaintiff plead, expressly or implicitly, any cause of action or request any remedy
that necessarily arises under federal law.
41. Asacourt of general jurisdiction, the District Court is authorized to hear this matter.
42. Although previously conducting business in Oklahoma for years without
registering, on January 17, 2019, Meta filed its Certificate of Qualification to register as a foreign
PDF Page 14
corporation doing business in Oklahoma. In its filing, Meta stated that it intended to conduct
business as “Social media” and that it had assets of $80,077,747, 700. That same day, the Oklahoma
Secretary of State granted Meta a Certificate of Authority to transact business in the State of
Oklahoma.’ On May 15, 2023, in order to continue doing business in Oklahoma, Meta filed an
Annual Certificate including “a report of the amount of capital invested in Oklahoma by a foreign
corporation.” In that filed certificate, Meta stated that the amount of “funds, credits, securities
and property” that it “used or employed in the business carried on in the State of Oklahoma” was
$610,956.
43. Meta has entered into millions of individual contracts with individual Oklahomans
and Oklahoma businesses to use its platforms.!!
44. By 2020, Meta estimated that Instagram, only one of its “Family of Apps” or
“FoA,” had reached a staggering saturation rate of 80% of Oklahomans under 35 years old. Meta’s
own documents also show that total daily active users (“DAU”) in the State of Oklahoma for
Instagram were: (1) more than 800,000 Oklahomans in 2018; (2) more than 900,000 in 2019; (3)
more than I 000,000 in 2020; and (4) more than 1,200,000 in 2021. Instagram is massively popular
among young Oklahomans. According to Meta’s internal metrics, from July 2020 to June 2021
over 300,000 Oklahoma teens used Instagram monthly. During that time, over 219,000 Oklahoma
teens used Instagram daily. Between October 2022 and April 2023, over 296,000 “young adults”
(according to Meta’s internal definition) in Oklahoma used Instagram daily.
45, Meta closely monitored its performance, both in terms of numbers of users and time
spent on its platforms, in Oklahoma. Meta monitored the following metrics for Instagram usage in
* Ex. 2, Facebook, Inc.'s Certificate of Qualification (Foreign Corporation) and Facebook, Inc.’s Certificate of
Authority to Transact Business in Oklahoma.
' Ex. 3, Facebook, Inc.’s May 15, 2023, Annual Certificate.
Ex. 4, Instagram Terms of Use.
ll
PDF Page 15
Oklahoma: (1) the amount of time daily active teens spent on Instagram per day; (2) teen
“penetration” in the state; (3) the ratio of teen daily active users versus monthly active users; (4)
teen monthly active user “story participation” rates; (5) the amount of “feed media” daily active
teens consumed per day on Instagram; (6) the amount of “stories” that daily active teens consumed
per day on Instagram; (7) Instagram market saturation with respect to users under 35; (8) the
percentage of Facebook Android monthly active users on Instagram; and (9) the reduction in
monthly active users over a two month time period.
46. Perhaps most strikingly, Meta internally estimated that about 80% of Oklahoma
teens were monthly active users of Instagram.
47, And of course, Meta enriched itself by selling advertisements targeted at users in
Oklahoma. According to Meta’s public advertising library, Meta regularly targets advertisements
to Oklahoma. All manner of Oklahoma entities—from the OKC Thunder, the Tulsa World,
QuikTrip, and Hobby Lobby, to smaller entities within Osage County like the Osage County
Tourism Department and the Fairfax Community Foundation—advertise on Meta’s platforms. To
be sure, countless others also advertise on Instagram to reach Oklahoma audiences and expand
their businesses in Oklahoma.
48. In sum, Meta aims its platforms at Oklahomans, collects massive amounts of
information on Oklahoma users, studies the impact its platform has on Oklahomans, selis
advertising to Oklahomans based on that information, and tracks its performance in Oklahoma—
all to further its goal of generating profits for its shareholders.
49. This Court has personal jurisdiction over each Defendant pursuant to 12 O.S. §
2004 because of their contacts in Oklahoma. As is described more fully below, Defendants (1)
entered contracts with millions of Oklahomans and intentionally availed themselves of the
PDF Page 16
Oklahoma market by directing marketing efforts towards Oklahomans; (2} sold the opportunity to
advertise to Oklahomans; and (3) monitored their substantial contacts in Oklahoma, so as to render
personal jurisdiction over Meta consistent with traditional notions of fair play and substantial
justice. The allegations in this Petition establish that Defendants had minimum contacts with
Oklahoma and are incorporated by reference herein.
50. Pursuant to 12 O.S. § 137, venue is proper in Osage County because Osage County
is a county where the alleged misconduct occurred and where Defendants have conducted or
transacted business.
FACTUAL ALLEGATIONS
A. Defendants Engage in Consumer Transactions with Oklahoma Consumers
51. The OCPA broadly defines a “consumer transaction” as “the advertising, offering
for sale or purchase, sale, purchase, or distribution of any services or any property, tangible or
intangible, real, personal, or mixed, or any other article, commodity, or thing of value wherever
located, for purposes that are personal, household, or business oriented.” 15 O.S, § 752(2). The
definition includes “anything that could be sold or marketed to a consumer.” Horton v, Bank of
America, N.A., 189 F.Supp.3d 1286, 1293 (N.D. Okla.2016). As described herein, Defendants have
engaged, and continue to engage, in conduct that constitutes a “consumer transaction.”
1. Meta Offers Instagram in Exchange for Consumers’ Valuable Consideration That
Enables Meta to Sell Advertising
52. Through its mobile application and website, Instagram offers Oklahoma consumers
the opportunity to connect with friends, follow accounts, and explore various interests.
53. On Instagram, consumers interact with different “surfaces.” Those include: (1) the
main “Feed” and “Stories” surfaces, which display content posted by accounts the consumer
follows; (2) the “Explore” surface that suggests new content to consumers; (3) the “Reels” surface,
13
PDF Page 17
focused on short-form videos; and (3) the “Direct Messaging” surface, which allows consumers to
send messages to one another.
54. No two consumers’ experiences on Instagram are the same. Rather, Instagram
presents a customized display to each consumer based on the interests and preferences they express
on Instagram, along with other user data in Meta’s possession.
55. To fully access Instagram, consumers must create an account.
56. To create an account, consumers enter into a contract with Meta.'?
57. By entering that contract, users agree to comply with Instagram’s Terms of Use
(the “Instagram Terms”). !?
58. The Instagram Terms state that “The Instagram Platform is one of the Meta
Products, provided to you by Meta Platforms, Inc. The Instagram Terms therefore constitute an
agreement between you and Meta Platforms, Inc.”"4
59. Under the Instagram Terms, users do not pay to use Instagram. Rather, in exchange
for the right to use Instagram, consumers agree to a host of terms that power Meta’s advertising
business.
60. For example, in a section titled “How Our Service Is Funded,” the Instagram Terms
explain that “[i]nstead of paying to use Instagram, by using the Service covered by these Terms
[i.e. Instagram], you acknowledge that we can show you ads that businesses and organizations pay
us to promote on and off the Meta Company Products. We use your personal data, such as
information about your activity and interests, to show you ads that are more relevant to you.”!>
!2 See Ex. 4, Instagram Terms of Use.
13 fed.
14 Id.
15 Id
PDF Page 18
61. The Instagram Terms also state that Meta “allow[s] advertisers to tell us things like
their business goal and the kind of audience they want to see their ads. We then show their ad to
people who might be interested. We also provide advertisers with reports about the performance
of their ads to help them understand how people are interacting with their content on and off
Instagram, For example, we provide general demographic and interest information to advertisers
to help them better understand their audience.”’!®
62. Under the Instagram Terms Oklahoma consumers “pay” for Instagram by allowing
Meta to build its advertising business using consumers’ time and attention.
63. Under the Instagram Terms, consumers “must agree to [Meta’s] Privacy Policy to
use Instagram.” There is no other way for a consumer to use Instagram.
64, Every consumer must agree that Meta may collect a host of data including, among
other things: (1) information about the consumer’s activity on Instagram; (2) the messages the
consumer sends and receives; (3) the content the consumer posts through Instagram’s camera
feature and the consumer’s camera roll; (4) the consumer’s responses to various types of
advertisements; and even (5) the hardware and software the consumer is using, GPS data,
Bluetooth signals, nearby Wi-Fi access points, and beacons and cell towers.
65. Meta uses this data to further its advertising business. In particular, it allows
advertisers to target consumers that reside in specific locations in Oklahoma. For example, Meta
allows advertisers to target Oklahoma consumers located in Oklahoma City and Tulsa simply by
choosing from a Meta-created list of “Designated Market Areas,” or “DMA.” Meta assigns DMA
Code 650 to Oklahoma City and DMA Code 671 to Tulsa.!”
16 Fd.
'7 See Designated market areas for ad marketing, Meta Business Heip Center (available at
https:/www. facebook.com/business/help/'1 501907550136620).
15
PDF Page 19
66. Meta’s connections to Oklahoma are not limited to Oklahoma City and Tulsa. Meta
enables advertisers to focus on any part of Oklahoma, including Osage County, either by targeting
specific areas by zip code or by creating custom DMAs simply by dropping a pin on a map and
choosing a radius around the pin.'*
2. Advertising is the Core of Meta’s Business
67. Meta has become one of the largest and most profitable companies in the world by
offering highly targeted, data-driven advertising using massive databases of information collected
from the consumers who use its platforms.
68. As Zuckerberg explained, “based on what pages people like, what they click on,
and other signals, we create categories . . . and then charge advertisers to show ads to that category.
Although advertising to specific groups existed well before the internet, online advertising allows
much more precise targeting and therefore more-relevant ads.””!”
69. On information and belief, Meta charges a premium to businesses for access to the
“much more precise” advertising opportunities on its social media platforms.
70. Consumers are served targeted advertisements during all, or nearly all, sessions on
Instagram. And consumers see advertisements almost constantly on Instagram, often several times
per minute. The advertisements Meta displays on Instagram are interwoven into most if not all of
Instagram’s “surfaces.” The same is true of Facebook.
71. Given this business model, Meta is motivated to maximize the time users spend on
Instagram and Facebook.
'8 Meta’s services include a function it calls “Drop a Pin” that allows targeted advertising by identifying a location on
a map using a pin and adjusting a slider scale to establish a particular radius around the pinned location to represent
the designated market area. See Use location targeting, Meta Business Help Center, (available at
https://www. facebook.com/business/help/3655613507856427id=176276233019487).
'9 See Understanding Facebook's Business Model, Mark Zuckerberg, January 24, 2019 (available at
https://about. fo.com/news/20 19/0 1 /understanding-facebooks-business-model/).
16
PDF Page 20
72. One incentive is that the more time users spend on Meta’s platforms, the more
“inventory” Meta can sell. For instance, if a user increases her time spent viewing her Instagram
“feed” from one to five hours per day, Meta can deliver roughly five times the number of
advertisements to that user. The increase in time spent therefore significantly increases the profits
Meta can make off this user.
73. Second, the more time that same user is engaged on Instagram, the more Meta
learns about her. This data is gathered and refined so that she can be more accurately dropped into
a particular category and ads shown to her can be more precisely targeted. On information and
belief, advertisers will pay more for these ads.
74. As described more fully below, Meta has succeeded in capturing a breathtaking
amount of consumer time, attention, and data—especially on Instagram, and especially from
Adolescents.
ay Meta Prioritizes Acquiring Adolescents and Maximizing Their Time Spent on
Instagram
7S, In Meta’s business model, not all consumers are created equal. Adolescents are
Meta’s prized demographic.
76, Meta has pursued increasing Adolescents’ time spent on its platforms as one of the
company’s most important goals. As one Meta analyst wrote in 2016:
Lifetime Value (LTV)
This number is core to making decisions about your business. Lifetime value is the
cumulative total “value” (usually expressed as “profit’) you expect from a
customer/ user. With this number, we can make better decisions regarding how
much to spend on each user. Generally, you do not want to spend more than the
LTV of the user.
Short summary is the “young ones are the good ones.” You want to bring people to
your service young and early.
PDF Page 21
77. For example, as of November 2016, Meta’s “overall goal remain[ed] total teen time
spent... with some specific efforts (Instagram) taking on tighter focused goals like U.S. teen total
time spent.”
78. This strategy was directed by Zuckerberg, who “decided that the top priority for the
company in 2017 is teens.”
79. On information and belief, Meta has worked to maximize Adolescents’ “time
spent” throughout its corporate history. To that end, a product manager within Instagram wrote
that he wanted to establish a small team “focused on getting a very clear understanding of our
current US DAP and MAP growth situation, opportunities, and challenges because 1) US Teens
are our #1 cohort for both long-term growth of [Instagram] and [Facebook].”
80. This is especially true of Instagram, which is central to Meta’s strategy to grow its
number of Adolescent users.
81. As Meta knows, Instagram is especially appealing to Adolescents and is Meta’s
most popular application with that demographic. Meta therefore devotes vast resources to
increasing Adolescents’ engagement on Instagram.
82. Meta’s internal studies show that Adolescents have an outsized influence on their
entire households’ attitudes towards Instagram. As Meta’s internal research shows, “[t]eens are
household influencers bringing [family] members (parents and younger siblings) to IG
[Instagram], as well as shaping what is ‘normal’ behavior on IG [Instagram].”
83. Even more fundamentally, Meta pursues Adolescents because Meta’s advertising
customers value that audience.
18
PDF Page 22
84. Among other reasons, Meta’s advertising partners want to reach Adolescents
because they (1) are more likely to be influenced by advertisements; (2) may become lifelong
customers; and (3) set trends that the rest of society emulates.
85. Notably, Meta allows advertisers to target Adolescents on Instagram based on their
age and location.
86. On information and belief, many advertisers pay Meta a premium to serve
advertisements to Adolescents, including advertisements to Adolescents in specific geographic
markets, such as those in Oklahoma and in Osage County.
87. Meta is motivated to increase Adolescents’ time spent on Instagram not only
because it is a meaningful stream of advertising business, but also, because the data that Meta
collects from that use is itself highly valuable.
88. Meta has profited immensely from its business model. Meta reported earning
$116.6 billion in revenue in 2022, with $23.2 billion in net income, and Zuckerberg, its CEO, has
become one of the wealthiest people in the world.
89. In addition to financial success, Zuckerberg’s role as Meta’s CEO and Founder has
made him a public figure able to exert significant influence not only over the company, but also
over society writ large. In a private email exchange with at least four other billionaires, one of
Meta’s major investors told Zuckerberg that he believed “Mark Zuckerberg has been cast as *the
spokesman* for the Millennial Generation — as the single person who gives voice to the hopes and
fears and the unique experiences of this generation, at least in the USA.” In a response, Mr.
Zuckerberg agreed with that sentiment, stating that he is “the most well-known person of [his]
generation.”
PDF Page 23
B. Meta Operates Instagram in a Manner That is Unfair to Adolescents
90. Meta has engaged in unfair practices by designing and operating its platforms in a
manner that addicts Adolescents on massive scale.
1. By Meta’s Design, Instagram Induces Compulsive Use Among Adolescents
91. For generations, companies have marketed products to Adolescents—from bikes to
Barbies to baseball cards. Unquestionably, products like those appealed to a young audience, and
their creators marketed them accordingly.
92. Meta could have followed a similar course. It might have offered a version of
Instagram that was simply appealing, but not addictive.
93. Instead, Meta designed Instagram to exploit known vulnerabilities in Adolescents’
neurological development, making Instagram biologically difficult—and in some cases nearly
impossible—-for teens to resist.
94. As Meta’s founding president, Sean Parker, explained in 2018:
The thought process that went into building these applications, Facebook being the
first of them ... was all about: ‘How do we consume as much of your time and
conscious attention as possible?’ That means that we need to sort of give you a
little dopamine hit every once in a while, because someone liked or commented
on a photo or a post or whatever. And that’s going to get you to contribute more
content and that’s going to get you ... more likes and comments. It’s a social-
validation feedback loop ... exactly the kind of thing that a hacker like myself
would come up with, because you’re exploiting a vulnerability in human
psychology. The inventors, creators—me, [Meta founder] Mark [Zuckerberg],
[Instagram founder} Kevin Systrom on Instagram, all of these people—
understood this consciously. And we did it anyway. ~°
95, On an ongoing basis, Meta pours massive resources into understanding
Adolescents’ cognitive development and vulnerabilities.
20 See Alex Heam, ‘Never get high on your own supply’ — why social media bosses don't use social media, The
Guardian (Jan. 23, 2018) (available at https:/www.theguardian.com/media/20 1 8/jan/23/never-get-high-on-your-
own-supply-why-social-media-bosses-dont-use-social-media (emphasis added)).
20
PDF Page 24
06. For example, in the late 2010s, Meta’s Consumer Market Research team created a
“very deep body of work over the course of years/months” studying teens. That group did
“enormous work and investment” in “teen foundational research.”
97. But that “very deep body of work” was not enough. In 2020 Meta started the “Teen
Ecosystem Understand” project, which was an ongoing effort to study Adolescent users. Led by
Instagram’s “growth” team, this project sought to deliver insights that would allow Meta to make
Instagram increasingly irresistible to Adolescent users.
98. On information and belief, Meta’s Teen Ecosystem Understand and Consumer
Market Research projects were two of many initiatives Meta directed to study Adolescents so that
it could capture more of their time and attention.
2. Instagram’s “Teen Fundamentals” Study Shows Instagram’s Power to Induce
Compulsive Use Among Adolescents
99. A May 2020 report by the Teen Ecosystem Understand project illustrates the
lengths to which Meta studied, understood, and sought to exploit teens’ neurological
vulnerabilities.
100. Titled “Teen Fundamentals,” the 97-page internal presentation”' purports to be a
“synthesis of adolescent development concepts, neuroscience as well as nearly 80 studies of our
own product research.” One of the presentation’s stated goal was to “look . .. to biological factors
that are relatively consistent across adolescent development and gain valuable unchanging insights
to inform product strategy today.”
101. The first section of the internal presentation, titled “Biology,” contains several
images of brains in various stages of development.
2! Meta employees regularly convey information to one another through slideshows using Microsoft PowerPoint,
21
PDF Page 25
102. As part of the “Biology” section, the internal presentation explained that “Unlike
the body which functions wholly from day one, the brain essential [sic] spot trains certain areas
and functions at a partial capacity before it is wholly developed . . . The teenage brain is about
80% mature. The remaining 20% rests in the frontal cortex... .at this time teens are highly dependent
on their temporal lobe where emotions, memory and learning, and the reward system reign
supreme.”
103. According to the report, “teens’ decisions are mainly driven by emotion, the
intrigue of novelty and reward .. . [making] teens very vulnerable at the elevated levels they
operate on. Especially in the absence of a mature frontal cortex to help impose limits on the
indulgence in these.”
104. The next section of the Teen Fundamentals slide presentation, titled “Behavior,”
acknowledged what Meta knew well: “the teenage brain happens to be pretty easy to stimulate.”
105. By way of an example, the presentation noted that “everytime [sic] one of our teen
users finds something unexpected their brains deliver them a dopamine hit.”
106. The next slide explained that “teens are insatiable when it comes to ‘feel good’
dopamine effects.”
107. And the following slide highlighted that “teens brains’ [sic] are especially ‘plastic’
or keen to learn presenting a unique opportunity that coupled with curiosity can send teens down
some interesting rabbit holes... .”
108. Suggesting another way that teen brains are “easy to stimulate,” the internal
presentation notes that “a huge driver for teen behavior is the prospect of reward. This is what
makes them predisposed to impulse, peer pressure, and potentially harmful risky behavior like
drugs, stunts, and pranks...”
22
PDF Page 26
109. Building on that theme, the presentation also observed that “approval and
acceptance are huge rewards for teens and interactions are the currency on IG [Instagram]. DMs
{direct messages], notifications, comments, follows, likes, etc. encourage teens to continue
engaging and keep coming back to the app.”
110. The presentation confirmed that Instagram was successfully exploiting these
vulnerabilities, even when its young consumers voiced their concerns directly to Meta.
111. For example, the internal presentation conceded that:
teen brains are much more sensitive to dopamine, one of the reasons that drug
addiction is higher for adolescents and keeps them scrolling and scrolling. And due
to the immature brain they have a much harder time stopping even though they want
to — our own product foundation research has shown teens are unhappy with the
amount of time they spend on our app.
112. But that was not enough for the members of the Instagram “growth” team that
authored the presentation. Instead, the presentation repeatedly asked how Instagram could become
even more irresistible to teens:
e “So, now that we know this — what is the effect of teen’s biology on their
behavior? And how does this manifest itself in product usage?”
e “How weil does [G [Instagram] cater to [teens’ desired] activity? How does it
stack up against [its competitors]?”
e “Teen’s [sic] insatiable appetite for novelty puts them on a persistent quest to
discover new means of stimulation . . . how can your team give teens
somewhere new to go or something new to find from the product you work on?”
113. Inthe end, the internal presentation succinctly described Meta’s future direction: “‘]
want to remind you all once more of the core things that make teens tick. New things, feeling good
and reward. We are not quite checking all of these boxes . . . some teens are tuming to competitors
to supplement for [sic] those needs.” It concluded: “we [would] do well to think hard about how
we can make IG [Instagram] an app tailored to the teenage mindset.”
23
PDF Page 27
114. The Teen Fundamentals report was shared with various teams inside Meta,
culminating in its presentation to Instagram’s leadership team (including Adam Mosseri) in June
2020.
115. In response to the presentation, Instagram’s leadership requested additional
research, which led to a subsequent report titled “Deepening Rewards to Drive More Meaningful
Daily Usage” designed to “unpack” the concept of “rewards.” As part of that report, Instagram
employees conducted user interviews and “synthesized this data with academic literature to
understand how it applies at a psychological level.” Through other related projects, Instagram
continued to use its research and understanding of Adolescent users’ brains to gain a competitive
advantage.
3. Instagram Features Induce Compulsive Use
116. Leveraging its understanding of “the things that make teens tick,” Meta exploited
Adolescents’ limited capacity for self-control through an array of features, such as push
notifications, ephemerality, auto-play, and infinite scroll.
117. Collectively, these and other Instagram features created and exploited obstacles to
Adolescents’ decision-making, causing them to spend more time on Instagram than they otherwise
would.
i. Incessant Push Notifications.
118. Meta causes Adolescents to increase their time spent on Instagram by inundating
them with notifications. The Instagram mobile application, by default, peppers users (including
Adolescents) with frequent alerts or notifications intended to cause users to open the application
119. Echoing Meta’s “Teen Fundamentals” research, academics have observed that
these notifications impact the brain in similar ways as narcotic stimulants:
24
PDF Page 28
Although not as intense as [sic] hit of cocaine, positive social stimuli will similarly
result in a release of dopamine, reinforcing whatever behavior preceded it... Every
notification, whether it’s a text message, a “like” on Instagram, or a Facebook
notification, has the potential to be a positive social stimulus and dopamine influx.22
120. On information and belief, by default Meta notifies Adolescents when another user
follows them, likes their content, comments on their content, “tags” them, mentions them, sends
them a message, or “goes live” (if the young person follows the user).
121. As Meta’s own research shows, Adolescents have a difficult time resisting these
notifications.
122. In an internal analysis of “Levers for Teen Growth,” a member of the “Instagram
Growth Data Science Team” noted that Defendants could “[l]everage teens’ higher tolerance for
notifications to push retention and engagement.”
123. In a November 2019 internal presentation entitled “IG Notification Systems
Roadshow,” Meta’s employees acknowledged that some of its users are “overloaded because they
are inherently more susceptible to notification dependency.”
124. Similarly, in an internal presentation titled “State of US Teens 2020”—authored by
the “IG [Instagram] Growth Analytics” team—Meta observed that teens “have longer time spent
than adults because they tend to have more sessions per day than adults. This may be because US
teens are more sensitive to notifications and have more notification-driven sessions than adults.”
125. By dispensing notifications and other “rewards” on a variable or intermittent
schedule, Meta increases the addictive nature of Instagram.
ii. Ephemeral Nature of Instagram Content
126. Méeta’s internal research also showed that Adolescents are developmentally wired
” See Trevor Haynes, Dopamine, Smartphones & You: A battle for your time, Blog - Science In The News (Harvard
University May 1, 2018) (available at https://sitn.hms.harvard.edw’flash/2018/dopamine-smartphones-battle-time/).
25
PDF Page 29
to fear “missing out.” Meta induces constant engagement by making certain Instagram experiences
ephemeral.
127. Unlike content delivery systems that permit a user to view existing posts on a
schedule convenient for the user, ephemeral content is available only on a temporary basis,
incentivizing users to engage with it immediately.
128. For example, Instagram’s popular “Stories” surface displays user-created images,
videos, and narratives for twenty-four hours, at most, before the content disappears.??7+
129. Similarly, Instagram’s “Live” feature gives users the ability to livestream videos to
followers or the public during a specific session, after which the stream video is typically no longer
available.”°
130. In the case of “Live,” for instance, an Adolescent’s failure to quickly join the live
stream when it begins means that the user will miss out on the chance to view the content entirely.
Often, Instagram sends users notifications that an account they follow is going live so that users
do not “miss out.”
131. Likewise, because “Stories” delete within 24 hours, Adolescents must constantly
monitor that surface if they desire to keep up with the accounts they follow.
132. Meta deploys ephemeral content features because it knows Adolescents’ fear of
missing out on content will keep them glued to Instagram.
133. Meta’s internal documents acknowledge that Instagram’s ephemeral features drive
compulsive Instagram use.
33 See Introducing Instagram Stories (Aug. 2, 2016) (available ai
https://about.instagram.com/blog/announcements/introducing-instagram-stories).
+4 See Josh Constine, Instagram Launches “Stories,” A Snapchatty Feature for Imperfect Sharing, TechCrunch (Aug.
2, 2016) (available at https://techcrunch.com/2016/08/02/instagram-stories’).
*5 See Live, Instagram Help Center (available at https://help.instagram.com/272122157758915/?helpref=he_fnav).
26
PDF Page 30
134. For instance, Meta’s internal documents acknowledge that Instagram’s ephemeral
features drive compulsive Instagram use on at least a monthly basis.
135. Meta’s documents noted that “[yJoung people are acutely aware that Instagram can
be bad for their mental health, yet are compelled to spend time on the app for fear of missing out
on cultural and social trends.”
136. Even though it knew ephemerality was fueling out-of-control Instagram usage
among Adolescents, Meta pressed forward. Illustrating the mindset within Meta, in 2021 a user
experience researcher observed that direct messages on Instagram “were not urgent (especially
compared to other apps like Snapchat)” and “consisted mainly of videos and memes from friends
which could be watched at [a user’s]} leisure.” The researcher noted that “we need to develop new
products that increase the possibilities for time-sensitive interactions on [Instagram]... .”
iii. Infinite Scroll, Autoplay, and Reels Induce Perpetual Instagram Use
137. Meta has also implemented tools that induce perpetual, passive Instagram use.
138. For example, Instagram presents an infinite scroll on several key surfaces. In other
words, Instagram partially displays additional content at the bottom of the user’s screen, such that
the user is typically unable to look at a single post in isolation (without seeing the top portion of
the next post in their feed).
139. Instagram teases yet-to-be-fully-viewed content indefinitely—as the user scrolls
down the feed, new content is automatically loaded and previewed. This design choice makes it
difficult for Adolescents to disengage because there is no natural end point to the display of new
information.
140. Meta also deploys the ‘‘auto-play” feature to keep Adolescents on Instagram.
27
PDF Page 31
141. Much like infinite scroll, Instagram’s “Stories” surface automatically and
continuously plays content, encouraging Adolescents to remain on the platform.
142. Meta understands that these are powerful tools. Tellingly, when news broke that a
Meta competitor was turning off auto-play for users under 18, Meta’s internal researcher team
registered surprise. One observed that “[t]urning off autoplay for teens seems like a huge move!
Imagine if we turned off infinite scroll for teens.” The second responded “Yeah, I was thinking the
same thing. Autoplay is HUGE.”
143. Meta’s popular “Reels” surface has these same characteristics. An internal strategy
presentation shows that Reels is “a TikTok competitor for short and entertaining videos” and one
of “three big bets” that “Instagram focused on . . . to bring value to teens” in 2020.
144. Videos on Reels automatically and perpetually play as the user swipes the screen
up to the next video. The short-form nature of Reels (between 15 to 90 seconds, as of April 2023)
makes it difficult for Adolescents to close the app. Other aspects of Reels—including the
placement of the like, comment, save, and share buttons on top of the video— reduce or prevent
interruption and keep the user constantly viewing the video.
145. Internally, Meta employees recognized that the design of Reels was harmful to
Adolescents. As one employee observed in September 2020, “Reels seems to be everything they
denounce in the stupid documentary [i.e. Netflix’s The Social Dilemma], and everything we know
from our research: passive consumption of an endless feed, without any connection to the content
creator. Yay.” A Meta mental health researcher responded, “Exactly. Ugh.”
146. On information and belief, the above-described Instagram features are but a small
sample of the tools Meta has deployed to induce Adolescents to spend more time on Instagram
than they otherwise would.
28
PDF Page 32
4. Instagram Induces Widespread Compulsive Use Among Adolescents
147. Because of Meta’s design choices, Instagram has already hooked a generation of
Adolescents.
148. Meta knew. Meta’s studies repeatedly confirmed that Adolescents used Instagram
at alarming rates. They also showed that Adolescents wanted to reduce their time on Instagram
and that Instagram’s engagement-inducing features simply overpowered them. Meta’s studies also
showed that compulsive Instagram use had detrimental effects on Adolescents’ mental health,
sleep, and relationships. But because the compulsive use of Instagram by Adolescents benefitted
Meta’s bottom line, Meta chose to ignore its own internal recommendations.
149. For example, in a February 2019 internal presentation titled “Instagram Teen Well-
Being Study: US Topline Findings,” Meta observed that “App Addiction is Common on IG
[Instagram].” The presentation noted that 23% of teenage monthly active users find that they often
feel like they “‘waste too much time on” Instagram.
150. InSeptember 2019, Meta commissioned a third-party study on Teen Mental Health.
That study’s first “Topline Headline” was that “Instagram is an inevitable and unavoidable
component of teens lives. Teens can’t switch off from Instagram even if they want to.”
151. Another “Topline Headline” was that “Teens talk of Instagram in terms of an
‘addicts’ narrative’ spending too much time indulging in a compulsive behavior that they know is
negative but feel powerless to resist.”
152. A later slide observed that “Teens are hooked despite how it makes them feel . . .
Instagram is addictive, and time-spend on platform is having a negative impact on mental health.”
153. The Teen Mental Health report also found that teens ‘know they stay up later than
they should and miss out on sleep to stay plugged in” to Instagram.
29
PDF Page 33
154. Elsewhere, the report noted that “Adolescents are acutely aware that Instagram is
bad for their mental health yet are compelled to spend time on the app for fear of missing out on
cultural and social trends.”
155. Relatedly, in an October 2019 discussion regarding mental health research, an
employee observed:
teens told us that they don’t like the amount of time they spend on the app...they
often feel ‘addicted’ and know that what they’re seeing is bad for their mental
health but feel unable to stop themselves. This makes them not feel like they get a
break or can’t switch off social media. In the survey, about 30% (and even larger
proportions of those who are unsatisfied with their lives) said that the amount of
time they spend on social media makes them feel worse.
156. In March 2020, one Instagram employee asked if there were “any recent studies
where we explicitly talk about time spent tools and why teens want them.” In response, a different
employee confirmed that “[t]he feedback, essentially, is that (1) teens feel addicted to IG
[Instagram] and feel a pressure to be present, (2) like addicts, they feel that they are unable to stop
themselves from being on IG, and (3) the tools we currently have aren't effective at limiting their
time on the ap [sic].”
157. But despite that survey feedback, Meta made sure not to speak about the concept
of “addiction” publicly. In that same March 2020 exchange, the two employees discussed a draft
public statement regarding ‘efforts to combat social media addiction.”
158. The first asked: “Do we want to call it addiction? Maybe not.” The second clarified:
“(this is internal only).” The first employee responded: “Internal only makes it better. I’m just a
little cautious about calling it addiction.” The second responded: ‘Totally agree, we would never
want to say that!”
159. Employees continued to grapple with this issue in September 2020, when Netflix
released The Social Dilemma, which accused Meta of addicting Adolescents to Instagram.
30
PDF Page 34
160. The program hit home with some Meta employees. In one exchange among several
Instagram employees, Instagram’s Director of Data Science stated ‘by the way] there is a new
Netflix [documentary] basically saying we’re creating a world of addicts...” In response, a second
employee stated that the documentary “makes me feel like tech plays to humans’ inability to have
self-control lol.”
161. Inresponse, Instagram’s Director of Data Science stated, “Yeh that’s exactly what
the [documentary] says. I think its true [to be honest] . . . | do worry what it does to Adolescents
who are still developing their brains and social skills, as well as being more susceptible to mean
comments or lack of friends/feedback.”
162. A third employee asked if Meta was “creating addicts or facilitating them.... giving
existing addicts a really accessible outlet?” The second employee clarified, “a really accessible
outlet that optimizes for time spent...{and] keeps people coming back even when it stops being
good for them.”
163. Instagram’s Director of Data Sciences responded, “without the right stimulus,
someone might never become an addict. So it’s a tricky one. It’s like, you’ll never become a
gambling addict if you don’t visit vegas : P”
164. That same day in September 2020, Instagram’s Director of Data Science analyzed
the scope of the problem, creating charts titled “Number of US Humans who spend a lot of time
on IG ina day,” and “US Humans that spend a ton of time on IG in a Week.”
165. The daily chart showed that in the United States more than 475,000 teens spend 3-
4 hours per day on Instagram; more than 235,000 spend 4-5 hours; and more than 300,000 spend
5 or more hours. The weekly chart showed that in the United States more than 1 million teens
31
PDF Page 35
spend 14-21 hours; more than 420,000 spend 21-28 hours; and 400,000 spend 28 or more hours
per week on Instagram.
166. Meta knew that this level of usage was caused by its design. As a Meta Vice
President of Product told Instagram’s leadership in February 2021, “problematic use . . . [will]
require more fundamental changes to our goals, what type of work they incentive [sic], and
therefore how core mechanics work (feed design, ranking, sharing, notif[ications]}.” That message
did not receive a response.
167. And Meta knew that Instagram’s core mechanics were interfering with a critical
part of Adolescents’ development: Sleep.
168. For example, in an April 2021 analysis, Meta observed that “peak” hours for
messaging were “in the late evenings,” with the highest rate of “‘sessions with message sends”
occurring between 9—] 1pm.
169. That same analysis showed that on weekdays, US teens spent the most time on
Instagram between 9-11 pm.
170. After reviewing that information, a Meta data scientist commented, “Honestly the
only insight I see in these charts is that teens are really into using IG [Instagram] at | 1pm when
they should probably be sleeping : ( ”
171. Internally, Meta understood the specific ways that compulsive use manifested on
Instagram.
172. For example, a November 2021 analysis titled “Well-being: Problematic Use”
showed that “more reliable proxies for identifying problematic use” included: ‘passive’
consumption, frequent low-engagement sessions, disproportionate night-time usage, repetitive app
checking, and receiving and responding to more push notifications.”
32
PDF Page 36
173. That same analysis also acknowledged that “problematic use” was “more common
among teens and people in their 20s.” It explained: “this is consistent with Adolescents having
problems with self-regulation.”
5, Instagram Harms Adolescents by Inducing Compulsive Use
174. Defendants have substantially injured Adolescents by designing Instagram to
induce compulsive and excessive use, which interferes with important developmental processes
and behaviors.
175. These injuries include Adolescents’ lack of sleep and related health outcomes,
diminished in-person socialization skills, reduced attention, increased hyperactivity, self-control
challenges and interruption of various brain development processes.
176. Defendants have also caused Adolescents to experience mental health harms, such
as increased levels of depression and anxiety.
177. In addition, Defendants have caused Adolescents to have diminished social
capacity and other developmental skills by virtue of the “opportunity cost” associated with
devoting significant time to social media, rather than participating in other developmentally
important, in-person, life experiences.
178. The United States Surgeon General’s May 2023 Advisory, titled “Social Media and
Youth Mental Health” (the “Advisory”), describes some of the harms caused by Defendants.”° As
the Advisory explains, “{a] Surgeon General’s advisory is a public statement that calls the
American people’s attention to an urgent public health issue . . . Advisories are reserved for
significant public health challenges that require the nation’s immediate awareness and action.”
26 See U.S. Dep't of Health & Hum. Servs., Social Media and Youth Mental Health: The U.S. Surgeon General’s
Advisory 4 (2023) (available at https://www hhs.govw/sites/default/files/se-youth-mental-health-social-media-
advisory.pdf).
33
PDF Page 37
According to the Surgeon General, Adolescents’ social media use ts one such significant public
health challenge.
179. As the Advisory explains, “[e]xcessive and problematic social media use, such as
compulsive or uncontrollable use, has been linked to sleep problems, attention problems, and
feelings of exclusion among adolescents.”
49 68
180. The Advisory also identifies “changes in brain structure,” “altered neurological
development,” “depressive symptoms, suicidal thoughts and behaviors,” “attention
deficit/hyperactivity disorder (ADHD)” and “depression, anxiety and neuroticism,” as additional
harms to Adolescents associated with compulsive social media use.?’
On Meta Engages in Deceptive Conduct by Omitting and Misrepresenting Material
Facts About Instagram
18i. Under the OCPA, a business engages in deceptive conduct when its acts,
representations, or omissions deceive or could reasonably be expected to deceive or mislead a
person to their detriment.
182. Asan initial matter, Meta failed to disclose Instagram’s addictive nature. For years,
Meta has led young consumers (and their parents) to believe that Instagram is a safer and less
harmful platform than it is.
183. Meta deceived young consumers and their parents by failing to disclose that
Instagram is, on balance, harmful to consumers (and especially damaging to girls), by concealing
information about some of its most harmful platform features, by promoting misleading metrics
about platform safety, and by touting inaccurate and ineffective “well-being” initiatives, among
other methods.
2? To be clear, this Petition is focused on harms arising out of compulsive or “problematic” Instagram use, not harms
caused by exposure to any particular content on Instagram.
34
PDF Page 38
1. Meta Did Not Disclose its Knowledge That Instagram Harms _ Users,
Particularly Girls
184. Meta has long known that the Instagram platform is likely harming a significant
portion of its user-base.
185. For instance, in September and October of 2018, Meta surveyed and interviewed
active Instagram users to gauge the association between Instagram and “negative social
comparison,” (/.e., comparisons that make young users feel worse about themselves). In Meta
researchers’ own words, the experiment found that “at least some of this association is causal.”
186. Knowing that “[nJegative social comparison is associated with worsened well-
being measures across the board,” Meta found that for Instagram users, “there is a relationship
between tenure and the length of negative [social comparison].” For instance, among Instagram
users who have been on the platform for an average of 4.4 years, Meta found that “33% of people
hav[e] been feeling worse about themselves on [Instagram] for ‘several months to a year.’”
187. That study also found that Instagram drives negative social comparison, the process
whereby a person assesses their own self-worth by how they perceive they stack up against others,
for teen girls especially. It observed that, in comparison to men who are at least 25 years old,
women are five times more likely and teen girls are eight times more likely to engage in negative
social comparison due to Instagram use.
188. Meta would continue sharpening its internal understanding of the harms
experienced by Instagram users over subsequent years. For instance, Meta knew that Instagram
caused or contributed to:
e Addiction. Meta research from August and September 2019 found that
“Instagram is addictive, and time-spend on [the] platform is having a negative
impact on mental health.” The research observed that teens “have an addicts’
narrative about Instagram use... they wish they could spend less time caring
about it, but they can’t help themselves.” The research concluded, “[o]n an
35
¢ 68% of teen girls experience negative social comparison, and this issue is “net
an influencer problem, it's an Instagram problem.”
190. Meta also knew that the “Explore” surface, a feature whereby Instagram
recommends content to users from accounts that they do not follow, exacerbated some of the harms
identified by the above research.
191. On July 22, 2021, Meta internally shared research indicating that Instagram’s
Explore surface tends to increase users’ “exposure to [negative appearance comparison] content
beyond the preferences that people have indicated by the choice of accounts they follow.”
Consequently, “17% of people see substantially more {at least 20 percentage points) [negative
appearance comparison] content in Explore than in Feed. It’s worse for women and teen girls.”
192. In other words, Meta knew that Instagram’s algorithms were feeding users more
harmful content than the users would have otherwise received if content visibility was only driven
by preferences expressed by the users.
193. In mid-2021, Meta undertook an extensive survey of users to “develop a holistic,
consistent picture of user bad experiences on Instagram that allows [Meta] to track [its] progress
each half [year].”
194. Referred to internally as BEEF (“Bad Experiences and Encounters Framework”),
the survey measured users’ exposure to certain categories of harmful content on Instagram over a
seven-day period and leveraged a subcategory of Instagram users as a control group to “determine
causality.”
195. As discussed in more detail below, the BEEF survey showed a significant number
of Instagram users regularly experienced: (1) negative social comparison-promoting content; (2)
self-harm-promoting content; (3) bullying content; (4) unwanted advances; and (5) a collection of
other harmful encounters on the platform.
37
PDF Page 41
196. Relatedly, on August 27, 2021, an Instagram spokesperson shared an exchange with
Head of Instagram Adam Mosseri about a forthcoming article by The Wall Street Journal’s Jeff
Horwitz “that essentially argues that [Instagram’s] design is inherently bad for teenage girls (leads
to [suicide and self-harm], poor mental health, dysphoria).” The spokesperson observed that
Horwitz’s “arguments [are] based on [Meta’s] own research so [they] are difficult to rebut” and
noted that the article could expose “that [Meta’s] own research confirmed what everyone has
long suspected.”
197. Upon information and belief, Meta has not disclosed the vast majority of its internal
research showing the harm Instagram causes its users.
198. Nor has Meta warned consumers of the risks that Instagram poses to users that are
known to Meta internally.
i. Meta’s Leadership Refused to Remediate Instagram’s Known Harms
199. Although Meta understood that Instagram causes significant harm to users, Meta
executives repeatedly declined to fund internal proposals to reduce those harms.
200. As early as March 2019, Meta employees were aware that a critical mass of
“internal and external” research showed that Instagram harms users. Based on that knowledge,
Meta employees raised this issue to Meta’s senior decision-makers.
201. On or around March 8, 2019, a team of Meta researchers sent Sheryl Sandberg a
report warning that Meta was harmful for users, on balance, The report stated, “there is increasing
scientific evidence (particularly in the US) that the average net effect of [Meta] on people’s well-
being is slightly negative.
202. The report identified “[t]hree negative drivers that occur frequently on Meta’s
platforms and impact people’s well being.” Those drivers were: (1) problematic use (Meta’s
euphemism for compulsive use); (2) social comparison; and (3) loneliness.
38
PDF Page 42
203. The report observed: (1) 58.1% of users experienced varying degrees of
problematic use; (2) 45% of users experienced varying degrees of social comparison; and (3) 43%
of users experienced varying degrees of loneliness from using Meta’s platforms.
204. The report warned Sandberg that Meta needed new product investment to remedy
these harms. It stated: “With no additional investment, we are on a trajectory to deliver exploratory
findings (and NO product changes).” ““We recommend investing in both the product effort and the
[research] effort.”
205. On April 8, 2019, Meta’s VP of Product, Choice and Competition escalated this
warning by emailing Zuckerberg, Sandberg, and Mosseri. That email reiterated the warning they
had previously shared with Sandberg: “there is increasing scientific evidence (particularly in the
US) that the average net effect of [Meta] on people’s well-being is slightly negative.” Like the
report that Sandberg received individually, the email to Zuckerberg, Mosseri, and Sandberg
implored that “there is a strong need to increase our investment in these areas to make a meaningful
shift over the next year and beyond.”
206. Several days later, a member of Meta’s finance team—speaking on behalf of
Zuckerberg and Sandberg—told the research team that Meta would not fund the recommended
investments at the organizational level.
207. Later that same day, Mosseri clarified that the recommended research would not be
funded at the Instagram level either. He explained, “[uJnfortunately I don’t see us funding this
from Instagram any time soon.”
208. Rather, Meta’s executive decision-makers understood that Instagram was, on net,
negatively impacting its users. But instead of disclosing that fact or investing in solutions to the
problem, Meta continued to prioritize profits at the users’ expense.
39
PDF Page 43
209, Later in 2019, Fidji Simo—then Head of Facebook—told Mosseri that, to improve
well-being on Meta’s platforms, “‘we need to increase investment.”
210. Mosseri replied, “100% agree. My current take is the biggest problem is: Well-
being is the existential question we face, and we lack a... roadmap of work that demonstrates we
care about well-being.”
211. Despite Mosseri’s purported concerns, Meta’s leadership refused to fund well-
being product investments for years.
212. For example, in August 2021, Nick Clegg—Meta’s President of Global A ffairs—
emailed Zuckerberg recommending “additional investment to strengthen our position on wellbeing
across the company.”
213. Clegg endorsed this investment because politicians worldwide were raising
concerns “about the impact of [Meta’s] products on Adolescents’ mental health.” Clegg concluded
that while Meta had a “strong program of research,” it “need[ed] to do more and we are being held
back by a lack of investment on the product side which means that we’ re not able to make changes
and innovations at the pace required.”
214. Zuckerberg declined to respond to Clegg’s request for months—even after The
Wall Street Journal reported on Meta’s internal research showing the harm Instagram caused to
Adolescents’ mental health.
215. As it turns out, Zuckerberg’s attention was elsewhere. While Clegg and others
worried about public backlash, Zuckerberg was preoccupied with public perception of his
hydrofoil, whitch ts an aquatic recreation device.
216. OnSeptember 21, 2021, while Meta’s previously undisclosed internal research was
a leading headline, Meta’s Public Affairs team worked to dissuade Zuckerberg from publicly
40
PDF Page 44
mocking a different news story that mistakenly referred to Zuckerberg’s hydrofoil as an “electric
surfboard.”
217. According to a member of the team, Zuckerberg was “eager” to publicly state:
“Look, it’s one thing for journalists to make false claims about my work, but it’s crossing a line to
say I’m riding an electric surfboard when it’s clearly a hydrofoil and I’m pumping that thing with
my legs.”
218. Later in the same conversation, an unamused Clegg observed the absurdity of
Zuckerberg’s inclination:
Am I missing something here? On the day a [Meta] rep[resentative] is pulled apart
by US Senators on whether we care enough about children on our services,
[Zuckerberg] is going to post about . . . surfboards? Maybe I’ve lost my humor
about this whole thing, but I really think this would seem to any casual observer to
be pretty tone deaf given the gravity of the things we’re accused of... If 1 was him,
I wouldn’t want to be asked “while your company was being accused of aiding and
abetting teenage suicide why was your only public pronouncement a post about
surfing?” ... [The Wall Street Journal’s reporting about Instagram’s mental health
impacts] has dramatically consolidated a wider narrative (that we’re bad for kids)
which had been brewing for some time. [t now makes regulation . . . certain, and in
my view makes launching [Instagram] Kids nigh impossible. I’ve told [Zuckerberg]
and [Sandberg] this already.
219. Zuckerberg ultimately released the statement. As Meta’s Head of Communications
said, “I’m really eager to just do whatever he wants at this point. My spine has been surgically
removed.”
220. Meanwhile, Clegg was concerned that Zuckerberg’s mindset was hampering
Meta’s response to the mental health crisis covered in The Wall Street Journal’s reporting. In a
contemporaneous discussion with a member of Meta’s finance team, Clegg implored, “the WSJ
story about [Instagram] and teenage depression and suicide will have a huge impact on
regulatory/political pressure on us going forward . . . I’m worried that none of this is — yet — being
reflected in [Zuckerberg’s] decision making [sic] on [staffing].”
41
PDF Page 45
221. Clegg was not alone—other members of Meta’s senior leadership team were also
becoming increasingly alarmed.
222. For instance, following significant media coverage of Meta platforms’ harms to
young users, Meta’s VP of Research emailed Clegg to share, “I feel even more convinced that we
need to make more progress on well-being on the product side.”
223. Similarly, in an October 2021 exchange about Clegg’s well-being recommendation
(to which Zuckerberg still had not responded), Mosseri complained, “I’m really worried about this
... we’ve been talking about this for a long time but have made little progress.”
224. Meta’s VP of Product Management agreed with Mosseri, observing that Meta’s
“biggest gap is getting [Meta’s] research into product roadmaps. We got 0 new well-being funding
for 2022.”
225. Meta’s VP of Product Management reiterated the same with other Meta employees:
“We've made a lot of progress on research .. . We've not made a lot of progress on getting the
research into product.”
226. In November 2021, Clegg sent an email following up on his August 2021
correspondence to which Zuckerberg had failed to respond. In the follow-up email, Clegg
underscored that product investment is “important to ensure we have the product roadmaps
necessary to stand behind our external narrative of well-being on our apps.”
227. In other words, as Clegg told Zuckerberg, the Meta’s external well-being
“narrative” was inconsistent with Meta’s actual financial commitment to that issue.
228. Zuckerberg, “the only non middle manager at Meta” as another Meta executive
described him, was ultimately responsible for this discrepancy as he was the ultimate decision
maker for Meta.
42
PDF Page 46
229. According to another executive, while “most companies are ‘feudal’ in their
structure—nesting minor fiefdoms,” Meta is not. Instead, Meta is “an absolute dictatorship” run
by Zuckerberg.
ii. Meta Limited Internal Access to Documents Showing Instagram’s Harms
230. As described above, Meta never publicly disclosed: (1) the internal research
findings showing the harm Instagram caused its users; or (2) Meta’s decision-makers’ refusal to
invest in recommended solutions to known problems.
231. To the contrary, Meta took affirmative steps to hide its internal research from the
public, including from Oklahoma consumers.
232. As Meta’s products, including Instagram, faced growing scrutiny over time, Meta
locked down access to internal research findings.
233. For instance, on August 27, 2021—-shortly after Meta learned of The Wall Street
Journal’s forthcoming article—one Instagram research manager noted that Meta was “locking
down access to some of the extra sensitive pieces of work.”
234. The same manager subsequently instructed a research colleague to “make sure that
any of our shareable deliverables or insights docs that you own on the mental well-being space are
locked down.”
235. Similarly, on October 20, 2021, a senior Meta well-being researcher complained
about a new policy requiring Meta’s Communications team to review research findings even
before they could be shared internally.
236. A colleague sympathized with the senior researcher’s grievance and observed that
if internal research “needs to be sanitized to share with [internal] people that need to know (i.e.,
the people in focused, closed groups) then we’ve got a big problem.”
43
PDF Page 47
237. The colleague continued, bemoaning the way a Communications team employee
had sanitized their research findings before internal circulation. The colleague explained that the
communications team “took issue with language describing a finding as applying to a general
population instead of just survey responders ... The discussion that followed left [a second
colleague] feeling that [Meta] wouldn’t want us to do that so that [Meta] could more easily
dismiss inconvenient findings. This is a huge moral hazard, in my opinion.”
238. True to form, Meta also restricted access to the BEEF survey results in the latter
half of 2021.
239. As one Meta employee observed on September 30, 2021, “[t]he results of BEEF . .
. are only being shared in private and select groups, to avoid leaks. Sad new world.” According to
the same employee, Meta narrowed BEEF survey result access to a “66-person secret group.”
240. By 2021, one PhD-level researcher compared Meta’s messaging strategy to that of
Big Tobacco. After Meta tried to downplay Instagram’s role in causing mental health harms in the
wake of the Haugen disclosures, the researcher emailed colleagues explaining that:
Pre-[Meta] I spent a lot of time working on public health and environmental issues,
and this sounds eerily similar to what tobacco companies and climate change
deniers say. Uncertainty/doubt is a key component of the scientific method, but it
can also be weaponized to push back on critics (e.g., ‘ . . . but this one scientist
thinks cigarettes don’t cause cancer,’ ‘we need more research to know for sure
whether climate change is man made,’ ‘evolution is just a theory,’ etc etc)...
[W]hen we use language like this it puts us in very bad company.
241. On information and belief, Meta’s internal culture of secrecy was designed to keep
consumers and policymakers in the dark about the harm Meta was causing its users, including
Instagram users.
iii. Meta Did Not Disclose Its Knowledge That Instagram’s Cosmetic Surgery Effect
Selfie Filters Were Especially Harmful to Adolescents
44
PDF Page 48
242. The decision-making behind Instagram’s cosmetic selfie filters illustrates Meta’s
culture of deception.
243. Upon information and belief, by 2017, Meta believed that it was losing users and
content creators to rival social media platforms, such as Snapchat.
244. Meta staff concluded that “face filters are viewed as the key differentiator to keep
[content creators] using Snapchat—in particular very large talent is eager for a simple beauty filter
to help them be more comfortable to put their face on camera.”
245. Shortly thereafter, Meta worked to integrate augmented reality filter effects into
Instagram.
246. For example, an internal Meta presentation dated February 27, 2018, describes the
“strategic goal” of integrating augmented reality filter effects into the Instagram platform “to see
if [augmented reality] effects can get strong product market fit... by tapping into [Instagram’s]
teens community and cultural moments.”
247. That “strategic goal” was intended to benefit “Instagram, Teens, and Partners” in
specific ways. For Instagram’s part, integrating augmented reality filter effects would “[i]ncrease
[c]amera [e]ngagement in order to drive more sharing” and “[bJuild a daily behavior by giving
[t]eens reasons to check the camera everyday [sic] though scalable new content.”
248. In other words, these camera filters would increase user engagement—and
consequently, profit—on Meta platforms.
249. But by 2018, some Meta staff were already wary that augmented reality filters may
harm users. They acknowledged a “growing body of research that social media may be driving
significant increases in rates of anxiety and depression, esp[ecially] among teenage women.”
45
PDF Page 49
250. As one Meta staffer explained, “[t]his is a hard issue to navigate because I know
there is a lot of competitive pressure and a lot of market demand for filters that go much more
directly into the beautification space. And if we test any of these things, they will undoubtedly
perform well. But just because people like and want something in the short term doesn’t mean it’s
healthy for them.”
251. Consequently, in October 2018, Meta staff commissioned ‘a researcher and
licensed psychologist at Duke who specializes in eating disorders and body image issues among
adolescents and adults” to undertake a literature review titled, “Consequences and Implications of
Selfie Manipulation on Well-Being.”
252. Meta’s commissioned study of the existing scientific literature found:
An analysis of the costs and benefits of editing selfies and viewing manipulated
photos indicate the risks far outweigh the benefits. Research to date suggests these
behaviors exacerbate risk and maintenance of several mental health concerns
including body dissatisfaction, eating disorders and body dysmorphic disorder . . .
Data also indicates that editing selfies may have a paradoxical effect with regards
to social connection. Rather than increasing acceptance, editing photos may
actually increase social rejection . . . Rather than bringing people together, selfie
manipulation tools risk propagating unrealistic standards of beauty that are cross-
culturally harmful and divide more than they unite.
253. Meta’s decision-makers disregarded the literature review’s warning and
implemented cosmetic selfie filters on the Instagram platform. But in mid-October 2019, Meta
received harsh public rebuke from “press and mental health experts” who observed that certain
selfie filters available on the Instagram platform promoted plastic surgery, raising mental health
concerns.
254. Internally, Meta employees referred to this as a “PR fire” that included “negative
press coverage, questions from regulators, and growing concern from experts.”
46
PDF Page 50
255. Based on that public pressure—and approximately eleven months after receiving
unequivocal warnings from the literature review that it commissioned—Meta installed a set of
interim policies banning augmented reality filters that promote cosmetic plastic surgery.
256. After installing these interim policies, Meta devoted significant thought to its long-
term position regarding these augmented filters.
257. For example, a subsequent Meta internal presentation notes, “[i]ndependent experts
we consulted from around the world .. . generally agree that Cosmetic Surgery Effects raise
significant concerns related to mental health and wellbeing, especially for teenage girls.” The
presentation recommended “continuing the ban and erring on the side of protecting users from
potential mental health impacts.”
258. In November 2019, Meta staff formally submitted a long-term policy proposal to
the decision-makers: “Reject cosmetic effects that change the user’s facial structure in a way that’s
only achievable by cosmetic surgery for the purposes of beautification (in a way that cannot be
achieved by makeup).” The proposal clarified, ‘“{t]his does not apply to effects that change a user’s
facial structure for the purpose of turning the user into a character or animal.”
259. Meta’s decision-makers ignored this recommendation.
260. On November 14, 2019, Andrew Bosworth—then, Meta’s VP of Augmented
Reality and Virtual Reality—-opposed the proposal arguing that maintaining the ban would only
“move [users] to other apps which aren’t likely to be as restrained.”
261. A day later, Instagram’s Head of Public Policy questioned Bosworth’s perspective.
She noted that the “strong recommendation” to “disallow[] effects that mimic plastic surgery” was
made after consulting with Meta’s communications, marketing, and policy teams—as well as
engagement “with nearly 20 outside experts and academics.”
47
PDF Page 51
262, Instagram’s Head of Public Policy added, “we’re talking about actively
encouraging young girls into body dysmorphia . . . the outside academics and experts consulted
were nearly unanimous on the harm here.”
263. Two days later, a second Meta staffer likewise challenged Bosworth’s viewpoint:
[T]he argument that this decision (to prohibit cosmetic surgery selfie filters] might move people
into other apps doesn’t carry weight with me [i]f it means we’re not setting a good example/being
a good steward for young pecple.”
264. On March 30, 2020, Sandberg also expressed support for maintaining Meta’s ban
on cosmetic surgery effect filters: “I really hope we can keep the ban since we already have it ...
Let’s not break something that isn’t already broken.”
265. Shortly thereafter, the question of “whether [Meta] should continue, modify, or lift
the temporary ban on Cosmetic Surgery [augmented reality] Effects” was elevated to Mark
Zuckerberg. Zuckerberg, citing his belief that the ban “felt paternalistic,” decided to lift the ban.
266, Later that week, a senior Meta staffer memorialized her disagreement with
Zuckerberg’s decision, stating “I respect your call on this and [’ll support it, but want to just say
for the record that I don’t think it’s the right call given the risks . . . I just hope that years from now
we will look back and feel good about the decision we made here.”
267. In April 2021, several Meta staffers were still reeling from Zuckerberg’s decision.
Reacting to an article that referred to social media’s widespread use of augmented reality filters as
“a mass experiment on girls and young women,” a Meta staffer remarked, “[t]his makes me so sad
to read. Especially knowing how hard we fought to prevent these on [Instagram].”
48
PDF Page 52
268. A second Meta staffer replied, “I know, it’s pretty dispiriting to think about. And
the fact that we have no idea what the long-term effects will be for this generation that has grown
up comparing themselves to something that’s . . . totally fake.”
269. The first staffer responded: “Given all the continued coverage on the impact of
beauty filters on youth, I’d want us to consider age-gating these effects to [under-18] accounts.”
270. Upon information and belief, Meta never age-gated cosmetic selfie filters on its
platforms—and the platform feature remains accessible to Adolescents to this day.
271. Upon information and belief, Meta never publicly disclosed its findings that these
effects were harmful to users, a material omission that misled consumers and influenced
consumers’ decision-making about their use of Instagram.
2. Meta Promoted Misleading Metrics About the Incidence of Harm on
Instagram
t. Meta’s Community Standards Enforcement Reports Vastly Understate the
Incidence of Harmful Content and Experience on Instagram
272. Through public representations, Meta creates the impression that Instagram is a safe
platform on which harmful content is rarely encountered.
273. For example, Meta broadcasts that message through its Community Standard
Enforcement Reports (“the Reports”), which Meta publishes quarterly on its online “Transparency
Center” and amplifies through press releases.
274. The Reports describe the percentage of content posted on Instagram that Meta
removes for violating Instagram’s community standards. Meta often refers to that percentage as
its “prevalence” metric.
275. Through Report-related talking points, Meta directed its employees to tout the
“prevalence” metric as “the most important measure of a healthy online community.”
49
PDF Page 53
276. The Reports create the impression that because Meta aggressively enforces
platform community standards—thereby reducing the “prevalence” of community-standards-
violating content—Instagram is a safe product that only rarely exposes users (including
Adolescents) to harmful content.
277. But this is a false equivalency intended to at least sow confusion or, at most,
outright mislead. As Meta well understands, the “prevalence” of standards-violating content,
which is often quite low, is not the same as the actual “prevalence” of harmful content, which is
rampant on Instagram (to say nothing of the harmful effects of the compulsive use of Instagram
that the Defendants have so successfully encouraged).
278. Notably, Meta drafted the “community standards,” and has incentives to design
those standards narrowly so that they are rarely violated.
279. For example, the 2019 third quarter Report touts Meta’s “[p]rogress to help keep
people safe.” Likewise, the 2023 second quarter Report states that “we publish the Community
Standards Enforcement Report . . . to more effectively track our progress and demonstrate our
continued commitment to making . . . Instagram safe.”
280. These representations—publicly accessible on Méeta’s online Transparency
Center—create the impression that Meta is disclosing all it knows about the safety of Instagram
and the risks of potential harm that exists on its platform. In other words, Meta posts these reports
to its online Transparency Center so that anyone who visits the site will believe that Adolescents
are unlikely to harmed by using Meta’s platform.
281. Indeed, documents show that Meta intended the Reports to create that exact
(mis)understanding.
28 See Community Standard Enforcement Report, Q2 2023 Report, Meta Transparency Center (Aug. 2023)
(available at https://transparency.fb.com/data/community-standards-enforcement/).
50
PDF Page 54
282. In March 2021, Meta conducted an internal Meta “Company Narrative Audit” that
suggested ways the company could improve its standing with the public-—and with consumers,
more specifically, The audit identified several “narratives” the company should attempt to combat
the narratives such as “[Meta] allows hateful and harmful content to proliferate on its platform.”
283. To counteract the narrative, the audit suggests that Meta should publicize that:
“Every three months we publish our Community Enforcement Standards Report to track our
progress and demonstrate our continued commitment to making Facebook and Instagram safe.”
284. Consistent with this effort, internal communications show that Meta encouraged
employees to use the Reports as an external “measure for platform safety” that “illustrate the
efforts we are making to keep our platforms safe.”
285. But the impression that the Reports create—that Meta platforms are safe and only
rarely display harmful content—is false and misleading.
286. For example, Meta’s 2021 third quarter Report states that on Instagram, “less than
0.05% of views were of content that violated our standards against suicide & self-injury.” That
representation created the impression that it was very rare for users to experience content relating
to suicide and self-injury on Instagram.
287. But Meta’s contemporaneous internal BEEF survey data showed that during 2021,
6.7% of surveyed Instagram users had seen self-harm content within the last seven days. For users
between 13-15 years of age, 8.4% had seen content relating to self-harm on Instagram within the
last seven days.
288. Thus, the frequency with which users encounter self-harm-related content on
Instagram vastly exceeded the impression Meta created through its Reports.
51
PDF Page 55
289. A similar discrepancy may be seen in Meta’s measurement of bullying and
harassing content.
290. For example, the third quarterly Report of 2021 stated that “we estimate between
0.05% to 0.06% of views were of content that violated our standards against bullying & harassment
[on Instagram].” This representation created the impression that it was very rare for users to
observe or experience bullying or harassment on Instagram.
291. Again, Meta’s contemporaneous internal user survey data told a different story:
Among surveyed Instagram users, 28.3% witnessed bullying on the platform within the last seven
days and 8.1% were the target of bullying on the platform within the last seven days.
292. Among 13-15-year-olds, 27.2% reported witnessing bullying within seven days.
Among users aged 16-17, that figure was 29.4%.
293. When asked whether they had been the target of bullying on Instagram within the
last seven days, 10.8% of 13-15-year-olds said yes.
294. Similarly, and contrary to the 2021 third quarter Report’s representation that
harassment on Instagram was rare, Meta’s contemporaneous internal survey showed that 11.9% of
all survey respondents said they had received unwanted advances on Instagram within the last
seven days.
295. Among 13-15-year-olds, 13.0% reported that they had received unwanted advances
within the last seven days. Among 16-17-year-olds, that figure was 14.1%.
296. In other words, contrary to the impression the Reports created, Instagram users in
general—and Adolescents in particular—regularly encounter content related to self-harm,
bullying, and harassment on Instagram. Through its Reports, Meta affirmatively misrepresented
the actual prevalence of such harms.
32
PDF Page 56
ii. Meta’s Executive leadership Knew the Reports Misled Consumers
297, Meta’s leadership team understood the discrepancy between Meta’s public Reports
and Meta’s internal survey results.
298. On October 5, 2021, Bejar—then a contractor for Meta and formerly Meta’s
Director of Site Engineering—emailed Zuckerberg, Sandberg, Cox, and Mosseri voicing concerns
that “there was a critical gap in how [Meta] as a company approach[es] harm.””°
299. Bejar proposed that the company shift the focus of its public communications away
from the “prevalence” of community standards violations, towards a measure (like the BEEF
surveys)*’ that reflected the true scope of harmful content encountered on Instagram.?!
300. Meta’s senior Leadership did not respond to Bejar. In fact, Zuckerberg, with whom
Bejar worked directly for several years, declined to respond to Bejar’s email. Bejar has stated that
he could “not think of an email that I sent to Mark [Zuckerberg] during my time [at Meta] that he
didn’t read or respond to.”
301. Undeterred, Meta continued to issue and publicize the Reports—even though
Meta’s leadership team knew the Reports vastly under-represent the volume of harmful content on
Instagram, and despite Bejar’s pleas.
302. During the multistate investigation, Bejar testified that Meta adopted and
maintained this strategy to mistead the public.
29 Ex. 1, Bejar Trans., at 236:16-290:14.
39 fd.
31 Fed.
32 Jd, at 291:7-17.
PDF Page 57
303. When asked if he believed “that Mr. Zuckerberg and other company leaders focused
on the Prevalence metric because it created a distorted picture about the safety of Meta’s
platforms,” Bejar testified ‘I do.”
304. When asked if he thought “Mr. Zuckerberg’s public statements about prevalence
created a misleading picture of the harmfulness of Meta’s platforms,” Bejar testified “I do.’
305. And when asked if he was “aware of any instances where the company, in [his]
view, minimized the harms users were experiencing on Meta’s platforms,” Bejar testified: “Every
time that a company spokesperson in the context of harms quotes Prevalence statistics I believe
that is what they are doing, that they’re minimizing the harms that people are experiencing in the
product.’?>
306. Meta issued the Reports and made other public representations in order to downplay
the harmful experiences that are widespread on Instagram—particularly for Adolescents.
oe Meta Deceived Consumers by Promoting “Time Spent” Tools Despite Known
Inaccuracies
307, For years, Meta has affirmatively deceived consumers by promoting and
maintaining inaccurate time-tracking tools on Meta platforms.
308. On August 1, 2018, Meta announced “new tools to help people manage their time
on Facebook and Instagram.” The announcement touted platform-specific activity dashboards,
daily use reminders, and a push notification-limiting tool engineered “based on collaboration and
inspiration from leading mental health experts and organizations, academics, [Meta’s}] own
extensive research and feedback from [Meta’s] community.”
3 fd, at 200:16-201:13.
44 Fad.
35 fe. at 291:7-17,
36 See Ameet Ranadive and David Ginsberg, New Tools to Manage Your Time on Facebook and instagram, Meta
Newsroom (Aug. 1, 2018) (avai/ab/e ai https://about.fo.com/news/2018/08/manage-your-time’}.
54
PDF Page 58
309. In that announcement, Meta acknowledged that it has “a responsibility to help
people understand how much time they spend on [Meta] platforms so they can better manage their
experience.” Meta stated that it hopes “that these tools give people more control over the time they
spend on our platforms and also foster conversations between parents and teens about the online
habits that are right for them.’
310. Through these public statements and others, Meta led Oklahoma consumers and
parents to believe they could rely on Meta’s so-called “Time Spent” tools to track and manage the
time spent on Instagram in a meaningful, accurate way.
311. That representation was false. By March 2020, Meta knew that its Time Spent data
was materially flawed.
312. As one Meta staffer observed at the time, “[oJur [Time Spent] data as currently
shown is incorrect. It’s not just that Apple / Google have better data. Ours is wrong. Far worse.
We're sharing bad metrics externally .. . The reason this is relevant is we vouch for these
numbers. Any day they’re out there is a legal liability.”
313. By the middle of 2020, Instagram’s team charged with decommissioning (or at
Meta’s refers to it, “unshipping”) platform features recommended that Meta’s Time Spent tools
should be removed from Meta’s platforms.
314. But Meta did not follow that recommendation because the “Time Spent” tool was
a key part of Meta’s message to users that Instagram was a trustworthy platform where the risks
of addiction were low.
315. For instance, when she learned about the effort to remove the Time Spent tools,
Instagram’s Head of Policy feared that the removal of inaccurate Time Spent tools would strip
37 Id.
55
PDF Page 59
Meta of its “biggest proof point” on “tech addiction/problematic use.” Consequently, she
advocated that the Time Spent tools should remain in place, despite their inaccuracy:
{T]he time spent dash{board and] end of feed notification is the biggest proof
point we have on tech addiction/problematic use and the tool with the most
positive sentiment from our mental health stakeholders—there’s no product work
we’ve done in the last four years that comes close and we wouldn’t have the
credibility we now have in the social comparison/mental health parent space had
we not launched this... In order to land this unship successfully we would need to
land the why, and without doing so we would lose significant credibility with our
policy and mental health stakeholders . . . I don’t think that’s going to land well
without having something that addresses the underlying issue around problematic
use.
316. Internal resistance to the removal of Time Spent tools continued in the latter half of
2020, as users spent more time on Meta’s platforms during the COVID-19 pandemic. For example,
in July 2020, Meta’s Product Marketing and Communications teams told colleagues that Meta
should not remove the inaccurate Time Spent tools because:
« “Time spent is a bigger concern due to COVID/spending more time online.”
¢ “{Meta] just deprioritized the mental health team, so no new or upcoming
[mental health-promoting] features to point to here.”
e “[Facebook] launched their v2 time spent tool on iOS in Q2 (Android coming
in Q3) and got decent press around the re-launch.”
* “Upcoming moments make the market environment sensitive in this area
(suicide prevention day (sept), world mental health day (oct)) and there is
concern that back-to-school will spark new issues in market perception due to
the majority being online/remote learning so time spent online will likely be
top-of-mind for many.”
317. In other words, Meta’s Product Marketing and Communications teams preferred to
maintain the facade of Meta’s Time Spent tools because the truth—that Meta was not actually
providing any meaningful, accurate tools to help users or parents combat or reduce compulsive
use—would undermine Meta’s business interests and public “sentiment.”
a6
PDF Page 60
318. By October of 2020, internal momentum to discontinue Meta’s inaccurate Time
Spent tools was successfully stymied.
319. In the words of one Meta employee who originally advocated for the removal of
inaccurate Time Spent tools: “I don’t think we can touch [the Time Spent tool] for months, maybe
even more. The regulatory and brand risk from removing our only addiction-related features
outweighs . .. the wins around user trust in the data from the few users who use it.”
320. More than a year after recognizing the Time Spent tools’ inaccuracy, Meta
continued publicly touting the features.
321. On information and belief, Meta regularly promoted its “Time Spent” tool as an
accurate and useful way for users to control their use of Instagram, even when it knew that the
“Time Spent” tool delivered inaccurate metrics.
322. Meta made these representations to build trust with consumers and parents that
Meta’s Time Spent tool would help users (particularly young users) manage their time on
Instagram, even though Meta knew that tool was broken. In this way, Meta won public trust and
sentiment by deceiving the public about the utility of its core addiction-mitigation feature.
4. Through Public Misrepresentations, Meta Leads the Public to Believe That
Instagram is Safe for Adolescents
323. The Time Spent episode is not the only time Meta has prioritized winning trust over
telling the truth. To the contrary, Meta has repeatedly misrepresented facts about its business to
convince consumers and their parents that Meta can be trusted to keep Adolescents safe on
Instagram.
i. Meta Created the False Impression That It Restricts Adolescents from
Accessing Harmful Content on Instagram
324, Through express representations, Meta cultivated the impression that it protects
37
PDF Page 61
young users from harmful or inappropriate content on Instagram.
325. For example, in the opening statement to his Congressional testimony in December
2021, Adam Mosseri stated “We've put in place multiple protections to create safe and age-
appropmiate experiences for people between the ages of 13 and 18” on Instagram.
326. Antigone Davis—Meta’s Global Head of Safety—made the same representation in
prepared remarks to Congress in September 2021 .?8
327, During subsequent questioning from senators, Davis explained that “[w]hen it
comes to those between 13 and 17, we consult with experts to ensure that our policies properly
account for their presence, for example, by age-gating content.’”*’ Davis added, Meta does not
“allow young people to see certain types of content. And we have age gating around certain types
of content,’
328. Davis also specifically testified that Meta does not “direct people towards content
that promotes eating disorders.”"!
329. But intemal documents reveal Meta’s knowledge that Adolescents regularly
encounter content on Meta’s platforms that is not age appropriate—and that Meta’s platforms do,
in fact, push certain Adolescents toward content that promotes eating disorders.
330. In fact, a report that Davis herself authored less than a year before her testimony
contradict the representations she made to Congress (and the public). Titled “Child Safety: State
of Play,” Davis’s October 2020 report contains many alarming findings regarding the lack of
protections for young users on Instagram.
¥ See Facebook Head of Safety Testimony on Mental Health Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at hitps://www.rev.com/blog/transcripts/facebook-head-of-safety-testimony-on-mental-health-
effects-full-senate-hearing-transcript).
39 Fd.
40 Id.
4 fd.
58
PDF Page 62
331. For example, according to that report, Instagram had “minimal child safety
protections” that were needed to prevent “Child Sexual Exploitation.”
332. On the topic of “Age Assurance/Appropriateness” on Instagram—a key feature of
Davis’s testimony—--Davis’s report showed that Instagram’s “vulnerabilities” included “U18
enforcement.” More specifically, Davis’s report noted that Instagram’s “age gating relies on either
stated age or weak age models; lack[s] checkpoint.” The same slide identified “content” as an
additional “[v]ulnerability” for Instagram, due to the presence of “inappropriate/harmful content
and experiences for minors.” This slide concluded that for these topics, “there is work happening
in this area but not resourced to move quickly.”
333. A later slide observed Instagram’s significant “vulnerabilities” regarding users’
well-being. It found that “core product features connect to challenging societal issues” such as the
“objectification of women (e.g. [augmented reality] face and body altering filters), competitive
social comparisons (e.g. likes and comments) and anxiety/[fear of missing out] (e.g.
notifications).” That slide also noted that Instagram was ‘“‘vulnerable” because it had difficulty
“calibrating for content impact on well-being (e.g.[,] eating disorder content and gender-based hate
speech).”
334. Other internal documents demonstrate that Meta did not meaningfully improve
Instagram’s safety or age-appropriateness by the time Davis testified in September 2021.
335. For instance, according to Meta’s internal findings from October 2021 (just after
Davis’s testimony), only 2% of content that young users encounter on Meta’s platforms is “age
appropriate” or “the sort of content we would like to promote to teens.”
336. And, in contrast to Davis’s testimony, Meta’s internal studies show that Instagram
disproportionately directs teen girls to negative appearance comparison-promoting content
59
PDF Page 63
content that Meta knows promotes eating disorders. For example, a June 2021 internal study shows
that on Instagram “approximately 70% of teen girls see ‘too much’ sensitive content,” .e., content
that makes them “often feel worse about themselves.” And another June 2021 internal study
showed that “roughly | in 5 piece pieces of content” teen girls see is “associated with more
negative appearance comparison.”
337. As these examples show, through Mosseri and Davis’s testimony, Meta
affirmatively misled the public about the efficacy of Meta’s efforts to protect young users from
harmful content and/or to deliver age-appropriate experiences on Instagram. These are material
misrepresentations, as reasonable consumers would be less likely to use a platform (or to allow
young users in their care to use a platform) that exposes users to age-inappropriate or harmful
content.
ii. Meta Created the False Impression That It Does Not Prioritize Time Spent
338. To downplay concerns that Instagram is addictive, Meta has repeatedly created the
public impression that it does not prioritize increasing users’ time on Instagram. To create that
impression, Meta’s executives claimed that it does not internally measure success in terms of the
time users spend on Meta’s platforms or otherwise encourage employees to pursue that goal.
339. For example, in October 2019, Mark Zuckerberg publicly stated that Meta does not
allow Meta “teams [to] set goals around increasing time spent on [Meta’s] services.”
340. Similarly, in October 2021, Sheryl Sandberg represented that the company does not
“optimize [its] systems to increase amount of time spent” and that Meta “explicitly do[es]n’t give
[its] team goals around time spent.”
PDF Page 64
341. Meta makes representations like these to garner trust: it wants the public (including
consumers and parents) to believe that it does not measure success in terms of time spent to dispel
the notion that it intentionally fuels compulsive use of Meta’s products.
342. But Meta’s representation that it does not set goals based on time spent is false.
343. For instance, on December 28, 2015, Zuckerberg instructed that Meta should aim
to increase the time that Instagram users spend on the platform by 10% within the next five years.
344. Similarly, an internal email to Instagram’s co-founders lists “emphasis on driving
time spent” among the company’s “‘[k]ey {t]hemes” for the first half of 2016.
345. As another example, an internal Meta presentation titled “2017 Teens Strategic
Focus” explicitly describes Meta’s 2017 “Top-Line Goals” for the first half of 2017, which was
“shared with Zuck.” The first “Top Line Goal” is to “grow teen time spent.”
346. On information and belief, Meta continues to work to increase users’ time spent on
Instagram even to this day.
347. Thus, by claiming that it did not set goals based on time spent, Meta affirmatively
misled the public—inciuding Oklahoma consumers and parents—about Meta’s motivations and
internal business practices. This is a material misrepresentation, as reasonable consumers (and the
parents of Adolescents) would be less likely to trust a platform that works to capture ever-
increasing shares of users’ time.
lil. Meta Created False Impressions That It Does Not Place a Monetary Value on
Adolescents
348. Ina similar vein, Meta deceptively led the public to believe that it does not place a
monetary value on Adolescents’ use of Meta platforms. Meta created that impression it does not
discuss its youngest users in terms of their financial value to the company.
61
PDF Page 65
349. For example, during Antigone Davis’s September 2021 Congressional testimony,
Senator Amy Klobuchar asked Davis for the monetary value that Meta places upon a young user’s
lifetime use of Meta products.
350. Davis responded, “That’s not how we think about building products for young
people... It’s just not the way we think about it.”
351. Through Davis’s testimony, Meta led the public to believe that it does not place a
monetary value on Adolescents’ use of Meta’s platforms.
352. But Meta’s internal correspondence demonstrates that Davis’s response to Senator
Klobuchar was inaccurate and misleading.
353. For instance, an internal email from September 2018 illustrates that Meta plainly
discusses the financial value that Adolescents represent to the company. According to Meta, “The
lifetime value of a 13 [year old] teen is roughly $270 per teen.”
354. Consequently, through Davis’s testimony, Meta affirmatively misled the public
including Oklahoma consumers-—about whether the company places a monetary value upon young
users’ lifetime use of Meta’s products. This is a material misrepresentation, as reasonable
consumers (and especially the parents of Adolescents) would be less likely to trust a platform that
calculates the monetary value that the platform may extract from an Adolescent’s lifetime
engagement.
iv. Meta Created the Misleading Impression That It Was Not Restricting Access
to Internal Research Findings and that it Used Its Internal research to Improve
Produce Safety
355. Through Congressional testimony, Meta deceptively led the public to believe that
it had not changed its internal data and research access policies in response to The Wall Street
Journal’s 2021 coverage of Meta’s internal research findings. Meta wanted to create that
62
PDF Page 66
impression so consumers and parents would believe that the company’s well-being research was
widely available internally and that the company had no reason to lock down internal information
about Instagram’s mental health impacts.
356. During Davis’s September 2021 Congressional testimony, Senator Marsha
Blackbum asked Davis “how are you restricting access to data internally? Have your policies
changed since The Wall Street Journal articles [by Jeff Horwitz describing the internal Meta
research shared by Frances Haugen]?
357. Davis succinctly responded, “Senator, not that | am—not that I’m aware of
certainly.”4?
358. Through Davis’s testimony, Meta led the public to believe Meta did not change its
internal access policies—such as restricting internal access to data and research—following The
Wall Street Journal’s coverage of the internal Meta research shared by Frances Haugen.
359. But in fact, as described in detail in Section C.1. ii. above, in reaction to Haugen’s
public disclosures (which led to Davis’s Congressional testimony), Meta methodically locked
down internal access to well-being related data and research.
360. To briefly restate evidence described above, in August 2021—shortly after Meta
learned of The Wall Street Journal’s forthcoming journalism—one Instagram research manager
noted that the company was “locking down access to some of the extra sensitive pieces of work.”
* See Facebook Head of Safety Testimony on Mental Health Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at https://www.rev.com/blog/transcripts/facebook-head-of-safety-testimony-on-mental-heaith-
effects-full-senate-hearing-transcript).
43 Tad.
63
PDF Page 67
361. The same manager subsequently instructed a research colleague to “make sure that
any of our shareable deliverables or insights docs that you own on the mental well-being space are
locked down.’"4
362. Consequently, through Davis’s testimony, Meta affirmatively misled the public
including Oklahoma consumers—about whether the company internally restricted access to data
and research following The Wall Street Journal’s coverage of Meta’s internal findings. This is a
material misrepresentation, as reasonable consumers (and parents of Adolescents) would be less
likely to trust a platform that undertakes affirmative steps to shield from disclosure internal
information about the platform’s “mental well-being” impacts.
363. Similarly, through Congressional testimony, Meta deceptively led the public to
believe that the company regularly uses internal research findings to inform safety-oriented
product improvements. Meta created this impression so consumers, parents, and guardians would
believe that the company used its troubling internal research findings to improve the safety of its
platforms.
364. During Davis’s September 2021 Congressional testimony, Senator Klobuchar
asked Davis: “What specific steps did you... take in response to your own research [into
Instagram users’ body image issues] and when?”
365. Davis responded: “Senator Klobuchar, I don’t know that I'll be able to give you
exact dates, but what I can tell you is that this research has fueled numerous product changes.”
366. Similarly, during Mosseri’s December 2021 Congressional testimony, Senator Ted
44 Id.
45 See Facebook Head of Safety Testimony on Mental Health Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at https://www.rev.com/blog: -testintony-on-mental-health-
effects-full-senate-hearing-transcript).
46 Fd.
64
PDF Page 68
Cruz asked Mossert: “How did you change your policies as a result of [Meta’s internal research
into Instagram users’ suicidal thoughts] to protect young girls?”
367. Mosseri responded: ‘Senator, I appreciate the question. We use research to not only
change our policies, but te change our product on a regular basis.”
368. Through Davis and Mossert’s Congressional testimony, Meta led the public to
believe Meta regularly uses internal research findings to improve product safety.
369. But in fact, as described in detail in Section C.1.i. above, members of Meta’s
leadership—including Mosseri—acknowledged the company’s failure to translate research
findings into meaningful product changes (1) shortly after Davis’s testimony; and (2) in the months
after Davis’s testimony and preceding Mosseri’s testimony.
370. To briefly restate the evidence detailed above, in October 2021—just two months
before Mosseri’s testimony-—a senior Meta employee explicitly told Mosseri that Meta had “not
made a lot of progress on getting the research into product.”
371. Around the same time, Mosseri complained about Meta’s failure to translate
research findings into product safety improvements: “I’m really worried about this... we’ve been
talking about this for a long time but have made little progress.”
372. And in November 2021—just one month before Mosseri’s testimony—another
senior Meta employee sent an email to Zuckerberg, Mosseri, and others, underscoring Meta’s
outstanding need “to ensure we have the product roadmaps necessary to stand behind our external
narrative of well-being on our apps.”
“7 See U.S, Senate Committee on Commerce, Science & Transportation, Subcommittee on Consumer Protection,
Product Safety, and Data Security, Protecting Kids Online: Instagram and Reforms for Young Users (Dec. 8, 2021)
(available at hitps://www.commerce,senate,pov/202 L/12/protecting-kids-online-instagram-and-reforms-for-
users).
“8 Id.
65
PDF Page 69
373. Consequently, through Davis and Mosseri’s Congressional testimony, Meta
affirmatively misled the public—including Oklahoma consumers—about measures the company
had taken (or failed to undertake) to translate troubling research findings into meaningful product
safety improvements. This is a material misrepresentation, as reasonable consumers, parents, and
guardians would be less likely to trust a platform that fails to deploy safety improvements to
products that pose a known, mitigatable risk to users.
v. Meta Created the [Impression That Its Products Are Not Addictive Despite
Meta’s Internal Research Showing the Contrary
374. Through Congressional testimony, Meta deceptively led the public to believe that
its platforms are not addictive, despite Meta’s own internal research to the contrary.
375. In her September 2021 Congressional testimony, Davis said that Meta does not
build its products to be addictive and disputed the addictive nature of Meta’s products.*°
376. Similarly, in Congressional testimony from December 2021, Adam Mosseri said,
“I don’t believe that research suggests that our products are addictive.”
377. Through Davis and Mosseri’s testimony, Meta led the public to believe Meta’s
platforms are not addictive.
378. In fact, as described in detail in Section B above, Meta (1) knew Instagram was
addictive; and (2) made decisions that facilitated addiction to Instagram long before Davis and
Mosseri’s misleading testimony.
*° See Facebook Head of Safety Testimony on Mental Heaith Effects: Full Senate Hearing Transcript, Rev (Sept. 30,
2021) (available at https://www.rev.com’blog/transcripts/facebook-head-of-safety-testimony-on-mentail-health-
effects-full-senate-hearing-transcript).
% See Taylor Hatmaker, /nstagram’s Adam Mossieri Defends the App’s Teen Safety Track Record to Congress,
TechCrunch (Dec. 8, 2021) (available at https://techcrunch.com/2021/12/08/instagrams-adam-mosseri-senate-
hearing-teen-safety/).
66
PDF Page 70
379. To briefly restate evidence described above, by September 2019, Meta knew from
internal research that “[t]eens are hooked despite how [Instagram] makes them feel .. . Instagram
is addictive, and time-spend on platform is having a negative impact on mental health.”
380. And after observing in May 2020 that “approval and acceptance are huge rewards
for teens and interactions are the currency on [Instagram], Meta deployed engagement-inducing
platforms features, such as “fdirect messages], notifications, comments, follows, likes, etc. [that]
encourage teens to continue engaging and keep coming back to the app.”
381. Consequently, through Davis and Mosseri’s Congressional testimony, Meta
affirmatively misled the public—including Oklahoma consumers—about the addictive nature of
the Instagram platform. This is a material misrepresentation, as reasonable consumers (and parents
of Adolescents) would be less likely to trust an addictive platform.
VIOLATIONS OF LAW
COUNT I
OKLAHOMA CONSUMER PROTECTION ACT
15 OS. § 751-763
(UNFAIRNESS)
382. Oklahoma re-alleges and incorporates by reference all prior paragraphs of this
Petition.
383. The OCPA prohibits businesses from knowingly engaging in “unfair” trade
practices, which are defined as any practice “which offends established public policy” or is
“immoral, unethical, oppressive, unscrupulous or substantially injurious to consumers.” 15 O.S. §
752(14).
384. Defendants have engaged and continue to engage in “consumer transactions” as
that term is defined in the OCPA with hundreds of thousands of Oklahomans.
67
PDF Page 71
385. By designing and deploying Instagram in a manner that induces compulsive use,
the Defendants have engaged in unfair trade practices prohibited by the OCPA.
386. Defendants designed and deployed Instagram in a manner that overwhelmed
consumers’ free and informed choice regarding how much time to spend on the Instagram
platform.
387. Defendants’ scheme was particularly unfair as it relates to Adolescent users, who
are a highly susceptible class of consumers. Indeed, Defendants designed and deployed Instagram
in a manner that intentionally exploited the developmental nature of Adolescents’ brains, creating
an obstacle to Adolescents’ free choice and causing them to spend more time on Instagram than
they otherwise would.
388. By designing and deploying Instagram in a manner that induces compulsive use,
Defendants caused or are likely to cause substantial injury to Oklahoma consumers. Specifically,
Defendants’ unfair conduct has caused or is likely to cause significant harms to the mental health
and well-being of Adolescents, who Defendants have caused to spend vastly more time on
Instagram than they otherwise would.
389. Through their conduct, Defendants have likely injured a large number of
Oklahomans, including a significant number of Adolescents that have likely suffered profound and
severe harms as a result of Defendants’ conduct.
390. Each instance of Defendants’ unfair practices constitutes a separate violation of the
OCPA.
391. Insofar as there are positive benefits associated with Defendants’ conduct, those
benefits do not outweigh the harm arising out of Defendants’ conduct.
COUNT H
OKLAHOMA CONSUMER PROTECTION ACT
68
PDF Page 72
15 O.S. § 751-763
(DECEPTION)
392. Oklahoma re-alleges and incorporates by reference all prior paragraphs of this
Petition.
393. Under the OCPA, a business engages in deceptive conduct by, either orally or
through a writing, making a “misrepresentation, omission or other practice that has deceived or
could reasonably be expected to deceive or misiead a person to the detriment of that person.” 15
O.S. § 752(13).
394. Defendants have engaged and continue to engage in “consumer transactions” as
that term is defined in the OCPA with hundreds of thousands of Oklahomans.
395. As described in this Petition, Defendants have repeatedly deceived consumers
through their words, conduct, silence, and action—in violation of the OCPA.
396. By making express and implied material misrepresentations about Instagram’s
safety, the incidence of harmful experiences on Instagram, and the efficacy of Instagram’s “well-
being” related platform features (such as the “Time Spent” tool), the Defendants have engaged in
deceptive trade practices that are prohibited by the OCPA.
397, Defendants also engaged in deceptive conduct in violation of the OCPA by failing
to disclose the harms associated with Instagram in general and with certain Instagram platform
features, which Defendants knew had a harmful effect on consumers’ mental health and well-
being. Defendants knew the express and implied representations they were making were not true
but made these representations anyway to increase consumers’ engagement with Instagram.
398. Through their acts, omissions, and misrepresentations, the Defendants downplayed
the risks of Instagram use and caused reasonable consumers to believe something that was false,
i.e., that Instagram is a safer platform than it is in reality.
69
PDF Page 73
399.
the OCPA.
Each instance of Defendants’ deceptive practices constitutes a separate violation of
RELIEF REQUESTED
Plaintiff respectfully requests that this Court:
A.
Enter judgment against each Defendant in favor of the State of Oklahoma for each
violation alleged in this Petition;
Issue a permanent injunction: (i) prohibiting Defendants from using platform
features that Defendants know or have reason to believe cause compulsive use
among Adolescents; and (ii) requiring Defendants to meaningfully and publicly
disclose, on a regular basis, the risks posed by Instagram to Adolescents;
Issue a permanent injunction prohibiting Defendants from engaging in deceptive
acts and practices in violation of the OCPA;
Order each Defendant to separately pay civil penalties to the State of Oklahoma not
more than $10,000 per violation of the OCPA as provided by 15 O.S. § 761.1;
1. Enter judgment finding that each instance in which an Adolescent accessed
the Instagram platform in the State of Oklahoma represents a distinct
violation of the OCPA;
Enter judgment against Defendants and in favor of the State of Oklahoma for the
reasonable costs and expenses of the investigation and prosecution of Defendants’
unlawful conduct, including attorney’s fees, expert and other witness fees, and
costs, as provided by 15 O.S. § 761.1;
Award such other relief as the Courts deems necessary and proper under the
circumstances.
70
PDF Page 74
GENTNER DRUMMOND
ATTORNEY GENERAL OF OKLAHOMA
a Ean
Robert J. Carlson, OBA #19312
Senior Assistant Attorney General
Caleb J. Smith, OBA #33613
Assistant Attorney General
Oklahoma Office of the Attorney General
15 West 6th Street, Suite 1000
Tulsa, OK 74119
Telephone: 918-581-2885
Email: Robert.Carlson@oag.ok.gov
Email: Caleb.Smith@oag.ok.gov
-and-
Ethan A. Shaner, OBA #30916
Deputy Attorney General
Oklahoma Office of the Attorney General
313 N.E. 21st Street
Oklahoma City, OK 73105
Telephone: (405) 521-3921
Email: Ethan.Shaner@oag.ok.gov
71
PDF Page 75
CERTIFCATE OF SERVICE
I do hereby certify that on the 16" day of November 2023, a true, correct and exact copy
of the above and foregoing document was served to those parties as listed below via:
[_] U.S. Postal Service (_] In Person Delivery [_] Courier Service Ge Mai [_] Fax
Robert G. McCampbell, OBA No. 10390
Nicholas (“Nick”) V. Merkley, OBA No. 20284
GABLEGOTWALS
BOK Park Plaza
499 West Sheridan Avenue, Suite 2200
Oklahoma City, Oklahoma 73102
Tel (405) 235-5500 | Fax (405) 235-2875
Email RMcCampbell(@Gablelaw.com
NMerkleyi@Gablelaw.com
COUNSEL FOR DEFENDANTS,
META PLATFORMS, INC. F/K/A FACEBOOK, INC.
AND INSTAGRAM, LLC
< ' Tilr~
Robert J. Carlson
72
PDF Page 76
EXHIBIT 1
PDF Page 77
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
CONFIDENTIAL
--000--
CONFIDENTIAL PROCEEDINGS
EXAMINATION UNDER OATH OF ARTURO BEJAR
Regarding Meta Platforms
Tuesday, May 16, 2023
Palo Alto, California
Stenographically Reported By:
Hanna Kim, CLR, CSR No. 13083
Job No. 5907326
Page 1
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-5127
PDF Page 78
=
—— ee
oO
oO
4
J&
tH
4
=
i
9 = Q._ In that way, do you think Mr. Zuckerberg's
10 public statements about prevalence created a
11 misteading picture of the harmfulness of Meta's
03.09.09
e
2
3,
:
—
>
& !
i
3
—
=
LSS eo
(ed (lo
re (ll
[— Cd
I :
Sf es oa
eee [Dd
Co Ce
[oe . ee
Se j som
a | _— a |
51 (Pages 198 - 201)
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-5127
PDF Page 79
i
03:57:31
rf
|
|
|
|
CONFIDENTIAL
03:56:23
Page 234
page 235
Wl
Ww
oe
an
|
—+———
—
3 THE WITNESS: Frances Haugen's
14 whistleblowing.
15 BY MR. PHELPS:
16 Q. Might this been the same day as
117 Ms, Haugen’s testimony before Congress?
18 =A. I don't know.
19 Q. Okay. Do you know what note - it says,
20 “I saw the note you shared." Do you know what note 03:59:50
21 you were talking sbout?
|22 A. Yes. He posted an internal note where he
23 was talking about all of thes¢ issues.
24
|
04:00:06 |
| __i
60 (Pages 234 - 237)
|
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-51 27
PDF Page 80
CONFIDENTIAL
|
|
1 there were really not -- ] mean, we'll get into it,
2 but | think I can summarize it by saying, there's a
3 sentence here that says, “We care deeply about
4 issues like safety, well-being, and mental health.”
5 So what is the well-being metric at this 04:03:06
6 point? There is no well-being meric in the company
7 at this point.
8 What are the safety metrics? If they care
9 about bullying and harassment for teenagers, where
10 are the metrics that tell you the — and the work ==. 04:03:24
11 based around the percentage of teenagers that are
12 telling you they're experiencing firsthand bullying
13 and harassment? What's the content that's driving
14 that? What are the features that make that better?
15 Right? 04:03:38
16 "We care.” What does it mean, "we care“?
17 What does it mean we care when they talk about
18 mental health, when ['m working on an Instagram
19 team, and they de-prioritized the mental health work
20 in the middie of the pandemic? Right? So what docs 04:03:53!
21 it mean "they care about mental health"? Does it
22 mean that he cares as a human being, or does the
23 company actually act on these things?
| 24 And so, I read this e-mail, and I felt
| 25 that it was very misleading to employees because = —-04:04:04
age 238 | Page 240
ma
1 he -- he talks about claims that don't make any
2 sense. But if we go through them, no, actually, all
3 those claims make a lot of sense. And I believe
4 it's his responsibility as the leader of the company
5 to investigate, address these claims and innovate 04:04:18
6 and make these things better because that compeny
7 has managed to get most of humanity on its services.
8 And so, great, you did that, but, like,
| 9 this is — this is the work. This is what you need
| 10 to be really great at in order to provide asafe 04:04:34
| 11 environment for teenagers, in order to create a safe
| 12 environment for everybody and respectful environment
13 for everybody.
14 And so, when { wrote this note, I felt
15 that it embodied all of the gaps that [had been 04:04:49
16 researching in that previous year that | tried to
17 summarize in the e-mail | sent to him and Sheryl and
18 Adam and -- and — and Chris.
7 |
20 «=. Okay. What was your reaction to this note 04:02:01 7 |
21 in ~ in broad strokes before we get into it in
22 detail?
3 A. I think it's misleading. And — and it
24 represented many of the things that I felt were
25 driving the behavior of the company in a way that 04:02:36
nw
04:05:28
Page 241 |
61 (Pages 238 - 241)
|
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-5127
PDF Page 81
|
|
~~
Q. Are you aware of any instances where the
8 company, in your view, minimized the harms users
| 9 were experiencing on Meta's platforms after you sent
10 those e-mails? 06:12:04
11 A. Every time that a company spokesperson in
12 the context of harms quotes prevalence, statistics,
13 [believe that is what they are doing, that they're
14 minimizing the harms that people are experiencing in
15 the product. Be it -- albeit somebody in the 06:12:22
16 communications team, | think that prevalence is not
17 harm.
18 These numbers in that e-mail, that's harm.
19 You need prevatence, but it’s not reflective of the
20 harm people are experiencing in the product.
06:12:40
_ Page 320 j
ee
|
|
eae
Pp
SS
|
i
|
4
81 (Pages 318 - 321)
Veritext Legal Solutions
Calendar-CA@veritext.com 866-299-5127
Delaware -
The First State
I, JEFFREY W. BULLOCK, SECRETARY OF STATE OF THE STATE OF
DELAWARE, DO HEREBY CERTIFY “FACEBOOK, INC." IFS DULY INCORPORATED
UNDER THE LAWS OF THE STATE OF DELAWARE AND IS IN GOOD STANDING AND
HAS A LEGAL CORPORATE EXISTENCE SO FAR AS THE RECORDS OF THIS
OFFICE SHOW, AS OF THE TENTH DAY OF JANUARY, A.D. 2019.
AND I DO HEREBY FURTHER CERTIFY THAT THE ANNUAL REPORTS HAVE
BEEN FILED TO DATE.
AND I DO HEREBY FURTHER CERTIFY THAT THE SAID “FACEBOOK, INC."
WAS INCORPORATED ON THE TWENTY-NINTH DAY OF JULY, A.D. 2004.
AND I DO REREBY FURTHER CERTIFY THAT THE FRANCHISE TAXES HAVE
BEEN PAID TO DATE. ,
Ot; @. Gas, 0 heme
3835615 8300 Authentication: 202059700
SRN 20190197586 ee Date: 01-10-19
You may vertly this cestificate ondine a: corp.delaware.gov/authvershiral
PDF Page 86
aaa
i . a iy EK
OFFICE OF THE SECRETARY OF STATE
CERTIFICATE OF AUTHORITY
WHEREAS, FACEBOOK, INC.
incorporated under the laws of the State of -DELAWARE has filed in the office of
the Secretary of State duly authenticated evidence of us incorporation and an
application for Certificate of Authority to transact business in this State, as
provided by the laws of the State of Oklahoma.
NOW THEREFORE, I, the undersigned, Secretary of State of the State of
Oklahoma, by virtue of the powers vested in me by law, do hereby issue this
Certificate of Authority authorizing said Corporation to transact business in this
Slate.
IN TESTIMONY WHEREOF, ! hereunto set my hand and cause to be
affixed the Great Seal of the State of Oklahoma.
Filed in the city of Oklahoma City this
L7th day of January, 2019.
\
Secretary of State
PDF Page 87
EXHIBIT 3
PDF Page 88
rs
=
03/13/2023) 18:4t an
OKLAHOMA SECRETARY OF sTaTE 2020 ANNUAL CERTIFICATE
LAHOMA SECRETARY OF STATE
NRRGBANAE
FACEBOOK, INC. CORPORATION KEY:
2312726928
STATE OF DOMICILE:
DELAWARE
STATUS :
OUSTED
ANNIVERSARY DATE:
January 17°
PLEASE READ CAREFULLY. TO EXPEQITE THE FILING OF THIS FORM, DO NOT CHANGE ANY OF
THE PRE-PRINTEO INFORMATION OR FIGURES. IF ANY INFORMATION CONTAINEO IN THIS
REPORT HAS BEEN CHANGED, OR THE CORPORATION HAS CEASED DOING SUSINESS IN
0
IT 1S WPORTANT THAT YOU NOTIFY THIS OFFICE IN WRITING TO REQUEST THE
APPROPRIATE FORM TO AMEND THE RECORDS OR TO WITHDRAW FROM THE STATE.
3.
THIS FORM is lo be used tor Ging the annual certificate, a report of the amount of capilal invested In
Oktahome by a foreign corporation pursuant to 18 O.S., § 1142,A.t3. YOU MUST COMPLETE LINES
6 & 8 ON THE REVERSE SIDE OF THIS ORIGINAL FORM TO CORRECTLY DETERMINE THE
FILING FEE. ALL BLANKS MUST GE FILLED IN WITH EITHER AN AMOUNT OR THE WORD
“HONE®.
ATTACH PAYMENT TO THIS CERTIFICATE. You may wish to retain 8 copy for your files. MAKE ALL
CHECKS PAYABLE AND DIRECT ALL CORRESPONDENCE TO:
OKLAHOMA SECRETARY OF STATE
421 NW 13™ STREET, SUITE 210
OKLAHOMA CITY, OK 73103
(405) 621- 3912
WHO MUST FILE: Any corporation thal has NOT paid a fee on its authorized capital. This form MUST
be filed each year with the Secretary of State on the anniversary of ks qualification (at chown above). it
must be signed by the president, vicg president, or managing officer of the corporation.
The PENALTY for faiure to file an Annual Certificate required by taw Is OUSTER of tha corporation
from doing business in the State of Ckishoma.
PLEASE NOTE that the Annus? Certificate Is SEPARATE FROM, AND IN ADDITION TO, the
Franchise Tex retumn which must be filed each year with the Okishoma Tax Commrasion.
RECEIVED
. MAY 15 2023
OKLAHOMA SECRET,
OF STATE ARY
FILED - Oklahoma Secretary of State #2312726928 06/15/2023
PDF Page 89
ALL BLANKS MUST BE FILLED IN THE EITHER AN AMOUNT OR THE WORD “NONE”
0
2. ___ 8,141,000,000
ok 457,050,000,000
4. 457,050,000,000
5. 300,000
ee geet es eee
g.__ 610,956
7, 457.049,700,000
ga 310,956
10, 320.96
+$500 ouster penalty
T. Turner 4/25/2023
Total authorized par vatue capitol stock
Number of authorized no par value shares
No par vatue shares x $50.00 (Par vaiue assigned by law 18 0.8. §
1142.4, 10 for computing the filing fees only
Sum of fines 1 and 3.
Total smount of capita’ on which the corporation has creda for paying. This
is an aggregate amount including the amount paid initially upon quelificstion
aad subsequently upon annual certificates, Wf hwo or more corporations have
merged, the eurvivor is given credit for the amount of captal upon which the
merging comorations have paid fees in Oldahoma, upon @ing procf of the
marger with the SOS.
Maximum amount of capital invested by said corporation In the slate of
Oktshoma, This means the maximum amount of funds, credits, securities
and property of whatever kind used or employed in the business carried on
{n the State of Okiahama
Diereaces between the total authorized capital and paid an credit. (Ling co
Line 5}
Total invested in excess of the amount heretofore paid on.
(Line 6 — Line 5).
FEE CALCULATION INSTRUCTIONS - complate only one.
if the emount entered on Line & 1s Zero of negative, enter and atiach tne
filing tae of $10.00 to this rapart.
If the amount entered on Line & is greater then zero, compute the fee ai
ing ae 1 per cent ($1.00 per $1,000.00) on this amount plus the $10.00
ee.
Farr boo*s Inc.
(EXACT NAME OF CORPORATION)
BY: aaa Tinea
Ketoring @. rely _ Secretary
(Plense print name)
se = =e
PDF Page 90
EXHIBIT 4
PDF Page 91
oO Help Center
Instagram Features
Manage Your Account
Staying Safe
Privacy, Security and
Reporting
Terms and Policies
Community Guidelines
Privacy Policy
Terms of Use
Platform Policy
Cookies Policy
Transparency Center
Community Payments
Terms
Instagram Purchase
Protection Policy
Jerms and Policies
Terms of Use Copy link
Welcome to Instagram!
These Terms of Use (or "Terms") govern your use of
Instagram, except where we expressly state that
separate terms (and not these) apply, and provide
information about the Instagram Service (the
“Service"), outlined below. When you create an
Instagram account or use Instagram, you agree to
these terms. The Meta Terms of Service do not apply
to this Service.
The Instagram Service is one of the Meta Products,
provided to you by Meta Platforms, Inc. These Terms
of Use therefore constitute an agreement between
you and Meta Platforms, Inc.
ARBITRATION NOTICE: YOU AGREE THAT DISPUTES
BETWEEN YOU AND US WILL BE RESOLVED BY
BINDING, INDIVIDUAL ARBITRATION AND YOU
WAIVE YOUR RIGHT TO PARTICIPATE IN A CLASS
ACTION LAWSUIT OR CLASS-WIDE ARBITRATION.
WE EXPLAIN SOME EXCEPTIONS AND HOW YOU
CAN OPT OUT OF ARBITRATION BELOW.
The Instagram Service
We agree to provide you with the Instagram Service.
The Service includes all of the Instagram products,
features, applications, services, technologies, and
software that we provide to advance Instagram's
mission: To bring you closer to the people and things
you love. The Service is made up of the following
aspects:
PDF Page 92
O) Help Center
mm mm me ee me ee ecu
different types of accounts and features to
help you create, share, grow your presence,
and communicate with people on and off
Instagram. We also want to strengthen your
relationships through shared experiences that
you actually care about. So we build systems
that try to understand who and what you and
others care about, and use that information to
help you create, find, join and share in
experiences that matter to you. Part of that is
highlighting content, features, offers and
accounts that you might be interested in, and
offering ways for you to experience Instagram,
based on things that you and others do on
and off Instagram.
Fostering a positive, inclusive, and safe
environment.
We develop and use tools and offer resources
to our community members that help to make
their experiences positive and inclusive,
including when we think they might need
help. We also have teams and systems that
work to combat abuse and violations of our
Terms and policies, as well as harmful and
deceptive behavior. We use all the information
we have-including your information-to try to
keep our platform secure. We also may share
information about misuse or harmful content
with other Meta Companies or law
enforcement. Learn more in the Privacy Policy.
Developing and using technologies that help
us consistently serve our growing
community.
Organizing and analyzing information for our
growing community is central to our Service. A
big part of our Service is creating and using
cutting-edge technologies that help us
personalize, protect, and improve our Service
on an incredibly large scale for a broad global
community. Technologies like artificial
intelligence and machine learning give us the
power to apply complex processes across our
Service. Automated technologies also help us
ensure the functionality and integrity of our
Service
PDF Page 93
O) Help Center
Instagram is part of the Meta Companies,
which share technology, systems, insights, and
information-including the information we
have about you (learn more in the Privacy
Policy) in order to provide services that are
better, safer, and more secure. We also
provide ways to interact across the Meta
Company Products that you use, and designed
systems to achieve a seamless and consistent
experience across the Meta Company
Products.
¢ Ensuring access to our Service.
To operate our global Service, we must store
and transfer data across our systems around
the world, including outside of your country of
residence. The use of this global infrastructure
is necessary and essential to provide our
Service. This infrastructure may be owned or
operated by Meta Platforms, inc., Meta
Platforms Ireland Limited, or their affiliates.
* Connecting you with brands, products, and
services in ways you care about.
We use data from Instagram and other Meta
Company Products, as well as from third-party
partners, to show you ads, offers, and other
sponsored content that we believe will be
meaningful to you. And we try to make that
content as relevant as all your other
experiences on Instagram.
e Research and innovation.
We use the information we have to study our
Service and collaborate with others on
research to make our Service better and
contribute to the well-being of our
community.
How Our Service Is Funded
Instead of paying to use Instagram, by using the
Service covered by these Terms, you acknowledge that
we can show you ads that businesses and
PDF Page 94
Help Center
we themes ee eee wot eg ee ee eee ee ey re
show you ads that are more relevant to you.
We show you relevant and useful ads without telling
advertisers who you are. We don’t sell your personal
data. We allow advertisers to tell us things like their
business goal and the kind of audience they want to
see their ads. We then show their ad to people who
might be interested.
We also provide advertisers with reports about the
performance of their ads to help them understand
how people are interacting with their content on and
off Instagram. For example, we provide general
demographic and interest information to advertisers
to help them better understand their audience. We
don't share information that directly identifies you
(information such as your name or email address that
by itself can be used to contact you or identifies who
you are) unless you give us specific permission. Learn
more about how Instagram ads work here.
You may see branded content on Instagram posted by
account holders who promote products or services
based on a commercial relationship with the business
partner mentioned in their content. You can learn
more about this here.
The Privacy Policy
Providing our Service requires collecting and using
your information. The Privacy Policy explains how we
collect, use, and share information across the Meta
Products. It also explains the many ways you can
control your information, including in the Instagram
Privacy and Security Settings. You must agree to the
Privacy Policy to use Instagram.
Your Commitments
In return for our commitment to provide the Service,
we require you to make the below commitments to us.
PDF Page 95
O) Help Center
ey ee eee wee we my
we need you to commit to a few restrictions in order
to be part of the Instagram community.
* You must be at least 13 years old.
* You must not be prohibited from receiving any
aspect of our Service under applicable laws or
engaging in payments related Services if you
are on an applicable denied party listing.
e We must not have previously disabled your
account for violation of law or any of our
policies.
* You must not be a convicted sex offender,
How You Can't Use Instagram. Providing a safe and
open Service for a broad community requires that we
all do our part.
* You can't impersonate others or provide
inaccurate information.
You don't have to disclose your identity on
Instagram, but you must provide us with
accurate and up to date information (including
registration information), which may include
providing personal data. Also, you may not
impersonate someone or something you
aren't, and you can't create an account for
someone else unless you have their express
permission.
¢ You can’t do anything unlawful, misleading,
or fraudulent or for an illegal or
unauthorized purpose.
¢ You can't violate (or help or encourage
others to violate) these Terms or our policies,
including in particular the Instagram
Community Guidelines, Meta Platform Terms
and Developer Policies, and Music Guidelines.
If you post branded content, you must comply
with our Branded Content Policies, which
require you to use our branded content tool.
Learn how to report conduct or content in our
Help Center.
PDF Page 96
| 0} Help Center
Siete eee ee eee erty cep et ecg oop ees
or appeals channel, such as by making
fraudulent or groundless reports or appeals.
You can't attempt to create accounts or
access or collect information in unauthorized
ways,
This includes creating accounts or collecting
information in an automated way without our
express permission.
You can't sell, license, or purchase any
account or data obtained from us or our
Service.
This includes attempts to buy, sell, or transfer
any aspect of your account {including your
username); solicit, collect, or use login
credentials or badges of other users; or
request or collect Instagram usernames,
passwords, or misappropriate access tokens.
You can't post someone else's private or
confidential information without permission
or do anything that violates someone else's
rights, including intellectual property rights
(e.g., copyright infringement, trademark
infringement, counterfeit, or pirated goods).
You may use someone else's works under
exceptions or limitations to copyright and
related rights under applicable law. You
represent you own or have obtained all
necessary rights to the content you post or
share. Learn more, including how to report
content that you think infringes your
intellectual property rights, here.
You can't modify, translate, create derivative
works of, or reverse engineer our products or
their components.
You can't use a domain name or URL in your
username without our prior written consent.
Permissions You Give to Us. As part of our agreement,
you also give us permissions that we need to provide
the Service.
* We do not claim ownership of your content,
but you grant us a license to use it.
PDF Page 97
| O) Help Center
Batts etree pe pres o- Smee ep ee
Service and you are free to share your content
with anyone else, wherever you want.
However, we need certain legal permissions
from you {known as a “license") to provide the
Service. When you share, post, or upload
content that is covered by intellectual
property rights (like photos or videos) on or in
connection with our Service, you hereby grant
to us a non-exclusive, royalty-free,
transferable, sub-licensable, worldwide license
to host, use, distribute, modify, run, copy,
publicly perform or display, translate, and
create derivative works of your content
(consistent with your privacy and application
settings). This license will end when your
content is deleted from our systems. You can
delete content individually or all at once by
deleting your account. To learn more about
how we use information, and how to control
or delete your content, review the Privacy
Policy and visit the Instagram Help Center.
Permission to use your username, profile
picture, and information about your
relationships and actions with accounts, ads,
and sponsored content.
You give us permission to show your
username, profile picture, and information
about your actions (such as likes) or
relationships (such as follows) next to or in
connection with accounts, ads, offers, and
other sponsored content that you follow or
engage with that are displayed on Meta
Products, without any compensation to you.
For example, we may show that you liked a
sponsored post created by a brand that has
paid us to display its ads on Instagram. As
with actions on other content and follows of
other accounts, actions on sponsored content
and follows of sponsored accounts can be
seen only by people who have permission to
see that content or follow. We will also respect
your ad settings. You can learn more here
about your ad settings.
You agree that we can download and install
updates to the Service on your device.
PDF Page 98
| O) Help Center
Additional Rights We Retain
* If you select a username or similar identifier
for your account, we may change it if we
believe it is appropriate or necessary (for
example, if it infringes someone’s intellectual
property or impersonates another user).
° If you use content covered by intellectual
property rights that we have and make
available in our Service (for example, images,
designs, videos, or sounds we provide that
you add to content you create or share), we
retain all rights to our content (but not yours).
* You can only use our intellectual property and
trademarks or similar marks as expressly
permitted by our Brand Guidelines or with our
prior written permission.
* You must obtain written permission from us or
under an open source license to modify,
create derivative works of, decompile, or
otherwise attempt to extract source code from
us.
Content Removal and Disabling or
Terminating Your Account
¢ We can remove any content or information
you share on the Service if we believe that it
violates these Terms of Use, our policies
(including our Instagram Community
Guidelines), or we are permitted or required
to do so by law. We can refuse to provide or
stop providing all or part of the Service to you
(including terminating or disabling your access
to the Meta Products and Meta Company
Products) immediately to protect our
community or services, or if you create risk or
legal exposure for us, violate these Terms of
Use or our policies (including our Instagram
Community Guidelines), if you repeatedly
PDF Page 99
| O) Help Center
te wey tes come we se ieee owe
change the Service, remove or block content
or information shared on our Service, or stop
providing all or part of the Service if we
determine that doing so is reasonably
necessary to avoid or mitigate adverse legal or
regulatory impacts on us. tf you believe your
account has been terminated in error, or you
want to disable or permanently delete your
account, consult our Help Center. When you
request to delete content or your account, the
deletion process will automatically begin no
more than 30 days after your request. It may
take up to 90 days to delete content after the
deletion process begins. While the deletion
process for such content is being undertaken,
the content is no longer visible to other users,
but remains subject to these Terms of Use and
our Privacy Policy. After the content is
deleted, it may take us up to another 90 days
to remove it from backups and disaster
recovery systems.
Content will not be deleted within 90 days of
the account deletion or content deletion
process beginning in the following situations:
¢ where your content has been used by
others in accordance with this license
and they have not deleted it (in which
case this license will continue to apply
until that content is deleted); or
¢ where deletion within 90 days is not
possible due to technical limitations of
our systems, in which case, we will
complete the deletion as soon as
technically feasible; or
e where deletion would restrict our
ability to:
* investigate or identify illegal
activity or violations of our
terms and policies (for
example, to identify or
investigate misuse of our
products or systems);
O) Help Center
Sm mm my ee tm mp mg te Bene
not be a waiver.
¢ We reserve ail rights not expressly granted to
you.
Who Has Rights Under this Agreement.
¢ Our past, present, and future affiliates and
agents, including Instagram LLC, can invoke
our rights under this agreement in the event
they become involved in a dispute. Otherwise,
this agreement does not give rights to any
third parties.
* You cannot transfer your rights or obligations
under this agreement without our consent.
® Our rights and obligations can be assigned to
others. For example, this could occur if our
ownership changes (as in a merger,
acquisition, or sale of assets) or by law.
Who Is Responsible if Something Happens.
* Our Service is provided “as is," and we can't
guarantee it will be safe and secure or will
work perfectly all the time. TO THE EXTENT
PERMITTED BY LAW, WE ALSO DISCLAIM ALL
WARRANTIES, WHETHER EXPRESS OR
IMPLIED, INCLUDING THE IMPLIED
WARRANTIES OF MERCHANTABILITY, FITNESS
FOR A PARTICULAR PURPOSE, TITLE, AND
NON-INFRINGEMENT.
* We also don't control what people and others
do or say, and we aren't responsible for their
(or your} actions or conduct (whether online
or offline) or content (including unlawful or
objectionable content). We also aren't
responsible for services and features offered
by other people or companies, even if you
access them through our Service.
¢ Our responsibility for anything that happens
on the Service (also called “tiability”) is limited
as much as the law will allow. If there is an
issue with our Service, we can't know what all
the possible impacts might be. You agree that
we won't be responsible ("liable") for any lost
PDF Page 102
| O) Help Center
wep ee Tee em yee were ee oe
or related to these Terms, even if we know
they are possible. This includes when we
delete your content, information, or account.
Our aggregate liability arising out of or
relating to these Terms will not exceed the
greater of $100 or the amount you have paid
us in the past twelve months.
You agree to defend (at our request),
indemnify and hold us harmless from and
against any claims, liabilities, damages, losses,
and expenses, including without limitation,
reasonable attorney's fees and costs, arising
out of or in any way connected with these
Terms or your use of the Service. You will
cooperate as required by us in the defense of
any claim. We reserve the right to assume the
exclusive defense and control of any matter
subject to indemnification by you, and you will
not in any event settle any claim without our
prior written consent.
How We Will Handle Disputes.
e Except as provided below, you and we agree
that any cause of action, legal claim, or
dispute between you and us arising out of or
related to these Terms or Instagram
(“claim(s)") must be resolved by arbitration
on an individual basis. Class actions and class
arbitrations are not permitted: you and we
may bring a claim only on your own behalf
and cannot seek relief that would affect other
Instagram users. If there is a final judicial
determination that any particular claim (or a
request for particular relief) cannot be
arbitrated in accordance with this provision's
limitations, then only that claim (or only that
request for relief) may be brought in court. All
other claims (or requests for relief) remain
subject to this provision.
Instead of using arbitration, you or we can
bring claims in your local “small claims" court,
if the rules of that court will allow it. If you
don't bring your claims in small claims court
(or if you or we appeal a small claims court
judgment to a court of general jurisdiction),
PDF Page 103
O) Help Center
arbitrations under its Consumer Arbitration
Rules. You and we expressly waive a trial by
jury.
The following claims don't have to be
arbitrated and may be brought in court:
disputes related to intellectual property (like
copyrights and trademarks), violations of our
Platform Policy, or efforts to interfere with the
Service or engage with the Service in
unauthorized ways (for example, automated
ways). In addition, issues relating to the scope
and enforceability of the arbitration provision
are for a court to decide.
This arbitration provision is governed by the
Federal Arbitration Act.
You can opt out of this provision within 30
days of the date that you agreed to these
Terms. To opt out, you must send your name,
residence address, username, email address or
phone number you use for your instagram
account, and a clear statement that you want
to opt out of this arbitration agreement, and
you must send them here: Meta Platforms, Inc.
ATTN: Instagram Arbitration Opt-out, 1601
Willow Rd., Menlo Park, CA 94025.
Before you commence arbitration of a claim,
you must provide us with a written Notice of
Dispute that includes your name, residence
address, username, email address or phone
number you use for your Instagram account, a
detailed description of the dispute, and the
relief you seek. Any Notice of Dispute you
send to us should be mailed to Meta
Platforms, tnc., ATTN: Instagram Arbitration
Filing, 1601 Willow Rd. Menlo Park, CA 94025.
Before we commence arbitration, we will send
you a Notice of Dispute to the email address
you use with your Instagram account, or other
appropriate means. If we are unable to resolve
a dispute within thirty (30) days after the
Notice of Dispute is received, you or we may
commence arbitration
PDF Page 104
Help Center
miei eee eee ee ney Se eee tee eeetig ee
if your claims seek less than $75,000 and you
timely provided us with a Notice of Dispute.
For all other claims, the costs and fees of
arbitration shall be allocated in accordance
with the arbitration provider's rules, including
rules regarding frivolous or improper claims.
¢ For any claim that is not arbitrated or resolved
in small claims court, you agree that it will be
resolved exclusively in the U.S. District Court
for the Northern District of California ora
state court located in San Mateo County. You
also agree to submit to the personal
jurisdiction of either of these courts for the
purpose of litigating any such claim.
¢ The laws of the State of California, to the
extent not preempted by or inconsistent with
federal law, will govern these Terms and any
claim, without regard to conflict of law
provisions.
Unsolicited Material.
We always appreciate feedback or other suggestions,
but may use them without any restrictions or
obligation to compensate you for them, and are under
no obligation to keep them confidential.
Updating These Terms
We may change our Service and policies, and we may
need to make changes to these Terms so that they
accurately reflect our Service and policies. Unless
otherwise required by law, we will notify you (for
example, through our Service) before we make
changes to these Terms and give you an opportunity
to review them before they go into effect. Then, if you
continue to use the Service, you will be bound by the
updated Terms. If you do not want to agree to these
or any updated Terms, you can delete your account,
here.
Revised: 26 July 2022