Page 1
Rafey S. Balabanian (SBN 315962)
rbalabanian@edelson.com
EDELSON PC
150 California Street, 18th Floor
San Francisco, California Tel: 415.212.Fax: 415.373.
12/6/
Richard Fields (pro hac vice admission to be sought)
fields@fieldslawpllc.com
FIELDS PLLC
1701 Pennsylvania Avenue, NW, Suite Washington, DC Tel: 833.382.
Counsel for Plaintiff and the Proposed Class
SUPERIOR COURT OF THE STATE OF CALIFORNIA
FOR THE COUNTY OF SAN MATEO
21-CIV-Case No. _________________
JANE DOE, individually and on behalf of all
others similarly situated,
CLASS ACTION COMPLAINT FOR:
Plaintiff,
(1) STRICT PRODUCT LIABILITY
(2) NEGLIGENCE
v.
META PLATFORMS, INC. (f/k/a Facebook,
Inc.), a Delaware corporation,
JURY DEMAND
Defendant.
Plaintiff Jane Doe, on behalf of herself and on behalf of a Class defined below, brings
this Class Action Complaint and Demand for Jury Trial against Defendant Meta Platforms, Inc.
(f/k/a Facebook, Inc. and d/b/a “Facebook”) 1 for compensatory damages, in excess of $
billion, in addition to punitive damages in an amount to be determined at trial. Plaintiff, for her
Complaint, alleges as follows:
Defendant Meta Platforms, Inc. is referred to throughout this Complaint as “Meta” or
“Facebook.”
CLASS ACTION COMPLAINT
Case No. __________________Page 2
DETERMINING FOREIGN LAW - NOTICE
Plaintiff hereby gives notice that, to the extent Defendant Meta Platforms raises the
Communications Decency Act, 47 U.S.C. § 230, as a defense to the claims asserted below, and
to the extent that the Court were to find that the Communications Decency Act conflicts with
Burmese law, Burmese law applies. Burmese law does not immunize social media companies for
their role in inciting violence and contributing to genocide.
INTRODUCTION
1.
The Rohingya people, a Muslim minority historically living in present-day Burma
(internally renamed Myanmar following a military coup),2 number over 1 million and are the
largest stateless population in the world. While the Rohingya have long been the victims of
discrimination and persecution, the scope and violent nature of that persecution changed
dramatically in the last decade, turning from human rights abuses and sporadic violence into
terrorism and mass genocide.
2.
A key inflection point for that change was the introduction of Facebook into
Burma in 2011, which materially contributed to the development and widespread dissemination
of anti-Rohingya hate speech, misinformation, and incitement of violence—which together
amounted to a substantial cause, and perpetuation of, the eventual Rohingya genocide. A
stunning declaration of a former Facebook employee now turned whistleblower, states
“Facebook executives were fully aware that posts ordering hits by the Myanmar government on
the minority Muslim Rohingya were spreading wildly on Facebook…”, and that “…the issue of
the Rohingya being targeted on Facebook was well known inside the company for years.” This
Throughout this Complaint, “Myanmar” will be used in reference the ruling military
government, while “Burma” will be used to refer to the country itself. See U.S. Relations With
Burma, US STATE DEPARTMENT, https://www.state.gov/u-s-relations-with-burma/ (“The military
government changed the country’s name to ‘Myanmar’ in 1989. The United States government
continues to use the name ‘Burma.’”)
CLASS ACTION COMPLAINT
Case No. __________________Page 3
information, and the whistleblower’s knowledge of Facebook’s lack of response, led this person
to conclude: “I, working for Facebook, had been a party to genocide.”
3.
For years, the Myanmar military, along with the support of civilian terrorists in
the majority Buddhist population, have treated the Rohingya as less than human, limiting their
rights, restricting their movements, and committing widespread human rights violations. While
various incidents of violence occurred periodically for years, nothing could prepare the
Rohingya, or the international community, for what was to come after Facebook entered the
picture in 2012.
4.
Following confrontations on the Rakhine State border, the Myanmar military, and
its civilian conspirators, now armed with Facebook to organize and spread terror, escalated their
brutal crackdown, carrying out violent acts of ethnic cleansing that defy comprehension.
5.
In the ensuing months and years, tens of thousands of Rohingya were brutally
murdered, gang raped, and tortured. Men, women, and children were burned alive inside their
homes and schools. Family members were tortured, raped, and killed in front of each other. More
than ten thousand lost their lives, while hundreds of thousands were brutalized, maimed, and
bore witness to indescribable violence and misery that they will carry with them for the rest of
their lives. Families were destroyed, childhoods were lost, lives were ruined, and entire
communities were erased from the face of the earth.
6.
As this wave of violence persisted with little end in sight, hundreds of thousands
of Rohingya fled their home country and sought refuge around the world. The vast majority of
those refugees ended up, and still live, in Bangladesh in what is now the largest refugee camp in
the world. Over ten thousand individuals, including Plaintiff, eventually arrived in the United
States and many are living here under refugee status.
Craig Timberg, New whistleblower claims Facebook allowed hate, illegal activity to go
unchecked, THE WASHINGTON POST (Oct. 22, 2021),
https://www.washingtonpost.com/technology/2021/10/22/facebook-new-whistleblowercomplaint/.
CLASS ACTION COMPLAINT
Case No. __________________Page 4
7.
The Rohingya people who are left in Burma live under constant threat of arrest,
violence, abuse, and discrimination. Those who made it out, too, live in fear for themselves and
their loved ones. Many Rohingya refugees around the world live in abject poverty and in highly
unstable situations that could change at any time depending on the political climate of the
country in which they now reside. Even those in the Bangladesh refugee camp are not safe: In
September 2021, a well-known and outspoken Rohingya community leader was murdered in the
camp, with many believing that the murder was carried out by supporters of the Myanmar
military.
8.
Woven throughout the years of this horrific tragedy are two constants: (1) the
enduring resilience of the Rohingya people and (2) the willingness of Defendant Meta to
knowingly facilitate the spread of anti-Rohingya hate speech, misinformation, and the
widespread incitement of violence against the Rohingya people.
9.
So deep was Facebook’s penetration into daily life in Burma and its role in the
out-of-control spread of anti-Rohingya content, that Marzuki Darusman, chairman of the U.N.
Independent International Fact-Finding Mission on Myanmar, described Facebook as having
played a “determining role” in the genocide. And, worst of all, it allowed the dissemination of
hateful and dangerous misinformation to continue for years, long after it was repeatedly put on
notice of the horrific and deadly consequences of its inaction.
10.
Amazingly (at least to those not privy to Facebook’s inner workings), Facebook
has long been aware that hateful, outraged, and politically extreme content (especially content
attacking a perceived “out-group”) is oxygen to the company’s blood. The more horrendous the
content, the more it generates “engagement” (a measure of users’ interaction with content on the
system (“likes,” “shares,” comments, etc.)). As Facebook has determined through years of study
and analysis: hate and toxicity fuel its growth far more effectively than updates about a user’s
favorite type of latte.
11.
Rather than taking what it’s learned to change its practices, Facebook made a
corporate decision to lean into the hate. Its algorithms were carefully designed to actively exploit
CLASS ACTION COMPLAINT
Case No. __________________Page 5
this opportunity, prioritizing divisive and polarizing content, including hate speech and
misinformation about targeted groups, when delivering content to users and recommending that
users make new connections or join new groups.
12.
Facebook participates in and contributes to the development and creation of
divisive content, including hate speech and misinformation. By ensuring that more users see and
respond—in the form of “likes,” “shares,” and comments—to such toxic content, Facebook’s
algorithms train users to post more hate speech and misinformation in order to garner more
attention online.
13.
This “growth at all costs” view of Facebook’s business is not speculative, or, for
that matter, inconsistent with Facebook’s view of itself. Facebook’s Borg-like march toward
further growth was best captured by one of its highest-ranking executives, Andrew Bosworth, in
an internal memo circulated after a shooting death in the Chicago was stunningly live streamed
on Facebook. It stated, in part:
We connect people.
That can be good if they make it positive. Maybe someone finds
love. Maybe it even saves the life of someone on the brink of
suicide.
So we connect more people.
That can be bad if they make it negative. Maybe it costs a life by
exposing someone to bullies. Maybe someone dies in a terrorist
attack coordinated on our tools.
And still we connect people.
The ugly truth is that we believe in connecting people so deeply
that anything that allows us to connect more people more often
is *de facto* good…
That’s why all the work we do in growth is justified. All the
questionable contact importing practices. All the subtle language
that helps people stay searchable by friends. All of the work we do
to bring more communication in. The work we will likely have to
do in China some day. All of it.
The natural state of the world is not connected. It is not unified. It
is fragmented by borders, languages, and increasingly by different
products. The best products don’t win. The ones everyone use
win.
CLASS ACTION COMPLAINT
Case No. __________________Page 6 In almost all of our work, we have to answer hard questions about
what we believe. We have to justify the metrics and make sure
they aren’t losing out on a bigger picture. But connecting people.
That’s our imperative. Because that’s what we do. We connect
people.
14.
In short, Facebook sees itself, at best, as an amoral actor on the world stage, with
the sole objective of growth, regardless of how it impacts its users or the world more generally.
To be clear, the last five years, and in fact just the last five months, have made it abundantly
clear that Facebook’s path to promote the very worst of humanity was not the result of a bug, but
rather a carefully designed feature.
15.
The manifestation of this can be seen in nearly everything Facebook does. For
example:
•
Before and after the 2020 election, it failed to stop mass
publication and reposting of misinformation about the legitimacy
of the election and the subsequent calls for violence that
culminated in the January 6th attack on our nation’s Capitol;
•
Facebook has known about human traffickers using its system for
years, but only after “[i]t got so bad that in 2019”, and Apple
threatened to pull Facebook and Instagram’s access to the App
Store, did “Facebook employees rush[] to take down problematic
content and make emergency policy changes avoid what they
described as a ‘potentially severe’ consequence for the business.”6;
•
In another ongoing example, throughout the COVID-19 global
pandemic, Facebook has been a constant vehicle for the mass
distribution of misinformation on COVID, masks, and vaccines;
and
Ryan Mac, Growth At Any Cost: Top Facebook Executive Defended Data Collection In
2016 Memo — And Warned That Facebook Could Get People Killed, BUZZFEED
https://www.buzzfeednews.com/article/ryanmac/growth-at-any-cost-top-facebook-executivedefended-data (emphasis added).
In most companies, the total disregard shown for the human toll of corporate action
would have been met with termination; here, however, Andrew Bosworth not only stayed
employed, but was in fact placed in charge of (and became a chief spokesman for) arguably the
company’s largest and most aggressive expansion ever: the “Metaverse.” See Kurt Wagner,
Who’s Building Facebook’s Metaverse? Meet CTO Andrew Bosworth, BLOOMBERG (Oct. 27,
2021), https://www.bloomberg.com/news/articles/2021-10-27/facebook-fb-new-cto-andrewbosworth-is-the-man-building-the-metaverse.
Clare Duffy, Facebook has known it has a human trafficking problem for years. It still
hasn’t fully fixed it, CNN (Oct. 25, 2021), https://www.cnn.com/2021/10/25/tech/facebookinstagram-app-store-ban-human-trafficking/index.html.
CLASS ACTION COMPLAINT
Case No. __________________Page 7
•
Disturbingly, whistleblower Frances Haugen shed light on
Facebook’s knowledge that its websites, including both Facebook
and Instagram, led to mental health and body-image issues, and in
some cases, eating disorders and suicidal thoughts, in teens. Yet,
Facebook’s own internal research also showed that the more that
teenagers had these thoughts and emotions, the more they used the
app. So, it did nothing to protect the millions of children viewing
its content daily and maintained the status quo.
16.
The clear underlying message of the Bosworth memo above, as well as these
examples, is one of sacrifice: that the victims of a terrorist attack can be sacrificed for
Facebook’s growth; that an innocent child who takes her own life because she is bullied can be
sacrificed for Facebook’s growth; that democracy can be sacrificed for Facebook’s growth; that
the mental and physical health of children can be sacrificed for Facebook’s growth; that the
prevention of a global pandemic can be scarified for Facebook’s growth; and, as will be fully
described here, that an entire ethnic population can be sacrificed for Facebook’s relentless
growth.
17.
Because Facebook’s algorithms recommend that susceptible users join extremist
groups, where users are conditioned to post even more inflammatory and divisive content, it is
naturally open to exploitation by autocratic politicians and regimes. By using large numbers of
fake accounts (that Facebook not only fails to police but actually likes because they inflate the
user data Facebook presents to the financial markets), these regimes can repeatedly post, like,
share, and comment on content attacking ethnic minorities or political opponents. Because that
content appears to generate high engagement, Facebook’s algorithms prioritize it in the News
Feeds of real users.
18.
As such, Facebook’s arrival in Burma provided exactly what the military and its
civilian terrorists were praying for. Beginning around 2011, Facebook arranged for tens of
millions of Burmese to gain access to the Internet for the first time, exclusively through
Facebook. This resulted in a “crisis of digital literacy,” leaving these new users blind to the
prevalence of false information online. Facebook did nothing, however, to warn its Burmese
users about the dangers of misinformation and fake accounts on its system or take any steps to
the restrict its vicious spread.
CLASS ACTION COMPLAINT
Case No. __________________Page 8
19.
The brutal and repressive Myanmar military regime employed hundreds of
people, some posing as celebrities, to operate fake Facebook accounts and to generate hateful
and dehumanizing content about the Rohingya.
20.
Anti-Rohingya content thereafter proliferated throughout the Facebook product
for years. Human rights and civil society groups have collected thousands of examples of
Facebook posts likening the Rohingya to animals, calling for Rohingya to be killed, describing
the Rohingya as foreign invaders, and falsely accusing Rohingya of heinous crimes.
21.
It was clearly foreseeable, and indeed known to Facebook, that, by prioritizing
and rewarding users for posting dangerous and harmful content online—as well as by
recommending extremist groups and allowing fake accounts created by autocrats to flourish on
its system—Facebook would radicalize users in Burma, causing them to then support or engage
in dangerous or harmful conduct in the offline world.
22.
Despite having been repeatedly alerted between 2013 and 2017 to the vast
quantities of anti-Rohingya hate speech and misinformation on its system, and the violent
manifestation of that content against the Rohingya people, Facebook barely reacted and devoted
scant resources to addressing the issue.
23.
The resulting Facebook-fueled anti-Rohingya sentiment motivated and enabled
the military government of Myanmar to engage in a campaign of ethnic cleansing against the
Rohingya. To justify and strengthen its hold on power, the government cast, by and through
Facebook, the Rohingya as foreign invaders from which the military was protecting the Burmese
people. Widespread anger toward, and fear of, the Rohingya made it possible for the government
to enhance its own popularity by persecuting the Rohingya. Meanwhile, few Burmese civilians
objected to the attendant human rights abuses and eventual acts of genocide; indeed, as described
herein, many civilians actively participated in atrocities committed against the Rohingya.
24.
With the way cleared by Facebook, the military’s campaign of ethnic cleansing
culminated with “clearance operations” that began in August 2017. Security forces, accompanied
by civilian death squads armed with long swords, attacked dozens of Rohingya villages. More
CLASS ACTION COMPLAINT
Case No. __________________Page 9
than ten thousand Rohingya men, women, and children died by shooting, stabbing, burning, or
drowning. Thousands of others were tortured, maimed, and raped. Whole villages were burned to
the ground. More than 700,000 Rohingya eventually fled to squalid, overcrowded refugee camps
in Bangladesh.
25.
Not until 2018—after the damage had been done—did Facebook executives,
including CEO Mark Zuckerberg and COO Sheryl Sandberg, meekly admit that Facebook
should and could have done more to prevent what the United Nations has called “genocide” and
a “human rights catastrophe.” Facebook’s underwhelming response failed to capture even a
scintilla of the gravity of what it had done and the role it played, stating “we weren’t doing
enough to help prevent our platform from being used to foment division and incite offline
violence. We agree that we can and should do more.”
26.
The second part of its efforts to “do more” was to launch the virtual reality centric
“Metaverse” to further force themselves into the lives of billions. As noted by prominent
political commentor Dan Pfeiffer,
Facebook is one of the least liked, least trusted companies on the
planet. They are in the middle of a massive scandal about their
involvement in genocide, human trafficking, and disinformation.
And their next move is to say: “What if you could live inside
Facebook?”
27.
Still, years after their initial tepid admission of negligence, former Facebook
employee and now prolific whistleblower, Frances Haugen, stated “[t]he company’s leadership
knows how to make Facebook and Instagram safer but won’t make the necessary changes
because they have put their astronomical profits before people.” 9 Notably, in litigation pending
Alex Warofka, An Independent Assessment of the Human Rights Impact of Facebook in
Myanmar, FACEBOOK NEWSROOM (Nov. 5, 2018), https://about.fb.com/news/2018/11/myanmarhria/.
See @DanPfeiffer, TWITTER (Oct. 28, 2021, 3:24 PM)
https://twitter.com/danpfeiffer/status/1453819894487674899.
Abram Brown, Facebook ‘Puts Astronomical Profits Over People,’ Whistle-Blower Tells
Congress, FORBES (Oct. 5, 2021),
https://www.forbes.com/sites/abrambrown/2021/10/05/facebook-will-likely-resume-work-oninstagram-for-kids-whistleblower-t
CLASS ACTION COMPLAINT
Case No. __________________Page 10
before the International Court of Justice stemming from the Rohingya genocide, Facebook is at
this very moment taking aggressive measures to conceal evidence of its involvement.
28.
Perhaps the most damning example of Facebook’s continued failure in Burma is
the ongoing—to this day—misinformation campaign being carried out on Facebook within the
country. As reported by Reuters on November 2, 2021, the Myanmar military has
tasked thousands of soldiers with conducting what is widely referred to in the
military as “information combat” … The mission of the social media drive, part of
the military’s broader propaganda operations, is to spread the junta's view among
the population, as well as to monitor dissenters and attack them online as traitors,
… “Soldiers are asked to create several fake accounts and are given content
segments and talking points that they have to post” … In over 100 cases, the
messages or videos were duplicated across dozens of copycat accounts within
minutes, as well as on online groups, purported fan channels for Myanmar
celebrities and sports teams and purported news outlets … Posts often referred to
people who opposed the junta as “enemies of the state” and “terrorists”, and
variously said they wanted to destroy the army, the country and the Buddhist
religion.
29.
At the core of this Complaint is the realization that Facebook was willing to trade
the lives of the Rohingya people for better market penetration in a small country in Southeast
Asia. Successfully reaching the majority of Burmese people, and continuing to operate there
now, has a negligible impact on Facebook’s overall valuation and bottom line. Without the
In a glaring example of Facebook’s failure to learn from its deadly mistakes in Burma,
Haugen has provided documents demonstrating that history is currently repeating itself in
Ethiopia, where acts of ethnic violence are being carried out against the Tigrayan minority
amidst a raging civil war, again with the help of a Facebook-fueled misinformation and hatespeech campaign. See Facebook is under new scrutiny for it’s role in Ethiopia’s conflict, NPR
(Oct. 11, 2021), https://www.npr.org/2021/10/11/1045084676/facebook-is-under-new-scrutinyfor-its-role-in-ethiopias-conflict. See also Mark Scott, Facebook did little to moderate posts in
the world’s most violent countries, POLITICO (Oct. 25, 2021),
https://www.politico.com/news/2021/10/25/facebook-moderate-posts-violent-countries-(“In many of the world’s most dangerous conflict zones, Facebook has repeatedly failed to
protect its users, combat hate speech targeting minority groups and hire enough local staff to
quell religious sectarianism”).
Robert Burnson, Facebook’s Stance on Myanmar Genocide Records Assailed by Gambia,
BLOOMBERG (Oct. 28. 2021), https://www.bloomberg.com/news/articles/2021-10-28/facebook-sstance-on-myanmar-genocide-records-assailed-by-gambia.
Fanny Potkin, Wa Lone, 'Information combat': Inside the fight for Myanmar’s soul,
REUTERS (Nov. 2, 2021), https://www.reuters.com/world/asia-pacific/information-combatinside-fight-myanmars-soul-2021-11-01.
CLASS ACTION COMPLAINT
Case No. __________________Page 11
Burma market, Facebook would still be worth $1 trillion, Mark Zuckerberg would still be one of
the top ten richest people in the world, and its stock price would still be at astronomical levels.
30.
In the end, there was so little for Facebook to gain from its continued presence in
Burma, and the consequences for the Rohingya people could not have been more dire. Yet, in the
face of this knowledge, and possessing the tools to stop it, it simply kept marching forward.
That is because, once Facebook struck the Faustian Bargain that launched the company, it has
had blinders on to any real calculation of the benefits to itself compared to the negative impacts
it has on anyone else. Facebook is like a robot programed with a singular mission: to grow. And
the undeniable reality is that Facebook’s growth, fueled by hate, division, and misinformation,
has left hundreds of thousands of devastated Rohingya lives in its wake.
PARTIES
31.
resides in Illinois.
32.
Plaintiff Jane Doe is a natural person and a Rohingya Muslim refugee. Plaintiff
Meta Platforms, Inc. is a corporation organized and existing under the laws of the
State of Delaware, with its principal place of business at 1 Hacker Way, Menlo Park, California
94025. Until October 2021, Defendant Meta was known as Facebook, Inc. Meta Platforms does
business in this County, the State of California, and across the United States.
JURISDICTION AND VENUE
33.
This Court has jurisdiction over this action pursuant to Article VI, Section 10, of
the California Constitution and Cal. Code Civ. Proc. § 410.10.
Angshuman Choudhury, How Facebook Is Complicit in Myanmar’s Attacks on
Minorities, THE DIPLOMAT (Aug. 25, 2020), https://www.thediplomat.com/2020/08/howfacebook-is-complicit-in-myanmars-attacks-on-minorities/ (“why would Facebook favor the
regime in Myanmar? For the same reason it would do so in India: to protect business interests in
a domestic market that it currently dominates by a wide margin. Imposing bans on governmentor military-linked accounts could dilute this monopoly by drawing the ire of state regulators.”) In
2020, Facebook similarly bowed to the demands of the Communist Vietnamese government to
“censor posts with anti-state language rather than risk losing an estimated $1 billion in annual
revenue from the country.” Peter Wade, Facebook Bowed to Vietnam Government’s Censorship
Demands: Report, ROLLING STONE (Oct. 25, 2021),
https://www.rollingstone.com/politics/politics-news/facebook-vietnam-censorship-1247323/.
CLASS ACTION COMPLAINT
Case No. __________________Page 12
34.
business is located within this County. Plaintiff submits to the jurisdiction of the Court.
This Court has personal jurisdiction over Defendant because its principal place of
35.
Venue is proper in this Court under Cal. Code Civ. P. § 395(a) because Defendant
resides in this County.
FACTUAL BACKGROUND
I.
The Defective Design of Facebook’s Algorithms and Services
A.
Facebook Designed Its Social Network to Maximize Engagement
36.
Facebook’s goal is to maximize “engagement,” a metric reflecting the amount of
time a user spends and the amount of interaction (“likes,” “shares,” comments, etc.) that the user
has with any given content. For Facebook, engagement determines advertising revenue, which
determines profits. “The prime directive of engagement … is driven by monetization. It befits a
corporation aiming to accelerate growth, stimulate ad revenue, and generate profits for its
shareholders.”
37.
if our users decrease their level of engagement with Facebook, our
revenue, financial results, and business may be significantly
harmed. The size of our user base and our users’ level of
engagement are critical to our success…. [O]ur business
performance will become increasingly dependent on our ability to
increase levels of user engagement and monetization…. Any
decrease in user retention, growth, or engagement could render
Facebook less attractive to developers and marketers, which may
have a material and adverse impact on our revenue, business,
financial condition, and results of operations. … Our advertising
revenue could be adversely affected by a number of … factors,
including: decreases in user engagement, including time spent on
Facebook[.]
In its SEC Form 10-K for the year ended December 31, 2012, Facebook warned:
38.
Accordingly, Facebook intentionally incorporated engagement-based ranking of
content into its system and the algorithms that drive it. Facebook’s News Feed—the first thing
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
U.S. Securities and Exchange Commission Form 10-K, Facebook, Inc. (fiscal year ended
Dec. 31, 2012) (“Facebook 2012 10-K”) at 13, 14,
https://www.sec.gov/Archives/edgar/data/1326801/000132680113000003/fb12312012x10k.htm#s5D6A63A4BB6B6A7AD01CD7A5A25638E4.
CLASS ACTION COMPLAINT
Case No. __________________Page 13
that users see when opening up the app or entering the site and “the center of the Facebook
experience”—is driven by engagement. Posts with higher engagement scores are included and
prioritized in the News Feed, while posts with lower scores are buried or excluded altogether.
“[T]he Feed’s … logics can be understood through a design decision to elevate and amplify
‘engaging’ content…. [T]he core logic of engagement remains baked into the design of the Feed
at a deep level.”
39.
Facebook engineers and data scientists meet regularly to assess the billions of
likes, comments and clicks Facebook users make every day to “divine ways to make us like,
comment and click more,” so that users will keep coming back and seeing more ads from the
company’s 2 million advertisers. Engineers are continually running experiments with a small
share of Facebook users to boost engagement. 16 Thus, Facebook’s design was the “result of
particular decisions made over time…. Every area has undergone meticulous scrutiny … by
teams of developers and designers…. [Facebook] has evolved through conscious decisions in
response to a particular set of priorities.”
40.
Facebook has consistently promoted and rewarded employees who contribute to
the company’s growth through a relentless focus on increased engagement of Facebook users;
employees who raise ethical and safety concerns tend to be ignored and marginalized and,
eventually, left the company.
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
Victor Luckerson, Here’s How Facebook’s News Feed Actually Works, TIME (July 9,
2015), https://time.com/collection-post/3950525/facebook-news-feed-algorithm/.
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
Katie Canales, ‘Increasingly gaslit’: See the messages concerned Facebook employees
wrote as they left the company, BUSINESS INSIDER (Oct. 28, 2021),
https://www.businessinsider.com/facebook-papers-employee-departure-badge-post-gaslitburned-out-2021-10 (“[t]he employee said Facebook's infamous growth-first approach leads to
rolling out ‘risky features.’ If employees propose reversing that risk, they’re seen as being
‘growth-negative, and veto’d by decision makers on those grounds,’ they said. They also said it’s
difficult to establish ‘win/wins,’ or to roll out features that promote both safety and growth”).
CLASS ACTION COMPLAINT
Case No. __________________Page 14
B.
Facebook Prioritizes Hate Speech and Misinformation to Increase
User Engagement
41.
Facebook knows that the most negative emotions—fear, anger, hate—are the
most engaging. Facebook employs psychologists and social scientists as “user researchers” to
analyze its user’s behavior in response to online content. An internal Facebook presentation by
one such researcher, leaked in May 2020, warned: “Our algorithms exploit the human brain’s
attraction to divisiveness…. If left unchecked, … [Facebook would feed users] more and more
divisive content in an effort to gain user attention & increase time on the platform.”
42.
To maximize engagement, Facebook does not merely fill users’ News Feeds with
disproportionate amounts of hate speech and misinformation; it employs a system of social
rewards that manipulates and trains users to create such content. When users post content, other
users who are shown that content are prompted to “like,” “comment” on, or “share” it. Under
each piece of content, users can see how many times others have liked or shared that content and
can read the comments. See Figure 1.
(Figure 1.)
43.
A study published in February 2021 confirmed that, “[i]n online social media
platforms, feedback on one’s behavior often comes in the form of a ‘like’—a signal of approval
Jeff Horwitz, Deepa Seetharaman, Facebook Executives Shut Down Efforts to Make the
Site Less Divisive, WALL STREET JOURNAL (May 26, 2020),
https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixedsolutions-11590507499.
CLASS ACTION COMPLAINT
Case No. __________________Page 15
from another user regarding one’s post” and tested the assumption that likes “function as a social
reward.”
44.
Roger McNamee, an early investor in Facebook and advisor to Mark Zuckerberg,
wrote in his New York Times bestseller, “Zucked: Waking Up to the Facebook Catastrophe”:
Getting a user outraged, anxious, or afraid is a powerful way to
increase engagement. Anxious and fearful users check the site
more frequently. Outraged users share more content to let other
people know what they should also be outraged about. Best of all
from Facebook’s perspective, outraged or fearful users in an
emotionally hijacked state become more reactive to further
emotionally charged content. It is easy to imagine how
inflammatory content would accelerate the heart rate and trigger
dopamine hits.
45.
A Nature article published in 2020 further explained:
[I]ncendiary, polarizing posts consistently achieve high
engagement…. This content is meant to draw engagement, to
provide a reaction….
This divisive material often has a strong moral charge. It takes a
controversial topic and establishes two sharply opposed camps,
championing one group while condemning the other. These are the
headlines and imagery that leap out at a user as they scroll past,
forcing them to come to a halt. This offensive material hits a nerve,
inducing a feeling of disgust or outrage. “Emotional reactions like
outrage are strong indicators of engagement…. [T]his kind of
divisive content will be shown first, because it captures more
attention than other types of content.” …
The design of Facebook means that … forwarding and
redistribution is only a few clicks away…. Moreover, the
networked nature of social media amplifies this single response,
distributing it to hundreds of friends and acquaintances. They too
receive this incendiary content and they too share, inducing …
“outrage cascades — viral explosions of moral judgment and
disgust.” Outrage does not just remain constrained to a single user,
Björn Lindström, Martin Bellander, David T. Schultner, Allen Chang, Philippe N. Tobler,
David M. Amodio, A computational reward learning account of social media engagement,
NATURE COMMUNICATIONS 12, Art. No. 1311 (Feb. 26, 2021),
https://www.nature.com/articles/s41467-020-19607x#:~:text=%20A%20computational%20reward%20learning%20account%20of%20soci
al,our%0hypothesis%20
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 88 (Penguin ed.).
CLASS ACTION COMPLAINT
Case No. __________________Page 16 but proliferates, spilling out to provoke other users and appear in
other online environments.
46.
Facebook knew that it could increase engagement and the length of time users
spend on its websites (and subsequently increase its revenue) by adjusting its algorithms to
manipulate users’ News Feeds and showing them more negative content thus causing “massive-
scale emotional contagion.” In 2014, Adam Kramer, a member of Facebook’s “Core Data
Science Team,” co-authored an article describing one of the experiments that Facebook
conducted on its own users, stating,
we test whether emotional contagion occurs outside of in-person
interaction between individuals by reducing the amount of
emotional content in the News Feed … Which content is shown or
omitted in the News Feed is determined via a ranking algorithm
that Facebook continually develops and tests in the interest of
showing viewers the content they will find most relevant and
engaging. One such test is reported in this study: A test of whether
posts with emotional content are more engaging.
***
The results show emotional contagion…. [F]or people who had
positive content reduced in their News Feed, a larger percentage of
words in people’s status updates were negative and a smaller
percentage were positive ...
These results indicate that emotions expressed by others on
Facebook influence our own emotions, constituting experimental
evidence for massive-scale contagion via social networks.
47.
Independent research unequivocally confirms that fake content thrives on
Facebook over more reliable and trustworthy sources. In September 2021, the Washington Post
reported on a “forthcoming peer-reviewed study by researchers at New York University and the
Université Grenoble Alpes in France [which] found that from August 2020 to January 2021,
news publishers known for putting out misinformation got six times the amount of likes, shares,
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
Adam D.I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock, Experimental evidence of
massive-scale emotional contagion through social networks, 111 PROCEEDINGS OF THE
NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES, no. 29 (June 17, 2014),
https://www.pnas.org/cgi/doi/10.1073/pnas.1320040111.
CLASS ACTION COMPLAINT
Case No. __________________Page 17
and interactions on the [Facebook] platform as did trustworthy news sources, such as CNN or the
World Health Organization.”
48.
In testimony before Congress in September 2020, Tim Kendall, Facebook’s first
Director of Monetization—likening Facebook’s business model to that of Big Tobacco—
explained how such content makes Facebook addictive:
At Facebook, I believe we sought to mine as much human attention
as possible and turn it into historically unprecedented profits. To
do this, we didn’t simply create something useful and fun; we took
a page from Big Tobacco’s playbook, working to make our
offering addictive at the outset….
The next page in Big Tobacco’s playbook was to add
bronchodilators to cigarettes. This allowed the smoke to get in
contact with more surface area of the lungs. Allowing for
misinformation, conspiracy theories, and fake news to flourish
were Facebook’s bronchodilators.
But that incendiary content wasn’t enough. Tobacco companies
then added ammonia to cigarettes to increase the speed with which
nicotine traveled to the brain. Facebook’s ability to deliver this
incendiary content to the right person, at the right time, in the exact
right way—through their algorithms—that is their ammonia. And
we now know it fosters tribalism and division.
Social media preys on the most primal parts of your brain; it
provokes, it shocks, and it enrages….
Facebook and their cohorts worship at the altar of engagement and
cast other concerns aside, raising the voices of division, anger,
hate, and misinformation to drown out the voices of truth, justice,
morality, and peace.
49.
Content attacking opposing groups is particularly engaging. Zeynep Tufekci, a
sociologist at University of North Carolina, has written that:
the new, algorithmic gatekeepers aren’t merely (as they like to
believe) neutral conduits for both truth and falsehood. They make
their money by keeping people on their sites and apps; that aligns
their incentives closely with those who stoke outrage, spread
Elizabeth Dwoskin, Misinformation on Facebook got six times more clicks than factual
news during the 2020 election, study says, WASHINGTON POST (Sept. 4, 2021),
https://www.washingtonpost.com/technology/2021/09/03/facebook-misinformation-nyu-study/.
Mainstreaming Extremism: Social Media’s Role in Radicalizing America: Hearing
Before the House Subcommittee on Consumer Protection and Commerce, 116th Congress (Sept.
24, 2020) (statement of Timothy Kendall).
CLASS ACTION COMPLAINT
Case No. __________________Page 18 misinformation, and appeal to people’s existing biases and
preferences.
[T]he problem is that when we encounter opposing views in the
age and context of social media, it’s not like reading them in a
newspaper while sitting alone. It’s like hearing them from the
opposing team while sitting with our fellow fans in a football
stadium. Online, we’re connected with our communities, and we
seek approval from our like-minded peers. We bond with our team
by yelling at the fans of the other one. In sociology terms, we
strengthen our feeling of “in-group” belonging by increasing our
distance from and tension with the “out-group”—us versus
them…. This is why the various projects for fact-checking claims
in the news, while valuable, don’t convince people. Belonging is
stronger than facts.
50.
A study published in June 2021 showed that posts attacking “others” (the “out-
group”) are particularly effective at generating social rewards, such as likes, shares, and
comments, and that those reactions consist largely of expressions of anger:
We investigated whether out-group animosity was particularly
successful at generating engagement on two of the largest social
media platforms: Facebook and Twitter. Analyzing posts from
news media accounts and US congressional members (n =
2,730,215), we found that posts about the political out-group were
shared or retweeted about twice as often as posts about the ingroup.… Out-group language consistently emerged as the strongest
predictor of shares and retweets…. Language about the out-group
was a very strong predictor of “angry” reactions (the most popular
reactions across all datasets)…. In sum, out-group language is the
strongest predictor of social media engagement across all relevant
predictors measured, suggesting that social media may be creating
perverse incentives for content expressing out-group animosity.
51.
Another study, published in August 2021, analyzed how “quantifiable social
feedback (in the form of ‘likes’ and ‘shares’)” affected the amount of “moral outrage” expressed
in subsequent posts. The authors “found that daily outrage expression was significantly and
positively associated with the amount of social feedback received for the previous day’s outrage
expression.” The amount of social feedback is, in turn, determined by the algorithms underlying
the social media product:
Zeynep Tufekci, How social media took us from Tahrir Square to Donald Trump, MIT
TECHNOLOGY REVIEW (Aug. 14, 2018), https://technologyreview.com/2018/08/14/240325/howsocial-media-took-us-from-tahrir-square-to-donald-trump.
Steve Rathje, Jay J. Van Bagel, Sander van der Linden, Out-group animosity drives
engagement on social media, 118 PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES (26),
(June 29, 2021), https://doi.org/10.1073/pnas.2024292118.
CLASS ACTION COMPLAINT
Case No. __________________Page 19 Social media newsfeed algorithms can directly affect how much
social feedback a given post receives by determining how many
other users are exposed to that post. Because we show here that
social feedback affects users’ outrage expressions over time, this
suggests that newsfeed algorithms can influence users’ moral
behaviors by exploiting their natural tendencies for reinforcement
learning…. [D]esign choices aimed at … profit maximization via
user engagement can indirectly affect moral behavior because
outrage-provoking content draws high engagement….
52.
In other words, if a user makes two posts—one containing hateful, outraged, and
divisive content and one lacking such content—Facebook’s algorithms will show the hateful,
outraged, and divisive post to more users. Consequently, the hateful, outraged, and divisive post
is rewarded with more likes, shares, and comments. The user quickly learns that to obtain a
reaction to his or her posts, he or she should incorporate as much hateful, outraged, and divisive
content as possible.
53.
On October 5, 2021, Frances Haugen, a former Facebook product manager,
testified before Congress:
The dangers of engagement based ranking are that Facebook
knows that content that elicits an extreme reaction from you is
more likely to get a click, a comment or reshare. And it’s
interesting because those clicks and comments and reshares aren’t
even necessarily for your benefit, it’s because they know that other
people will produce more content if they get the likes and
comments and reshares. They prioritize content in your feed so that
you will give little hits of dopamine to your friends, so they will
create more content. And they have run experiments on people,
producer side experiments, where they have confirmed this.
54.
Recently leaked documents confirm Facebook’s ability to determine the type of
content users post through its algorithms. After Facebook modified its algorithms in 2018 to
boost engagement, “[t]he most divisive content that publishers produced was going viral on the
William J. Brady, Killian McLoughlin, Tuan N. Doan, Molly J. Crockett, How social
learning amplifies moral outrage expression in online social networks, 7 SCIENCE ADVANCES,
no. 33 (Aug. 13, 2021), https://www.science.org/doi/10.1126/sciadv.abe5641. Posts were
classified as containing moral outrage or not using machine learning.
Facebook Whistleblower Frances Haugen Testifies on Children & Social Media Use:
Full Senate Hearing Transcript, REV (Oct. 5, 2021),
https://www.rev.com/blog/transcripts/facebook-whistleblower-frances-haugen-testifies-onchildren-social-media-use-full-se
CLASS ACTION COMPLAINT
Case No. __________________Page 20
platform … creating an incentive to produce more of it…. Company researchers discovered that
publishers and political parties were reorienting their posts toward outrage and sensationalism.
That tactic produced high levels of comments and reactions that translated into success on
Facebook.” Facebook researchers further discovered that “the new algorithm’s heavy weighting
of reshared material in its News Feed made the angry voices louder. ‘Misinformation, toxicity,
and violent content are inordinately prevalent among reshares,’ researchers noted in internal
memos.” Facebook data scientists suggested “a number of potential changes to curb the tendency
of the overhauled algorithm to reward outrage and lies” but “Mr. Zuckerberg resisted some of the
proposed fixes, the documents show, because he was worried they might hurt the company’s
other objective—making users engage more with Facebook.”
55.
In October 2021, NBC News described, based on internal documents leaked by
Frances Haugen, an experiment in which an account created by Facebook researchers
experienced “a barrage of extreme, conspiratorial, and graphic content”—even though the
fictitious user had never expressed interest in such content. For years, Facebook “researchers had
been running [similar] experiments … to gauge the platform’s hand in radicalizing users,
according to the documents seen by NBC News,” and among Haugen’s disclosures are
“research, reports and internal posts that suggest Facebook has long known its algorithms and
recommendation systems push some users to extremes.”
56.
It is not surprising that the true nature of Facebook’s algorithms has become fully
apparent only through leaked documents and whistleblower testimony, since Facebook goes to
great lengths to hinder outside academic research regarding the design of those algorithms. In a
congressional hearing entitled “The Disinformation Black Box: Researching Social Media Data”
Keach Hagey, Jeff Horwitz, Facebook Tried to Make Its Platform a Healthier Place. It
Got Angrier Instead, WALL STREET JOURNAL (Sept. 15, 2021),
https://www.wsj.com/articles/facebook-algorithm-change-zuckerberg-11631654215.
Brandy Zadrozny, “Carol’s Journey”: What Facebook knew about how it radicalizes
users, NBC NEWS (Oct. 22, 2021), https://www.nbcnews.com/tech/tech-news/facebook-knewradicalized-users-rcna3581.
CLASS ACTION COMPLAINT
Case No. __________________Page 21
on September 28, 2021, three social media researchers testified about Facebook’s attempts to
block their access to the data they needed:
•
Laura Edelson of New York University testified: “this summer,
Facebook cut off my team’s access to their data. We used that very
data to support the finding in our recent study that posts from
misinformation sources on Facebook got six times more
engagement than factual news during the 2020 elections, to
identify multiple security and privacy vulnerabilities that we have
reported to Facebook, and to audit Facebook’s own, public-facing
Ad Library for political ads.”
•
Alan Mislove, a Professor of Computer Sciences at Northeastern
University, testified: “Facebook recently criticized a study on
misinformation by saying it focused on who engages with content
and not who sees it—but that’s only true because Facebook does
not make such impression data available to researchers.”
•
Kevin T. Leicht, a Professor of Sociology at University of Illinois
Urbana-Champaign testified: “there are limited amounts of social
media data available due to company restrictions placed on that
data. Many researchers fear litigation that may result from
analyzing and publishing results from these data.”
57.
On October 5, 2021, Haugen testified before Congress:
[N]o one truly understands the destructive choices made by
Facebook except Facebook….
A company with such frightening influence over so many people,
over their deepest thoughts, feelings, and behavior, needs real
oversight. But Facebook’s closed design means it has no real
oversight. Only Facebook knows how it personalizes your Feed for
you.
Hearing on The Disinformation Black Box: Researching Social Media Data before the
Subcomm. on Oversight, 117th Cong. (2021) (testimony of Laura Edelson, NYU Cybersecurity
for Democracy), https://www.congress.gov/117/meeting/house/114064/witnesses/HHRG-117SY21-Wstate-EdelsonL-20210928.pdf.
Hearing on The Disinformation Black Box: Researching Social Media Data before the
Subcomm. on Oversight, 117th Cong. (2021) (testimony of Alan Mislove, Professor of Computer
Sciences at Northeastern University),
https://www.congress.gov/117/meeting/house/114064/witnesses/HHRG-117-SY21-WstateMisloveA-20210928.pdf.
Hearing on The Disinformation Black Box: Researching Social Media Data before the
Subcomm. on Oversight, 117th Cong. (2021) (testimony of Kevin T. Leicht, a Professor of
Sociology at University of Illinois Urbana-Champaign),
https://www.congress.gov/117/meeting/house/114064/witnesses/HHRG-117-SY21-WstateLeichtK-20210928.pdf.
CLASS ACTION COMPLAINT
Case No. __________________Page 22 At other large tech companies like Google, any independent
researcher can download from the Internet the company’s search
results and write papers about what they find. And they do. But
Facebook hides behind walls that keeps researchers and regulators
from understanding the true dynamics of their system….
58.
Nevertheless, it is now clear that, by modifying the design of its algorithms and
system, Facebook can influence and manipulate the quantity, substance, and emotional tone of
the content its users produce. Through its dopamine-based incentive structure of social rewards
and cues, as well as its algorithmic promotion of hate speech and misinformation, Facebook
contributes to and participates in the development and creation of outraged, extreme, and
divisive content.
59.
It’s obviously not in Facebook’s favor—especially its bottom line—to curb the
spread of negative content and adjust its algorithm to promote positive content. One designer and
technologist proposed four different interventions to address the “problems of polarization,
dehumanization, and outrage, three of the most dangerous byproducts” of tools such as
Facebook. The four interventions described in the article include “Give Humanizing Prompts,”
“Picking out unhealthy content with better metrics,” “Filter unhealthy content by default,” and
“Give users feed control.” Facebook had not implemented any such interventions, undoubtedly
because, as the author admitted, the interventions “will all likely result in short-term reductions
in engagement and ad revenue.”
60.
Facebook has options for moderating its algorithms’ tendency to promote hate
speech and misinformation, but it rejects those options because the production of more engaging
content takes precedence. In a September 2021 article, based on recently leaked internal
documents, the Wall Street Journal described how Facebook had modified its News Feed
Facebook Whistleblower Frances Haugen Testifies on Children & Social Media Use:
Full Senate Hearing Transcript, REV (Oct. 5, 2021),
https://www.rev.com/blog/transcripts/facebook-whistleblower-frances-haugen-testifies-onchildren-social-media-use-full-se
Tobias Rose-Stockwell, Facebook’s problems can be solved with design, QUARTZ (Apr.
30, 2018) (emphases in original), https://qz.com/1264547/facebooks-problems-can-be-solvedwith-design/.
CLASS ACTION COMPLAINT
Case No. __________________Page 23
algorithm “to reverse [a] decline in comments, and other forms of engagement, and to encourage
more original posting” by users.
61.
Simply put, it is clear—based largely on admissions from former Facebook
executives—that Facebook’s algorithms are not “neutral.” The algorithms do not merely
recommend content based on users’ previously expressed interests; rather, to maximize
engagement, they are heavily biased toward promoting content that will enrage, polarize, and
radicalize users. Facebook does not simply “connect” people with similar interests; it exploits the
universal human instinct for tribalism by actively herding people into groups that define
themselves through their violent opposition to “other” people—often identified by race, religion,
or political ideology.
C.
Facebook Curates and Promotes Extremist Group Content
62.
Facebook’s algorithms curate and promote content that attracts new members to
extremist groups. A presentation by a researcher employed at Facebook, which was leaked in
2020, showed that Facebook’s algorithms were responsible for the growth of German extremist
groups on the website: “The 2016 presentation states that ‘64% of all extremist group joins are
due to our recommendation tools’ and that most of the activity came from the platform’s ‘Groups
You Should Join’ and ‘Discover’ algorithms. ‘Our recommendation systems grow the problem.’”
Ultimately, however, because “combating polarization might come at the cost of lower
engagement … Mr. Zuckerberg and other senior executives largely shelved the basic research …
and weakened or blocked efforts to apply its conclusions to Facebook products.”
63.
[I]f I am active in a Facebook Group associated with a conspiracy
theory and then stop using the platform for a time, Facebook will
do something surprising when I return. It may suggest other
Roger McNamee gave this example:
Keach Hagey, Jeff Horwitz, Facebook Tried to Make Its Platform a Healthier Place. It
Got Angrier Instead, THE WALL STREET JOURNAL (Sept. 15, 2021),
https://www.wsj.com/articles/facebook-algorithm-change-zuckerberg-11631654215.
Jeff Horwitz, Deepa Seetharaman, Facebook Executives Shut Down Efforts to Make the
Site Less Divisive, WALL STREET JOURNAL (May 26, 2020),
https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixedsolutions-11590507499.
CLASS ACTION COMPLAINT
Case No. __________________Page 24
conspiracy theory Groups to join…. And because conspiracy
theory Groups are highly engaging, they are very likely to
encourage reengagement with the platform. If you join the Group,
the choice appears to be yours, but the reality is that Facebook
planted the seed. It does so not because conspiracy theories are
good for you but because conspiracy theories are good for them.
McNamee described how, in 2016, he had raised his concerns with Mark Zuckerberg and Sheryl
Sandberg, to no avail.
64.
In the August 2021 study discussed above, the authors stated: “[U]sers conform to
the expressive norms of their social network, expressing more outrage when they are embedded
in ideologically extreme networks where outrage expressions are more widespread…. Such norm
learning processes, combined with social reinforcement learning, might encourage more
moderate users to become less moderate over time, as they are repeatedly reinforced by their
peers for expressing outrage.”
65.
Indeed, the positive feedback loop created by Facebook in the form of “likes,”
“comments,” and “shares” drive user engagement with extremist content and reward user
participation in creating such content. Together with algorithms promoting hate speech,
misinformation, and conspiracy theories, Facebook has steered users to extremist groups and
trained those users to express more outrage.
D.
Exploitation by Autocrats
66.
Facebook’s system and algorithms are also susceptible to exploitation by
unscrupulous and autocratic politicians and regimes. In his book, McNamee wrote:
Facebook’s culture, design goals, and business priorities made the
platform an easy target for bad actors, which Facebook aggravated
with algorithms and moderation policies that amplified extreme
voices. The architecture and business model that make Facebook
successful also make it dangerous. Economics drive the company
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 94-95 (Penguin
2020 ed.).
Id. at 4-7.
William J. Brady, Killian McLoughlin, Tuan N. Doan, Molly J. Crockett, How social
learning amplifies moral outrage expression in online social networks, 7 SCIENCE ADVANCES,
no. 33 (Aug. 13, 2021), https://www.science.org/doi/10.1126/sciadv.abe5641.
CLASS ACTION COMPLAINT
Case No. __________________Page 25 to align—often unconsciously—with extremists and authoritarians
to the detriment of democracy around the world.
67.
Facebook had the ability to detect and deactivate counterfeit accounts used by
authoritarian politicians and regimes to generate “fake engagement” but devoted minimal
resources to that task. In April 2021, Sophie Zhang, a data scientist whom Facebook had fired a
year earlier, spoke out about having “found multiple blatant attempts by foreign national
governments to abuse our platform on vast scales to mislead their own citizenry….” For
example, “[o]ver one six-week period from June to July 2018, [the president of Honduras]’s
Facebook posts received likes from 59,100 users, more than 78% of which were not real people.”
Such “fake engagement can influence how that content performs in the all-important news feed
algorithm; it is a kind of counterfeit currency in Facebook’s attention marketplace.”
68.
It took Facebook almost a year to remove fake accounts associated with
“domestic-focused coordinated inauthentic activity in Honduras” and, when Zhang “found that
the Honduras network was reconstituting … there was little appetite from [Facebook] to take it
down again.” Before she was fired, Zhang alerted Facebook to networks of fake Pages
supporting political leaders in Albania, Azerbaijan, Mexico, Argentina, Italy, the Philippines,
Afghanistan, South Korea, Bolivia, Ecuador, Iraq, Tunisia, Turkey, Taiwan, Paraguay, El
Salvador, India, the Dominican Republic, Indonesia, Ukraine, Poland, and Mongolia. Some of
these networks were investigated while others “languish[ed] for months without action.”
69.
Zhang gave one example that was especially reminiscent of the situation in
Burma:
Of all the cases of inauthentic behavior that Zhang uncovered, the
one that most concerned her—and that took the longest to take
down—was in Azerbaijan. It was one of the largest she had seen,
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 232-33 (Penguin
2020 ed.).
Julia Carrie Wong, How Facebook let fake engagement distort global politics: a
whistleblower’s account, THE GUARDIAN (Apr. 12, 2021),
https://theguardian.com/technology/2021/apr/12/facebook-fake-engagement-whistleblowersophie-zhang.
Id.
CLASS ACTION COMPLAINT
Case No. __________________Page 26 and it was clearly being used to prop up an authoritarian regime
with an egregious record on human rights.
The Azerbaijani network used the same tactic that was seen in
Honduras—thousands of Facebook Pages set up to look like user
accounts—but instead of creating fake likes, the Pages were used
to harass. Over one 90-day period in 2019, it produced
approximately 2.1m negative, harassing comments on the
Facebook Pages of opposition leaders and independent media
outlets, accusing them of being traitors and praising the country’s
autocratic leader, President Ilham Aliyev, and his ruling party, the
YAP.
Facebook did not employ a dedicated policy staffer or market
specialist for Azerbaijan, and neither its eastern European nor
Middle Eastern policy teams took responsibility for it. Eventually
Zhang discovered that the Turkey policy team was supposed to
cover the former Soviet republic, but none of them spoke Azeri or
had expertise in the country. As of August 2020, Facebook did not
have any full-time or contract operations employees who were
known to speak Azeri, leaving staff to use Google Translate to try
to understand the nature of the abuse.
Facebook did not take down those fake accounts or Pages until more than a year after Zhang
reported them.
E.
Facebook’s Algorithm Has Successfully Radicalized Its Users
70.
By prioritizing hate speech and misinformation in users’ News Feeds, training
users to produce ever more extreme and outraged content, recommending extremist groups, and
allowing its product to be exploited by autocrats, Facebook radicalizes users and incites them to
violence.
71.
As Chamath Palihapitiya, Facebook’s former vice president for user growth, told
an audience at Stanford Business School: “I think we have created tools that are ripping apart the
social fabric of how society works … [t]he short-term, dopamine-driven feedback loops we’ve
created are destroying how society works … No civil discourse, no cooperation[,]
misinformation, mistruth. And it’s not an American problem…”
Id.
James Vincent, Former Facebook exec says social media is ripping apart society, THE
VERGE (Dec. 11, 2017), https://www.theverge.com/2017/12/11/16761016/former-facebookexec-ripping-apart-society.
CLASS ACTION COMPLAINT
Case No. __________________Page 27
McNamee likewise explained how the design of Facebook’s algorithms and
72.
system lead to real-world violence: “The design of Facebook trained users to unlock their
emotions, to react without critical thought…. at Facebook’s scale it enables emotional contagion,
where emotions overwhelm reason…. Left unchecked, hate speech leads to violence,
disinformation undermines democracy.”
As Dipayan Ghosh, a former Facebook privacy expert, noted, “[w]e have set
73.
ethical red lines in society, but when you have a machine that prioritizes engagement, it will
always be incentivized to cross those lines.”
Facebook’s tendency to cause real-world violence by radicalizing users online has
74.
been demonstrated time and time again. A few recent examples include:
•
In March 2019, a gunman killed 51 people at two mosques in
Christchurch, New Zealand, while live-streaming the event on
Facebook.49 For two years prior to the shooting, the gunman had
been active on the Facebook group of the Lads Society, an
Australian extremist white nationalist group.
•
In August 2020, “[h]ours before a 17-year-old white man allegedly
killed two people and injured a third at protests over a police
shooting in Kenosha, Wisconsin, a local militia group posted a call
on Facebook: ‘Any patriots willing to take up arms and defend our
city tonight from evil thugs?’”51 Later, Mark Zuckerberg said that
“the social media giant made a mistake by not removing a page
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 98, 233 (Penguin
2020 ed.).
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 185 (HarperCollins 2021).
Charlotte Grahan-McLay, Austin Ramzy, and Daniel Victor, Christchurch Mosque
Shootings Were Partly Streamed on Facebook, NEW YORK TIMES (Mar. 14, 2019),
https://www.nytimes.com/2019/03/14/world/asia/christchurch-shooting-new-zealand.html.
Royal Commission of Inquiry into the Terrorist Attack on Christchurch Mosques on March 2019 § 4.6, https://christchurchattack.royalcommission.nz/the-report/firearmslicensing/general-life-in-new-zealand/; Michael McGowan, Australian white nationalists reveal
plans to recruit ‘disgruntled, white male population’, THE GUARDIAN (Nov. 11, 2019),
https://www.theguardian.com/australia-news/2019/nov/12/australian-white-nationalists-revealplans-to-recruit-disgruntled-
Adam Mahoney, Lois Beckett, Julia Carrie Wong, Victoria Bekiempis, Armed white men
patrolling Kenosha protests organized on Facebook, THE GUARDIAN (Aug. 26, 2020),
https://www.theguardian.com/us-news/2020/aug/26/kenosha-militia-protest-shooting-facebook.
CLASS ACTION COMPLAINT
Case No. __________________Page 28 and event that urged people in Kenosha … to carry weapons amid
protests.”
•
75.
“In the days leading up to [the January 6, 2021] march on the
Capitol, supporters of President Trump promoted it extensively on
Facebook and Facebook-owned Instagram and used the services to
organize bus trips to Washington. More than 100,000 users posted
hashtags affiliated with the movement prompted by baseless claims
of election fraud, including #StopTheSteal and
#FightForTrump.”Prior to Facebook’s entry into Burma, as described below, Facebook was on
notice of the manner in which its service could influence political conflict and be used to fuel
real-world violence. For example, during a 2010 conflict in Kyrgyzstan, highly divisive and at
times violent content spread widely on Facebook, inclusive of substantial misinformation related
to the source and cause for ongoing violence. 54 Likewise, even in examples where Facebook has
been credited with supporting protests for positive political change before 2012, the consistent
result is that the same governments and militant groups that were opposed by the protests,
eventually utilized Facebook to help put down those uprisings through widespread
misinformation campaigns.
76.
Facebook was on notice very early on in its existence that “that liberty isn't the
only end toward which these tools can be turned.”56 And it is with that knowledge in hand that it
launched in the extremely volatile environment present in Burma.
Madeleine Carlisle, Mark Zuckerberg Says Facebook’s Decision to Not Take Down
Kenosha Militia Page Was a Mistake, TIME (Aug. 29, 2020), https://time.com/5884804/markzuckerberg-facebook-kenosha-shooting-jacob-blake/.
Elizabeth Dwoskin, Facebook’s Sandberg deflected blame for Capitol riot, but new
evidence shows how platform played role, WASHINGTON POST (Jan. 13, 2021),
https://www.washingtonpost.com/technology/2021/01/13/facebook-role-in-capitol-protest/.
Neil Melvin and Tolkun Umaraliev, New Social Media and Conflict in Kyrgyzstan, SIPRI
(Aug. 2011), https://www.sipri.org/sites/default/files/files/insight/SIPRIInsight1101.pdf.
Nariman El-Mofty, Social Media Made the Arab Spring, But Couldn't Save It, WIRED
(Jan. 26, 2016), https://www.wired.com/2016/01/social-media-made-the-arab-spring-but-couldntsave-it/ (“These governments have also become adept at using those same channels to spread
misinformation. ‘You can now create a narrative saying a democracy activist was a traitor and a
pedophile,’ … ‘The possibility of creating an alternative narrative is one people didn’t consider,
and it turns out people in authoritarian regimes are quite good at it.’”)
Id.
CLASS ACTION COMPLAINT
Case No. __________________Page 29
II.
The Introduction of Facebook Led to a Crisis of Digital Literacy in Burma
77.
In addition to high engagement, continued user growth was critical to Facebook’s
success. “If we fail to retain existing users or add new users, … our revenue, financial results,
and business may be significantly harmed.” 57 By 2012, Facebook reported 1.06 billion monthly
active users (“MAUs”) with 84% of those accessing Facebook from outside the United States,
meaning that there were already about 170 million MAUs in the United States—equal to more
than half the U.S. population. 58 To ensure continued growth, Facebook would have to gain users
in developing countries, many of whom had no previous access to the Internet.
78.
Prior to 2011, in an atmosphere of extreme censorship, only about 1% of the
Burmese population had cell phones. That percentage grew dramatically with the liberalization
that began in 2011.59 In 2013, when two foreign telecom companies were permitted to enter the
market, the cost of a SIM card fell from more than $200 to as little as $2, and by 2016, nearly
half the population had mobile phone subscriptions, most with Internet access.
79.
Facebook took active steps to ensure that it would have a dominant position in the
emerging Burmese market. “Entering the country in 2010, Facebook initially allowed its app to
be used without incurring data charges, so it gained rapid popularity. It would come pre-loaded
on phones bought at mobile shops….”
80.
Facebook would eventually pursue a similar strategy for penetrating other
developing markets, as reflected in its “Free Basics” product. Free Basics was “a Facebook-
Facebook 2012 10-K, at 13,
https://www.sec.gov/Archives/edgar/data/1326801/000132680113000003/fb12312012x10k.htm#s5D6A63A4BB6B6A7AD01CD7A5A25638E4.
Id. at 8.
Report of the detailed findings of the Independent International Fact-Finding Mission on
Myanmar, UNITED NATIONS HUMAN RIGHTS COUNCIL (Sept. 17, 2018),
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf (“UNHRC
Report”) ¶ 1343.
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
Saira Asher, Myanmar coup: How Facebook became the ‘digital tea shop,’ BBC NEWS
(Feb. 4, 2021), https://www.bbc.com/news/world-asia-55929654.
CLASS ACTION COMPLAINT
Case No. __________________Page 30
developed mobile app that gives users access to a small selection of data-light websites and
services … [t]o deliver the service, … Facebook partners with local mobile operators … [who]
agree to ‘zero-rate’ the data consumed by the app, making it free, while Facebook does the
technical heavy lifting to ensure that they can do this as cheaply as possible. Each version is
localized, offering a slightly different set of up to 150 sites and services…. There are no other
social networking sites apart from Facebook and no email provider.”
81.
One reason why Facebook gained immense traction in Burma is that “[t]he
website … handles Myanmar fonts well compared to other social media like Twitter.” 63 After
citizens bought an inexpensive phone and a cheap SIM card, “there was one app that everybody
in [Burma] wanted: Facebook. The reason? Google and some of the other big online portals
didn’t support Burmese text, but Facebook did.”
82.
For the majority of Burma’s 20 million Internet-connected citizen’s, “Facebook is
the internet…. [M]ost mobile phones sold in the country come preloaded with Facebook….
There are equal numbers of internet users and Facebook users in [Burma]. As a result, many
people use Facebook as their main source of information….”
83.
A report commissioned by Facebook in 2018 described how the rapid transition of
Burma from a society without modern communications infrastructure to an Internet-connected
society caused “a crisis of digital literacy: A large population of internet users lacks basic
understanding of how to … make judgments on online content.… Digital literacy is generally
low across the country, and many people find it difficult to verify or differentiate content (for
Olivia Solon, ‘It’s digital colonialism’: how Facebook’s free internet service has failed
its users, THE GUARDIAN (July 27, 2017),
https://www.theguardian.com/technology/2017/jul/27/facebook-free-basics-developing-markets.
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech.
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
Human Rights Impact Assessment: Facebook in Myanmar, BSR (Oct. 2018) (“BSR
Report”) at 12-13, https://about.fb.com/wp-content/uploads/2018/11/bsr-facebook-myanmarhria_final.pdf.
CLASS ACTION COMPLAINT
Case No. __________________Page 31
example, real news from misinformation).” 66 As noted by Sarah Su, a Facebook employee who
works on content safety issues on the News Feed, “[w]hat you’ve seen in the past five years is
almost an entire country getting online at the same time, we realized that digital literacy is quite
low. They don’t have the antibodies to [fight] viral misinformation.”
84.
The U.N. Independent International Fact-Finding Mission on Myanmar (the
“U.N. Mission”) investigating the genocide in Burma reported: “[t]he Myanmar context is
distinctive … because of the relatively new exposure of the Myanmar population to the Internet
and social media…. In a context of low digital and social media literacy, the Government’s use
of Facebook for official announcements and sharing of information further contributes to users’
perception of Facebook as a reliable source of information.”
85.
Thet Swei Win, the director of an organization that works to promote social
harmony between ethnic groups in Burma, told the BBC “[w]e have no internet literacy…[w]e
have no proper education on how to use the internet, how to filter the news, how to use the
internet effectively.”
86.
As described by the U.N., “[t]he relative unfamiliarity of the population with the
Internet and with digital platforms and the easier and cheaper access to Facebook have led to a
situation in [Burma] where Facebook is the Internet…. For many people, Facebook is the main,
if not only, platform for online news and for using the Internet more broadly.” 70 “Facebook is
arguably the only source of information online for the majority in [Burma]”71 and “Facebook is a
Id.
Steven Levy, Facebook: the Inside Story (Blue Rider Press 2020).
UNHRC Report, ¶¶ 1342, 1345,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
UNHRC Report, ¶ 1345,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
Libby Hogan, Michael Safi, Revealed: Facebook hate speech exploded in Myanmar
during Rohingya crisis, THE GUARDIAN (Apr. 2, 2018),
https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-exploded-inmyanmar-during-rohingya-crisis.
CLASS ACTION COMPLAINT
Case No. __________________Page 32
particularly influential medium in Myanmar. More than 14 million people out [of] a total
population of 53 million utilize Facebook in Myanmar, and according to a 2016 survey of
internet users in Myanmar, ‘reading news on the internet’ often meant ‘news they had seen on
their Facebook newsfeed, and [they] did not seem aware of other news sources online.’”
87.
The New York Times has reported that “[t]he military exploited Facebook’s wide
reach in Myanmar, where it is so broadly used that many of the country’s 18 million internet
users confuse the Silicon Valley social media platform with the internet,”73 and that “[a]s
Facebook’s presence in Myanmar grew …, the company did not address what the BSR report
calls ‘a crisis of digital literacy’ in a country that was just emerging from a military dictatorship
and where the internet was still new.”
88.
In the end, the U.N. put it best: “Facebook has been a useful instrument for those
seeking to spread hate, in a context where, for most users, Facebook is the Internet.”
13 III.
Facebook Amplified the Myanmar Military’s Use of Fear and Hatred of the
Rohingya to Justify its Hold on Power
89.
By 2011, the country’s history of political repression and ethnic violence was
widely known. “Myanmar’s political history has been heavily dominated by an all-powerful
military, known as the Myanmar ‘Tatmadaw,’ which has ruled the country for most of its
Fortify Rights, They Gave Them Long Swords: Preparations for Genocide and Crimes
Against Humanity Against Rohingya Muslims in Rakhine State, Myanmar at 95, n.403 (July
2018), https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf
(“Fortify Rights Report”) (citing GSMA, Mobile Phones, Internet, and Gender in Myanmar at (Feb. 2016), https://www.gsma.com/mobilefordevelopment/wpcontent/uploads/2016/02/Mobile-phones-internet-and-gender-in-Myanmar.pdf.)
Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military,
NEW YORK TIMES (Oct. 15, 2018), https://www.nytimes.com/2018/10/15/technology/myanmarfacebook-genocide.html.
Alexandra Stevenson, Facebook Admits It Was Used to Incite Violence in Myanmar,
NEW YORK TIMES (Nov. 6, 2018), https://www.nytimes.com/2018/11/06/technology/myanmarfacebook.html.
Report of the detailed findings of the Independent International Fact-Finding Mission on
Myanmar, UNITED NATIONS HUMAN RIGHTS COUNCIL (Sept. 17, 2018), https://documents-ddsny.un.org/doc/UNDOC/GEN/G18/274/54/PDF/G1827454.pdf?OpenElement (“UNHRC
Report”) ¶ 74.
CLASS ACTION COMPLAINT
Case No. __________________Page 33
existence.”76 In 1962, the military took power in a coup led by General Ne Win. 77 In 1989, after
widespread protests against the regime had broken out the year before, the military placed Aung
San Suu Kyi, leader of the National League for Democracy opposition party (NLD) and winner
of the Nobel peace prize, under house arrest; after the NLD won a general election in 1990, the
military government refused to recognize the result or to allow the legislature to assemble.
“During the military dictatorship [from 1962 to 2011], Myanmar was considered one of the most
repressive countries in Asia.”
90.
Despite a brief period of liberalization that began in 2011, the military continued
to dominate Burma’s government. The 2008 Constitution was designed by “the military to retain
its dominant role in politics and government … 25 percent of the seats in each house of
parliament and in the state and regional assemblies belong to unelected members of the military,
who are appointed by the Tatmadaw.”80 In addition to being guaranteed at least one vice
presidential position, “the Tatmadaw selects candidates for (and effectively controls) three key
ministerial posts: Defence, Border Affairs and Home Affairs. This is sufficient to control the
National Defence and Security Council and the entire security apparatus.”
91.
The military has consistently used an imagined threat from the Rohingya to justify
its hold on power. “[T]he ‘Rohingya crisis’ in Rakhine State … has been used by the military to
reaffirm itself as the protector of a nation under threat….” 82 In support of the 1962 coup,
“General Ne Win argued that a military take-over was necessary to protect the territorial integrity
of the country” due to “insurgencies from ‘ethnic armed organizations.’” 83 The Tatmadaw has
UNHRC Report, ¶ 71,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf. Myanmar gained
its independence from Great Britain in 1948.
Id.
Id. ¶ 74.
Id. ¶ 94.
Case No. __________________Page 34
used the alleged “ethnic threat to national sovereignty and territorial integrity as the excuse for
its control of the country….”84 The main concern to those in power was to “maintain power and
to attain and preserve ‘national unity in the face of ethnic diversity.’ Human rights were
‘subordinate to these imperatives.’ … Reports of serious human rights violations were
pervasive….”
92.
The government found that it could increase its own popularity by, first, instilling
fear and hatred of the Rohingya among the Buddhist majority in Burma and then publicly
oppressing, marginalizing, and persecuting the Rohingya. The U.N. found that “the Rohingya
have gradually been denied birth registration, citizenship and membership of the political
community. This lack of legal status and identity is the cornerstone of the oppressive system
targeting the Rohingya…. It is State-sanctioned and in violation of Myanmar’s obligations under
international law because it discriminates on the basis of race, ethnicity and religion.”86 The four
Special Rapporteurs on the human rights situation in Burma appointed by the United Nations
from 1992 to 201187 concluded, inter alia:
[S]ince late 1989, the Rohingya citizens of Myanmar … have been
subjected to persecution based on their religious beliefs involving
extrajudicial executions, torture, arbitrary detention, forced
disappearances, intimidation, gang-rape, forced labour, robbery,
setting of fire to homes, eviction, land confiscation and population
resettlement as well as the systematic destruction of towns and
mosques.
[S]ome of these human rights violations may entail categories of
crimes against humanity or war crimes.
Yet, “the Tatmadaw enjoys considerable popularity among the Bamar-Buddhist majority.”
Case No. __________________Page 35
93.
Facebook, by its very design, turned out to be the perfect tool for the Burmese
military and Buddhist extremists to use in promoting their message of religious intolerance and,
ultimately, ethnic cleansing. The amplification and propagation of hateful, extremist, and
polarizing messages and the radicalization of users are inevitable results of the algorithms that
Facebook intentionally and meticulously built into its system.
A.
Facebook Participated in Inciting Violence Against the Rohingya
(2012-2017)
94.
On June 8, 2012, there were violent confrontations between Rohingya and ethnic
Rakhine groups; security forces killed a number of the Rohingya and Muslim homes and shops
were set on fire and looted.91 In the ensuing weeks and months, the Rohingya suffered more
killings at the hands of Tatmadaw soldiers, burnings and lootings, sexual and gender-based
violence, arbitrary arrests, and torture in prison. 92 The U.N. Mission drew a direct connection
between the Burmese government’s use of Facebook and the violence against the Rohingya that
began in June 2012:
On 1 June 2012 … the spokesperson of the President of
Myanmar … posted a statement on his personal Facebook account.
He warned about the arrival from abroad of “Rohingya
terrorists” … and stated that the Myanmar troops would
“completely destroy them….” Although this post was later deleted,
the impact of a high official equating the Rohingya population with
terrorism may have been significant ahead of the 2012 violence,
which erupted a week later.
[P]osts early in 2012 about the alleged rape and murder by
Rohingya men of a Buddhist woman were reportedly shared
widely and are considered to have contributed to the tension and
violence in Rakhine State in that year.
95.
Incitement of violence on Facebook continued beyond 2012: “[A]n online news
report from 30 June 2014 … alleged that two Muslim teashop owners had raped a Buddhist
woman…. [A prominent Buddhist monk] reposted the article on his Facebook page…. Violence
Case No. __________________Page 36
erupted the following day [resulting in two deaths]. The rape allegations were false, with the
‘victim’ reportedly admitting that she had fabricated the rape allegations.”
The U.N. Report continued, “[t]here is no doubt that hate speech against Muslims
96.
in general, and Rohingya in particular, is extremely widespread in Myanmar…. Given
Facebook’s dominance in Myanmar, the Mission paid specific attention to a number of Facebook
accounts that appear to be particularly influential….” 96 For example:
•
[T]he late U Ko Ni, a well-known Muslim and legal advisor of the
NLD, was frequently targeted on Facebook…. In one post from
March 2016, a photo of U Ko Ni next to president Htin Kyaw was
captioned ‘this [dog] getting his foot in the door in Myanmar
politics is not something we should sit by and watch….’ The
Mission has seen multiple other posts with a similar message and
threats towards U Ko Ni dating from between March and October
2016. On 29 January 2017, U Ko Ni was assassinated….
•
[I]n January 2017, a self-described pro-Myanmar patriot with more
than 17,000 followers on Facebook posted a graphic video of
police violence against civilians in another country. He captioned
the post as follows: “Watch this video. The kicks and beatings are
very brutal…. [The] disgusting race of [Muslim] terrorists who
sneaked into our country … need to be beaten like that….” One
comment under the post reads: “It is very satisfying to watch
this…. It’s sad that Myanmar security forces are not as skillful in
their beating.” In July 2018, the post had over 23,000 views, reactions and 517 shares.
•
[O]ne account holder, supposedly a monk, posted a poem with
graphic photos allegedly showing Buddhist Mros killed by the
“Bengali” on 3 August 2017, along with photos of damage to a
pagoda allegedly done by “Bengali”.99 (The Myanmar authorities
refer to the Rohingya as “Bengalis” to suggest that, rather than
being native to Myanmar, they are illegal immigrants from
Bangladesh.100)
•
[O]n 11 February 2018, … Shwewiki.com, a self-proclaimed
“Media/News Company in Yangon” with over 1.3 million
Case No. __________________Page 37 followers on Facebook, posted a link to an article titled “The lies
of the [Rohingya liars] are exposed[.]”
In a Pulitzer Prize winning report, Reuters found numerous “posts, comments,
97.
images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook as
of [August 2018].” For example:
•
In December 2013, one user posted: “We must fight them the way
Hitler did the Jews, damn kalars [a pejorative for the Rohingya].”
•
In September 2017, another wrote: “These non-human kalar dogs,
the Bengalis, are killing and destroying our land, our water and our
ethnic people…. We need to destroy their race.”
•
In April 2018, another user posted, with a picture of a boatload of
Rohingya refugees, “Pour fuel and set fire so that they can meet
Allah faster.”
98.
Facebook was used to instigate communal unrest “in early September 2017, …
through the parallel distribution of similar but conflicting chain messages on Facebook
Messenger to Muslim and Buddhist communities. Each chain message stated that the other group
was preparing for major violence on 11 September and encouraged the recipient to get ready to
resist…. [T]he messages … caused widespread fear and at least three violent incidents.”103 One
of the most dangerous campaigns came in 2017, when “the military’s intelligence arm spread
rumors on Facebook to both Muslim and Buddhist groups that an attack from the other side was
imminent….”
99.
Steve Stecklow, author of the Reuters report, observed that several of the posts
that he and his team catalogued “described Rohingyas as dogs or pigs. ‘This is a way of
Id. ¶ 1312.
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
UNHRC Report, ¶ 1348,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military,
NEW YORK TIMES (Oct. 15, 2018), https://www.nytimes.com/2018/10/15/technology/myanmarfacebook-genocide.html.
CLASS ACTION COMPLAINT
Case No. __________________Page 38
dehumanising a group,’ Stecklow says. ‘Then when things like genocide happen, potentially
there may not be a public uproar or outcry as people don’t even view these people as people.’”
100.
According to Voices that Poison, a U.S.-based human rights group, “speech that
describes victims as vermin, pests, insects or animals is a rhetorical hallmark of incitement to
violence, even genocide, because it dehumanises the victim.”106 Fortify Rights, a human rights
group, similarly noted: “Burmese individuals and groups have disseminated vitriolic Facebook
posts dehumanizing and calling for widespread attacks against the Rohingya. For example, the
widely-followed monk Ashin Wirathu, head of the ultranationalist group formerly known as Ma
Ba Tha, posted a reference to the Rohingya in 2014, saying ‘You can be full of kindness and
love, but you cannot sleep next to a mad dog. If we are weak, our land will become Muslim.’”
101.
The New York Times reported that the Myanmar military had posted anti-
Rohingya propaganda on Facebook using fake accounts:
They posed as fans of pop stars and national heroes as they flooded
Facebook with their hatred. One said Islam was a global threat to
Buddhism. Another shared a false story about the rape of Buddhist
woman by a Muslim man.
The Facebook posts were not from everyday internet users.
Instead, they were from Myanmar military personnel who turned
the social network into a tool for ethnic cleansing, according to
former military officials, researchers and civilian officials in the
country.
***
The Myanmar military’s Facebook operation began several years
ago, said people familiar with how it worked. The military threw
major resources at the task, the people said, with as many as people on it.
They began by setting up what appears to be news pages and pages
on Facebook that were devoted to Burmese pop stars, models and
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech.
Fortify Rights Report, at 95,
https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf.
CLASS ACTION COMPLAINT
Case No. __________________Page 39 other celebrities, like a beauty queen with a penchant for parroting
military propaganda….
Those then became distribution channels for lurid photos, false
news and inflammatory posts, often aimed at Myanmar’s Muslims,
the people said. Troll accounts run by the military helped spread
the content, shout down critics and fuel arguments between
commenters to rile people up. Often, they posted sham photos of
corpses that they said were evidence of Rohingya-perpetrated
massacres, said one of the people.
Digital fingerprints showed that one major source of the Facebook
content came from areas outside Naypyidaw, where the military
keeps compounds, some of the people said.
102.
By October 2015, the Allard K. Lowenstein International Human Rights Clinic at
Yale Law School had already concluded that there was “strong evidence that genocide is being
committed against Rohingya.”109 The worst, however, was yet to come.
B.
The August 2017 “Clearance Operations” and Their Aftermath: A
“Human Rights Catastrophe”
103.
The Myanmar military’s campaign of ethnic cleansing culminated in August
with the “Clearance Operations.” The U.N. reported that “[d]uring the course of the operation
more than 40 percent of all villages in northern Rakhine State were partially or totally
destroyed…. As a result, over 725,000 Rohingya had fled to Bangladesh by September 2018.”
The August 2017 clearance operations “caused the disintegration of a community and resulted in
a human rights catastrophe, the effects of which will span generations.” 111 Additional “clearance
Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military,
NEW YORK TIMES (Oct. 15, 2018), https://www.nytimes.com/2018/10/15/technology/myanmarfacebook-genocide.html.
Persecution of the Rohingya Muslims: Is Genocide Occurring in Myanmar’s Rakhine
State, ALLARD K. LOWENSTEIN INTERNATIONAL HUMAN RIGHTS CLINIC, YALE LAW SCHOOL
(Oct. 2015) at 1, https://law.yale.edu/sites/default/files/documents/pdf/Clinics/fortifyrights.pdf.
UNHRC Report, ¶ 751,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf. The U.N. Mission
documented the clearance operations exhaustively: “The Mission obtained a wealth of
information on these events, including over 600 interviews with victims and eyewitnesses,
satellite imagery, documents, photographs and videos. It examined many incidents in detail. It
found consistent patterns of the most serious human rights violations and abuses.” Id. ¶ 754.
Id. ¶ 749.
CLASS ACTION COMPLAINT
Case No. __________________Page 40
operations” followed in numerous Rohingya villages across northern Rakhine State, with at least
54 verified locations.
104.
The U.N. Mission described the clearance operations in six Rohingya villages in
detail.113 The following description of the operation in one of those villages is typical:
•
“[H]undreds of Tatmadaw soldiers … surrounded [the village].
They were accompanied by a smaller number of ethnic Rakhine
from neighbouring villages. The security forces then opened fire,
shooting at villagers, including those that were fleeing. Soldiers
also dragged people from houses and shot some of them at point
blank range. Others were killed by having their throats slit with
large knives.”
•
“During the course of the operation, structures in [the village] were
burned and destroyed…. Satellite imagery analysis … shows the
extent of the destruction…. The entire Rohingya village … was
destroyed, while the nearby non-Rohingya village … remains
intact.”
•
“Women and girls were also subjected to rape, gang rape, sexual
mutilation and sexual humiliation during the ‘clearance
operations.’”
•
These “‘clearance operations’ were led by the Tatmadaw….
Individuals from the neighbouring ethnic Rakhine village were
recognised as participants and some ethnic Rakhine men assisted
the military….”
105.
The UNHRC Report and the Fortify Rights Report contain numerous first-person
accounts of atrocities committed against the Rohingya by both Myanmar security forces and by
civilians during the August 2017 “Clearance Operations.” For example:
•
“The soldiers killed the male members of my family. They shot at
them first and then slit their throats. The courtyard was full of
blood. They killed my husband, my father-in-law and my two
Case No. __________________Page 41 nephews of 15 and eight years old. They even killed the child in
the same way.”
•
“I found my six-month old son’s body lying next to my wife’s
body. She had been shot. My baby son was stabbed in his stomach
and his intestine and liver were coming out. When I took his small
body into my lap, I was showered with his blood.”
•
“My husband was shot and then he had his throat cut. I was raped.
It is so difficult to say what happened. They tore off my clothes,
then six soldiers raped me, and after that two ethnic Rakhine men,
whom I recognised, raped me. They pressed my breasts and face
continuously. My face almost turned blue. I knew the ethnic
Rakhine who lived nearby.”
•
“I hid in the toilet outhouse, some distance from our house. I saw
that our house was surrounded by 10 soldiers and some police. I
was able to see what happened. First they tied up my parents. Then
they shot my father and raped my mother; later they killed her too.
After this, they burned our house.”
•
“One mother described how she had to choose which of her
children to save. The security forces had entered her house and
grabbed her young daughter. Her son tried to save his sister and was
attacked by the security forces. The mother watched from the other
end of the house and made the split second decision that these two
children would not live, but that she could perhaps still save her two
younger children. Her husband returned the next morning to the
village and dug through the pits of bodies until he found the corpse
of their son. They never found the body of their daughter. The
mother told the Mission with haunted eyes: ‘How can I continue
with my life having made this choice?’”
•
“I saw my own children killed. Those who are left of my family
came with me here. My three children and my mother were killed.
They made them lie down on the ground and they cut the backs of
their necks.”
•
“Some small children were thrown into the river…. They hacked
small children who were half alive. They were breast-feeding age
children, two years, three years, five years….”
Fortify Rights Report, at 60,
https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf.
Id. at 61.
CLASS ACTION COMPLAINT
Case No. __________________Page 42
•
“The military took and arrested around 50 people. They brought
them to the military camp … and set fire to where they kept them.
One was my own brother. There was a small hut, and they put all
the people in there and set it on fire.”
•
“As soon as we got on the boat, they shot at us…. Foyezur
Rahman was my father. My daughter was Sofia. She was 18. They
were both shot in the back. As soon as the military shot them, they
stopped moving. We brought their dead bodies here [to
Bangladesh] and buried them.”
•
“I saw her taken from the house and raped by military soldiers. It
happened outside, beside a house. We watched from inside the
house. After they raped her, they killed her…. [O]ne person [raped
her], then she was taken to the road, and he cut her neck and cut
her breasts off.”
106.
In December 2017, Médicins Sans Frontières (Doctors Without Borders) (“MSF”)
published estimates of Rohingya deaths between August 25 and September 24, 2017—the month
after the “clearance operations” began—based on surveys of refugees in Bangladesh. MSF
estimated that “8,170 deaths were due to violence …, including 1,247 children under five years
of age.… Cause of death by shooting accounted for 69.4% of these deaths; being ‘burned to
death at home’ accounted for 8.8%; being beaten to death accounted for 5.0%; sexual violence
leading to death for 2.6%; and death by landmine for 1.0%.”
107.
MSF noted that “the rates of mortality captured here are likely to be
underestimates, as the data does not account for those people who have not yet been able to flee
Myanmar, or for families who were killed in their entirety.” 128 The U.N. Mission similarly
“concluded that the estimated number of more than 10,000 deaths during the August-September
2017 ‘clearance operations’ alone is likely to be conservative.”
Id. at 62.
Id. at 65.
Id. at 69.
Rohingya crisis – a summary of findings from six pooled surveys, MÉDICINS SANS
FRONTIÈRES (Dec. 9, 2017), https://www.msf.org/myanmarbangladesh-rohingya-crisis-summaryfindings-six-pooled-surveys.
UNHRC Report, ¶ 1482,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
CLASS ACTION COMPLAINT
Case No. __________________Page 43
108.
Most of the Rohingya who escaped the clearance operations now live in “a
miserable slum of a million people” in Bangladesh. Time reported in 2019 that “[c]onditions in
the [refugee] camps remain abysmal. Most refugees live in small shacks made of bamboo and
tarpaulin sheets, so tightly packed together that they can hear their neighbors talking, having sex,
and disciplining their children or, sometimes, wives. In the springtime, the huts turn into saunas.
In the monsoon season, daily rainfall turns hilly footpaths into waterslides and lifts trash and
human waste from open drains to float in stagnant pools.” “[M]urders and other forms of
violence occur almost nightly inside the camps and are rarely if ever investigated.” 130 “The
Rohingya are …, with no access to meaningful work, entirely dependent on humanitarian aid….
These factors increase vulnerability, in particular for women and girls, to trafficking and other
exploitation.”131 According to Steven Corliss of the U.N. refugee agency, UNHCR, “The
situation is untenable: environmentally, socially and economically.”
109.
Plaintiff and the Class have been deprived of their property, including their homes
and the land they cultivated for generations. In an update to its 2018 Report, the U.N. Mission
wrote:
The Mission concludes on reasonable grounds that the Government
undertook a concerted effort to clear and destroy and then
confiscate and build on the lands from which it forcibly displaced
hundreds of thousands of Rohingya. The consequences are twofold. This government-led effort subjugates Rohingya to inhumane
living conditions as [internally displaced persons] and refugees by
denying them access to their land, keeping them uprooted from
their homes, depriving them of their to ability to progress in
healthy and safe communities and preventing them from engaging
in livelihood activities that sustain them as a people. The second
consequence of the Government’s four-pronged approach of
clearing, destroying, confiscating and building on land is that it is
fundamentally altering the demographic landscape of the area by
Feliz Solomon, ‘We’re Not Allowed to Dream.’ Rohingya Muslims Exiled to Bangladesh
Are Stuck in Limbo Without an End in Sight, TIME (May 23, 2019),
https://time.com/longform/rohingya-muslims-exile-bangladesh/.
UNHRC Report, ¶ 1174,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
Feliz Solomon, ‘We’re Not Allowed to Dream.’ Rohingya Muslims Exiled to Bangladesh
Are Stuck in Limbo Without an End in Sight, TIME (May 23, 2019),
https://time.com/longform/rohingya-muslims-exile-bangladesh/.
CLASS ACTION COMPLAINT
Case No. __________________Page 44 cementing the demographic re-engineering of Rakhine State that
resulted from mass displacement. Much of this is being done under
the guise of ‘development,’ with a clear discourse emerging to this
effect in the immediate aftermath of the August 2017 ‘clearance
operations.’
110.
In addition to loss of life, physical injuries, emotional trauma, and destruction or
taking of property, Plaintiff and the Class have been deprived of their culture and community.
The Rohingya people have their own language, not spoken anywhere else in the world. They
have lost their traditional places of worship. Family and community ties dating back generations
have been torn apart.
111.
Having become refugees in foreign countries where they largely do not speak the
language, have no financial resources, and lack knowledge of the culture or legal system,
Plaintiff and the Class have been denied meaningful justice. Most Class members have been
attempting to recover from severe physical and/or emotional trauma and struggling to survive in
dangerous, overcrowded refugee camps in Bangladesh—thousands of miles from any court
having jurisdiction over Facebook—since they were forced from Myanmar.
112.
The U.N. concluded that “[t]he attack on the Rohingya population of Myanmar
was horrendous in scope. The images of an entire community fleeing from their homes across
rivers and muddy banks, carrying their babies and infants and elderly, their injured and dying,
will and must remain burned in the minds of the international community. So will the ‘before
and after’ satellite imagery, revealing whole villages literally wiped off the map. In much of
northern Rakhine State, every trace of the Rohingya, their life and community as it has existed
for decades, was removed…. The ‘clearance operations’ were indeed successful.”
113.
•
Among the U.N. Mission’s findings were:
The elements of the crime of genocide were satisfied. “The
Mission is satisfied that the Rohingya … constitute a protected
Report of the detailed findings of the Independent International Fact-Finding Mission on
Myanmar, UNITED NATIONS HUMAN RIGHTS COUNCIL (Sept. 16, 2019),
https://www.ohchr.org/Documents/HRBodies/HRCouncil/FFMMyanmar/20190916/A_HRC_42_CRP.5.pdf, (“UNHRC 2019 Report”) ¶ 139.
UNHRC Report, ¶ 1439,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
CLASS ACTION COMPLAINT
Case No. __________________Page 45 group.”135 “The gross human rights violations … suffered by the
Rohingya at the hands of the Tatmadaw and other security forces
(often in concert with civilians) include conduct that falls with four
of [the] five categories of prohibited acts,” including killings,
serious bodily and mental harm, conditions of life calculated to
physically destroy the Rohingya, and measures intended to prevent
births.136 “The Mission … concludes, on reasonable grounds, that
the factors allowing the inference of genocidal intent are
present.”
•
114.
“The Mission finds that crimes against humanity have been
committed in … Rakhine [State], principally by the Tatmadaw….
[T]hese include crimes against humanity of murder;
imprisonment[;] enforced disappearance; torture; rape, sexual
slavery and other forms of sexual violence; persecution; and
enslavement.”In its 2019 report, the U.N. Mission reaffirmed its earlier conclusions: “the
Mission concludes on reasonable grounds that, since the publication of the Mission’s
report, the Government has committed the crimes against humanity of ‘other inhumane acts’ and
‘persecution’ in the context of a continued widespread and systematic attack against the
Rohingya civilian population in furtherance of a State policy to commit such an attack.”
Furthermore, the Mission concluded that “the evidence supports an inference of genocidal intent
and, on that basis, that the State of Myanmar breached its obligation not to commit genocide
under the Genocide Convention under the rules of State responsibility.”
C.
Facebook’s Role in the 2017 “Clearance Operations”
115.
The U.N. Mission specifically found that Facebook had contributed to the
Clearance Operations:
The Mission has examined documents, … Facebook posts and
audio-visual materials that have contributed to shaping public
opinion on the Rohingya…. The analysis demonstrates that a
carefully crafted hate campaign has developed a negative
Case No. __________________Page 46 perception of Muslims among the broad population in Myanmar….
This hate campaign, which continues to the present day, portrays
the Rohingya … as an existential threat to Myanmar and to
Buddhism…. It is accompanied by dehumanising language and the
branding of the entire [Rohingya] community as ‘illegal Bengali
immigrants.’ This discourse created a conducive environment for
the 2012 and 2013 anti-Muslim violence in Rakhine State and
beyond, without strong opposition from the general population. It
also enabled the hardening of repressive measures against the
Rohingya and Kaman in Rakhine State and subsequent waves of
State-led violence in 2016 and 2017.
116.
The Guardian described the work of two analysts who noted a strong correlation
between the amount of hate speech on Facebook and the violence inflicted on the Rohingya in
late 2017:
Digital researcher and analyst Raymond Serrato examined about
15,000 Facebook posts from supporters of the hardline nationalist
Ma Ba Tha group. The earliest posts dated from June 2016 and
spiked on 24 and 25 August 2017, when ARSA Rohingya militants
attacked government forces, prompting the security forces to
launch the ‘clearance operation’ that sent hundreds of thousands of
Rohingya pouring over the border.
UNHRC Report, ¶ 696,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf (emphasis added).
CLASS ACTION COMPLAINT
Case No. __________________Page 47 Serrato’s analysis showed that activity within the anti-Rohingya
group, which has 55,000 members, exploded with posts registering
a 200% increase in interactions.
‘Facebook definitely helped certain elements of society to
determine the narrative of the conflict in Myanmar,’ Serrato told
the Guardian. ‘Although Facebook had been used in the past to
spread hate speech and misinformation, it took on greater potency
after the attacks.’
***
Alan Davis, an analyst from the Institute for War and Peace
Reporting who led a two-year study of hate speech in Myanmar,
said that in the months before August he noticed posts on
Facebook becoming ‘more organised and odious, and more
militarised.’
His research team encountered fabricated stories stating that
‘mosques in Yangon are stockpiling weapons in an attempt to blow
up various Buddhist pagodas and Shwedagon pagoda,’ the most
sacred Buddhist site in Yangon in a smear campaign against
Muslims. These pages also featured posts calling Rohingya the
derogatory term ‘kalars’ and ‘Bengali terrorists.’ Signs denoting
‘Muslim-free’ areas were shared more than 11,000 times.
***
Davis said … ‘I think things are so far gone in Myanmar right
now ... I really don’t know how Zuckerberg and co sleep at night.
If they had any kind of conscience they would be pouring a good
percentage of their fortunes into reversing the chaos they have
created.’
117.
2017:
•
The Myanmar military used Facebook to justify the “clearance operations” in
“In a post from the official Facebook page of the Office of the
Tatmadaw Commander-in-Chief, … [a Myanmar parable about a
camel which gradually takes more and more space in his
merchant’s tent, until eventually the merchant is forced out] was
explained in detail in connection with the issue of the Rohingya in
Rakhine State…. Prior to its deletion by Facebook in August 2018,
the post had almost 10,000 reactions, over 6,000 shares and comments.”
Libby Hogan, Michael Safi, Revealed: Facebook hate speech exploded in Myanmar
during Rohingya crisis, THE GUARDIAN (Apr. 2, 2018),
https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-exploded-inmyanmar-during-rohingya-crisis.
UNHRC Report, ¶ 1312,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
CLASS ACTION COMPLAINT
Case No. __________________Page 48
•
“[I]n a 21 September 2017 post on Facebook … [the Tatmadaw]
Commander-in-Chief … states that, ‘the Bengali population
exploded and the aliens tried to seize the land of the local
ethnics….’”
•
“[O]n 11 October 2017 the Commander-in-Chief … posted: ‘there
is exaggeration to say that the number of Bengali fleeing to
Bangladesh is very large.’ At the time more than 600,Rohingya had fled … Myanmar in a period of six weeks.”
•
“On 27 October 2017, in another Facebook post entitled ‘every
citizen has the duty to safeguard race, religion, cultural identities
and national interest,’ [the] Commander-in-Chief stated that ‘all
must … preserve the excellent characteristics of the country
…’”
118.
Rolling Stone reported that “[m]ore shocking was how [the military’s] bigoted
doctrine was parroted by Aung San Suu Kyi, the Nobel Peace Prize-winning human-rights icon
and de facto leader of Myanmar…. When she finally broke her silence, on Facebook, nearly two
weeks after the 2017 attacks began, it was in cold defense of the same military that kept her
under house arrest for 15 years when she was the country’s leading dissident. Suu Kyi blamed
‘terrorists’ for promoting a ‘huge iceberg of misinformation’ about the violence engulfing
Rakhine. She made no mention of the Rohingya exodus.”
119.
The U.N. additionally found that there was “no doubt that the prevalence of hate
speech in Myanmar significantly contributed to increased tension and a climate in which
individuals and groups may become more receptive to incitement and calls for violence. This
also applies to hate speech on Facebook.”148 In early 2018, U.N. investigator Yanghee Lee
warned that “Facebook has become a beast,” and that “we know that the ultra-nationalist
Id. ¶ 1338.
Id. ¶ 1339.
Id. ¶ 1341.
Jason Motlagh, The Survivors of the Rohingya Genocide, ROLLING STONE (Aug. 9, 2018),
https://www.rollingstone.com/politics/politics-features/rohingya-genocide-myanmar-701354/
(emphasis added).
UNHRC Report, ¶ 1354,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf (emphasis added).
CLASS ACTION COMPLAINT
Case No. __________________Page 49
Buddhists have their own Facebooks and are really inciting a lot of violence and a lot of hatred
against the Rohingya or other ethnic minorities.”
D.
Civilian Participation in the 2017 “Clearance Operations”
120.
The radicalization of the Burmese population, to which Facebook materially
contributed, did not merely ensure tolerance of and support for the military’s campaign of
genocide against the Rohingya, it also allowed the military to recruit, equip, and train “civilian
death squads” that would actively participate in the atrocities.
121.
The U.N. Mission drew a connection between anti-Rohingya reporting and hate
speech, ethnic tension, and the ability of the military to recruit non-Rohingya civilians to
perpetrate violence against the Rohingya, finding:
•
“The inflammatory nature of much of this reporting [on activities
of Rohingya militants], often characterizing Rohingya as ‘Bengali
terrorists,’ coupled with rising vitriolic discourse and hate speech
against the Rohingya, fuelled an already volatile situation.”
•
“[The reports] deepened inter-communal suspicion and fear. They
were likely a factor in a notable breakdown in the relationship
between the communities, particularly in the weeks leading up to
25 August 2017.”
•
“During this period [beginning in late 2016], the Myanmar
authorities made increasing efforts to recruit ethnic Rakhine as
members of the security apparatus…. Moreover, the recruitment of
non-Rohingya to Government supported militias … continued
throughout this period in Rakhine State.”
Libby Hogan, Michael Safi, Revealed: Facebook hate speech exploded in Myanmar
during Rohingya crisis, THE GUARDIAN (Apr. 2, 2018),
https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-exploded-inmyanmar-during-rohingya-crisis; see Tom Miles, U.N. investigators cite Facebook role in
Myanmar crisis, REUTERS (Mar. 12, 2018), https://www.reuters.com/article/us-myanmarrohingya-facebook-idUKKCN1GO2PN.
Jason Motlagh, The Survivors of the Rohingya Genocide, ROLLING STONE (Aug. 9, 2018),
https://www.rollingstone.com/politics/politics-features/rohingya-genocide-myanmar-701354/.
Case No. __________________Page 50
122.
In a 162-page report based on 254 interviews, the human rights group Fortify
Rights documented how, in August 2017, “Myanmar authorities … activated non-Rohingya
civilian squads, some of whom the authorities previously armed and/or trained. These civilian
perpetrators … acted under the Myanmar military and police in razing hundreds of Rohingya
villages throughout northern Rakhine State, brutally killing masses of unarmed Rohingya men,
women, and children.”154 The title of the report, “They Gave Them Long Swords,” referred to an
eyewitness account of Myanmar soldiers arming non-Rohingya civilians.
123.
In a chapter of its report entitled “Criminal Acts Against Rohingya by Civilian
Perpetrators Since August 25, 2017,” Fortify Rights stated:
After arming and training local non-Rohingya citizens who had a
demonstrated history of hostility toward Rohingya Muslims in
northern Rakhine State, the Myanmar authorities activated them on
August 25…. Groups of local non-Rohingya citizens, in some
cases trained, armed, and operating alongside Myanmar security
forces, murdered Rohingya men, women, and children, destroyed
and looted Rohingya property, and assisted the Myanmar Army
and Police in razing villages.
124.
In a Facebook post on September 22, 2017, the Burmese Commander-in-Chief
“encouraged further cooperation between local non-Rohingya citizens and the Myanmar
military, saying ‘[l]ocal ethnics can strengthen the defense prowess by living in unity and by
joining hands with the administrative bodies and security forces in oneness.’”
E.
Facebook Ignored Complaints of Hate Speech on its Website
125.
Because Myanmar’s history of repressive military rule and ethnic violence was
well-documented by the time Facebook became widely available in Myanmar around 2012,
Facebook should have known that its product could be used to spread hate speech and
Fortify Rights Report, at 12-13, 14,
https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf.
Id. at 16.
Id. at 55.
Id. at 46 & n.87 (citing Facebook post).
CLASS ACTION COMPLAINT
Case No. __________________Page 51
misinformation. In addition, beginning in 2013, Facebook was repeatedly alerted to hate speech
on its system:
•
In 2013, a new civil society organization called Panzagar, meaning
“flower speech,” was formed in Myanmar. 158 The group spoke out
locally about anti-Muslim hate speech directed at the Rohingya
minority that was proliferating on Facebook. One of the group’s
awareness-raising methods was to put flowers in their mouths to
symbolize speaking messages of peace versus hate. Panzagar
reported instances of hate speech to Facebook.
•
In November 2013, Aela Callan, an Australian documentary
filmmaker, “met at Facebook’s California headquarters with Elliott
Schrage, vice president of communications and public policy” to
discuss a project she had begun regarding “hate speech and false
reports that had spread online during conflicts between Buddhists
and Rohingya Muslims the prior year…. I was trying to alert him to
the problems she said….” But “[h]e didn’t connect me with anyone
inside Facebook who could deal with the actual problem….”
•
“On March 3, 2014, Matt Schissler [an American aid worker
working in Myanmar], was invited to join a call with Facebook on
the subject of dangerous speech online…. Toward the end of the
meeting, Schissler gave a stark recounting of how Facebook was
hosting dangerous Islamophobia. He detailed the dehumanizing and
disturbing language people were using in posts and the doctored
photos and misinformation being spread widely.”
•
By June 14, 2014, Al Jazeera had published an article entitled
“Facebook in Myanmar: Amplifying Hate Speech?” In that article,
a civil society activist was quoted as saying: “Since the violence in
Rakhine state began, we can see that online hate speech is spreading
and becoming more and more critical and dangerous…. I think
Facebook is the most effective way of spreading hate speech. It’s
already very widespread, infecting the hearts of people.” The article
cited Facebook posts reading: “We should kill every Muslim. No
Muslims should be in Myanmar”; “Why can’t we kick out the
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech.
Mary Michener Oye, Using ‘flower speech’ and new Facebook tools, Myanmar fights
online hate speech, THE WASHINGTON POST,
https://www.washingtonpost.com/national/religion/using-flower-speech-and-new-facebooktools-myanmar-fights-online-hate-sp
eech/
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 177 (HarperCollins 2021).
CLASS ACTION COMPLAINT
Case No. __________________Page 52 Muslim dogs?”; and “all terrorists are Muslim … they kill innocent
men and women so peace and Islam are not related.”
•
On August 18, 2014, PRI’s “The World” program published a story
entitled “In newly liberated Myanmar, hatred spreads on
Facebook.” After describing several false rumors that led to
violence, the article reported: “The pattern repeats in towns and
villages across Myanmar. Rumors rip through communities, fueled
by seething racism and embellishments. Graphic images of violence
are shared virally through social media platforms like Facebook,
which has become one of the most popular websites in the
country….”
•
After the March 2014 call with Schissler, “a handful of Facebook
employees started an informal working group to connect Facebook
employees in Menlo Park with activists in Myanmar.”164 Schissler
said that “between March and December 2014, he held [a series] of
discussions with Facebook officials…. He told them how the
platform was being used to spread hate speech and false rumors in
Myanmar, he said, including via fake accounts.”
•
“In March 2015, Schissler gave a talk at Facebook’s California
headquarters about new media, particularly Facebook, and antiMuslim violence in Myanmar.”166 “In a small conference room
where roughly a dozen Facebook employees had gathered, with
others joining by video-conference, he shared a PowerPoint
presentation that documented the seriousness of what was
happening in Myanmar: hate speech on Facebook was leading to
real-world violence in the country, and it was getting people
killed.”167 One Facebook employee asked whether Schissler thought
genocide could happen in Myanmar: “‘Absolutely’ he answered. If
Myanmar continued on its current path, and the anti-Muslim hate
speech grew unabated, a genocide was possible. No one followed
up on the question.”
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech (emphasis added).
Bridget DiCerto, In newly liberated Myanmar, hatred spreads on Facebook, THE WORLD
(Aug 8, 2014), https://www.pri.org/stories/2014-08-08/newly-liberated-myanmar-hatredspreads-facebook.
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 178 (HarperCollins 2021).
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
Id.
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 181 (HarperCollins 2021).
Id. at 181-82.
CLASS ACTION COMPLAINT
Case No. __________________Page 53
•
“‘They were warned so many times,’ said David Madden, a tech
entrepreneur who worked in Myanmar. He said he told Facebook
officials in 2015 that its platform was being exploited to foment
hatred in a talk he gave at its headquarters in Menlo Park,
California. About a dozen Facebook people attended the meeting in
person…. Others joined via video. ‘It couldn’t have been presented
to them more clearly, and they didn’t take the necessary steps,’
Madden said.”
•
Brooke Binkowski, who worked for an organization that did factchecking for Facebook beginning in early 2017, “said she tried to
raise concerns about misuse of the platform abroad, such as the
explosion of hate speech and misinformation during the Rohingya
crisis in Myanmar…. ‘I was bringing up Myanmar over and over
and over,’ she said. ‘They were absolutely resistant.’ Binkowski,
who previously reported on immigration and refugees, said
Facebook largely ignored her: ‘I strongly believe that they are
spreading fake news on behalf of hostile foreign powers and
authoritarian governments as part of their business model.’”
126.
Facebook’s response to such warnings about hate speech on its websites in Burma
was, however, utterly ineffective. The extreme import of what Matt Schissler was describing
“didn’t seem to register with the Facebook representatives. They seemed to equate the harmful
content with cyberbullying: Facebook wanted to discourage people from bulling across the
system, he said, and they believed that the same set of tools they used to stop a high school
senior from intimidating an incoming freshman could be used to stop Buddhist monks in
Myanmar from spreading malicious conspiracy theories about Rohingya Muslims.”
127.
Facebook had almost no capability to monitor the activity of millions of users in
Burma: “In 2014, the social media behemoth had just one content reviewer who spoke Burmese:
a local contractor in Dublin, according to messages sent by Facebook employees in the private
Facebook chat group. A second Burmese speaker began working in early 2015, the messages
show.” Accenture, to whom Facebook outsourced the task of monitoring for violations of its
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
Sam Levin, ‘They don’t care’: Facebook factchecking in disarray as journalists push to
cut ties, THE GUARDIAN (Dec. 13, 2018),
https://www.theguardian.com/technology/2018/dec/13/they-dont-care-facebook-fact-checkingin-disarray-as-journalists-push-
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 178 (HarperCollins 2021) (emphasis added).
CLASS ACTION COMPLAINT
Case No. __________________Page 54
community standards in Burma and other Asian countries, did not hire its first two Burmese
speakers, who were based in Manila, until 2015. Former monitors “said they didn’t actually
search for hate speech themselves; instead, they reviewed a giant queue of posts mostly reported
by Facebook users.” Chris Tun, a Deloitte consultant who had arranged meetings between the
Burmese government and Facebook, told Reuters: “Honestly, Facebook had no clue about
Burmese content. They were totally unprepared.”
128.
Instead, Facebook tried initially to rely entirely on users to report inappropriate
posts. However, “[a]lthough Myanmar users at the time could post on Facebook in Burmese, the
platform’s interface – including its system for reporting problematic posts – was in English.”
129.
In one case in 2018, Mark Zuckerberg was forced to apologize for exaggerating
Facebook’s monitoring capabilities. In an interview with Vox, Zuckerberg cited “one incident
where Facebook detected that people were trying to spread ‘sensational messages’ through
Facebook Messenger to incite violence on both sides of the conflict” but claimed that “the
messages were detected and stopped from going through.”174 In response, a group of activists
issued an open letter criticizing Zuckerberg and pointing out that Facebook had not detected the
messages; rather, the activists had “flagged the messages repeatedly to Facebook, barraging its
employees with strongly worded appeals until the company finally stepped in to help.”
Zuckerberg apologized in an email: “I apologize for not being sufficiently clear about the
important role that your organizations play in helping us understand and respond to Myanmar-
related issues, including the September incident you referred to.”
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
Id.
Jen Kirby, Mark Zuckerberg on Facebook’s role in ethnic cleansing in Myanmar: ‘It’s a
real issue’, VOX (Apr. 2, 2018), https://www.vox.com/2018/4/2/17183836/mark-zuckerbergfacebook-myanmar-rohingya-ethnic-cleansing-genocide.
Kevin Roose, Paul Mozur, Zuckerberg Was Called Out Over Myanmar Violence. Here’s
His Apology, NEW YORK TIMES (Apr. 9, 2018),
https://www.nytimes.com/2018/04/09/business/facebook-myanmar-zuckerberg.html.
CLASS ACTION COMPLAINT
Case No. __________________Page 55
130.
Reuters’ Steve Stecklow sent the examples of hate speech that he and his team
had found on the system, some of which was “extremely violent and graphic,” to Facebook: “It
was sickening to read…. When I sent it to Facebook, I put a warning on the email saying I just
want you to know these are very disturbing things…. What was so remarkable was that [some
of] this had been on Facebook for five years and it wasn’t until we notified them in August [of
2018] that it was removed.”
131.
“The [U.N.] Mission itself experienced a slow and ineffective response from
Facebook when it used the standard reporting mechanism to alert the company to a post targeting
a human rights defender for his alleged cooperation with the Mission.” “The post described the
individual as a ‘national traitor,’ consistently adding the adjective ‘Muslim.’ It was shared and
reposted over 1,000 times. Numerous comments to the post explicitly called for the person to be
killed, in unequivocal terms: … ‘If this animal is still around, find him and kill him….’ ‘He is a
Muslim. Muslims are dogs and need to be shot….’ ‘Remove his whole race.’ … In the weeks
and months after the post went online, the human rights defender received multiple death threats
from Facebook users….” “The Mission reported this post to Facebook on four occasions; in each
instance the response received was that the post was examined but ‘doesn’t go against one of
[Facebook’s] specific Community Standards.’ … The post was finally removed several weeks
later but only through the support of a contact at Facebook, not through the official channel.
Several months later, however, the Mission found at least 16 re-posts of the original post still
circulating on Facebook.”
132.
On February 25, 2015, Susan Benesch, a human rights lawyer and researcher who
directs the Dangerous Speech Project at the Berkman Klein Center for Internet & Society at
Harvard University, gave a presentation entitled “The Dangerous Side of Language” at
Facebook. The presentation showed how anti-Rohingya speech being disseminated by Facebook
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
UNHRC Report, ¶ 1351,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
CLASS ACTION COMPLAINT
Case No. __________________Page 56 in Myanmar was not merely hate speech but “Dangerous speech” that “Moves an audience to
condone or take part in violence”: ‘
“They are breeding so
fast, and they are
stealing our women,
raping them...
We must keep
Myanmar Buddhist."
Wirathu
133.
Even aer
the atrocities in late 2017, Facebook refused to help obtain justice for
the Rohingya:
In late September 2018, Matthew Smith, the CEO of Fortify
Rights, a human rights organization based in Southeast Asia, began
to work with human rights groups to build a case strong enough for
the International Criminal Court, at the Hague, proving that
Burmese soldiers had violated international laws and perpetuated a
genocide against the Rohingya. . The platform held detailed
information on all its user accounts; even when posts were deleted,
Facebook kept a record of everything a person had ever written,
and every image uploaded. . Most Burmese soldiers had
Facebook on their phones, so the company would have records of
the locations of army units’ soldiers to match with attacks on
Rohingya villages.
..
..
*
II II
If Smith and other human rights workers could get their hands on
the deleted posts, they could build a stronger case documenting
how Myanmar’s military had both carried out a genocide against
"
at
Susan Benesch, The Dangerous Side ofLanguage, Dangerous Speech Project, available
https://www.dropbox.com/s/tazw9elxptquugfI'he%20Dangerous%20Side%200f’/020Language.p
df?dl=0&tbclid=IwAR1thI4-
BHCawG6g00pX3ManYAMN6IKd8kY8sB4x76nq66vihlAdlEv0As.
Cuss ACHON COMPLAINT
Case No. Page 57 the Rohingya and manipulated the public into supporting their
military onslaught.
Apparently not eager to help prove that Facebook had been
complicit in genocide, Facebook’s lawyers denied Smith’s requests
for access to the data: “‘Facebook had the chance to do the right
thing again and again, but they didn’t. Not in Myanmar,’ said
Smith. ‘It was a decision, and they chose not to help.’”
134.
participating in disinformation efforts that led to the genocide.
Worst yet, Facebook’s activity promoted such content to its users, thus actively
135.
Facebook ultimately ratified its conduct and its involvement of the genocide and
violence in Burma by admitting shortcomings of its system.
F.
Facebook Admits That It Had a Responsibility to Prevent Its Product
From Being Used to Incite Violence and Genocide
136.
In 2018, after the “clearance operations,” several senior Facebook executives,
including Mark Zuckerberg, belatedly admitted that the company had a responsibility to prevent
its product from being used to incite violence in Burma and should have done more in that
regard. On April 10, 2018, Zuckerberg testified before the U.S. Senate:
SEN. PATRICK LEAHY: ... [S]ix months ago, I asked your
general counsel about Facebook’s role as a breeding ground for
hate speech against Rohingya refugees. Recently, U.N.
investigators blamed Facebook for playing a role in inciting
possible genocide in Myanmar. And there has been genocide
there….
This is the type of content I’m referring to. It calls for the death of
a Muslim journalist. Now, that threat went straight through your
detection system, it spread very quickly, and then it took attempt
after attempt after attempt, and the involvement of civil society
groups, to get you to remove it.
Why couldn’t it be removed within 24 hours?
ZUCKERBERG: Senator, what’s happening in Myanmar is a
terrible tragedy, and we need to do more....
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 185-86 (HarperCollins 2021).
Id. at 186-87.
CLASS ACTION COMPLAINT
Case No. __________________Page 58 LEAHY: We all agree with that.
137.
In a statement to Reuters, Mia Garlick, Facebook’s director of Asia Pacific
Policy, stated: “We were too slow to respond to concerns raised by civil society, academics and
other groups in Myanmar. We don’t want Facebook to be used to spread hatred and incite
violence. This … is especially true in Myanmar where our services can be used to amplify hate
or exacerbate harm against the Rohingya.”
138.
We have a responsibility to fight abuse on Facebook. This is
especially true in countries like Myanmar where many people are
using the internet for the first time and social media can be used to
spread hate and fuel tension on the ground.
The ethnic violence in Myanmar is horrific and we have been too
slow to prevent misinformation and hate on Facebook.
In August 2018, Sara Su, a Product Manager, posted on Facebook’s blog:
139.
On September 5, 2018, Facebook COO Sheryl Sandberg testified before the U.S.
Senate:
SEN. MARK WARNER: … Ms. Sandberg, you made mention in
your opening testimony the fact that sometimes political actors are
using the platforms really to incent violence. I mean, I think
you’ve made at least some reference, mention of Myanmar. We’ve
obviously seen a great tragedy take place there where hundreds of
thousands of Rohingya Muslims are fleeing and in many ways.
The U.N. High Commissioner has said that fake accounts on
Facebook have incented that violence. Do you believe that
Facebook has both a moral obligation and potentially even a
legal obligation to take down accounts that are actually
incentivizing violence?
Facebook, Social Media Privacy, and the Use and Abuse of Data, Senate Hearing 115683 before the Comm. on Commerce, Science, and Transportation, et al., 115th Cong. (Apr. 10,
2018), https://www.govinfo.gov/content/pkg/CHRG-115shrg37801/html/CHRG115shrg37801.htm.
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/
(emphasis added).
Sara Su, Update on Myanmar, FACEBOOK NEWSROOM (Aug. 15, 2018),
https://about.fb.com/news/2018/08/update-on-myanmar/.
CLASS ACTION COMPLAINT
Case No. __________________Page 59 SHERYL SANDBERG: I strongly believe that. In the case of
what’s happened in Myanmar, it’s, it’s devastating and we’re
taking aggressive steps and we know we need to do more….
140.
In October 2018, BSR (Business for Social Responsibility) published a human
rights impact assessment—commissioned by Facebook itself—of Facebook’s presence in
Burma; BSR found that:
•
“Facebook is … used to spread rumors about people and events.
Character assassinations were described to BSR during this
assessment, and in extreme cases these have extended to online
death threats…. There are indications that organized groups make
use of multiple fake accounts and news pages to spread hate
speech, fake news, and misinformation for political gain. Rumors
spread on social media have been associated with communal
violence and mob justice.”
•
“The Facebook platform in Myanmar is being used by bad actors
to spread hate speech, incite violence, and coordinate harm….
Facebook has become a means for those seeking to spread hate and
cause harm, and posts have been linked to offline violence…. [F]or
example, the Report of the Independent International Fact-Finding
Mission on Myanmar describes how Facebook has been used by
bad actors to spread anti-Muslim, anti-Rohingya, and anti-activist
sentiment.”
•
“The consequences for the victim are severe, with lives and bodily
integrity placed at risk from incitement to violence.”
141.
On November 5, 2018, Alex Warofka, Facebook’s Product Policy Manager,
issued a statement on the BSR report: “The report concludes that, prior to this year, we weren’t
doing enough to help prevent our platform from being used to foment division and incite offline
violence. We agree that we can and should do more.”
Open Hearing on Foreign Influence Operations’ Use of Social Media Platforms, Senate
Hearing 115-460 before the Select Comm. of Intel., 115th Cong. (Sept. 5, 2018),
https://www.govinfo.gov/content/pkg/CHRG-115shrg31350/html/CHRG-115shrg31350.htm.
BSR Report, at 13, https://about.fb.com/wp-content/uploads/2018/11/bsr-facebookmyanmar-hria_final.pdf.
Id. at 24.
Id. at 35.
Alex Warofka, An Independent Assessment of the Human Rights Impact of Facebook in
Myanmar, FACEBOOK NEWSROOM (Nov. 5, 2018), https://about.fb.com/news/2018/11/myanmarhria/.
CLASS ACTION COMPLAINT
Case No. __________________Page 60
142.
In October 2021, a former member of Facebook’s Integrity Team submitted a
sworn whistleblower declaration to the SEC. It stated, inter alia:
At Facebook, … there’s no will to actually fix problems, in
particular if doing so might reduce user engagement, and therefore
profits….
Any projects Facebook undertakes under the banner of charity or
community building are actually intended to drive engagement….
Internet.org, Facebook’s scheme to provide Internet to the
developing world, wasn’t about charity…. Inside the company,
the dialogue was that this is about gaining an impenetrable
foothold in order to harvest data from untapped markets.
Through Internet.org, which provided Facebook at free or greatly
reduced rates in key markets, Facebook effectively became the
Internet for people in many developing countries…. [Facebook
executives would] say ‘When you are the sole source for the
Internet you are the sole source for news.’
Facebook executives often use data to confuse, rather than clarify
what is occurring. There is a conscious effort to answer questions
from regulators in ways that intentionally downplay the severity of
virtually any given issue….
An[] example of their playbook played out in the wake of the
genocide of the Rohingya refugees in Myanmar, a country where
Facebook was effectively the Internet for most people, and where
the long-isolated population was vulnerable to information
manipulation. Facebook executives were fully aware that posts
ordering hits by the Myanmar government on the minority
Muslim Rohingya were spreading wildly on Facebook, because
it was being reported in the media and multiple aid-organizations,
as well as major, top-tier reporters who used to call the company
when they discovered early on that the genocide was being
accommodated on Facebook. It was clear before the killing even
started that members of the military junta in Myanmar were
directing this activity. But, when the violence of the early stages of
the Myanmar government-directed genocide metastasized and the
murders were unmistakably being directed on Facebook, I was
instructed to tell the media, “We know now, and we finally
managed to remove their access, but we did not have enough
Burmese-speaking moderators.” This part was true; there was only
one Burmese translator on the team of moderators for years, in the
same period when the communications apparatus grew by leaps
and bounds. But the issue of the Rohingya being targeted on
Facebook was well known inside the company for years. I
refused to deploy the approved talking point.
Later, after widespread public blowback forced the company to
hire a human rights group to conduct an independent review,
Facebook’s policy manager Alex Warofka released a statement
with the typical Facebook ‘mea culpa’ response: ‘We agree that we
CLASS ACTION COMPLAINT
Case No. __________________Page 61 can and should do more.’ I quickly realized that the company was
giving a PR response to a genocide that they accommodated—that,
I, working for Facebook, had been a party to genocide. This is
what prompted me to look for another job.
143.
Facebook’s subsequent actions prove that, for an investment amounting to a
miniscule portion of the company’s vast resources, 190 the company could have blocked much of
the hate speech against the Rohingya. In August 2018, Facebook posted on its website:
The ethnic violence in Myanmar has been truly horrific…. While
we were too slow to act, we’re now making progress—with better
technology to identify hate speech, improved reporting tools, and
more people to review content.
Today, we are taking more action in Myanmar, removing a total of
18 Facebook accounts, one Instagram account and 52 Facebook
Pages, followed by almost 12 million people. We are preserving
data, including content, on the accounts and Pages we have
removed.
144.
In December 2018, Facebook updated its blog to report that it removed an
accounts in Myanmar for engaging in coordinated inauthentic behavior on Facebook…. [W]e
discovered that these seemingly independent news, entertainment, beauty and lifestyle Pages
were linked to the Myanmar military.”
145.
Rosa Birch, head of Facebook’s Strategic Response Team, told NBC in 2019 that
“the team worked on a new tool that allows approved non-governmental organizations to flag
problematic material they see on Facebook in a way that is seen more quickly by the company
than if a regular user reported the material. ‘It sounds relatively simple, and something that we
Emphasis added.
Between 2011 and 2017, Facebook reported revenues of $115,357,000,000 and net
income of $34,893,000,000. Facebook: annual revenue and net income 2007-2020, STATISTICA
RESEARCH DEPARTMENT (Feb. 5, 2021), https://www.statista.com/statistics/277229/facebooksannual-revenue-and-net-income/.
Removing Myanmar Military Officials From Facebook, FACEBOOK NEWSROOM (Aug. 28,
2018), https://about.fb.com/news/2018/08/removing-myanmar-officials/ (emphasis added).
Id.
CLASS ACTION COMPLAINT
Case No. __________________Page 62
should have done a couple of years ago,’ she said.”193 “When hate speech against the Rohingya
minority in Myanmar spread virulently via Facebook in Burmese (a language spoken by some
million people) Facebook was slow to act because it had no hate-speech detection algorithm in
Burmese, and few Burmese-speaking moderators. But since the Rohingya genocide, Facebook
has built a hate-speech classifier in Burmese by pouring resources toward the project. It paid to
hire 100 Burmese-speaking content moderators, who manually built up a dataset of Burmese hate
speech that was used to train an algorithm.”
146.
Recent revelations show, however, that Facebook continues to ignore the harm its
algorithms and product inflict in developing countries. A September 2021 Wall Street Journal
article based on leaked internal Facebook documents reported:
Facebook treats harm in developing countries as ‘simply the cost
of doing business’ in those places, said Brian Boland, a former
Facebook vice president who oversaw partnerships with internet
providers in Africa and Asia before resigning at the end of last
year. Facebook has focused its safety efforts on wealthier markets
with powerful governments and media institutions, he said, even as
it has turned to poorer countries for user growth.
‘There is very rarely a significant, concerted effort to invest in
fixing those areas,’ he said.
***
An internal Facebook report from March said actors including
some states were frequently on the platform promoting violence,
exacerbating ethnic divides and delegitimizing social institutions.
‘This is particularly prevalent—and problematic—in At Risk
Countries,’ the report says.
It continues with a header in bold: ‘Current mitigation strategies
are not enough.’
David Ingram, Facebook’s new rapid response team has a crucial task: Avoid fueling
another genocide, NBC (June 20, 2019), https:/www.nbcnews.com/tech/tech-news/facebook-snew-rapid-response-team-has-crucial-task-avoid-n1019821 (emphasis added).
Billy Perrigo, Facebook Says It’s Removing More Hate Speech Than Ever Before, But
There’s a Catch, TIME (Nov. 27, 2019), https://time.com/5739688/facebook-hate-speechlanguages/.
Justin Scheck, Newley Purnell, Jeff Horwitz, Facebook Employees Flag Drug Cartels
and Human Traffickers. The Company’s Response Is Weak, Documents Show, WALL STREET
JOURNAL (Sept. 16, 2021), https://www.wsj.com/articles/facebook-drug-cartels-humantraffickers-response-is-weak-documents-11631812953.
CLASS ACTION COMPLAINT
Case No. __________________Page 63
147.
The Wall Street Journal article relates one example indicating that Facebook has
learned nothing from its experience in Burma:
In Ethiopia, armed groups have used Facebook to incite violence.
The company’s internal communications show it doesn’t have
enough employees who speak some of the relevant languages to
help monitor the situation. For some languages, Facebook also
failed to build automated systems, called classifiers, that could
weed out the worst abuses….
***
In a December planning document, a Facebook team wrote that the
risk of bad consequences in Ethiopia was dire…. It said in some
high-risk places like Ethiopia, ‘Our classifiers don’t work, and
we’re largely blind to problems on our site.’
Groups associated with the Ethiopian government and state media
posted inciting comments on Facebook against the Tigrayan
minority, calling them ‘hyenas’ and ‘a cancer.’ Posts accusing
Tigrayans of crimes such as money laundering were going viral,
and some people on the site said the Tigrayans should be wiped
out.
Violence escalated toward the end of last year, when the
government launched an attack on the Tigray capital, Mekelle.
Secretary of State Antony Blinken said in March that Tigrayans are
victims of ethnic cleansing.
148.
Whistleblower Francis Haugen echoed this sentiment, noting that Facebook’s
efforts to train its systems in non-English languages are severely lacking, stating “[o]ne of the
core things that I’m trying to draw attention to is the underinvestment in languages that aren’t
English.… Unfortunately the most fragile places in the world are the most diverse when it comes
to languages.” She goes on to say “I saw a pattern of behavior where I believed there was no
chance that Facebook would be able to solve these problems in isolation … I saw what I feared
was going to happen continue to unfurl … I knew I could never live with myself if I watched
million, 20 million people over the next 20 years die because of violence that was facilitated by
Id.
CLASS ACTION COMPLAINT
Case No. __________________Page 64
social media.”
149.
Facebook’s admissions that it should have done more to prevent the genocide in
Burma—and its subsequent efforts, if any—came too late for the tens of thousands of Rohingya
who have been murdered, raped, and tortured, and for the hundreds of thousands who are now
living in squalid refugee camps and displaced from their home across the world.
FACTS SPECIFIC TO JANE DOE
150.
Rakhine State, Burma.
Plaintiff Jane Doe is a Rohingya Muslim woman who previously lived in the
151.
In 2012, Plaintiff was about 16 years old, her father was detained, beaten, and
tortured for two weeks by the Myanmar military.
152.
Around the same time, many young Rohingya girls in Plaintiff’s village and
nearby villages were being taken from their families. Members of the Myanmar military came to
Plaintiff’s village, and anyone who left their homes was killed. Plaintiff saw at least seven men
killed, as well as an elderly woman. Plaintiff knew that many others in her village were also
killed, including women and children, but she could only see those directly in the vicinity of her
home.
153.
Fearful that she would be abducted and sexually assaulted or killed herself,
Plaintiff’s family eventually urged her to flee Burma alone.
154.
Plaintiff joined a group of Rohingya fleeing by boat to Bangladesh. She traveled
to Thailand and then Malaysia, where the UNHCR eventually arranged for her resettlement in
the United States.
155.
Plaintiff is gravely concerned about her parents and her sisters, who remain in
Burma. Their homes and the small store that was their livelihood were destroyed during ethnic
violence. Plaintiff’s family land, home, and personal property were eventually seized and those
that remained behind in Burma were forced from their homes. They lack any reliable source of
Giulia Saudelli, Facebook whistleblower warns company is neglecting languages other
than English, DW, https://www.dw.com/en/facebook-whistleblower-warns-company-isneglecting-languages-other-than-english/a-59739260.
CLASS ACTION COMPLAINT
Case No. __________________Page 65
income and live in constant fear of further attacks by the Myanmar military or by Buddhist
monks.
156.
Plaintiff also has an aunt and uncle who fled to a refugee camp in Bangladesh,
where they have remained for several years.
157.
Plaintiff remains traumatized by the ethnic violence and threats of violence
inflicted on her and her family.
158.
Plaintiff did not learn that Facebook’s conduct was a cause of her injuries until
2021. A reasonable investigation by Plaintiff into the causes of her injuries would not have
revealed this information prior to 2021 because Facebook’s role in the Rohingyan genocide was
not widely known or well understood within the Rohingya community. Further, even if such
information was known to various journalists or investigators at earlier points in time, Plaintiff’s
ability to discover such information was significantly hindered by her inability to read or write.
CLASS ACTION ALLEGATIONS
159.
Class Definition. Plaintiff seeks to represent the following proposed Class
pursuant to California Code of Civil Procedure § 382:
All Rohingya who left Burma (Myanmar) on or after June 1, 2012,
and arrived in the United States under refugee status, or who
sought asylum protection, and now reside in the United States.
The following are excluded from the Class: (1) any Judge or Magistrate presiding over this
action and members of their families; (2) Defendant, Defendant’s subsidiaries, parents,
successors, predecessors, and any entity in which Defendant or its parents have a controlling
interest, and its current or former employees, officers, or directors; (3) Plaintiff’s counsel and
Defendant’s counsel; and (4) the legal representatives, successors, and assigns of any such
excluded person.
160.
Ascertainability and Numerosity. The Class is so numerous that joinder of all
members is impracticable. At least 10,000 members of the Class reside in the United States.
Class members are ascertainable and can be identified through public records.
CLASS ACTION COMPLAINT
Case No. __________________Page 66
161.
Commonality and Predominance. There are many questions of law and fact
common to the claims of Plaintiff and the Class and those questions predominate over any
questions that may affect individual members of the Class. These common questions of law and
fact include:
•
Whether Facebook (the product) contains design defects that
harmed Rohingya Muslims, and, if so, whether Facebook (the
company) is strictly liable for them;
•
Whether Facebook owed a duty of care to Rohingya Muslims
when entering the Burmese market;
•
Whether Facebook breached any duty of care to Rohingya
Muslims in the way it operated in Burma; and
•
Whether Facebook’s Burmese operations caused harm to Rohingya
Muslims.
162.
Typicality. Plaintiff’s claims are typical of the claims of all members of the
Class. Plaintiff and the other Class members sustained damages as a result of Defendant’s
uniform wrongful conduct.
163.
Adequacy. Plaintiff will fairly and adequately protect the interests of the Class.
Plaintiff has retained counsel with substantial experience in prosecuting complex class actions
and particular expertise in litigation involving social media. Plaintiff and her counsel are
committed to vigorously prosecuting the action on behalf of the Class and have the resources to
do so. Neither the Plaintiff nor her counsel have any interests adverse to those of the other
members of the Class. Defendant has no defenses unique to Plaintiff.
164.
Superiority. A class action is superior to all other available methods for the fair
and efficient adjudication of this controversy and joinder of all members of the Class is
impracticable. The members of the proposed Class are, by definition, recent immigrants and lack
the tangible resources, language skills, and cultural sophistication to access and participate
effectively in the prosecution of individual lawsuits in any forum having jurisdiction over
Defendant. A class action in which the interests of the Class are advanced by representative
parties therefore provides the greatest chance for individual Class members to obtain relief.
CLASS ACTION COMPLAINT
Case No. __________________Page 67
Moreover, duplicative individual litigation of the complex legal and factual controversies
presented in this Complaint would increase the delay and expense to all parties and impose a
tremendous burden on the courts. By contrast, a class action would reduce the burden of case
management and advance the interests of judicial economy, speedy justice, and uniformity of
decisions.
FIRST CAUSE OF ACTION
STRICT PRODUCT LIABILITY
165.
Plaintiff incorporates the foregoing allegations as if fully set forth herein.
166.
Facebook makes its social media product widely available to users around the
167.
Facebook designed its system and the underlying algorithms and in a manner that
world.
rewarded users for posting, and thereby encouraged and trained them to post, increasingly
extreme and outrageous hate speech, misinformation, and conspiracy theories attacking
particular groups.
168.
The design of Facebook’s algorithms and product resulted in the proliferation and
intensification of hate speech, misinformation, and conspiracy theories attacking the Rohingya in
Burma, radicalizing users, causing injury to Plaintiff and the Class, as described above.
Accordingly, through the design of its algorithms and product, Facebook (1) contributed to the
development and creation of such hate speech and misinformation and (2) radicalized users,
causing them to tolerate, support, and even participate in the persecution of and ethnic violence
against Plaintiff and the Class.
169.
Because (1) the persecution of the Rohingya by the military government was
widely known before Facebook launched its product in Burma and (2) Facebook was repeatedly
warned after the launch that hate speech and misinformation on the system was likely to result in
ethnic violence, Facebook knew and had reason to expect that the Myanmar military and non-
Rohingya civilians would engage in violence and commit atrocities against Plaintiff and the
Class.
CLASS ACTION COMPLAINT
Case No. __________________Page 68
170.
Moreover, the kind of harm resulting from the ethnic violence committed by the
Myanmar military and their non-Rohingya supporters is precisely the kind of harm that could
have been reasonably expected from Facebook’s propagation and prioritization of anti-Rohingya
hate speech and misinformation on its system—e.g., wrongful death, personal injury, pain and
suffering, emotional distress, and property loss.
171.
The dangers inherent in the design of Facebook’s algorithms and product
outweigh the benefits, if any, afforded by that design.
172.
Plaintiff and the Class are entitled to actual damages proximately caused by the
defective design of Facebook’s algorithms and system.
173.
Plaintiff and the Class are further entitled to punitive damages caused by
Facebook’s failure to correct or withdraw its algorithms and product after Facebook knew about
their defects.
SECOND CAUSE OF ACTION
NEGLIGENCE
174.
Plaintiff incorporates the foregoing allegations as if fully set forth herein.
175.
When operating in Burma—as everywhere—Facebook had a duty to use
reasonable care to avoid injuring others.
176.
Facebook breached this duty by—among other things—negligently designing its
algorithms to fill Burmese users’ News Feeds (especially users particularly susceptible to such
content) with disproportionate amounts of hate speech, misinformation, and other content
dangerous to Plaintiff and the Class; negligently contributing to the creation of hate speech,
misinformation, and other content dangerous to Plaintiff and the Class by rewarding (and thus
encouraging) users to post ever more extreme content; negligently failing to remove such
dangerous content from its system after having been repeatedly warned of the potential for such
content to incite violence; negligently making connections between and among violent
extremists and susceptible potential violent actors; and negligently allowing users to use
CLASS ACTION COMPLAINT
Case No. __________________Page 69
Facebook in a manner that Facebook knew or should have known would create an unreasonable
risk to Plaintiff and the Class.
177.
Because (1) the persecution of the Rohingya by the military government was
widely known before Facebook launched its product in Burma and (2) Facebook was repeatedly
warned after the launch that hate speech and misinformation on the system was likely to result in
ethnic violence, Facebook knew and had reason to expect that the proliferation of such content
on its system could incite and facilitate violence and atrocities by the Myanmar military and non-
Rohingya civilians against Plaintiff and the Class.
178.
Moreover, the kind of harm resulting from the ethnic violence committed by the
Myanmar military and their non-Rohingya supporters is precisely the kind of harm that could
have been reasonably expected from Facebook’s negligent propagation and prioritization of anti-
Rohingya hate speech and misinformation on its system—e.g., wrongful death, personal injury,
pain and suffering, emotional distress, and property loss.
179.
Facebook’s acts and omissions in breach of its duty of care were a proximate
cause of the persecution of and ethnic violence against—and resulting injuries to—Plaintiff and
the Class.
180.
Plaintiff and the Class are entitled to actual damages proximately caused by
Facebook’s negligence of its algorithms and product.
181.
Plaintiff and the Class are further entitled to punitive damages caused by
Facebook’s failure to correct or withdraw its algorithms and system after Facebook knew about
their defects.
PRAYER FOR RELIEF
WHEREFORE, Plaintiff Jane Doe, on behalf of herself and the Class, respectfully requests
that this Court enter an Order:
A.
Certifying the case as a class action on behalf of the Class, as defined above,
appointing Plaintiff Jane Doe as representative of the Class, and appointing her counsel as Class
Counsel;
CLASS ACTION COMPLAINT
Case No. __________________Page 70
B.
Declaring that Defendant is strictly liable for defects, as described above, in its
algorithms and system; and that Defendant, as described above, acted negligently;
C.
Awarding the Class compensatory damages for wrongful death, personal injury,
pain and suffering, emotional distress, and loss of property, in the amount of at least $
billion;
D.
Awarding Plaintiff and the Class punitive damages in an amount to be determined
E.
Awarding Plaintiff and the Class their reasonable litigation expenses and
at trial.
attorneys’ fees;
F.
Awarding the Plaintiff and the Class pre- and post-judgment interest, to the extent
allowable; and
G.
Awarding such other and further relief as equity and justice may require.
JURY TRIAL
Plaintiff demands a trial by jury for all issues so triable.
Respectfully Submitted,
JANE DOE, individually and on behalf of all
others similarly situated,
Dated: December 6,
By:
One of Plaintiff’s Attorneys
Rafey S. Balabanian (SBN 315962)
rbalabanian@edelson.com
EDELSON PC
150 California Street, 18th Floor
San Francisco, California Tel: 415.212.Fax: 415.373.
Jay Edelson*
jedelson@edelson.com
J. Eli Wade-Scott*
ewadescott@edelson.com
Michael Ovca*
CLASS ACTION COMPLAINT
Case No. __________________Page 71 movca@edelson.com
EDELSON PC
350 North LaSalle, 14th Floor
Chicago, Illinois Tel: 312.589.Fax: 312.589.
Richard Fields*
fields@fieldslawpllc.com
Edward Han*
edhan@fieldslawpllc.com
Martin Cunniff*
martincunniff@fieldslawpllc.com
Awista Ayazi*
ayazi@fieldslawpllc.com
FIELDS PLLC
1701 Pennsylvania Avenue, NW, Suite Washington, DC Tel: 833.382.
Counsel for Plaintiff and the Proposed Class
*Admission pro hac vice to be sought.
CLASS ACTION COMPLAINT
Case No. __________________
PDF Page 1
PlainSite Cover Page
PDF Page 2
1
2
3
4
5
Rafey S. Balabanian (SBN 315962)
rbalabanian@edelson.com
EDELSON PC
150 California Street, 18th Floor
San Francisco, California 94111
Tel: 415.212.9300
Fax: 415.373.9435
12/6/2021
8
Richard Fields (pro hac vice admission to be sought)
fields@fieldslawpllc.com
FIELDS PLLC
1701 Pennsylvania Avenue, NW, Suite 200
Washington, DC 20006
Tel: 833.382.9816
9
Counsel for Plaintiff and the Proposed Class
6
7
10
SUPERIOR COURT OF THE STATE OF CALIFORNIA
FOR THE COUNTY OF SAN MATEO
11
21-CIV-06465
Case No. _________________
JANE DOE, individually and on behalf of all
others similarly situated,
12
13
CLASS ACTION COMPLAINT FOR:
Plaintiff,
14
(1) STRICT PRODUCT LIABILITY
(2) NEGLIGENCE
v.
15
16
META PLATFORMS, INC. (f/k/a Facebook,
Inc.), a Delaware corporation,
17
JURY DEMAND
Defendant.
18
19
20
Plaintiff Jane Doe, on behalf of herself and on behalf of a Class defined below, brings
21
this Class Action Complaint and Demand for Jury Trial against Defendant Meta Platforms, Inc.
22
(f/k/a Facebook, Inc. and d/b/a “Facebook”) 1 for compensatory damages, in excess of $150
23
billion, in addition to punitive damages in an amount to be determined at trial. Plaintiff, for her
24
Complaint, alleges as follows:
25
26
27
28
Defendant Meta Platforms, Inc. is referred to throughout this Complaint as “Meta” or
“Facebook.”
1
CLASS ACTION COMPLAINT
1
Case No. __________________
PDF Page 3
1
DETERMINING FOREIGN LAW - NOTICE
2
Plaintiff hereby gives notice that, to the extent Defendant Meta Platforms raises the
3
Communications Decency Act, 47 U.S.C. § 230, as a defense to the claims asserted below, and
4
to the extent that the Court were to find that the Communications Decency Act conflicts with
5
Burmese law, Burmese law applies. Burmese law does not immunize social media companies for
6
their role in inciting violence and contributing to genocide.
7
INTRODUCTION
8
9
1.
The Rohingya people, a Muslim minority historically living in present-day Burma
(internally renamed Myanmar following a military coup),2 number over 1 million and are the
10
largest stateless population in the world. While the Rohingya have long been the victims of
11
discrimination and persecution, the scope and violent nature of that persecution changed
12
dramatically in the last decade, turning from human rights abuses and sporadic violence into
13
terrorism and mass genocide.
14
2.
A key inflection point for that change was the introduction of Facebook into
15
Burma in 2011, which materially contributed to the development and widespread dissemination
16
of anti-Rohingya hate speech, misinformation, and incitement of violence—which together
17
amounted to a substantial cause, and perpetuation of, the eventual Rohingya genocide. A
18
stunning declaration of a former Facebook employee now turned whistleblower, states
19
“Facebook executives were fully aware that posts ordering hits by the Myanmar government on
20
the minority Muslim Rohingya were spreading wildly on Facebook…”, and that “…the issue of
21
the Rohingya being targeted on Facebook was well known inside the company for years.” This
22
23
24
25
Throughout this Complaint, “Myanmar” will be used in reference the ruling military
government, while “Burma” will be used to refer to the country itself. See U.S. Relations With
Burma, US STATE DEPARTMENT, https://www.state.gov/u-s-relations-with-burma/ (“The military
government changed the country’s name to ‘Myanmar’ in 1989. The United States government
continues to use the name ‘Burma.’”)
2
26
27
28
CLASS ACTION COMPLAINT
2
Case No. __________________
PDF Page 4
1
information, and the whistleblower’s knowledge of Facebook’s lack of response, led this person
2
to conclude: “I, working for Facebook, had been a party to genocide.”3
3
3.
For years, the Myanmar military, along with the support of civilian terrorists in
4
the majority Buddhist population, have treated the Rohingya as less than human, limiting their
5
rights, restricting their movements, and committing widespread human rights violations. While
6
various incidents of violence occurred periodically for years, nothing could prepare the
7
Rohingya, or the international community, for what was to come after Facebook entered the
8
picture in 2012.
9
4.
Following confrontations on the Rakhine State border, the Myanmar military, and
10
its civilian conspirators, now armed with Facebook to organize and spread terror, escalated their
11
brutal crackdown, carrying out violent acts of ethnic cleansing that defy comprehension.
12
5.
In the ensuing months and years, tens of thousands of Rohingya were brutally
13
murdered, gang raped, and tortured. Men, women, and children were burned alive inside their
14
homes and schools. Family members were tortured, raped, and killed in front of each other. More
15
than ten thousand lost their lives, while hundreds of thousands were brutalized, maimed, and
16
bore witness to indescribable violence and misery that they will carry with them for the rest of
17
their lives. Families were destroyed, childhoods were lost, lives were ruined, and entire
18
communities were erased from the face of the earth.
19
6.
As this wave of violence persisted with little end in sight, hundreds of thousands
20
of Rohingya fled their home country and sought refuge around the world. The vast majority of
21
those refugees ended up, and still live, in Bangladesh in what is now the largest refugee camp in
22
the world. Over ten thousand individuals, including Plaintiff, eventually arrived in the United
23
States and many are living here under refugee status.
24
25
26
27
28
3
Craig Timberg, New whistleblower claims Facebook allowed hate, illegal activity to go
unchecked, THE WASHINGTON POST (Oct. 22, 2021),
https://www.washingtonpost.com/technology/2021/10/22/facebook-new-whistleblowercomplaint/.
CLASS ACTION COMPLAINT
3
Case No. __________________
PDF Page 5
1
7.
The Rohingya people who are left in Burma live under constant threat of arrest,
2
violence, abuse, and discrimination. Those who made it out, too, live in fear for themselves and
3
their loved ones. Many Rohingya refugees around the world live in abject poverty and in highly
4
unstable situations that could change at any time depending on the political climate of the
5
country in which they now reside. Even those in the Bangladesh refugee camp are not safe: In
6
September 2021, a well-known and outspoken Rohingya community leader was murdered in the
7
camp, with many believing that the murder was carried out by supporters of the Myanmar
8
military.
9
8.
Woven throughout the years of this horrific tragedy are two constants: (1) the
10
enduring resilience of the Rohingya people and (2) the willingness of Defendant Meta to
11
knowingly facilitate the spread of anti-Rohingya hate speech, misinformation, and the
12
widespread incitement of violence against the Rohingya people.
13
9.
So deep was Facebook’s penetration into daily life in Burma and its role in the
14
out-of-control spread of anti-Rohingya content, that Marzuki Darusman, chairman of the U.N.
15
Independent International Fact-Finding Mission on Myanmar, described Facebook as having
16
played a “determining role” in the genocide. And, worst of all, it allowed the dissemination of
17
hateful and dangerous misinformation to continue for years, long after it was repeatedly put on
18
notice of the horrific and deadly consequences of its inaction.
19
10.
Amazingly (at least to those not privy to Facebook’s inner workings), Facebook
20
has long been aware that hateful, outraged, and politically extreme content (especially content
21
attacking a perceived “out-group”) is oxygen to the company’s blood. The more horrendous the
22
content, the more it generates “engagement” (a measure of users’ interaction with content on the
23
system (“likes,” “shares,” comments, etc.)). As Facebook has determined through years of study
24
and analysis: hate and toxicity fuel its growth far more effectively than updates about a user’s
25
favorite type of latte.
26
27
11.
Rather than taking what it’s learned to change its practices, Facebook made a
corporate decision to lean into the hate. Its algorithms were carefully designed to actively exploit
28
CLASS ACTION COMPLAINT
4
Case No. __________________
PDF Page 6
1
this opportunity, prioritizing divisive and polarizing content, including hate speech and
2
misinformation about targeted groups, when delivering content to users and recommending that
3
users make new connections or join new groups.
4
12.
Facebook participates in and contributes to the development and creation of
5
divisive content, including hate speech and misinformation. By ensuring that more users see and
6
respond—in the form of “likes,” “shares,” and comments—to such toxic content, Facebook’s
7
algorithms train users to post more hate speech and misinformation in order to garner more
8
attention online.
9
13.
This “growth at all costs” view of Facebook’s business is not speculative, or, for
10
that matter, inconsistent with Facebook’s view of itself. Facebook’s Borg-like march toward
11
further growth was best captured by one of its highest-ranking executives, Andrew Bosworth, in
12
an internal memo circulated after a shooting death in the Chicago was stunningly live streamed
13
on Facebook. It stated, in part:
14
We connect people.
15
16
That can be good if they make it positive. Maybe someone finds
love. Maybe it even saves the life of someone on the brink of
suicide.
17
So we connect more people.
18
19
That can be bad if they make it negative. Maybe it costs a life by
exposing someone to bullies. Maybe someone dies in a terrorist
attack coordinated on our tools.
20
And still we connect people.
21
The ugly truth is that we believe in connecting people so deeply
that anything that allows us to connect more people more often
is *de facto* good…
22
23
24
25
26
27
28
That’s why all the work we do in growth is justified. All the
questionable contact importing practices. All the subtle language
that helps people stay searchable by friends. All of the work we do
to bring more communication in. The work we will likely have to
do in China some day. All of it.
The natural state of the world is not connected. It is not unified. It
is fragmented by borders, languages, and increasingly by different
products. The best products don’t win. The ones everyone use
win.
CLASS ACTION COMPLAINT
5
Case No. __________________
PDF Page 7
In almost all of our work, we have to answer hard questions about
what we believe. We have to justify the metrics and make sure
they aren’t losing out on a bigger picture. But connecting people.
That’s our imperative. Because that’s what we do. We connect
people.4
1
2
3
4
14.
In short, Facebook sees itself, at best, as an amoral actor on the world stage, with
5
the sole objective of growth, regardless of how it impacts its users or the world more generally.
6
To be clear, the last five years, and in fact just the last five months, have made it abundantly
7
clear that Facebook’s path to promote the very worst of humanity was not the result of a bug, but
8
rather a carefully designed feature. 5
9
15.
The manifestation of this can be seen in nearly everything Facebook does. For
10
example:
11
•
Before and after the 2020 election, it failed to stop mass
publication and reposting of misinformation about the legitimacy
of the election and the subsequent calls for violence that
culminated in the January 6th attack on our nation’s Capitol;
•
Facebook has known about human traffickers using its system for
years, but only after “[i]t got so bad that in 2019”, and Apple
threatened to pull Facebook and Instagram’s access to the App
Store, did “Facebook employees rush[] to take down problematic
content and make emergency policy changes avoid what they
described as a ‘potentially severe’ consequence for the business.”6;
•
In another ongoing example, throughout the COVID-19 global
pandemic, Facebook has been a constant vehicle for the mass
distribution of misinformation on COVID, masks, and vaccines;
and
12
13
14
15
16
17
18
19
20
4
21
22
23
24
25
26
Ryan Mac, Growth At Any Cost: Top Facebook Executive Defended Data Collection In
2016 Memo — And Warned That Facebook Could Get People Killed, BUZZFEED
https://www.buzzfeednews.com/article/ryanmac/growth-at-any-cost-top-facebook-executivedefended-data (emphasis added).
5
In most companies, the total disregard shown for the human toll of corporate action
would have been met with termination; here, however, Andrew Bosworth not only stayed
employed, but was in fact placed in charge of (and became a chief spokesman for) arguably the
company’s largest and most aggressive expansion ever: the “Metaverse.” See Kurt Wagner,
Who’s Building Facebook’s Metaverse? Meet CTO Andrew Bosworth, BLOOMBERG (Oct. 27,
2021), https://www.bloomberg.com/news/articles/2021-10-27/facebook-fb-new-cto-andrewbosworth-is-the-man-building-the-metaverse.
6
27
28
Clare Duffy, Facebook has known it has a human trafficking problem for years. It still
hasn’t fully fixed it, CNN (Oct. 25, 2021), https://www.cnn.com/2021/10/25/tech/facebookinstagram-app-store-ban-human-trafficking/index.html.
CLASS ACTION COMPLAINT
6
Case No. __________________
PDF Page 8
1
•
Disturbingly, whistleblower Frances Haugen shed light on
Facebook’s knowledge that its websites, including both Facebook
and Instagram, led to mental health and body-image issues, and in
some cases, eating disorders and suicidal thoughts, in teens. Yet,
Facebook’s own internal research also showed that the more that
teenagers had these thoughts and emotions, the more they used the
app. So, it did nothing to protect the millions of children viewing
its content daily and maintained the status quo.
16.
The clear underlying message of the Bosworth memo above, as well as these
2
3
4
5
6
examples, is one of sacrifice: that the victims of a terrorist attack can be sacrificed for
7
Facebook’s growth; that an innocent child who takes her own life because she is bullied can be
8
sacrificed for Facebook’s growth; that democracy can be sacrificed for Facebook’s growth; that
9
the mental and physical health of children can be sacrificed for Facebook’s growth; that the
10
prevention of a global pandemic can be scarified for Facebook’s growth; and, as will be fully
11
described here, that an entire ethnic population can be sacrificed for Facebook’s relentless
12
growth.
13
17.
Because Facebook’s algorithms recommend that susceptible users join extremist
14
groups, where users are conditioned to post even more inflammatory and divisive content, it is
15
naturally open to exploitation by autocratic politicians and regimes. By using large numbers of
16
fake accounts (that Facebook not only fails to police but actually likes because they inflate the
17
user data Facebook presents to the financial markets), these regimes can repeatedly post, like,
18
share, and comment on content attacking ethnic minorities or political opponents. Because that
19
content appears to generate high engagement, Facebook’s algorithms prioritize it in the News
20
Feeds of real users.
21
18.
As such, Facebook’s arrival in Burma provided exactly what the military and its
22
civilian terrorists were praying for. Beginning around 2011, Facebook arranged for tens of
23
millions of Burmese to gain access to the Internet for the first time, exclusively through
24
Facebook. This resulted in a “crisis of digital literacy,” leaving these new users blind to the
25
prevalence of false information online. Facebook did nothing, however, to warn its Burmese
26
users about the dangers of misinformation and fake accounts on its system or take any steps to
27
the restrict its vicious spread.
28
CLASS ACTION COMPLAINT
7
Case No. __________________
PDF Page 9
1
19.
The brutal and repressive Myanmar military regime employed hundreds of
2
people, some posing as celebrities, to operate fake Facebook accounts and to generate hateful
3
and dehumanizing content about the Rohingya.
4
20.
Anti-Rohingya content thereafter proliferated throughout the Facebook product
5
for years. Human rights and civil society groups have collected thousands of examples of
6
Facebook posts likening the Rohingya to animals, calling for Rohingya to be killed, describing
7
the Rohingya as foreign invaders, and falsely accusing Rohingya of heinous crimes.
8
9
21.
It was clearly foreseeable, and indeed known to Facebook, that, by prioritizing
and rewarding users for posting dangerous and harmful content online—as well as by
10
recommending extremist groups and allowing fake accounts created by autocrats to flourish on
11
its system—Facebook would radicalize users in Burma, causing them to then support or engage
12
in dangerous or harmful conduct in the offline world.
13
22.
Despite having been repeatedly alerted between 2013 and 2017 to the vast
14
quantities of anti-Rohingya hate speech and misinformation on its system, and the violent
15
manifestation of that content against the Rohingya people, Facebook barely reacted and devoted
16
scant resources to addressing the issue.
17
23.
The resulting Facebook-fueled anti-Rohingya sentiment motivated and enabled
18
the military government of Myanmar to engage in a campaign of ethnic cleansing against the
19
Rohingya. To justify and strengthen its hold on power, the government cast, by and through
20
Facebook, the Rohingya as foreign invaders from which the military was protecting the Burmese
21
people. Widespread anger toward, and fear of, the Rohingya made it possible for the government
22
to enhance its own popularity by persecuting the Rohingya. Meanwhile, few Burmese civilians
23
objected to the attendant human rights abuses and eventual acts of genocide; indeed, as described
24
herein, many civilians actively participated in atrocities committed against the Rohingya.
25
24.
With the way cleared by Facebook, the military’s campaign of ethnic cleansing
26
culminated with “clearance operations” that began in August 2017. Security forces, accompanied
27
by civilian death squads armed with long swords, attacked dozens of Rohingya villages. More
28
CLASS ACTION COMPLAINT
8
Case No. __________________
PDF Page 10
1
than ten thousand Rohingya men, women, and children died by shooting, stabbing, burning, or
2
drowning. Thousands of others were tortured, maimed, and raped. Whole villages were burned to
3
the ground. More than 700,000 Rohingya eventually fled to squalid, overcrowded refugee camps
4
in Bangladesh.
5
25.
Not until 2018—after the damage had been done—did Facebook executives,
6
including CEO Mark Zuckerberg and COO Sheryl Sandberg, meekly admit that Facebook
7
should and could have done more to prevent what the United Nations has called “genocide” and
8
a “human rights catastrophe.” Facebook’s underwhelming response failed to capture even a
9
scintilla of the gravity of what it had done and the role it played, stating “we weren’t doing
10
enough to help prevent our platform from being used to foment division and incite offline
11
violence. We agree that we can and should do more.”7
12
26.
The second part of its efforts to “do more” was to launch the virtual reality centric
13
“Metaverse” to further force themselves into the lives of billions. As noted by prominent
14
political commentor Dan Pfeiffer,
15
Facebook is one of the least liked, least trusted companies on the
planet. They are in the middle of a massive scandal about their
involvement in genocide, human trafficking, and disinformation.
And their next move is to say: “What if you could live inside
Facebook?”8
16
17
27.
18
Still, years after their initial tepid admission of negligence, former Facebook
19
employee and now prolific whistleblower, Frances Haugen, stated “[t]he company’s leadership
20
knows how to make Facebook and Instagram safer but won’t make the necessary changes
21
because they have put their astronomical profits before people.” 9 Notably, in litigation pending
22
7
23
24
25
Alex Warofka, An Independent Assessment of the Human Rights Impact of Facebook in
Myanmar, FACEBOOK NEWSROOM (Nov. 5, 2018), https://about.fb.com/news/2018/11/myanmarhria/.
8
See @DanPfeiffer, TWITTER (Oct. 28, 2021, 3:24 PM)
https://twitter.com/danpfeiffer/status/1453819894487674899.
Abram Brown, Facebook ‘Puts Astronomical Profits Over People,’ Whistle-Blower Tells
Congress, FORBES (Oct. 5, 2021),
https://www.forbes.com/sites/abrambrown/2021/10/05/facebook-will-likely-resume-work-oninstagram-for-kids-whistleblower-tells-congress/?sh=7385d0f74cda.
9
26
27
28
CLASS ACTION COMPLAINT
9
Case No. __________________
PDF Page 11
1
before the International Court of Justice stemming from the Rohingya genocide, Facebook is at
2
this very moment taking aggressive measures to conceal evidence of its involvement.10
3
28.
Perhaps the most damning example of Facebook’s continued failure in Burma is
4
the ongoing—to this day—misinformation campaign being carried out on Facebook within the
5
country. As reported by Reuters on November 2, 2021, the Myanmar military has
6
tasked thousands of soldiers with conducting what is widely referred to in the
military as “information combat” … The mission of the social media drive, part of
the military’s broader propaganda operations, is to spread the junta's view among
the population, as well as to monitor dissenters and attack them online as traitors,
… “Soldiers are asked to create several fake accounts and are given content
segments and talking points that they have to post” … In over 100 cases, the
messages or videos were duplicated across dozens of copycat accounts within
minutes, as well as on online groups, purported fan channels for Myanmar
celebrities and sports teams and purported news outlets … Posts often referred to
people who opposed the junta as “enemies of the state” and “terrorists”, and
variously said they wanted to destroy the army, the country and the Buddhist
religion.11
7
8
9
10
11
12
13
29.
At the core of this Complaint is the realization that Facebook was willing to trade
14
the lives of the Rohingya people for better market penetration in a small country in Southeast
15
Asia. Successfully reaching the majority of Burmese people, and continuing to operate there
16
now, has a negligible impact on Facebook’s overall valuation and bottom line. Without the
17
18
19
20
21
22
23
24
25
26
In a glaring example of Facebook’s failure to learn from its deadly mistakes in Burma,
Haugen has provided documents demonstrating that history is currently repeating itself in
Ethiopia, where acts of ethnic violence are being carried out against the Tigrayan minority
amidst a raging civil war, again with the help of a Facebook-fueled misinformation and hatespeech campaign. See Facebook is under new scrutiny for it’s role in Ethiopia’s conflict, NPR
(Oct. 11, 2021), https://www.npr.org/2021/10/11/1045084676/facebook-is-under-new-scrutinyfor-its-role-in-ethiopias-conflict. See also Mark Scott, Facebook did little to moderate posts in
the world’s most violent countries, POLITICO (Oct. 25, 2021),
https://www.politico.com/news/2021/10/25/facebook-moderate-posts-violent-countries-517050
(“In many of the world’s most dangerous conflict zones, Facebook has repeatedly failed to
protect its users, combat hate speech targeting minority groups and hire enough local staff to
quell religious sectarianism”).
10
Robert Burnson, Facebook’s Stance on Myanmar Genocide Records Assailed by Gambia,
BLOOMBERG (Oct. 28. 2021), https://www.bloomberg.com/news/articles/2021-10-28/facebook-sstance-on-myanmar-genocide-records-assailed-by-gambia.
11
27
28
Fanny Potkin, Wa Lone, 'Information combat': Inside the fight for Myanmar’s soul,
REUTERS (Nov. 2, 2021), https://www.reuters.com/world/asia-pacific/information-combatinside-fight-myanmars-soul-2021-11-01.
CLASS ACTION COMPLAINT
10
Case No. __________________
PDF Page 12
1
Burma market, Facebook would still be worth $1 trillion, Mark Zuckerberg would still be one of
2
the top ten richest people in the world, and its stock price would still be at astronomical levels.
3
30.
In the end, there was so little for Facebook to gain from its continued presence in
4
Burma, and the consequences for the Rohingya people could not have been more dire. Yet, in the
5
face of this knowledge, and possessing the tools to stop it, it simply kept marching forward. 12
6
That is because, once Facebook struck the Faustian Bargain that launched the company, it has
7
had blinders on to any real calculation of the benefits to itself compared to the negative impacts
8
it has on anyone else. Facebook is like a robot programed with a singular mission: to grow. And
9
the undeniable reality is that Facebook’s growth, fueled by hate, division, and misinformation,
10
has left hundreds of thousands of devastated Rohingya lives in its wake.
11
PARTIES
12
31.
13
resides in Illinois.
14
32.
Plaintiff Jane Doe is a natural person and a Rohingya Muslim refugee. Plaintiff
Meta Platforms, Inc. is a corporation organized and existing under the laws of the
15
State of Delaware, with its principal place of business at 1 Hacker Way, Menlo Park, California
16
94025. Until October 2021, Defendant Meta was known as Facebook, Inc. Meta Platforms does
17
business in this County, the State of California, and across the United States.
18
JURISDICTION AND VENUE
19
20
33.
This Court has jurisdiction over this action pursuant to Article VI, Section 10, of
the California Constitution and Cal. Code Civ. Proc. § 410.10.
21
22
Angshuman Choudhury, How Facebook Is Complicit in Myanmar’s Attacks on
Minorities, THE DIPLOMAT (Aug. 25, 2020), https://www.thediplomat.com/2020/08/howfacebook-is-complicit-in-myanmars-attacks-on-minorities/ (“why would Facebook favor the
regime in Myanmar? For the same reason it would do so in India: to protect business interests in
a domestic market that it currently dominates by a wide margin. Imposing bans on governmentor military-linked accounts could dilute this monopoly by drawing the ire of state regulators.”) In
2020, Facebook similarly bowed to the demands of the Communist Vietnamese government to
“censor posts with anti-state language rather than risk losing an estimated $1 billion in annual
revenue from the country.” Peter Wade, Facebook Bowed to Vietnam Government’s Censorship
Demands: Report, ROLLING STONE (Oct. 25, 2021),
https://www.rollingstone.com/politics/politics-news/facebook-vietnam-censorship-1247323/.
12
23
24
25
26
27
28
CLASS ACTION COMPLAINT
11
Case No. __________________
PDF Page 13
1
2
34.
business is located within this County. Plaintiff submits to the jurisdiction of the Court.
3
4
This Court has personal jurisdiction over Defendant because its principal place of
35.
Venue is proper in this Court under Cal. Code Civ. P. § 395(a) because Defendant
resides in this County.
5
6
FACTUAL BACKGROUND
I.
The Defective Design of Facebook’s Algorithms and Services
7
A.
Facebook Designed Its Social Network to Maximize Engagement
8
36.
Facebook’s goal is to maximize “engagement,” a metric reflecting the amount of
9
time a user spends and the amount of interaction (“likes,” “shares,” comments, etc.) that the user
10
has with any given content. For Facebook, engagement determines advertising revenue, which
11
determines profits. “The prime directive of engagement … is driven by monetization. It befits a
12
corporation aiming to accelerate growth, stimulate ad revenue, and generate profits for its
13
shareholders.”13
14
37.
15
if our users decrease their level of engagement with Facebook, our
revenue, financial results, and business may be significantly
harmed. The size of our user base and our users’ level of
engagement are critical to our success…. [O]ur business
performance will become increasingly dependent on our ability to
increase levels of user engagement and monetization…. Any
decrease in user retention, growth, or engagement could render
Facebook less attractive to developers and marketers, which may
have a material and adverse impact on our revenue, business,
financial condition, and results of operations. … Our advertising
revenue could be adversely affected by a number of … factors,
including: decreases in user engagement, including time spent on
Facebook[.]14
16
17
18
19
20
21
22
23
24
25
26
27
28
In its SEC Form 10-K for the year ended December 31, 2012, Facebook warned:
38.
Accordingly, Facebook intentionally incorporated engagement-based ranking of
content into its system and the algorithms that drive it. Facebook’s News Feed—the first thing
13
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
14
U.S. Securities and Exchange Commission Form 10-K, Facebook, Inc. (fiscal year ended
Dec. 31, 2012) (“Facebook 2012 10-K”) at 13, 14,
https://www.sec.gov/Archives/edgar/data/1326801/000132680113000003/fb12312012x10k.htm#s5D6A63A4BB6B6A7AD01CD7A5A25638E4.
CLASS ACTION COMPLAINT
12
Case No. __________________
PDF Page 14
1
that users see when opening up the app or entering the site and “the center of the Facebook
2
experience”—is driven by engagement. Posts with higher engagement scores are included and
3
prioritized in the News Feed, while posts with lower scores are buried or excluded altogether.
4
“[T]he Feed’s … logics can be understood through a design decision to elevate and amplify
5
‘engaging’ content…. [T]he core logic of engagement remains baked into the design of the Feed
6
at a deep level.”15
7
39.
Facebook engineers and data scientists meet regularly to assess the billions of
8
likes, comments and clicks Facebook users make every day to “divine ways to make us like,
9
comment and click more,” so that users will keep coming back and seeing more ads from the
10
company’s 2 million advertisers. Engineers are continually running experiments with a small
11
share of Facebook users to boost engagement. 16 Thus, Facebook’s design was the “result of
12
particular decisions made over time…. Every area has undergone meticulous scrutiny … by
13
teams of developers and designers…. [Facebook] has evolved through conscious decisions in
14
response to a particular set of priorities.”17
15
40.
Facebook has consistently promoted and rewarded employees who contribute to
16
the company’s growth through a relentless focus on increased engagement of Facebook users;
17
employees who raise ethical and safety concerns tend to be ignored and marginalized and,
18
eventually, left the company.18
19
20
21
22
23
24
25
26
27
28
15
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
16
Victor Luckerson, Here’s How Facebook’s News Feed Actually Works, TIME (July 9,
2015), https://time.com/collection-post/3950525/facebook-news-feed-algorithm/.
17
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
18
Katie Canales, ‘Increasingly gaslit’: See the messages concerned Facebook employees
wrote as they left the company, BUSINESS INSIDER (Oct. 28, 2021),
https://www.businessinsider.com/facebook-papers-employee-departure-badge-post-gaslitburned-out-2021-10 (“[t]he employee said Facebook's infamous growth-first approach leads to
rolling out ‘risky features.’ If employees propose reversing that risk, they’re seen as being
‘growth-negative, and veto’d by decision makers on those grounds,’ they said. They also said it’s
difficult to establish ‘win/wins,’ or to roll out features that promote both safety and growth”).
CLASS ACTION COMPLAINT
13
Case No. __________________
PDF Page 15
1
2
B.
Facebook Prioritizes Hate Speech and Misinformation to Increase
User Engagement
41.
Facebook knows that the most negative emotions—fear, anger, hate—are the
3
4
5
most engaging. Facebook employs psychologists and social scientists as “user researchers” to
6
analyze its user’s behavior in response to online content. An internal Facebook presentation by
7
one such researcher, leaked in May 2020, warned: “Our algorithms exploit the human brain’s
8
attraction to divisiveness…. If left unchecked, … [Facebook would feed users] more and more
9
divisive content in an effort to gain user attention & increase time on the platform.” 19
10
42.
To maximize engagement, Facebook does not merely fill users’ News Feeds with
11
disproportionate amounts of hate speech and misinformation; it employs a system of social
12
rewards that manipulates and trains users to create such content. When users post content, other
13
users who are shown that content are prompted to “like,” “comment” on, or “share” it. Under
14
each piece of content, users can see how many times others have liked or shared that content and
15
can read the comments. See Figure 1.
16
17
18
19
20
(Figure 1.)
21
22
23
43.
A study published in February 2021 confirmed that, “[i]n online social media
platforms, feedback on one’s behavior often comes in the form of a ‘like’—a signal of approval
24
25
26
27
28
19
Jeff Horwitz, Deepa Seetharaman, Facebook Executives Shut Down Efforts to Make the
Site Less Divisive, WALL STREET JOURNAL (May 26, 2020),
https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixedsolutions-11590507499.
CLASS ACTION COMPLAINT
14
Case No. __________________
PDF Page 16
1
from another user regarding one’s post” and tested the assumption that likes “function as a social
2
reward.”20
3
44.
4
Roger McNamee, an early investor in Facebook and advisor to Mark Zuckerberg,
wrote in his New York Times bestseller, “Zucked: Waking Up to the Facebook Catastrophe”:
5
Getting a user outraged, anxious, or afraid is a powerful way to
increase engagement. Anxious and fearful users check the site
more frequently. Outraged users share more content to let other
people know what they should also be outraged about. Best of all
from Facebook’s perspective, outraged or fearful users in an
emotionally hijacked state become more reactive to further
emotionally charged content. It is easy to imagine how
inflammatory content would accelerate the heart rate and trigger
dopamine hits.21
6
7
8
9
10
45.
11
A Nature article published in 2020 further explained:
[I]ncendiary, polarizing posts consistently achieve high
engagement…. This content is meant to draw engagement, to
provide a reaction….
12
13
This divisive material often has a strong moral charge. It takes a
controversial topic and establishes two sharply opposed camps,
championing one group while condemning the other. These are the
headlines and imagery that leap out at a user as they scroll past,
forcing them to come to a halt. This offensive material hits a nerve,
inducing a feeling of disgust or outrage. “Emotional reactions like
outrage are strong indicators of engagement…. [T]his kind of
divisive content will be shown first, because it captures more
attention than other types of content.” …
14
15
16
17
18
The design of Facebook means that … forwarding and
redistribution is only a few clicks away…. Moreover, the
networked nature of social media amplifies this single response,
distributing it to hundreds of friends and acquaintances. They too
receive this incendiary content and they too share, inducing …
“outrage cascades — viral explosions of moral judgment and
disgust.” Outrage does not just remain constrained to a single user,
19
20
21
22
23
24
25
26
27
28
20
Björn Lindström, Martin Bellander, David T. Schultner, Allen Chang, Philippe N. Tobler,
David M. Amodio, A computational reward learning account of social media engagement,
NATURE COMMUNICATIONS 12, Art. No. 1311 (Feb. 26, 2021),
https://www.nature.com/articles/s41467-020-19607x#:~:text=%20A%20computational%20reward%20learning%20account%20of%20social,our%2
0hypothesis%20that%20online%20social%20behavior%2C…%20More%20.
21
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 88 (Penguin 2020
ed.).
CLASS ACTION COMPLAINT
15
Case No. __________________
PDF Page 17
but proliferates, spilling out to provoke other users and appear in
other online environments. 22
1
2
46.
Facebook knew that it could increase engagement and the length of time users
3
spend on its websites (and subsequently increase its revenue) by adjusting its algorithms to
4
manipulate users’ News Feeds and showing them more negative content thus causing “massive-
5
scale emotional contagion.” In 2014, Adam Kramer, a member of Facebook’s “Core Data
6
Science Team,” co-authored an article describing one of the experiments that Facebook
7
conducted on its own users, stating,
8
we test whether emotional contagion occurs outside of in-person
interaction between individuals by reducing the amount of
emotional content in the News Feed … Which content is shown or
omitted in the News Feed is determined via a ranking algorithm
that Facebook continually develops and tests in the interest of
showing viewers the content they will find most relevant and
engaging. One such test is reported in this study: A test of whether
posts with emotional content are more engaging.
9
10
11
12
13
***
14
The results show emotional contagion…. [F]or people who had
positive content reduced in their News Feed, a larger percentage of
words in people’s status updates were negative and a smaller
percentage were positive ...
15
16
These results indicate that emotions expressed by others on
Facebook influence our own emotions, constituting experimental
evidence for massive-scale contagion via social networks.23
17
18
19
47.
Independent research unequivocally confirms that fake content thrives on
20
Facebook over more reliable and trustworthy sources. In September 2021, the Washington Post
21
reported on a “forthcoming peer-reviewed study by researchers at New York University and the
22
Université Grenoble Alpes in France [which] found that from August 2020 to January 2021,
23
news publishers known for putting out misinformation got six times the amount of likes, shares,
24
25
26
27
28
22
Luke Munn, Angry by design: toxic communication and technical architectures,
HUMANIT SOC SCI COMMUN 7 (July 30, 2020), https://www.nature.com/articles/s41599-02000550-7.
23
Adam D.I. Kramer, Jamie E. Guillory, and Jeffrey T. Hancock, Experimental evidence of
massive-scale emotional contagion through social networks, 111 PROCEEDINGS OF THE
NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES, no. 29 (June 17, 2014),
https://www.pnas.org/cgi/doi/10.1073/pnas.1320040111.
CLASS ACTION COMPLAINT
16
Case No. __________________
PDF Page 18
1
and interactions on the [Facebook] platform as did trustworthy news sources, such as CNN or the
2
World Health Organization.”24
3
48.
In testimony before Congress in September 2020, Tim Kendall, Facebook’s first
4
Director of Monetization—likening Facebook’s business model to that of Big Tobacco—
5
explained how such content makes Facebook addictive:
6
At Facebook, I believe we sought to mine as much human attention
as possible and turn it into historically unprecedented profits. To
do this, we didn’t simply create something useful and fun; we took
a page from Big Tobacco’s playbook, working to make our
offering addictive at the outset….
7
8
9
The next page in Big Tobacco’s playbook was to add
bronchodilators to cigarettes. This allowed the smoke to get in
contact with more surface area of the lungs. Allowing for
misinformation, conspiracy theories, and fake news to flourish
were Facebook’s bronchodilators.
10
11
12
But that incendiary content wasn’t enough. Tobacco companies
then added ammonia to cigarettes to increase the speed with which
nicotine traveled to the brain. Facebook’s ability to deliver this
incendiary content to the right person, at the right time, in the exact
right way—through their algorithms—that is their ammonia. And
we now know it fosters tribalism and division.
13
14
15
Social media preys on the most primal parts of your brain; it
provokes, it shocks, and it enrages….
16
17
Facebook and their cohorts worship at the altar of engagement and
cast other concerns aside, raising the voices of division, anger,
hate, and misinformation to drown out the voices of truth, justice,
morality, and peace.25
18
19
49.
Content attacking opposing groups is particularly engaging. Zeynep Tufekci, a
20
sociologist at University of North Carolina, has written that:
21
the new, algorithmic gatekeepers aren’t merely (as they like to
believe) neutral conduits for both truth and falsehood. They make
their money by keeping people on their sites and apps; that aligns
their incentives closely with those who stoke outrage, spread
22
23
24
25
26
24
Elizabeth Dwoskin, Misinformation on Facebook got six times more clicks than factual
news during the 2020 election, study says, WASHINGTON POST (Sept. 4, 2021),
https://www.washingtonpost.com/technology/2021/09/03/facebook-misinformation-nyu-study/.
Mainstreaming Extremism: Social Media’s Role in Radicalizing America: Hearing
Before the House Subcommittee on Consumer Protection and Commerce, 116th Congress (Sept.
24, 2020) (statement of Timothy Kendall).
25
27
28
CLASS ACTION COMPLAINT
17
Case No. __________________
PDF Page 19
misinformation, and appeal to people’s existing biases and
preferences.
1
2
[T]he problem is that when we encounter opposing views in the
age and context of social media, it’s not like reading them in a
newspaper while sitting alone. It’s like hearing them from the
opposing team while sitting with our fellow fans in a football
stadium. Online, we’re connected with our communities, and we
seek approval from our like-minded peers. We bond with our team
by yelling at the fans of the other one. In sociology terms, we
strengthen our feeling of “in-group” belonging by increasing our
distance from and tension with the “out-group”—us versus
them…. This is why the various projects for fact-checking claims
in the news, while valuable, don’t convince people. Belonging is
stronger than facts.26
3
4
5
6
7
8
50.
9
A study published in June 2021 showed that posts attacking “others” (the “out-
group”) are particularly effective at generating social rewards, such as likes, shares, and
10
comments, and that those reactions consist largely of expressions of anger:
11
We investigated whether out-group animosity was particularly
successful at generating engagement on two of the largest social
media platforms: Facebook and Twitter. Analyzing posts from
news media accounts and US congressional members (n =
2,730,215), we found that posts about the political out-group were
shared or retweeted about twice as often as posts about the ingroup.… Out-group language consistently emerged as the strongest
predictor of shares and retweets…. Language about the out-group
was a very strong predictor of “angry” reactions (the most popular
reactions across all datasets)…. In sum, out-group language is the
strongest predictor of social media engagement across all relevant
predictors measured, suggesting that social media may be creating
perverse incentives for content expressing out-group animosity.27
12
13
14
15
16
17
18
51.
19
20
21
22
Another study, published in August 2021, analyzed how “quantifiable social
feedback (in the form of ‘likes’ and ‘shares’)” affected the amount of “moral outrage” expressed
in subsequent posts. The authors “found that daily outrage expression was significantly and
positively associated with the amount of social feedback received for the previous day’s outrage
expression.” The amount of social feedback is, in turn, determined by the algorithms underlying
23
the social media product:
24
25
26
26
Zeynep Tufekci, How social media took us from Tahrir Square to Donald Trump, MIT
TECHNOLOGY REVIEW (Aug. 14, 2018), https://technologyreview.com/2018/08/14/240325/howsocial-media-took-us-from-tahrir-square-to-donald-trump.
27
27
28
Steve Rathje, Jay J. Van Bagel, Sander van der Linden, Out-group animosity drives
engagement on social media, 118 PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES (26),
(June 29, 2021), https://doi.org/10.1073/pnas.2024292118.
CLASS ACTION COMPLAINT
18
Case No. __________________
PDF Page 20
Social media newsfeed algorithms can directly affect how much
social feedback a given post receives by determining how many
other users are exposed to that post. Because we show here that
social feedback affects users’ outrage expressions over time, this
suggests that newsfeed algorithms can influence users’ moral
behaviors by exploiting their natural tendencies for reinforcement
learning…. [D]esign choices aimed at … profit maximization via
user engagement can indirectly affect moral behavior because
outrage-provoking content draws high engagement….28
1
2
3
4
5
6
52.
In other words, if a user makes two posts—one containing hateful, outraged, and
7
divisive content and one lacking such content—Facebook’s algorithms will show the hateful,
8
outraged, and divisive post to more users. Consequently, the hateful, outraged, and divisive post
9
is rewarded with more likes, shares, and comments. The user quickly learns that to obtain a
10
reaction to his or her posts, he or she should incorporate as much hateful, outraged, and divisive
11
content as possible.
12
53.
13
On October 5, 2021, Frances Haugen, a former Facebook product manager,
testified before Congress:
14
The dangers of engagement based ranking are that Facebook
knows that content that elicits an extreme reaction from you is
more likely to get a click, a comment or reshare. And it’s
interesting because those clicks and comments and reshares aren’t
even necessarily for your benefit, it’s because they know that other
people will produce more content if they get the likes and
comments and reshares. They prioritize content in your feed so that
you will give little hits of dopamine to your friends, so they will
create more content. And they have run experiments on people,
producer side experiments, where they have confirmed this. 29
15
16
17
18
19
20
54.
Recently leaked documents confirm Facebook’s ability to determine the type of
21
content users post through its algorithms. After Facebook modified its algorithms in 2018 to
22
boost engagement, “[t]he most divisive content that publishers produced was going viral on the
23
24
25
26
27
28
28
William J. Brady, Killian McLoughlin, Tuan N. Doan, Molly J. Crockett, How social
learning amplifies moral outrage expression in online social networks, 7 SCIENCE ADVANCES,
no. 33 (Aug. 13, 2021), https://www.science.org/doi/10.1126/sciadv.abe5641. Posts were
classified as containing moral outrage or not using machine learning.
29
Facebook Whistleblower Frances Haugen Testifies on Children & Social Media Use:
Full Senate Hearing Transcript, REV (Oct. 5, 2021),
https://www.rev.com/blog/transcripts/facebook-whistleblower-frances-haugen-testifies-onchildren-social-media-use-full-senate-hearing-transcript.
CLASS ACTION COMPLAINT
19
Case No. __________________
PDF Page 21
1
platform … creating an incentive to produce more of it…. Company researchers discovered that
2
publishers and political parties were reorienting their posts toward outrage and sensationalism.
3
That tactic produced high levels of comments and reactions that translated into success on
4
Facebook.” Facebook researchers further discovered that “the new algorithm’s heavy weighting
5
of reshared material in its News Feed made the angry voices louder. ‘Misinformation, toxicity,
6
and violent content are inordinately prevalent among reshares,’ researchers noted in internal
7
memos.” Facebook data scientists suggested “a number of potential changes to curb the tendency
8
of the overhauled algorithm to reward outrage and lies” but “Mr. Zuckerberg resisted some of the
9
proposed fixes, the documents show, because he was worried they might hurt the company’s
10
other objective—making users engage more with Facebook.” 30
11
55.
In October 2021, NBC News described, based on internal documents leaked by
12
Frances Haugen, an experiment in which an account created by Facebook researchers
13
experienced “a barrage of extreme, conspiratorial, and graphic content”—even though the
14
fictitious user had never expressed interest in such content. For years, Facebook “researchers had
15
been running [similar] experiments … to gauge the platform’s hand in radicalizing users,
16
according to the documents seen by NBC News,” and among Haugen’s disclosures are
17
“research, reports and internal posts that suggest Facebook has long known its algorithms and
18
recommendation systems push some users to extremes.”31
19
56.
It is not surprising that the true nature of Facebook’s algorithms has become fully
20
apparent only through leaked documents and whistleblower testimony, since Facebook goes to
21
great lengths to hinder outside academic research regarding the design of those algorithms. In a
22
congressional hearing entitled “The Disinformation Black Box: Researching Social Media Data”
23
24
25
26
30
Keach Hagey, Jeff Horwitz, Facebook Tried to Make Its Platform a Healthier Place. It
Got Angrier Instead, WALL STREET JOURNAL (Sept. 15, 2021),
https://www.wsj.com/articles/facebook-algorithm-change-zuckerberg-11631654215.
Brandy Zadrozny, “Carol’s Journey”: What Facebook knew about how it radicalizes
users, NBC NEWS (Oct. 22, 2021), https://www.nbcnews.com/tech/tech-news/facebook-knewradicalized-users-rcna3581.
31
27
28
CLASS ACTION COMPLAINT
20
Case No. __________________
PDF Page 22
1
on September 28, 2021, three social media researchers testified about Facebook’s attempts to
2
block their access to the data they needed:
3
•
Laura Edelson of New York University testified: “this summer,
Facebook cut off my team’s access to their data. We used that very
data to support the finding in our recent study that posts from
misinformation sources on Facebook got six times more
engagement than factual news during the 2020 elections, to
identify multiple security and privacy vulnerabilities that we have
reported to Facebook, and to audit Facebook’s own, public-facing
Ad Library for political ads.”32
•
Alan Mislove, a Professor of Computer Sciences at Northeastern
University, testified: “Facebook recently criticized a study on
misinformation by saying it focused on who engages with content
and not who sees it—but that’s only true because Facebook does
not make such impression data available to researchers.”33
•
Kevin T. Leicht, a Professor of Sociology at University of Illinois
Urbana-Champaign testified: “there are limited amounts of social
media data available due to company restrictions placed on that
data. Many researchers fear litigation that may result from
analyzing and publishing results from these data.”34
4
5
6
7
8
9
10
11
12
13
57.
On October 5, 2021, Haugen testified before Congress:
14
[N]o one truly understands the destructive choices made by
Facebook except Facebook….
15
16
A company with such frightening influence over so many people,
over their deepest thoughts, feelings, and behavior, needs real
oversight. But Facebook’s closed design means it has no real
oversight. Only Facebook knows how it personalizes your Feed for
you.
17
18
19
20
32
21
22
23
24
25
Hearing on The Disinformation Black Box: Researching Social Media Data before the
Subcomm. on Oversight, 117th Cong. (2021) (testimony of Laura Edelson, NYU Cybersecurity
for Democracy), https://www.congress.gov/117/meeting/house/114064/witnesses/HHRG-117SY21-Wstate-EdelsonL-20210928.pdf.
33
Hearing on The Disinformation Black Box: Researching Social Media Data before the
Subcomm. on Oversight, 117th Cong. (2021) (testimony of Alan Mislove, Professor of Computer
Sciences at Northeastern University),
https://www.congress.gov/117/meeting/house/114064/witnesses/HHRG-117-SY21-WstateMisloveA-20210928.pdf.
34
26
27
28
Hearing on The Disinformation Black Box: Researching Social Media Data before the
Subcomm. on Oversight, 117th Cong. (2021) (testimony of Kevin T. Leicht, a Professor of
Sociology at University of Illinois Urbana-Champaign),
https://www.congress.gov/117/meeting/house/114064/witnesses/HHRG-117-SY21-WstateLeichtK-20210928.pdf.
CLASS ACTION COMPLAINT
21
Case No. __________________
PDF Page 23
At other large tech companies like Google, any independent
researcher can download from the Internet the company’s search
results and write papers about what they find. And they do. But
Facebook hides behind walls that keeps researchers and regulators
from understanding the true dynamics of their system….35
1
2
3
4
58.
Nevertheless, it is now clear that, by modifying the design of its algorithms and
5
system, Facebook can influence and manipulate the quantity, substance, and emotional tone of
6
the content its users produce. Through its dopamine-based incentive structure of social rewards
7
and cues, as well as its algorithmic promotion of hate speech and misinformation, Facebook
8
contributes to and participates in the development and creation of outraged, extreme, and
9
divisive content.
10
59.
It’s obviously not in Facebook’s favor—especially its bottom line—to curb the
11
spread of negative content and adjust its algorithm to promote positive content. One designer and
12
technologist proposed four different interventions to address the “problems of polarization,
13
dehumanization, and outrage, three of the most dangerous byproducts” of tools such as
14
Facebook. The four interventions described in the article include “Give Humanizing Prompts,”
15
“Picking out unhealthy content with better metrics,” “Filter unhealthy content by default,” and
16
“Give users feed control.” Facebook had not implemented any such interventions, undoubtedly
17
because, as the author admitted, the interventions “will all likely result in short-term reductions
18
in engagement and ad revenue.”36
19
60.
Facebook has options for moderating its algorithms’ tendency to promote hate
20
speech and misinformation, but it rejects those options because the production of more engaging
21
content takes precedence. In a September 2021 article, based on recently leaked internal
22
documents, the Wall Street Journal described how Facebook had modified its News Feed
23
24
25
26
35
Facebook Whistleblower Frances Haugen Testifies on Children & Social Media Use:
Full Senate Hearing Transcript, REV (Oct. 5, 2021),
https://www.rev.com/blog/transcripts/facebook-whistleblower-frances-haugen-testifies-onchildren-social-media-use-full-senate-hearing-transcript.
Tobias Rose-Stockwell, Facebook’s problems can be solved with design, QUARTZ (Apr.
30, 2018) (emphases in original), https://qz.com/1264547/facebooks-problems-can-be-solvedwith-design/.
36
27
28
CLASS ACTION COMPLAINT
22
Case No. __________________
PDF Page 24
1
algorithm “to reverse [a] decline in comments, and other forms of engagement, and to encourage
2
more original posting” by users.37
3
61.
Simply put, it is clear—based largely on admissions from former Facebook
4
executives—that Facebook’s algorithms are not “neutral.” The algorithms do not merely
5
recommend content based on users’ previously expressed interests; rather, to maximize
6
engagement, they are heavily biased toward promoting content that will enrage, polarize, and
7
radicalize users. Facebook does not simply “connect” people with similar interests; it exploits the
8
universal human instinct for tribalism by actively herding people into groups that define
9
themselves through their violent opposition to “other” people—often identified by race, religion,
10
or political ideology.
11
C.
Facebook Curates and Promotes Extremist Group Content
12
62.
Facebook’s algorithms curate and promote content that attracts new members to
13
extremist groups. A presentation by a researcher employed at Facebook, which was leaked in
14
2020, showed that Facebook’s algorithms were responsible for the growth of German extremist
15
groups on the website: “The 2016 presentation states that ‘64% of all extremist group joins are
16
due to our recommendation tools’ and that most of the activity came from the platform’s ‘Groups
17
You Should Join’ and ‘Discover’ algorithms. ‘Our recommendation systems grow the problem.’”
18
Ultimately, however, because “combating polarization might come at the cost of lower
19
engagement … Mr. Zuckerberg and other senior executives largely shelved the basic research …
20
and weakened or blocked efforts to apply its conclusions to Facebook products.” 38
21
63.
22
[I]f I am active in a Facebook Group associated with a conspiracy
theory and then stop using the platform for a time, Facebook will
do something surprising when I return. It may suggest other
23
24
25
26
27
28
Roger McNamee gave this example:
37
Keach Hagey, Jeff Horwitz, Facebook Tried to Make Its Platform a Healthier Place. It
Got Angrier Instead, THE WALL STREET JOURNAL (Sept. 15, 2021),
https://www.wsj.com/articles/facebook-algorithm-change-zuckerberg-11631654215.
38
Jeff Horwitz, Deepa Seetharaman, Facebook Executives Shut Down Efforts to Make the
Site Less Divisive, WALL STREET JOURNAL (May 26, 2020),
https://www.wsj.com/articles/facebook-knows-it-encourages-division-top-executives-nixedsolutions-11590507499.
CLASS ACTION COMPLAINT
23
Case No. __________________
PDF Page 25
3
conspiracy theory Groups to join…. And because conspiracy
theory Groups are highly engaging, they are very likely to
encourage reengagement with the platform. If you join the Group,
the choice appears to be yours, but the reality is that Facebook
planted the seed. It does so not because conspiracy theories are
good for you but because conspiracy theories are good for them. 39
4
McNamee described how, in 2016, he had raised his concerns with Mark Zuckerberg and Sheryl
5
Sandberg, to no avail.40
1
2
6
64.
In the August 2021 study discussed above, the authors stated: “[U]sers conform to
7
the expressive norms of their social network, expressing more outrage when they are embedded
8
in ideologically extreme networks where outrage expressions are more widespread…. Such norm
9
learning processes, combined with social reinforcement learning, might encourage more
10
moderate users to become less moderate over time, as they are repeatedly reinforced by their
11
peers for expressing outrage.”41
12
65.
Indeed, the positive feedback loop created by Facebook in the form of “likes,”
13
“comments,” and “shares” drive user engagement with extremist content and reward user
14
participation in creating such content. Together with algorithms promoting hate speech,
15
misinformation, and conspiracy theories, Facebook has steered users to extremist groups and
16
trained those users to express more outrage.
17
D.
Exploitation by Autocrats
18
66.
Facebook’s system and algorithms are also susceptible to exploitation by
19
unscrupulous and autocratic politicians and regimes. In his book, McNamee wrote:
20
Facebook’s culture, design goals, and business priorities made the
platform an easy target for bad actors, which Facebook aggravated
with algorithms and moderation policies that amplified extreme
voices. The architecture and business model that make Facebook
successful also make it dangerous. Economics drive the company
21
22
23
24
39
25
26
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 94-95 (Penguin
2020 ed.).
40
Id. at 4-7.
41
27
28
William J. Brady, Killian McLoughlin, Tuan N. Doan, Molly J. Crockett, How social
learning amplifies moral outrage expression in online social networks, 7 SCIENCE ADVANCES,
no. 33 (Aug. 13, 2021), https://www.science.org/doi/10.1126/sciadv.abe5641.
CLASS ACTION COMPLAINT
24
Case No. __________________
PDF Page 26
to align—often unconsciously—with extremists and authoritarians
to the detriment of democracy around the world. 42
1
2
67.
Facebook had the ability to detect and deactivate counterfeit accounts used by
3
authoritarian politicians and regimes to generate “fake engagement” but devoted minimal
4
resources to that task. In April 2021, Sophie Zhang, a data scientist whom Facebook had fired a
5
year earlier, spoke out about having “found multiple blatant attempts by foreign national
6
governments to abuse our platform on vast scales to mislead their own citizenry….” For
7
example, “[o]ver one six-week period from June to July 2018, [the president of Honduras]’s
8
Facebook posts received likes from 59,100 users, more than 78% of which were not real people.”
9
Such “fake engagement can influence how that content performs in the all-important news feed
10
algorithm; it is a kind of counterfeit currency in Facebook’s attention marketplace.” 43
11
68.
It took Facebook almost a year to remove fake accounts associated with
12
“domestic-focused coordinated inauthentic activity in Honduras” and, when Zhang “found that
13
the Honduras network was reconstituting … there was little appetite from [Facebook] to take it
14
down again.” Before she was fired, Zhang alerted Facebook to networks of fake Pages
15
supporting political leaders in Albania, Azerbaijan, Mexico, Argentina, Italy, the Philippines,
16
Afghanistan, South Korea, Bolivia, Ecuador, Iraq, Tunisia, Turkey, Taiwan, Paraguay, El
17
Salvador, India, the Dominican Republic, Indonesia, Ukraine, Poland, and Mongolia. Some of
18
these networks were investigated while others “languish[ed] for months without action.” 44
19
20
69.
Zhang gave one example that was especially reminiscent of the situation in
Burma:
21
Of all the cases of inauthentic behavior that Zhang uncovered, the
one that most concerned her—and that took the longest to take
down—was in Azerbaijan. It was one of the largest she had seen,
22
23
24
25
26
27
28
42
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 232-33 (Penguin
2020 ed.).
43
Julia Carrie Wong, How Facebook let fake engagement distort global politics: a
whistleblower’s account, THE GUARDIAN (Apr. 12, 2021),
https://theguardian.com/technology/2021/apr/12/facebook-fake-engagement-whistleblowersophie-zhang.
44
Id.
CLASS ACTION COMPLAINT
25
Case No. __________________
PDF Page 27
and it was clearly being used to prop up an authoritarian regime
with an egregious record on human rights.
1
2
The Azerbaijani network used the same tactic that was seen in
Honduras—thousands of Facebook Pages set up to look like user
accounts—but instead of creating fake likes, the Pages were used
to harass. Over one 90-day period in 2019, it produced
approximately 2.1m negative, harassing comments on the
Facebook Pages of opposition leaders and independent media
outlets, accusing them of being traitors and praising the country’s
autocratic leader, President Ilham Aliyev, and his ruling party, the
YAP.
3
4
5
6
7
Facebook did not employ a dedicated policy staffer or market
specialist for Azerbaijan, and neither its eastern European nor
Middle Eastern policy teams took responsibility for it. Eventually
Zhang discovered that the Turkey policy team was supposed to
cover the former Soviet republic, but none of them spoke Azeri or
had expertise in the country. As of August 2020, Facebook did not
have any full-time or contract operations employees who were
known to speak Azeri, leaving staff to use Google Translate to try
to understand the nature of the abuse.
8
9
10
11
12
Facebook did not take down those fake accounts or Pages until more than a year after Zhang
13
reported them.45
14
E.
Facebook’s Algorithm Has Successfully Radicalized Its Users
15
70.
By prioritizing hate speech and misinformation in users’ News Feeds, training
16
users to produce ever more extreme and outraged content, recommending extremist groups, and
17
allowing its product to be exploited by autocrats, Facebook radicalizes users and incites them to
18
violence.
19
71.
As Chamath Palihapitiya, Facebook’s former vice president for user growth, told
20
an audience at Stanford Business School: “I think we have created tools that are ripping apart the
21
social fabric of how society works … [t]he short-term, dopamine-driven feedback loops we’ve
22
created are destroying how society works … No civil discourse, no cooperation[,]
23
misinformation, mistruth. And it’s not an American problem…” 46
24
25
26
45
Id.
46
27
28
James Vincent, Former Facebook exec says social media is ripping apart society, THE
VERGE (Dec. 11, 2017), https://www.theverge.com/2017/12/11/16761016/former-facebookexec-ripping-apart-society.
CLASS ACTION COMPLAINT
26
Case No. __________________
PDF Page 28
1
McNamee likewise explained how the design of Facebook’s algorithms and
72.
2
system lead to real-world violence: “The design of Facebook trained users to unlock their
3
emotions, to react without critical thought…. at Facebook’s scale it enables emotional contagion,
4
where emotions overwhelm reason…. Left unchecked, hate speech leads to violence,
5
disinformation undermines democracy.”47
6
As Dipayan Ghosh, a former Facebook privacy expert, noted, “[w]e have set
73.
7
ethical red lines in society, but when you have a machine that prioritizes engagement, it will
8
always be incentivized to cross those lines.”48
9
10
Facebook’s tendency to cause real-world violence by radicalizing users online has
74.
been demonstrated time and time again. A few recent examples include:
11
•
In March 2019, a gunman killed 51 people at two mosques in
Christchurch, New Zealand, while live-streaming the event on
Facebook.49 For two years prior to the shooting, the gunman had
been active on the Facebook group of the Lads Society, an
Australian extremist white nationalist group. 50
•
In August 2020, “[h]ours before a 17-year-old white man allegedly
killed two people and injured a third at protests over a police
shooting in Kenosha, Wisconsin, a local militia group posted a call
on Facebook: ‘Any patriots willing to take up arms and defend our
city tonight from evil thugs?’”51 Later, Mark Zuckerberg said that
“the social media giant made a mistake by not removing a page
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
47
Roger McNamee, Zucked: Waking Up to the Facebook Catastrophe, at 98, 233 (Penguin
2020 ed.).
48
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 185 (HarperCollins 2021).
49
Charlotte Grahan-McLay, Austin Ramzy, and Daniel Victor, Christchurch Mosque
Shootings Were Partly Streamed on Facebook, NEW YORK TIMES (Mar. 14, 2019),
https://www.nytimes.com/2019/03/14/world/asia/christchurch-shooting-new-zealand.html.
50
Royal Commission of Inquiry into the Terrorist Attack on Christchurch Mosques on 15
March 2019 § 4.6, https://christchurchattack.royalcommission.nz/the-report/firearmslicensing/general-life-in-new-zealand/; Michael McGowan, Australian white nationalists reveal
plans to recruit ‘disgruntled, white male population’, THE GUARDIAN (Nov. 11, 2019),
https://www.theguardian.com/australia-news/2019/nov/12/australian-white-nationalists-revealplans-to-recruit-disgruntled-white-male-population.
51
27
28
Adam Mahoney, Lois Beckett, Julia Carrie Wong, Victoria Bekiempis, Armed white men
patrolling Kenosha protests organized on Facebook, THE GUARDIAN (Aug. 26, 2020),
https://www.theguardian.com/us-news/2020/aug/26/kenosha-militia-protest-shooting-facebook.
CLASS ACTION COMPLAINT
27
Case No. __________________
PDF Page 29
and event that urged people in Kenosha … to carry weapons amid
protests.”52
1
•
2
3
4
5
6
75.
“In the days leading up to [the January 6, 2021] march on the
Capitol, supporters of President Trump promoted it extensively on
Facebook and Facebook-owned Instagram and used the services to
organize bus trips to Washington. More than 100,000 users posted
hashtags affiliated with the movement prompted by baseless claims
of election fraud, including #StopTheSteal and
#FightForTrump.”53
Prior to Facebook’s entry into Burma, as described below, Facebook was on
7
notice of the manner in which its service could influence political conflict and be used to fuel
8
real-world violence. For example, during a 2010 conflict in Kyrgyzstan, highly divisive and at
9
times violent content spread widely on Facebook, inclusive of substantial misinformation related
10
to the source and cause for ongoing violence. 54 Likewise, even in examples where Facebook has
11
been credited with supporting protests for positive political change before 2012, the consistent
12
result is that the same governments and militant groups that were opposed by the protests,
13
eventually utilized Facebook to help put down those uprisings through widespread
14
misinformation campaigns.55
15
76.
Facebook was on notice very early on in its existence that “that liberty isn't the
16
only end toward which these tools can be turned.”56 And it is with that knowledge in hand that it
17
launched in the extremely volatile environment present in Burma.
18
19
Madeleine Carlisle, Mark Zuckerberg Says Facebook’s Decision to Not Take Down
Kenosha Militia Page Was a Mistake, TIME (Aug. 29, 2020), https://time.com/5884804/markzuckerberg-facebook-kenosha-shooting-jacob-blake/.
53
Elizabeth Dwoskin, Facebook’s Sandberg deflected blame for Capitol riot, but new
evidence shows how platform played role, WASHINGTON POST (Jan. 13, 2021),
https://www.washingtonpost.com/technology/2021/01/13/facebook-role-in-capitol-protest/.
54
Neil Melvin and Tolkun Umaraliev, New Social Media and Conflict in Kyrgyzstan, SIPRI
(Aug. 2011), https://www.sipri.org/sites/default/files/files/insight/SIPRIInsight1101.pdf.
55
Nariman El-Mofty, Social Media Made the Arab Spring, But Couldn't Save It, WIRED
(Jan. 26, 2016), https://www.wired.com/2016/01/social-media-made-the-arab-spring-but-couldntsave-it/ (“These governments have also become adept at using those same channels to spread
misinformation. ‘You can now create a narrative saying a democracy activist was a traitor and a
pedophile,’ … ‘The possibility of creating an alternative narrative is one people didn’t consider,
and it turns out people in authoritarian regimes are quite good at it.’”)
56
Id.
52
20
21
22
23
24
25
26
27
28
CLASS ACTION COMPLAINT
28
Case No. __________________
PDF Page 30
1
II.
2
The Introduction of Facebook Led to a Crisis of Digital Literacy in Burma
77.
In addition to high engagement, continued user growth was critical to Facebook’s
3
success. “If we fail to retain existing users or add new users, … our revenue, financial results,
4
and business may be significantly harmed.” 57 By 2012, Facebook reported 1.06 billion monthly
5
active users (“MAUs”) with 84% of those accessing Facebook from outside the United States,
6
meaning that there were already about 170 million MAUs in the United States—equal to more
7
than half the U.S. population. 58 To ensure continued growth, Facebook would have to gain users
8
in developing countries, many of whom had no previous access to the Internet.
9
78.
Prior to 2011, in an atmosphere of extreme censorship, only about 1% of the
10
Burmese population had cell phones. That percentage grew dramatically with the liberalization
11
that began in 2011.59 In 2013, when two foreign telecom companies were permitted to enter the
12
market, the cost of a SIM card fell from more than $200 to as little as $2, and by 2016, nearly
13
half the population had mobile phone subscriptions, most with Internet access. 60
14
79.
Facebook took active steps to ensure that it would have a dominant position in the
15
emerging Burmese market. “Entering the country in 2010, Facebook initially allowed its app to
16
be used without incurring data charges, so it gained rapid popularity. It would come pre-loaded
17
on phones bought at mobile shops….”61
18
19
80.
Facebook would eventually pursue a similar strategy for penetrating other
developing markets, as reflected in its “Free Basics” product. Free Basics was “a Facebook-
20
21
22
23
24
25
26
27
28
57
Facebook 2012 10-K, at 13,
https://www.sec.gov/Archives/edgar/data/1326801/000132680113000003/fb12312012x10k.htm#s5D6A63A4BB6B6A7AD01CD7A5A25638E4.
58
Id. at 8.
59
Report of the detailed findings of the Independent International Fact-Finding Mission on
Myanmar, UNITED NATIONS HUMAN RIGHTS COUNCIL (Sept. 17, 2018),
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf (“UNHRC
Report”) ¶ 1343.
60
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
61
Saira Asher, Myanmar coup: How Facebook became the ‘digital tea shop,’ BBC NEWS
(Feb. 4, 2021), https://www.bbc.com/news/world-asia-55929654.
CLASS ACTION COMPLAINT
29
Case No. __________________
PDF Page 31
1
developed mobile app that gives users access to a small selection of data-light websites and
2
services … [t]o deliver the service, … Facebook partners with local mobile operators … [who]
3
agree to ‘zero-rate’ the data consumed by the app, making it free, while Facebook does the
4
technical heavy lifting to ensure that they can do this as cheaply as possible. Each version is
5
localized, offering a slightly different set of up to 150 sites and services…. There are no other
6
social networking sites apart from Facebook and no email provider.” 62
7
81.
One reason why Facebook gained immense traction in Burma is that “[t]he
8
website … handles Myanmar fonts well compared to other social media like Twitter.” 63 After
9
citizens bought an inexpensive phone and a cheap SIM card, “there was one app that everybody
10
in [Burma] wanted: Facebook. The reason? Google and some of the other big online portals
11
didn’t support Burmese text, but Facebook did.” 64
12
82.
For the majority of Burma’s 20 million Internet-connected citizen’s, “Facebook is
13
the internet…. [M]ost mobile phones sold in the country come preloaded with Facebook….
14
There are equal numbers of internet users and Facebook users in [Burma]. As a result, many
15
people use Facebook as their main source of information….” 65
16
83.
A report commissioned by Facebook in 2018 described how the rapid transition of
17
Burma from a society without modern communications infrastructure to an Internet-connected
18
society caused “a crisis of digital literacy: A large population of internet users lacks basic
19
understanding of how to … make judgments on online content.… Digital literacy is generally
20
low across the country, and many people find it difficult to verify or differentiate content (for
21
22
23
24
25
26
Olivia Solon, ‘It’s digital colonialism’: how Facebook’s free internet service has failed
its users, THE GUARDIAN (July 27, 2017),
https://www.theguardian.com/technology/2017/jul/27/facebook-free-basics-developing-markets.
63
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech.
62
64
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
Human Rights Impact Assessment: Facebook in Myanmar, BSR (Oct. 2018) (“BSR
Report”) at 12-13, https://about.fb.com/wp-content/uploads/2018/11/bsr-facebook-myanmarhria_final.pdf.
65
27
28
CLASS ACTION COMPLAINT
30
Case No. __________________
PDF Page 32
1
example, real news from misinformation).” 66 As noted by Sarah Su, a Facebook employee who
2
works on content safety issues on the News Feed, “[w]hat you’ve seen in the past five years is
3
almost an entire country getting online at the same time, we realized that digital literacy is quite
4
low. They don’t have the antibodies to [fight] viral misinformation.” 67
5
84.
The U.N. Independent International Fact-Finding Mission on Myanmar (the
6
“U.N. Mission”) investigating the genocide in Burma reported: “[t]he Myanmar context is
7
distinctive … because of the relatively new exposure of the Myanmar population to the Internet
8
and social media…. In a context of low digital and social media literacy, the Government’s use
9
of Facebook for official announcements and sharing of information further contributes to users’
10
perception of Facebook as a reliable source of information.” 68
11
85.
Thet Swei Win, the director of an organization that works to promote social
12
harmony between ethnic groups in Burma, told the BBC “[w]e have no internet literacy…[w]e
13
have no proper education on how to use the internet, how to filter the news, how to use the
14
internet effectively.”69
15
86.
As described by the U.N., “[t]he relative unfamiliarity of the population with the
16
Internet and with digital platforms and the easier and cheaper access to Facebook have led to a
17
situation in [Burma] where Facebook is the Internet…. For many people, Facebook is the main,
18
if not only, platform for online news and for using the Internet more broadly.” 70 “Facebook is
19
arguably the only source of information online for the majority in [Burma]”71 and “Facebook is a
20
21
66
67
22
23
24
25
26
27
28
Id.
Steven Levy, Facebook: the Inside Story (Blue Rider Press 2020).
68
UNHRC Report, ¶¶ 1342, 1345,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
69
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
70
UNHRC Report, ¶ 1345,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
71
Libby Hogan, Michael Safi, Revealed: Facebook hate speech exploded in Myanmar
during Rohingya crisis, THE GUARDIAN (Apr. 2, 2018),
https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-exploded-inmyanmar-during-rohingya-crisis.
CLASS ACTION COMPLAINT
31
Case No. __________________
PDF Page 33
1
particularly influential medium in Myanmar. More than 14 million people out [of] a total
2
population of 53 million utilize Facebook in Myanmar, and according to a 2016 survey of
3
internet users in Myanmar, ‘reading news on the internet’ often meant ‘news they had seen on
4
their Facebook newsfeed, and [they] did not seem aware of other news sources online.’”72
5
87.
The New York Times has reported that “[t]he military exploited Facebook’s wide
6
reach in Myanmar, where it is so broadly used that many of the country’s 18 million internet
7
users confuse the Silicon Valley social media platform with the internet,”73 and that “[a]s
8
Facebook’s presence in Myanmar grew …, the company did not address what the BSR report
9
calls ‘a crisis of digital literacy’ in a country that was just emerging from a military dictatorship
10
and where the internet was still new.”74
11
12
88.
In the end, the U.N. put it best: “Facebook has been a useful instrument for those
seeking to spread hate, in a context where, for most users, Facebook is the Internet.”75
13 III.
14
15
Facebook Amplified the Myanmar Military’s Use of Fear and Hatred of the
Rohingya to Justify its Hold on Power
89.
By 2011, the country’s history of political repression and ethnic violence was
16
widely known. “Myanmar’s political history has been heavily dominated by an all-powerful
17
military, known as the Myanmar ‘Tatmadaw,’ which has ruled the country for most of its
18
19
20
21
22
23
24
25
26
27
28
72
Fortify Rights, They Gave Them Long Swords: Preparations for Genocide and Crimes
Against Humanity Against Rohingya Muslims in Rakhine State, Myanmar at 95, n.403 (July
2018), https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf
(“Fortify Rights Report”) (citing GSMA, Mobile Phones, Internet, and Gender in Myanmar at 55
(Feb. 2016), https://www.gsma.com/mobilefordevelopment/wpcontent/uploads/2016/02/Mobile-phones-internet-and-gender-in-Myanmar.pdf.)
73
Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military,
NEW YORK TIMES (Oct. 15, 2018), https://www.nytimes.com/2018/10/15/technology/myanmarfacebook-genocide.html.
74
Alexandra Stevenson, Facebook Admits It Was Used to Incite Violence in Myanmar,
NEW YORK TIMES (Nov. 6, 2018), https://www.nytimes.com/2018/11/06/technology/myanmarfacebook.html.
75
Report of the detailed findings of the Independent International Fact-Finding Mission on
Myanmar, UNITED NATIONS HUMAN RIGHTS COUNCIL (Sept. 17, 2018), https://documents-ddsny.un.org/doc/UNDOC/GEN/G18/274/54/PDF/G1827454.pdf?OpenElement (“UNHRC
Report”) ¶ 74.
CLASS ACTION COMPLAINT
32
Case No. __________________
PDF Page 34
1
existence.”76 In 1962, the military took power in a coup led by General Ne Win. 77 In 1989, after
2
widespread protests against the regime had broken out the year before, the military placed Aung
3
San Suu Kyi, leader of the National League for Democracy opposition party (NLD) and winner
4
of the Nobel peace prize, under house arrest; after the NLD won a general election in 1990, the
5
military government refused to recognize the result or to allow the legislature to assemble. 78
6
“During the military dictatorship [from 1962 to 2011], Myanmar was considered one of the most
7
repressive countries in Asia.”79
8
9
90.
Despite a brief period of liberalization that began in 2011, the military continued
to dominate Burma’s government. The 2008 Constitution was designed by “the military to retain
10
its dominant role in politics and government … 25 percent of the seats in each house of
11
parliament and in the state and regional assemblies belong to unelected members of the military,
12
who are appointed by the Tatmadaw.”80 In addition to being guaranteed at least one vice
13
presidential position, “the Tatmadaw selects candidates for (and effectively controls) three key
14
ministerial posts: Defence, Border Affairs and Home Affairs. This is sufficient to control the
15
National Defence and Security Council and the entire security apparatus.” 81
16
91.
The military has consistently used an imagined threat from the Rohingya to justify
17
its hold on power. “[T]he ‘Rohingya crisis’ in Rakhine State … has been used by the military to
18
reaffirm itself as the protector of a nation under threat….” 82 In support of the 1962 coup,
19
“General Ne Win argued that a military take-over was necessary to protect the territorial integrity
20
of the country” due to “insurgencies from ‘ethnic armed organizations.’” 83 The Tatmadaw has
21
22
23
24
25
26
27
76
UNHRC Report, ¶ 71,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf. Myanmar gained
its independence from Great Britain in 1948.
77
Id.
78
Id. ¶ 74.
79
Id. ¶ 94.
80
81
82
83
28
Id. ¶ 81.
Id.
Id. ¶ 93.
Id. ¶ 71.
CLASS ACTION COMPLAINT
33
Case No. __________________
PDF Page 35
1
used the alleged “ethnic threat to national sovereignty and territorial integrity as the excuse for
2
its control of the country….”84 The main concern to those in power was to “maintain power and
3
to attain and preserve ‘national unity in the face of ethnic diversity.’ Human rights were
4
‘subordinate to these imperatives.’ … Reports of serious human rights violations were
5
pervasive….”85
6
92.
The government found that it could increase its own popularity by, first, instilling
7
fear and hatred of the Rohingya among the Buddhist majority in Burma and then publicly
8
oppressing, marginalizing, and persecuting the Rohingya. The U.N. found that “the Rohingya
9
have gradually been denied birth registration, citizenship and membership of the political
10
community. This lack of legal status and identity is the cornerstone of the oppressive system
11
targeting the Rohingya…. It is State-sanctioned and in violation of Myanmar’s obligations under
12
international law because it discriminates on the basis of race, ethnicity and religion.”86 The four
13
Special Rapporteurs on the human rights situation in Burma appointed by the United Nations
14
from 1992 to 201187 concluded, inter alia:
15
[S]ince late 1989, the Rohingya citizens of Myanmar … have been
subjected to persecution based on their religious beliefs involving
extrajudicial executions, torture, arbitrary detention, forced
disappearances, intimidation, gang-rape, forced labour, robbery,
setting of fire to homes, eviction, land confiscation and population
resettlement as well as the systematic destruction of towns and
mosques.88
16
17
18
19
[S]ome of these human rights violations may entail categories of
crimes against humanity or war crimes. 89
20
21
Yet, “the Tatmadaw enjoys considerable popularity among the Bamar-Buddhist majority.”90
22
23
24
84
85
86
25
26
27
87
88
89
90
28
Id.
Id. ¶ 94.
Id. ¶ 491.
Id. ¶ 96-97.
Id. ¶ 100.
Id. ¶ 97.
Id. ¶ 93.
CLASS ACTION COMPLAINT
34
Case No. __________________
PDF Page 36
1
93.
Facebook, by its very design, turned out to be the perfect tool for the Burmese
2
military and Buddhist extremists to use in promoting their message of religious intolerance and,
3
ultimately, ethnic cleansing. The amplification and propagation of hateful, extremist, and
4
polarizing messages and the radicalization of users are inevitable results of the algorithms that
5
Facebook intentionally and meticulously built into its system.
6
A.
Facebook Participated in Inciting Violence Against the Rohingya
(2012-2017)
94.
On June 8, 2012, there were violent confrontations between Rohingya and ethnic
7
8
9
Rakhine groups; security forces killed a number of the Rohingya and Muslim homes and shops
10
were set on fire and looted.91 In the ensuing weeks and months, the Rohingya suffered more
11
killings at the hands of Tatmadaw soldiers, burnings and lootings, sexual and gender-based
12
violence, arbitrary arrests, and torture in prison. 92 The U.N. Mission drew a direct connection
13
between the Burmese government’s use of Facebook and the violence against the Rohingya that
14
began in June 2012:
15
On 1 June 2012 … the spokesperson of the President of
Myanmar … posted a statement on his personal Facebook account.
He warned about the arrival from abroad of “Rohingya
terrorists” … and stated that the Myanmar troops would
“completely destroy them….” Although this post was later deleted,
the impact of a high official equating the Rohingya population with
terrorism may have been significant ahead of the 2012 violence,
which erupted a week later. 93
16
17
18
19
[P]osts early in 2012 about the alleged rape and murder by
Rohingya men of a Buddhist woman were reportedly shared
widely and are considered to have contributed to the tension and
violence in Rakhine State in that year.94
20
21
22
95.
Incitement of violence on Facebook continued beyond 2012: “[A]n online news
23
report from 30 June 2014 … alleged that two Muslim teashop owners had raped a Buddhist
24
woman…. [A prominent Buddhist monk] reposted the article on his Facebook page…. Violence
25
26
27
91
92
93
94
28
Id. ¶¶ 630-33.
Id. ¶¶ 635-39, 662-63, 669-78.
Id. ¶¶ 705-06.
Id. ¶ 1347.
CLASS ACTION COMPLAINT
35
Case No. __________________
PDF Page 37
1
erupted the following day [resulting in two deaths]. The rape allegations were false, with the
2
‘victim’ reportedly admitting that she had fabricated the rape allegations.” 95
3
The U.N. Report continued, “[t]here is no doubt that hate speech against Muslims
96.
4
in general, and Rohingya in particular, is extremely widespread in Myanmar…. Given
5
Facebook’s dominance in Myanmar, the Mission paid specific attention to a number of Facebook
6
accounts that appear to be particularly influential….” 96 For example:
7
•
[T]he late U Ko Ni, a well-known Muslim and legal advisor of the
NLD, was frequently targeted on Facebook…. In one post from
March 2016, a photo of U Ko Ni next to president Htin Kyaw was
captioned ‘this [dog] getting his foot in the door in Myanmar
politics is not something we should sit by and watch….’ The
Mission has seen multiple other posts with a similar message and
threats towards U Ko Ni dating from between March and October
2016. On 29 January 2017, U Ko Ni was assassinated….97
•
[I]n January 2017, a self-described pro-Myanmar patriot with more
than 17,000 followers on Facebook posted a graphic video of
police violence against civilians in another country. He captioned
the post as follows: “Watch this video. The kicks and beatings are
very brutal…. [The] disgusting race of [Muslim] terrorists who
sneaked into our country … need to be beaten like that….” One
comment under the post reads: “It is very satisfying to watch
this…. It’s sad that Myanmar security forces are not as skillful in
their beating.” In July 2018, the post had over 23,000 views, 830
reactions and 517 shares.98
•
[O]ne account holder, supposedly a monk, posted a poem with
graphic photos allegedly showing Buddhist Mros killed by the
“Bengali” on 3 August 2017, along with photos of damage to a
pagoda allegedly done by “Bengali”.99 (The Myanmar authorities
refer to the Rohingya as “Bengalis” to suggest that, rather than
being native to Myanmar, they are illegal immigrants from
Bangladesh.100)
•
[O]n 11 February 2018, … Shwewiki.com, a self-proclaimed
“Media/News Company in Yangon” with over 1.3 million
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
95
96
25
26
27
97
98
99
100
28
Id. ¶ 1325.
Id. ¶ 1310.
Id. ¶ 1312.
Id.
Id.
Id. ¶ 460.
CLASS ACTION COMPLAINT
36
Case No. __________________
PDF Page 38
followers on Facebook, posted a link to an article titled “The lies
of the [Rohingya liars] are exposed[.]”101
1
2
In a Pulitzer Prize winning report, Reuters found numerous “posts, comments,
97.
3
images and videos attacking the Rohingya or other Myanmar Muslims that were on Facebook as
4
of [August 2018].” For example:
5
•
In December 2013, one user posted: “We must fight them the way
Hitler did the Jews, damn kalars [a pejorative for the Rohingya].”
•
In September 2017, another wrote: “These non-human kalar dogs,
the Bengalis, are killing and destroying our land, our water and our
ethnic people…. We need to destroy their race.”
•
In April 2018, another user posted, with a picture of a boatload of
Rohingya refugees, “Pour fuel and set fire so that they can meet
Allah faster.”102
6
7
8
9
10
11
98.
Facebook was used to instigate communal unrest “in early September 2017, …
12
through the parallel distribution of similar but conflicting chain messages on Facebook
13
Messenger to Muslim and Buddhist communities. Each chain message stated that the other group
14
was preparing for major violence on 11 September and encouraged the recipient to get ready to
15
resist…. [T]he messages … caused widespread fear and at least three violent incidents.”103 One
16
of the most dangerous campaigns came in 2017, when “the military’s intelligence arm spread
17
rumors on Facebook to both Muslim and Buddhist groups that an attack from the other side was
18
imminent….”104
19
20
99.
Steve Stecklow, author of the Reuters report, observed that several of the posts
that he and his team catalogued “described Rohingyas as dogs or pigs. ‘This is a way of
21
22
23
24
25
26
101
Id. ¶ 1312.
102
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
103
UNHRC Report, ¶ 1348,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military,
NEW YORK TIMES (Oct. 15, 2018), https://www.nytimes.com/2018/10/15/technology/myanmarfacebook-genocide.html.
104
27
28
CLASS ACTION COMPLAINT
37
Case No. __________________
PDF Page 39
1
dehumanising a group,’ Stecklow says. ‘Then when things like genocide happen, potentially
2
there may not be a public uproar or outcry as people don’t even view these people as people.’” 105
3
100.
According to Voices that Poison, a U.S.-based human rights group, “speech that
4
describes victims as vermin, pests, insects or animals is a rhetorical hallmark of incitement to
5
violence, even genocide, because it dehumanises the victim.”106 Fortify Rights, a human rights
6
group, similarly noted: “Burmese individuals and groups have disseminated vitriolic Facebook
7
posts dehumanizing and calling for widespread attacks against the Rohingya. For example, the
8
widely-followed monk Ashin Wirathu, head of the ultranationalist group formerly known as Ma
9
Ba Tha, posted a reference to the Rohingya in 2014, saying ‘You can be full of kindness and
10
love, but you cannot sleep next to a mad dog. If we are weak, our land will become Muslim.’” 107
11
12
101.
The New York Times reported that the Myanmar military had posted anti-
Rohingya propaganda on Facebook using fake accounts:
13
They posed as fans of pop stars and national heroes as they flooded
Facebook with their hatred. One said Islam was a global threat to
Buddhism. Another shared a false story about the rape of Buddhist
woman by a Muslim man.
14
15
The Facebook posts were not from everyday internet users.
Instead, they were from Myanmar military personnel who turned
the social network into a tool for ethnic cleansing, according to
former military officials, researchers and civilian officials in the
country.
16
17
18
***
19
The Myanmar military’s Facebook operation began several years
ago, said people familiar with how it worked. The military threw
major resources at the task, the people said, with as many as 700
people on it.
20
21
22
They began by setting up what appears to be news pages and pages
on Facebook that were devoted to Burmese pop stars, models and
23
24
25
26
27
28
105
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
106
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech.
107
Fortify Rights Report, at 95,
https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf.
CLASS ACTION COMPLAINT
38
Case No. __________________
PDF Page 40
other celebrities, like a beauty queen with a penchant for parroting
military propaganda….
1
2
Those then became distribution channels for lurid photos, false
news and inflammatory posts, often aimed at Myanmar’s Muslims,
the people said. Troll accounts run by the military helped spread
the content, shout down critics and fuel arguments between
commenters to rile people up. Often, they posted sham photos of
corpses that they said were evidence of Rohingya-perpetrated
massacres, said one of the people.
3
4
5
6
Digital fingerprints showed that one major source of the Facebook
content came from areas outside Naypyidaw, where the military
keeps compounds, some of the people said. 108
7
8
9
102.
By October 2015, the Allard K. Lowenstein International Human Rights Clinic at
10
Yale Law School had already concluded that there was “strong evidence that genocide is being
11
committed against Rohingya.”109 The worst, however, was yet to come.
12
13
14
B.
The August 2017 “Clearance Operations” and Their Aftermath: A
“Human Rights Catastrophe”
103.
The Myanmar military’s campaign of ethnic cleansing culminated in August 2017
15
with the “Clearance Operations.” The U.N. reported that “[d]uring the course of the operation
16
more than 40 percent of all villages in northern Rakhine State were partially or totally
17
destroyed…. As a result, over 725,000 Rohingya had fled to Bangladesh by September 2018.”110
18
The August 2017 clearance operations “caused the disintegration of a community and resulted in
19
a human rights catastrophe, the effects of which will span generations.” 111 Additional “clearance
20
21
22
23
24
25
26
27
28
Paul Mozur, A Genocide Incited on Facebook, With Posts From Myanmar’s Military,
NEW YORK TIMES (Oct. 15, 2018), https://www.nytimes.com/2018/10/15/technology/myanmarfacebook-genocide.html.
109
Persecution of the Rohingya Muslims: Is Genocide Occurring in Myanmar’s Rakhine
State, ALLARD K. LOWENSTEIN INTERNATIONAL HUMAN RIGHTS CLINIC, YALE LAW SCHOOL
(Oct. 2015) at 1, https://law.yale.edu/sites/default/files/documents/pdf/Clinics/fortifyrights.pdf.
110
UNHRC Report, ¶ 751,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf. The U.N. Mission
documented the clearance operations exhaustively: “The Mission obtained a wealth of
information on these events, including over 600 interviews with victims and eyewitnesses,
satellite imagery, documents, photographs and videos. It examined many incidents in detail. It
found consistent patterns of the most serious human rights violations and abuses.” Id. ¶ 754.
111
Id. ¶ 749.
108
CLASS ACTION COMPLAINT
39
Case No. __________________
PDF Page 41
1
operations” followed in numerous Rohingya villages across northern Rakhine State, with at least
2
54 verified locations.112
3
4
104.
The U.N. Mission described the clearance operations in six Rohingya villages in
detail.113 The following description of the operation in one of those villages is typical:
5
•
“[H]undreds of Tatmadaw soldiers … surrounded [the village].
They were accompanied by a smaller number of ethnic Rakhine
from neighbouring villages. The security forces then opened fire,
shooting at villagers, including those that were fleeing. Soldiers
also dragged people from houses and shot some of them at point
blank range. Others were killed by having their throats slit with
large knives.”114
•
“During the course of the operation, structures in [the village] were
burned and destroyed…. Satellite imagery analysis … shows the
extent of the destruction…. The entire Rohingya village … was
destroyed, while the nearby non-Rohingya village … remains
intact.”115
•
“Women and girls were also subjected to rape, gang rape, sexual
mutilation and sexual humiliation during the ‘clearance
operations.’”116
•
These “‘clearance operations’ were led by the Tatmadaw….
Individuals from the neighbouring ethnic Rakhine village were
recognised as participants and some ethnic Rakhine men assisted
the military….”117
6
7
8
9
10
11
12
13
14
15
16
17
105.
The UNHRC Report and the Fortify Rights Report contain numerous first-person
18
accounts of atrocities committed against the Rohingya by both Myanmar security forces and by
19
civilians during the August 2017 “Clearance Operations.” For example:
•
20
21
“The soldiers killed the male members of my family. They shot at
them first and then slit their throats. The courtyard was full of
blood. They killed my husband, my father-in-law and my two
22
23
24
112
113
25
26
27
114
115
116
117
28
Id. ¶ 880.
Id. ¶¶ 755-879.
Id. ¶¶ 782-83.
Id. ¶¶ 784, 788.
Id. ¶ 790.
Id. ¶ 797.
CLASS ACTION COMPLAINT
40
Case No. __________________
PDF Page 42
nephews of 15 and eight years old. They even killed the child in
the same way.”118
1
2
•
“I found my six-month old son’s body lying next to my wife’s
body. She had been shot. My baby son was stabbed in his stomach
and his intestine and liver were coming out. When I took his small
body into my lap, I was showered with his blood.”119
•
“My husband was shot and then he had his throat cut. I was raped.
It is so difficult to say what happened. They tore off my clothes,
then six soldiers raped me, and after that two ethnic Rakhine men,
whom I recognised, raped me. They pressed my breasts and face
continuously. My face almost turned blue. I knew the ethnic
Rakhine who lived nearby.”120
•
“I hid in the toilet outhouse, some distance from our house. I saw
that our house was surrounded by 10 soldiers and some police. I
was able to see what happened. First they tied up my parents. Then
they shot my father and raped my mother; later they killed her too.
After this, they burned our house.”121
•
“One mother described how she had to choose which of her
children to save. The security forces had entered her house and
grabbed her young daughter. Her son tried to save his sister and was
attacked by the security forces. The mother watched from the other
end of the house and made the split second decision that these two
children would not live, but that she could perhaps still save her two
younger children. Her husband returned the next morning to the
village and dug through the pits of bodies until he found the corpse
of their son. They never found the body of their daughter. The
mother told the Mission with haunted eyes: ‘How can I continue
with my life having made this choice?’”122
•
“I saw my own children killed. Those who are left of my family
came with me here. My three children and my mother were killed.
They made them lie down on the ground and they cut the backs of
their necks.”123
•
“Some small children were thrown into the river…. They hacked
small children who were half alive. They were breast-feeding age
children, two years, three years, five years….” 124
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
118
23
24
25
119
120
121
122
26
27
28
Id. ¶ 809.
Id. ¶ 837.
Id. ¶ 854.
Id. ¶ 862.
Id. ¶ 825.
123
Fortify Rights Report, at 60,
https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf.
124
Id. at 61.
CLASS ACTION COMPLAINT
41
Case No. __________________
PDF Page 43
1
•
“The military took and arrested around 50 people. They brought
them to the military camp … and set fire to where they kept them.
One was my own brother. There was a small hut, and they put all
the people in there and set it on fire.”125
•
“As soon as we got on the boat, they shot at us…. Foyezur
Rahman was my father. My daughter was Sofia. She was 18. They
were both shot in the back. As soon as the military shot them, they
stopped moving. We brought their dead bodies here [to
Bangladesh] and buried them.”126
•
“I saw her taken from the house and raped by military soldiers. It
happened outside, beside a house. We watched from inside the
house. After they raped her, they killed her…. [O]ne person [raped
her], then she was taken to the road, and he cut her neck and cut
her breasts off.”127
2
3
4
5
6
7
8
9
106.
In December 2017, Médicins Sans Frontières (Doctors Without Borders) (“MSF”)
10
published estimates of Rohingya deaths between August 25 and September 24, 2017—the month
11
after the “clearance operations” began—based on surveys of refugees in Bangladesh. MSF
12
estimated that “8,170 deaths were due to violence …, including 1,247 children under five years
13
of age.… Cause of death by shooting accounted for 69.4% of these deaths; being ‘burned to
14
death at home’ accounted for 8.8%; being beaten to death accounted for 5.0%; sexual violence
15
leading to death for 2.6%; and death by landmine for 1.0%.”
16
107.
MSF noted that “the rates of mortality captured here are likely to be
17
underestimates, as the data does not account for those people who have not yet been able to flee
18
Myanmar, or for families who were killed in their entirety.” 128 The U.N. Mission similarly
19
“concluded that the estimated number of more than 10,000 deaths during the August-September
20
2017 ‘clearance operations’ alone is likely to be conservative.” 129
21
22
23
24
125
126
127
25
26
27
28
Id. at 62.
Id. at 65.
Id. at 69.
Rohingya crisis – a summary of findings from six pooled surveys, MÉDICINS SANS
FRONTIÈRES (Dec. 9, 2017), https://www.msf.org/myanmarbangladesh-rohingya-crisis-summaryfindings-six-pooled-surveys.
129
UNHRC Report, ¶ 1482,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
128
CLASS ACTION COMPLAINT
42
Case No. __________________
PDF Page 44
1
108.
Most of the Rohingya who escaped the clearance operations now live in “a
2
miserable slum of a million people” in Bangladesh. Time reported in 2019 that “[c]onditions in
3
the [refugee] camps remain abysmal. Most refugees live in small shacks made of bamboo and
4
tarpaulin sheets, so tightly packed together that they can hear their neighbors talking, having sex,
5
and disciplining their children or, sometimes, wives. In the springtime, the huts turn into saunas.
6
In the monsoon season, daily rainfall turns hilly footpaths into waterslides and lifts trash and
7
human waste from open drains to float in stagnant pools.” “[M]urders and other forms of
8
violence occur almost nightly inside the camps and are rarely if ever investigated.” 130 “The
9
Rohingya are …, with no access to meaningful work, entirely dependent on humanitarian aid….
10
These factors increase vulnerability, in particular for women and girls, to trafficking and other
11
exploitation.”131 According to Steven Corliss of the U.N. refugee agency, UNHCR, “The
12
situation is untenable: environmentally, socially and economically.” 132
13
109.
Plaintiff and the Class have been deprived of their property, including their homes
14
and the land they cultivated for generations. In an update to its 2018 Report, the U.N. Mission
15
wrote:
16
The Mission concludes on reasonable grounds that the Government
undertook a concerted effort to clear and destroy and then
confiscate and build on the lands from which it forcibly displaced
hundreds of thousands of Rohingya. The consequences are twofold. This government-led effort subjugates Rohingya to inhumane
living conditions as [internally displaced persons] and refugees by
denying them access to their land, keeping them uprooted from
their homes, depriving them of their to ability to progress in
healthy and safe communities and preventing them from engaging
in livelihood activities that sustain them as a people. The second
consequence of the Government’s four-pronged approach of
clearing, destroying, confiscating and building on land is that it is
fundamentally altering the demographic landscape of the area by
17
18
19
20
21
22
23
Feliz Solomon, ‘We’re Not Allowed to Dream.’ Rohingya Muslims Exiled to Bangladesh
Are Stuck in Limbo Without an End in Sight, TIME (May 23, 2019),
https://time.com/longform/rohingya-muslims-exile-bangladesh/.
130
24
25
26
131
UNHRC Report, ¶ 1174,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
Feliz Solomon, ‘We’re Not Allowed to Dream.’ Rohingya Muslims Exiled to Bangladesh
Are Stuck in Limbo Without an End in Sight, TIME (May 23, 2019),
https://time.com/longform/rohingya-muslims-exile-bangladesh/.
132
27
28
CLASS ACTION COMPLAINT
43
Case No. __________________
PDF Page 45
cementing the demographic re-engineering of Rakhine State that
resulted from mass displacement. Much of this is being done under
the guise of ‘development,’ with a clear discourse emerging to this
effect in the immediate aftermath of the August 2017 ‘clearance
operations.’133
1
2
3
4
110.
In addition to loss of life, physical injuries, emotional trauma, and destruction or
5
taking of property, Plaintiff and the Class have been deprived of their culture and community.
6
The Rohingya people have their own language, not spoken anywhere else in the world. They
7
have lost their traditional places of worship. Family and community ties dating back generations
8
have been torn apart.
9
111.
Having become refugees in foreign countries where they largely do not speak the
10
language, have no financial resources, and lack knowledge of the culture or legal system,
11
Plaintiff and the Class have been denied meaningful justice. Most Class members have been
12
attempting to recover from severe physical and/or emotional trauma and struggling to survive in
13
dangerous, overcrowded refugee camps in Bangladesh—thousands of miles from any court
14
having jurisdiction over Facebook—since they were forced from Myanmar.
15
112.
The U.N. concluded that “[t]he attack on the Rohingya population of Myanmar
16
was horrendous in scope. The images of an entire community fleeing from their homes across
17
rivers and muddy banks, carrying their babies and infants and elderly, their injured and dying,
18
will and must remain burned in the minds of the international community. So will the ‘before
19
and after’ satellite imagery, revealing whole villages literally wiped off the map. In much of
20
northern Rakhine State, every trace of the Rohingya, their life and community as it has existed
21
for decades, was removed…. The ‘clearance operations’ were indeed successful.” 134
22
113.
23
•
24
25
26
27
28
Among the U.N. Mission’s findings were:
The elements of the crime of genocide were satisfied. “The
Mission is satisfied that the Rohingya … constitute a protected
133
Report of the detailed findings of the Independent International Fact-Finding Mission on
Myanmar, UNITED NATIONS HUMAN RIGHTS COUNCIL (Sept. 16, 2019),
https://www.ohchr.org/Documents/HRBodies/HRCouncil/FFMMyanmar/20190916/A_HRC_42_CRP.5.pdf, (“UNHRC 2019 Report”) ¶ 139.
134
UNHRC Report, ¶ 1439,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
CLASS ACTION COMPLAINT
44
Case No. __________________
PDF Page 46
group.”135 “The gross human rights violations … suffered by the
Rohingya at the hands of the Tatmadaw and other security forces
(often in concert with civilians) include conduct that falls with four
of [the] five categories of prohibited acts,” including killings,
serious bodily and mental harm, conditions of life calculated to
physically destroy the Rohingya, and measures intended to prevent
births.136 “The Mission … concludes, on reasonable grounds, that
the factors allowing the inference of genocidal intent are
present.”137
1
2
3
4
5
•
6
7
8
9
114.
“The Mission finds that crimes against humanity have been
committed in … Rakhine [State], principally by the Tatmadaw….
[T]hese include crimes against humanity of murder;
imprisonment[;] enforced disappearance; torture; rape, sexual
slavery and other forms of sexual violence; persecution; and
enslavement.”138
In its 2019 report, the U.N. Mission reaffirmed its earlier conclusions: “the
10
Mission concludes on reasonable grounds that, since the publication of the Mission’s 2018
11
report, the Government has committed the crimes against humanity of ‘other inhumane acts’ and
12
‘persecution’ in the context of a continued widespread and systematic attack against the
13
Rohingya civilian population in furtherance of a State policy to commit such an attack.” 139
14
Furthermore, the Mission concluded that “the evidence supports an inference of genocidal intent
15
and, on that basis, that the State of Myanmar breached its obligation not to commit genocide
16
under the Genocide Convention under the rules of State responsibility.” 140
17
C.
Facebook’s Role in the 2017 “Clearance Operations”
18
115.
The U.N. Mission specifically found that Facebook had contributed to the 2017
19
Clearance Operations:
20
The Mission has examined documents, … Facebook posts and
audio-visual materials that have contributed to shaping public
opinion on the Rohingya…. The analysis demonstrates that a
carefully crafted hate campaign has developed a negative
21
22
23
135
136
24
25
26
27
28
137
Id. ¶ 1391.
Id. ¶ 1392.
Id. ¶ 1441.
138
Id. ¶ 1511.
UNHRC 2019 Report, ¶ 214,
https://www.ohchr.org/Documents/HRBodies/HRCouncil/FFMMyanmar/20190916/A_HRC_42_CRP.5.pdf.
140
Id. ¶ 220.
139
CLASS ACTION COMPLAINT
45
Case No. __________________
PDF Page 47
perception of Muslims among the broad population in Myanmar….
This hate campaign, which continues to the present day, portrays
the Rohingya … as an existential threat to Myanmar and to
Buddhism…. It is accompanied by dehumanising language and the
branding of the entire [Rohingya] community as ‘illegal Bengali
immigrants.’ This discourse created a conducive environment for
the 2012 and 2013 anti-Muslim violence in Rakhine State and
beyond, without strong opposition from the general population. It
also enabled the hardening of repressive measures against the
Rohingya and Kaman in Rakhine State and subsequent waves of
State-led violence in 2016 and 2017.141
1
2
3
4
5
6
7
116.
The Guardian described the work of two analysts who noted a strong correlation
8
between the amount of hate speech on Facebook and the violence inflicted on the Rohingya in
9
late 2017:
10
Digital researcher and analyst Raymond Serrato examined about
15,000 Facebook posts from supporters of the hardline nationalist
Ma Ba Tha group. The earliest posts dated from June 2016 and
spiked on 24 and 25 August 2017, when ARSA Rohingya militants
attacked government forces, prompting the security forces to
launch the ‘clearance operation’ that sent hundreds of thousands of
Rohingya pouring over the border.
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
141
UNHRC Report, ¶ 696,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf (emphasis added).
CLASS ACTION COMPLAINT
46
Case No. __________________
PDF Page 48
Serrato’s analysis showed that activity within the anti-Rohingya
group, which has 55,000 members, exploded with posts registering
a 200% increase in interactions.
1
2
‘Facebook definitely helped certain elements of society to
determine the narrative of the conflict in Myanmar,’ Serrato told
the Guardian. ‘Although Facebook had been used in the past to
spread hate speech and misinformation, it took on greater potency
after the attacks.’
3
4
5
***
6
Alan Davis, an analyst from the Institute for War and Peace
Reporting who led a two-year study of hate speech in Myanmar,
said that in the months before August he noticed posts on
Facebook becoming ‘more organised and odious, and more
militarised.’
7
8
9
His research team encountered fabricated stories stating that
‘mosques in Yangon are stockpiling weapons in an attempt to blow
up various Buddhist pagodas and Shwedagon pagoda,’ the most
sacred Buddhist site in Yangon in a smear campaign against
Muslims. These pages also featured posts calling Rohingya the
derogatory term ‘kalars’ and ‘Bengali terrorists.’ Signs denoting
‘Muslim-free’ areas were shared more than 11,000 times.
10
11
12
13
***
14
Davis said … ‘I think things are so far gone in Myanmar right
now ... I really don’t know how Zuckerberg and co sleep at night.
If they had any kind of conscience they would be pouring a good
percentage of their fortunes into reversing the chaos they have
created.’142
15
16
17
18
19
117.
2017:
•
20
21
22
23
24
25
26
27
28
The Myanmar military used Facebook to justify the “clearance operations” in
“In a post from the official Facebook page of the Office of the
Tatmadaw Commander-in-Chief, … [a Myanmar parable about a
camel which gradually takes more and more space in his
merchant’s tent, until eventually the merchant is forced out] was
explained in detail in connection with the issue of the Rohingya in
Rakhine State…. Prior to its deletion by Facebook in August 2018,
the post had almost 10,000 reactions, over 6,000 shares and 146
comments.”143
142
Libby Hogan, Michael Safi, Revealed: Facebook hate speech exploded in Myanmar
during Rohingya crisis, THE GUARDIAN (Apr. 2, 2018),
https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-exploded-inmyanmar-during-rohingya-crisis.
143
UNHRC Report, ¶ 1312,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
CLASS ACTION COMPLAINT
47
Case No. __________________
PDF Page 49
1
•
“[I]n a 21 September 2017 post on Facebook … [the Tatmadaw]
Commander-in-Chief … states that, ‘the Bengali population
exploded and the aliens tried to seize the land of the local
ethnics….’”144
•
“[O]n 11 October 2017 the Commander-in-Chief … posted: ‘there
is exaggeration to say that the number of Bengali fleeing to
Bangladesh is very large.’ At the time more than 600,000
Rohingya had fled … Myanmar in a period of six weeks.”145
•
“On 27 October 2017, in another Facebook post entitled ‘every
citizen has the duty to safeguard race, religion, cultural identities
and national interest,’ [the] Commander-in-Chief stated that ‘all
must … preserve the excellent characteristics of the country
…’”146
2
3
4
5
6
7
8
9
118.
Rolling Stone reported that “[m]ore shocking was how [the military’s] bigoted
10
doctrine was parroted by Aung San Suu Kyi, the Nobel Peace Prize-winning human-rights icon
11
and de facto leader of Myanmar…. When she finally broke her silence, on Facebook, nearly two
12
weeks after the 2017 attacks began, it was in cold defense of the same military that kept her
13
under house arrest for 15 years when she was the country’s leading dissident. Suu Kyi blamed
14
‘terrorists’ for promoting a ‘huge iceberg of misinformation’ about the violence engulfing
15
Rakhine. She made no mention of the Rohingya exodus.”147
16
119.
The U.N. additionally found that there was “no doubt that the prevalence of hate
17
speech in Myanmar significantly contributed to increased tension and a climate in which
18
individuals and groups may become more receptive to incitement and calls for violence. This
19
also applies to hate speech on Facebook.”148 In early 2018, U.N. investigator Yanghee Lee
20
warned that “Facebook has become a beast,” and that “we know that the ultra-nationalist
21
22
23
24
144
145
146
25
26
27
28
Id. ¶ 1338.
Id. ¶ 1339.
Id. ¶ 1341.
147
Jason Motlagh, The Survivors of the Rohingya Genocide, ROLLING STONE (Aug. 9, 2018),
https://www.rollingstone.com/politics/politics-features/rohingya-genocide-myanmar-701354/
(emphasis added).
148
UNHRC Report, ¶ 1354,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf (emphasis added).
CLASS ACTION COMPLAINT
48
Case No. __________________
PDF Page 50
1
Buddhists have their own Facebooks and are really inciting a lot of violence and a lot of hatred
2
against the Rohingya or other ethnic minorities.” 149
3
D.
Civilian Participation in the 2017 “Clearance Operations”
4
120.
The radicalization of the Burmese population, to which Facebook materially
5
contributed, did not merely ensure tolerance of and support for the military’s campaign of
6
genocide against the Rohingya, it also allowed the military to recruit, equip, and train “civilian
7
death squads” that would actively participate in the atrocities. 150
8
9
10
121.
The U.N. Mission drew a connection between anti-Rohingya reporting and hate
speech, ethnic tension, and the ability of the military to recruit non-Rohingya civilians to
perpetrate violence against the Rohingya, finding:
11
•
“The inflammatory nature of much of this reporting [on activities
of Rohingya militants], often characterizing Rohingya as ‘Bengali
terrorists,’ coupled with rising vitriolic discourse and hate speech
against the Rohingya, fuelled an already volatile situation.”151
•
“[The reports] deepened inter-communal suspicion and fear. They
were likely a factor in a notable breakdown in the relationship
between the communities, particularly in the weeks leading up to
25 August 2017.”152
•
“During this period [beginning in late 2016], the Myanmar
authorities made increasing efforts to recruit ethnic Rakhine as
members of the security apparatus…. Moreover, the recruitment of
non-Rohingya to Government supported militias … continued
throughout this period in Rakhine State.”153
12
13
14
15
16
17
18
19
20
21
22
23
24
25
149
Libby Hogan, Michael Safi, Revealed: Facebook hate speech exploded in Myanmar
during Rohingya crisis, THE GUARDIAN (Apr. 2, 2018),
https://www.theguardian.com/world/2018/apr/03/revealed-facebook-hate-speech-exploded-inmyanmar-during-rohingya-crisis; see Tom Miles, U.N. investigators cite Facebook role in
Myanmar crisis, REUTERS (Mar. 12, 2018), https://www.reuters.com/article/us-myanmarrohingya-facebook-idUKKCN1GO2PN.
150
Jason Motlagh, The Survivors of the Rohingya Genocide, ROLLING STONE (Aug. 9, 2018),
https://www.rollingstone.com/politics/politics-features/rohingya-genocide-myanmar-701354/.
151
26
27
28
UNHRC Report, ¶ 1134,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
152
Id. ¶ 1135.
153
Id. ¶ 1143-44.
CLASS ACTION COMPLAINT
49
Case No. __________________
PDF Page 51
1
122.
In a 162-page report based on 254 interviews, the human rights group Fortify
2
Rights documented how, in August 2017, “Myanmar authorities … activated non-Rohingya
3
civilian squads, some of whom the authorities previously armed and/or trained. These civilian
4
perpetrators … acted under the Myanmar military and police in razing hundreds of Rohingya
5
villages throughout northern Rakhine State, brutally killing masses of unarmed Rohingya men,
6
women, and children.”154 The title of the report, “They Gave Them Long Swords,” referred to an
7
eyewitness account of Myanmar soldiers arming non-Rohingya civilians.155
8
9
123.
In a chapter of its report entitled “Criminal Acts Against Rohingya by Civilian
Perpetrators Since August 25, 2017,” Fortify Rights stated:
10
After arming and training local non-Rohingya citizens who had a
demonstrated history of hostility toward Rohingya Muslims in
northern Rakhine State, the Myanmar authorities activated them on
August 25…. Groups of local non-Rohingya citizens, in some
cases trained, armed, and operating alongside Myanmar security
forces, murdered Rohingya men, women, and children, destroyed
and looted Rohingya property, and assisted the Myanmar Army
and Police in razing villages. 156
11
12
13
14
15
124.
In a Facebook post on September 22, 2017, the Burmese Commander-in-Chief
16
“encouraged further cooperation between local non-Rohingya citizens and the Myanmar
17
military, saying ‘[l]ocal ethnics can strengthen the defense prowess by living in unity and by
18
joining hands with the administrative bodies and security forces in oneness.’”157
19
E.
Facebook Ignored Complaints of Hate Speech on its Website
20
125.
Because Myanmar’s history of repressive military rule and ethnic violence was
21
well-documented by the time Facebook became widely available in Myanmar around 2012,
22
Facebook should have known that its product could be used to spread hate speech and
23
24
25
26
27
28
154
Fortify Rights Report, at 12-13, 14,
https://www.fortifyrights.org/downloads/Fortify_Rights_Long_Swords_July_2018.pdf.
155
Id. at 16.
156
Id. at 55.
157
Id. at 46 & n.87 (citing Facebook post).
CLASS ACTION COMPLAINT
50
Case No. __________________
PDF Page 52
1
misinformation. In addition, beginning in 2013, Facebook was repeatedly alerted to hate speech
2
on its system:
3
•
In 2013, a new civil society organization called Panzagar, meaning
“flower speech,” was formed in Myanmar. 158 The group spoke out
locally about anti-Muslim hate speech directed at the Rohingya
minority that was proliferating on Facebook. One of the group’s
awareness-raising methods was to put flowers in their mouths to
symbolize speaking messages of peace versus hate. Panzagar
reported instances of hate speech to Facebook. 159
•
In November 2013, Aela Callan, an Australian documentary
filmmaker, “met at Facebook’s California headquarters with Elliott
Schrage, vice president of communications and public policy” to
discuss a project she had begun regarding “hate speech and false
reports that had spread online during conflicts between Buddhists
and Rohingya Muslims the prior year…. I was trying to alert him to
the problems she said….” But “[h]e didn’t connect me with anyone
inside Facebook who could deal with the actual problem….” 160
•
“On March 3, 2014, Matt Schissler [an American aid worker
working in Myanmar], was invited to join a call with Facebook on
the subject of dangerous speech online…. Toward the end of the
meeting, Schissler gave a stark recounting of how Facebook was
hosting dangerous Islamophobia. He detailed the dehumanizing and
disturbing language people were using in posts and the doctored
photos and misinformation being spread widely.”161
•
By June 14, 2014, Al Jazeera had published an article entitled
“Facebook in Myanmar: Amplifying Hate Speech?” In that article,
a civil society activist was quoted as saying: “Since the violence in
Rakhine state began, we can see that online hate speech is spreading
and becoming more and more critical and dangerous…. I think
Facebook is the most effective way of spreading hate speech. It’s
already very widespread, infecting the hearts of people.” The article
cited Facebook posts reading: “We should kill every Muslim. No
Muslims should be in Myanmar”; “Why can’t we kick out the
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
158
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech.
159
Mary Michener Oye, Using ‘flower speech’ and new Facebook tools, Myanmar fights
online hate speech, THE WASHINGTON POST,
https://www.washingtonpost.com/national/religion/using-flower-speech-and-new-facebooktools-myanmar-fights-online-hate-speech/2014/12/24/3bff458c-8ba9-11e4-ace947de1af4c3eb_story.html.
160
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
161
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 177 (HarperCollins 2021).
CLASS ACTION COMPLAINT
51
Case No. __________________
PDF Page 53
Muslim dogs?”; and “all terrorists are Muslim … they kill innocent
men and women so peace and Islam are not related.” 162
1
2
•
On August 18, 2014, PRI’s “The World” program published a story
entitled “In newly liberated Myanmar, hatred spreads on
Facebook.” After describing several false rumors that led to
violence, the article reported: “The pattern repeats in towns and
villages across Myanmar. Rumors rip through communities, fueled
by seething racism and embellishments. Graphic images of violence
are shared virally through social media platforms like Facebook,
which has become one of the most popular websites in the
country….”163
•
After the March 2014 call with Schissler, “a handful of Facebook
employees started an informal working group to connect Facebook
employees in Menlo Park with activists in Myanmar.”164 Schissler
said that “between March and December 2014, he held [a series] of
discussions with Facebook officials…. He told them how the
platform was being used to spread hate speech and false rumors in
Myanmar, he said, including via fake accounts.” 165
•
“In March 2015, Schissler gave a talk at Facebook’s California
headquarters about new media, particularly Facebook, and antiMuslim violence in Myanmar.”166 “In a small conference room
where roughly a dozen Facebook employees had gathered, with
others joining by video-conference, he shared a PowerPoint
presentation that documented the seriousness of what was
happening in Myanmar: hate speech on Facebook was leading to
real-world violence in the country, and it was getting people
killed.”167 One Facebook employee asked whether Schissler thought
genocide could happen in Myanmar: “‘Absolutely’ he answered. If
Myanmar continued on its current path, and the anti-Muslim hate
speech grew unabated, a genocide was possible. No one followed
up on the question.”168
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
162
20
21
22
23
24
25
Hereward Holland, Facebook in Myanmar: Amplifying Hate Speech?, AL JAZEERA
(Jun. 14, 2014), https://www.aljazeera.com/features/2014/6/14/facebook-in-myanmaramplifying-hate-speech (emphasis added).
163
Bridget DiCerto, In newly liberated Myanmar, hatred spreads on Facebook, THE WORLD
(Aug 8, 2014), https://www.pri.org/stories/2014-08-08/newly-liberated-myanmar-hatredspreads-facebook.
164
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 178 (HarperCollins 2021).
165
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
166
26
27
28
Id.
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 181 (HarperCollins 2021).
168
Id. at 181-82.
167
CLASS ACTION COMPLAINT
52
Case No. __________________
PDF Page 54
1
•
“‘They were warned so many times,’ said David Madden, a tech
entrepreneur who worked in Myanmar. He said he told Facebook
officials in 2015 that its platform was being exploited to foment
hatred in a talk he gave at its headquarters in Menlo Park,
California. About a dozen Facebook people attended the meeting in
person…. Others joined via video. ‘It couldn’t have been presented
to them more clearly, and they didn’t take the necessary steps,’
Madden said.”169
•
Brooke Binkowski, who worked for an organization that did factchecking for Facebook beginning in early 2017, “said she tried to
raise concerns about misuse of the platform abroad, such as the
explosion of hate speech and misinformation during the Rohingya
crisis in Myanmar…. ‘I was bringing up Myanmar over and over
and over,’ she said. ‘They were absolutely resistant.’ Binkowski,
who previously reported on immigration and refugees, said
Facebook largely ignored her: ‘I strongly believe that they are
spreading fake news on behalf of hostile foreign powers and
authoritarian governments as part of their business model.’” 170
2
3
4
5
6
7
8
9
10
11
126.
Facebook’s response to such warnings about hate speech on its websites in Burma
12
was, however, utterly ineffective. The extreme import of what Matt Schissler was describing
13
“didn’t seem to register with the Facebook representatives. They seemed to equate the harmful
14
content with cyberbullying: Facebook wanted to discourage people from bulling across the
15
system, he said, and they believed that the same set of tools they used to stop a high school
16
senior from intimidating an incoming freshman could be used to stop Buddhist monks in
17
Myanmar from spreading malicious conspiracy theories about Rohingya Muslims.”171
18
127.
Facebook had almost no capability to monitor the activity of millions of users in
19
Burma: “In 2014, the social media behemoth had just one content reviewer who spoke Burmese:
20
a local contractor in Dublin, according to messages sent by Facebook employees in the private
21
Facebook chat group. A second Burmese speaker began working in early 2015, the messages
22
show.” Accenture, to whom Facebook outsourced the task of monitoring for violations of its
23
169
24
25
26
27
28
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
170
Sam Levin, ‘They don’t care’: Facebook factchecking in disarray as journalists push to
cut ties, THE GUARDIAN (Dec. 13, 2018),
https://www.theguardian.com/technology/2018/dec/13/they-dont-care-facebook-fact-checkingin-disarray-as-journalists-push-to-cut-ties.
171
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 178 (HarperCollins 2021) (emphasis added).
CLASS ACTION COMPLAINT
53
Case No. __________________
PDF Page 55
1
community standards in Burma and other Asian countries, did not hire its first two Burmese
2
speakers, who were based in Manila, until 2015. Former monitors “said they didn’t actually
3
search for hate speech themselves; instead, they reviewed a giant queue of posts mostly reported
4
by Facebook users.” Chris Tun, a Deloitte consultant who had arranged meetings between the
5
Burmese government and Facebook, told Reuters: “Honestly, Facebook had no clue about
6
Burmese content. They were totally unprepared.” 172
7
128.
Instead, Facebook tried initially to rely entirely on users to report inappropriate
8
posts. However, “[a]lthough Myanmar users at the time could post on Facebook in Burmese, the
9
platform’s interface – including its system for reporting problematic posts – was in English.”173
10
129.
In one case in 2018, Mark Zuckerberg was forced to apologize for exaggerating
11
Facebook’s monitoring capabilities. In an interview with Vox, Zuckerberg cited “one incident
12
where Facebook detected that people were trying to spread ‘sensational messages’ through
13
Facebook Messenger to incite violence on both sides of the conflict” but claimed that “the
14
messages were detected and stopped from going through.”174 In response, a group of activists
15
issued an open letter criticizing Zuckerberg and pointing out that Facebook had not detected the
16
messages; rather, the activists had “flagged the messages repeatedly to Facebook, barraging its
17
employees with strongly worded appeals until the company finally stepped in to help.”
18
Zuckerberg apologized in an email: “I apologize for not being sufficiently clear about the
19
important role that your organizations play in helping us understand and respond to Myanmar-
20
related issues, including the September incident you referred to.” 175
21
22
23
24
25
26
172
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
173
Id.
174
Jen Kirby, Mark Zuckerberg on Facebook’s role in ethnic cleansing in Myanmar: ‘It’s a
real issue’, VOX (Apr. 2, 2018), https://www.vox.com/2018/4/2/17183836/mark-zuckerbergfacebook-myanmar-rohingya-ethnic-cleansing-genocide.
Kevin Roose, Paul Mozur, Zuckerberg Was Called Out Over Myanmar Violence. Here’s
His Apology, NEW YORK TIMES (Apr. 9, 2018),
https://www.nytimes.com/2018/04/09/business/facebook-myanmar-zuckerberg.html.
175
27
28
CLASS ACTION COMPLAINT
54
Case No. __________________
PDF Page 56
1
130.
Reuters’ Steve Stecklow sent the examples of hate speech that he and his team
2
had found on the system, some of which was “extremely violent and graphic,” to Facebook: “It
3
was sickening to read…. When I sent it to Facebook, I put a warning on the email saying I just
4
want you to know these are very disturbing things…. What was so remarkable was that [some
5
of] this had been on Facebook for five years and it wasn’t until we notified them in August [of
6
2018] that it was removed.”176
7
131.
“The [U.N.] Mission itself experienced a slow and ineffective response from
8
Facebook when it used the standard reporting mechanism to alert the company to a post targeting
9
a human rights defender for his alleged cooperation with the Mission.” “The post described the
10
individual as a ‘national traitor,’ consistently adding the adjective ‘Muslim.’ It was shared and
11
reposted over 1,000 times. Numerous comments to the post explicitly called for the person to be
12
killed, in unequivocal terms: … ‘If this animal is still around, find him and kill him….’ ‘He is a
13
Muslim. Muslims are dogs and need to be shot….’ ‘Remove his whole race.’ … In the weeks
14
and months after the post went online, the human rights defender received multiple death threats
15
from Facebook users….” “The Mission reported this post to Facebook on four occasions; in each
16
instance the response received was that the post was examined but ‘doesn’t go against one of
17
[Facebook’s] specific Community Standards.’ … The post was finally removed several weeks
18
later but only through the support of a contact at Facebook, not through the official channel.
19
Several months later, however, the Mission found at least 16 re-posts of the original post still
20
circulating on Facebook.”177
21
132.
On February 25, 2015, Susan Benesch, a human rights lawyer and researcher who
22
directs the Dangerous Speech Project at the Berkman Klein Center for Internet & Society at
23
Harvard University, gave a presentation entitled “The Dangerous Side of Language” at
24
Facebook. The presentation showed how anti-Rohingya speech being disseminated by Facebook
25
26
27
28
176
Anisa Sudebar, The country where Facebook posts whipped up hate, BBC TRENDING
(Sept. 12, 2018), https://www.bbc.com/news/blogs-trending-45449938.
177
UNHRC Report, ¶ 1351,
https://digitallibrary.un.org/record/1643079/files/A_HRC_39_CRP-2-EN.pdf.
CLASS ACTION COMPLAINT
55
Case No. __________________
PDF Page 57
in Myanmar was not merely hate speech but “Dangerous speech” that “Moves an audience to
condone or take part in violence”: ‘78
“They are breeding so
fast, and they are
stealing our women,
raping them...
We must keep
Myanmar Buddhist."
Wirathu
133.
Even aer
the atrocities in late 2017, Facebook refused to help obtain justice for
the Rohingya:
In late September 2018, Matthew Smith, the CEO of Fortify
Rights, a human rights organization based in Southeast Asia, began
to work with human rights groups to build a case strong enough for
the International Criminal Court, at the Hague, proving that
Burmese soldiers had violated international laws and perpetuated a
genocide against the Rohingya. . The platform held detailed
information on all its user accounts; even when posts were deleted,
Facebook kept a record of everything a person had ever written,
and every image uploaded. . Most Burmese soldiers had
Facebook on their phones, so the company would have records of
the locations of army units’ soldiers to match with attacks on
Rohingya villages.
..
..
*
II II
If Smith and other human rights workers could get their hands on
the deleted posts, they could build a stronger case documenting
how Myanmar’s military had both carried out a genocide against
"3
at
Susan Benesch, The Dangerous Side ofLanguage, Dangerous Speech Project, available
https://www.dropbox.com/s/tazw9elxptquugfI'he%20Dangerous%20Side%200f’/020Language.p
df?dl=0&tbclid=IwAR1thI4-
BHCawG6g00pX3ManYAMN6IKd8kY8sB4x76nq66vihlAdlEv0As.
Cuss ACHON COMPLAINT
Case No.
56
PDF Page 58
the Rohingya and manipulated the public into supporting their
military onslaught.179
1
2
Apparently not eager to help prove that Facebook had been
complicit in genocide, Facebook’s lawyers denied Smith’s requests
for access to the data: “‘Facebook had the chance to do the right
thing again and again, but they didn’t. Not in Myanmar,’ said
Smith. ‘It was a decision, and they chose not to help.’”180
3
4
5
6
7
134.
participating in disinformation efforts that led to the genocide.
8
9
Worst yet, Facebook’s activity promoted such content to its users, thus actively
135.
Facebook ultimately ratified its conduct and its involvement of the genocide and
violence in Burma by admitting shortcomings of its system.
10
11
12
F.
Facebook Admits That It Had a Responsibility to Prevent Its Product
From Being Used to Incite Violence and Genocide
136.
In 2018, after the “clearance operations,” several senior Facebook executives,
13
including Mark Zuckerberg, belatedly admitted that the company had a responsibility to prevent
14
its product from being used to incite violence in Burma and should have done more in that
15
regard. On April 10, 2018, Zuckerberg testified before the U.S. Senate:
16
SEN. PATRICK LEAHY: ... [S]ix months ago, I asked your
general counsel about Facebook’s role as a breeding ground for
hate speech against Rohingya refugees. Recently, U.N.
investigators blamed Facebook for playing a role in inciting
possible genocide in Myanmar. And there has been genocide
there….
17
18
19
This is the type of content I’m referring to. It calls for the death of
a Muslim journalist. Now, that threat went straight through your
detection system, it spread very quickly, and then it took attempt
after attempt after attempt, and the involvement of civil society
groups, to get you to remove it.
20
21
22
Why couldn’t it be removed within 24 hours?
23
ZUCKERBERG: Senator, what’s happening in Myanmar is a
terrible tragedy, and we need to do more....
24
25
26
27
28
Sheera Frenkel and Cecilia Kang, An Ugly Truth: Inside Facebook’s Battle for
Domination, at 185-86 (HarperCollins 2021).
180
Id. at 186-87.
179
CLASS ACTION COMPLAINT
57
Case No. __________________
PDF Page 59
LEAHY: We all agree with that. 181
1
2
137.
In a statement to Reuters, Mia Garlick, Facebook’s director of Asia Pacific
3
Policy, stated: “We were too slow to respond to concerns raised by civil society, academics and
4
other groups in Myanmar. We don’t want Facebook to be used to spread hatred and incite
5
violence. This … is especially true in Myanmar where our services can be used to amplify hate
6
or exacerbate harm against the Rohingya.”182
7
138.
8
We have a responsibility to fight abuse on Facebook. This is
especially true in countries like Myanmar where many people are
using the internet for the first time and social media can be used to
spread hate and fuel tension on the ground.
9
10
The ethnic violence in Myanmar is horrific and we have been too
slow to prevent misinformation and hate on Facebook.183
11
12
13
In August 2018, Sara Su, a Product Manager, posted on Facebook’s blog:
139.
On September 5, 2018, Facebook COO Sheryl Sandberg testified before the U.S.
Senate:
14
SEN. MARK WARNER: … Ms. Sandberg, you made mention in
your opening testimony the fact that sometimes political actors are
using the platforms really to incent violence. I mean, I think
you’ve made at least some reference, mention of Myanmar. We’ve
obviously seen a great tragedy take place there where hundreds of
thousands of Rohingya Muslims are fleeing and in many ways.
The U.N. High Commissioner has said that fake accounts on
Facebook have incented that violence. Do you believe that
Facebook has both a moral obligation and potentially even a
legal obligation to take down accounts that are actually
incentivizing violence?
15
16
17
18
19
20
21
22
23
24
25
26
27
28
181
Facebook, Social Media Privacy, and the Use and Abuse of Data, Senate Hearing 115683 before the Comm. on Commerce, Science, and Transportation, et al., 115th Cong. (Apr. 10,
2018), https://www.govinfo.gov/content/pkg/CHRG-115shrg37801/html/CHRG115shrg37801.htm.
182
Steve Stecklow, Why Facebook is losing the war on hate speech in Myanmar, REUTERS
(Aug. 15, 2018), https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/
(emphasis added).
183
Sara Su, Update on Myanmar, FACEBOOK NEWSROOM (Aug. 15, 2018),
https://about.fb.com/news/2018/08/update-on-myanmar/.
CLASS ACTION COMPLAINT
58
Case No. __________________
PDF Page 60
SHERYL SANDBERG: I strongly believe that. In the case of
what’s happened in Myanmar, it’s, it’s devastating and we’re
taking aggressive steps and we know we need to do more….184
1
2
3
140.
In October 2018, BSR (Business for Social Responsibility) published a human
4
rights impact assessment—commissioned by Facebook itself—of Facebook’s presence in
5
Burma; BSR found that:
6
•
“Facebook is … used to spread rumors about people and events.
Character assassinations were described to BSR during this
assessment, and in extreme cases these have extended to online
death threats…. There are indications that organized groups make
use of multiple fake accounts and news pages to spread hate
speech, fake news, and misinformation for political gain. Rumors
spread on social media have been associated with communal
violence and mob justice.”185
•
“The Facebook platform in Myanmar is being used by bad actors
to spread hate speech, incite violence, and coordinate harm….
Facebook has become a means for those seeking to spread hate and
cause harm, and posts have been linked to offline violence…. [F]or
example, the Report of the Independent International Fact-Finding
Mission on Myanmar describes how Facebook has been used by
bad actors to spread anti-Muslim, anti-Rohingya, and anti-activist
sentiment.”186
•
“The consequences for the victim are severe, with lives and bodily
integrity placed at risk from incitement to violence.” 187
7
8
9
10
11
12
13
14
15
16
17
141.
On November 5, 2018, Alex Warofka, Facebook’s Product Policy Manager,
18
issued a statement on the BSR report: “The report concludes that, prior to this year, we weren’t
19
doing enough to help prevent our platform from being used to foment division and incite offline
20
violence. We agree that we can and should do more.”188
21
22
23
24
25
26
Open Hearing on Foreign Influence Operations’ Use of Social Media Platforms, Senate
Hearing 115-460 before the Select Comm. of Intel., 115th Cong. (Sept. 5, 2018),
https://www.govinfo.gov/content/pkg/CHRG-115shrg31350/html/CHRG-115shrg31350.htm.
185
BSR Report, at 13, https://about.fb.com/wp-content/uploads/2018/11/bsr-facebookmyanmar-hria_final.pdf.
184
186
187
Id. at 24.
Id. at 35.
188
27
28
Alex Warofka, An Independent Assessment of the Human Rights Impact of Facebook in
Myanmar, FACEBOOK NEWSROOM (Nov. 5, 2018), https://about.fb.com/news/2018/11/myanmarhria/.
CLASS ACTION COMPLAINT
59
Case No. __________________
PDF Page 61
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
142.
In October 2021, a former member of Facebook’s Integrity Team submitted a
sworn whistleblower declaration to the SEC. It stated, inter alia:
At Facebook, … there’s no will to actually fix problems, in
particular if doing so might reduce user engagement, and therefore
profits….
Any projects Facebook undertakes under the banner of charity or
community building are actually intended to drive engagement….
Internet.org, Facebook’s scheme to provide Internet to the
developing world, wasn’t about charity…. Inside the company,
the dialogue was that this is about gaining an impenetrable
foothold in order to harvest data from untapped markets.
Through Internet.org, which provided Facebook at free or greatly
reduced rates in key markets, Facebook effectively became the
Internet for people in many developing countries…. [Facebook
executives would] say ‘When you are the sole source for the
Internet you are the sole source for news.’
Facebook executives often use data to confuse, rather than clarify
what is occurring. There is a conscious effort to answer questions
from regulators in ways that intentionally downplay the severity of
virtually any given issue….
An[] example of their playbook played out in the wake of the
genocide of the Rohingya refugees in Myanmar, a country where
Facebook was effectively the Internet for most people, and where
the long-isolated population was vulnerable to information
manipulation. Facebook executives were fully aware that posts
ordering hits by the Myanmar government on the minority
Muslim Rohingya were spreading wildly on Facebook, because
it was being reported in the media and multiple aid-organizations,
as well as major, top-tier reporters who used to call the company
when they discovered early on that the genocide was being
accommodated on Facebook. It was clear before the killing even
started that members of the military junta in Myanmar were
directing this activity. But, when the violence of the early stages of
the Myanmar government-directed genocide metastasized and the
murders were unmistakably being directed on Facebook, I was
instructed to tell the media, “We know now, and we finally
managed to remove their access, but we did not have enough
Burmese-speaking moderators.” This part was true; there was only
one Burmese translator on the team of moderators for years, in the
same period when the communications apparatus grew by leaps
and bounds. But the issue of the Rohingya being targeted on
Facebook was well known inside the company for years. I
refused to deploy the approved talking point.
Later, after widespread public blowback forced the company to
hire a human rights group to conduct an independent review,
Facebook’s policy manager Alex Warofka released a statement
with the typical Facebook ‘mea culpa’ response: ‘We agree that we
CLASS ACTION COMPLAINT
60
Case No. __________________
PDF Page 62
can and should do more.’ I quickly realized that the company was
giving a PR response to a genocide that they accommodated—that,
I, working for Facebook, had been a party to genocide. This is
what prompted me to look for another job. 189
1
2
3
143.
Facebook’s subsequent actions prove that, for an investment amounting to a
4
miniscule portion of the company’s vast resources, 190 the company could have blocked much of
5
the hate speech against the Rohingya. In August 2018, Facebook posted on its website:
6
The ethnic violence in Myanmar has been truly horrific…. While
we were too slow to act, we’re now making progress—with better
technology to identify hate speech, improved reporting tools, and
more people to review content.
7
8
Today, we are taking more action in Myanmar, removing a total of
18 Facebook accounts, one Instagram account and 52 Facebook
Pages, followed by almost 12 million people. We are preserving
data, including content, on the accounts and Pages we have
removed.191
9
10
11
12
144.
In December 2018, Facebook updated its blog to report that it removed an
13
additional “425 Facebook Pages, 17 Facebook Groups, 135 Facebook accounts and 15 Instagram
14
accounts in Myanmar for engaging in coordinated inauthentic behavior on Facebook…. [W]e
15
discovered that these seemingly independent news, entertainment, beauty and lifestyle Pages
16
were linked to the Myanmar military.”192
17
145.
Rosa Birch, head of Facebook’s Strategic Response Team, told NBC in 2019 that
18
“the team worked on a new tool that allows approved non-governmental organizations to flag
19
problematic material they see on Facebook in a way that is seen more quickly by the company
20
than if a regular user reported the material. ‘It sounds relatively simple, and something that we
21
22
23
24
25
26
27
28
189
Emphasis added.
190
Between 2011 and 2017, Facebook reported revenues of $115,357,000,000 and net
income of $34,893,000,000. Facebook: annual revenue and net income 2007-2020, STATISTICA
RESEARCH DEPARTMENT (Feb. 5, 2021), https://www.statista.com/statistics/277229/facebooksannual-revenue-and-net-income/.
191
Removing Myanmar Military Officials From Facebook, FACEBOOK NEWSROOM (Aug. 28,
2018), https://about.fb.com/news/2018/08/removing-myanmar-officials/ (emphasis added).
192
Id.
CLASS ACTION COMPLAINT
61
Case No. __________________
PDF Page 63
1
should have done a couple of years ago,’ she said.”193 “When hate speech against the Rohingya
2
minority in Myanmar spread virulently via Facebook in Burmese (a language spoken by some 42
3
million people) Facebook was slow to act because it had no hate-speech detection algorithm in
4
Burmese, and few Burmese-speaking moderators. But since the Rohingya genocide, Facebook
5
has built a hate-speech classifier in Burmese by pouring resources toward the project. It paid to
6
hire 100 Burmese-speaking content moderators, who manually built up a dataset of Burmese hate
7
speech that was used to train an algorithm.”194
8
9
10
146.
Recent revelations show, however, that Facebook continues to ignore the harm its
algorithms and product inflict in developing countries. A September 2021 Wall Street Journal
article based on leaked internal Facebook documents reported:
11
Facebook treats harm in developing countries as ‘simply the cost
of doing business’ in those places, said Brian Boland, a former
Facebook vice president who oversaw partnerships with internet
providers in Africa and Asia before resigning at the end of last
year. Facebook has focused its safety efforts on wealthier markets
with powerful governments and media institutions, he said, even as
it has turned to poorer countries for user growth.
12
13
14
15
‘There is very rarely a significant, concerted effort to invest in
fixing those areas,’ he said.
16
***
17
An internal Facebook report from March said actors including
some states were frequently on the platform promoting violence,
exacerbating ethnic divides and delegitimizing social institutions.
‘This is particularly prevalent—and problematic—in At Risk
Countries,’ the report says.
18
19
20
It continues with a header in bold: ‘Current mitigation strategies
are not enough.’195
21
22
David Ingram, Facebook’s new rapid response team has a crucial task: Avoid fueling
another genocide, NBC (June 20, 2019), https:/www.nbcnews.com/tech/tech-news/facebook-snew-rapid-response-team-has-crucial-task-avoid-n1019821 (emphasis added).
194
Billy Perrigo, Facebook Says It’s Removing More Hate Speech Than Ever Before, But
There’s a Catch, TIME (Nov. 27, 2019), https://time.com/5739688/facebook-hate-speechlanguages/.
195
Justin Scheck, Newley Purnell, Jeff Horwitz, Facebook Employees Flag Drug Cartels
and Human Traffickers. The Company’s Response Is Weak, Documents Show, WALL STREET
JOURNAL (Sept. 16, 2021), https://www.wsj.com/articles/facebook-drug-cartels-humantraffickers-response-is-weak-documents-11631812953.
193
23
24
25
26
27
28
CLASS ACTION COMPLAINT
62
Case No. __________________
PDF Page 64
1
2
147.
The Wall Street Journal article relates one example indicating that Facebook has
learned nothing from its experience in Burma:
3
In Ethiopia, armed groups have used Facebook to incite violence.
The company’s internal communications show it doesn’t have
enough employees who speak some of the relevant languages to
help monitor the situation. For some languages, Facebook also
failed to build automated systems, called classifiers, that could
weed out the worst abuses….
4
5
6
***
7
In a December planning document, a Facebook team wrote that the
risk of bad consequences in Ethiopia was dire…. It said in some
high-risk places like Ethiopia, ‘Our classifiers don’t work, and
we’re largely blind to problems on our site.’
8
9
10
Groups associated with the Ethiopian government and state media
posted inciting comments on Facebook against the Tigrayan
minority, calling them ‘hyenas’ and ‘a cancer.’ Posts accusing
Tigrayans of crimes such as money laundering were going viral,
and some people on the site said the Tigrayans should be wiped
out.
11
12
13
Violence escalated toward the end of last year, when the
government launched an attack on the Tigray capital, Mekelle.
14
15
Secretary of State Antony Blinken said in March that Tigrayans are
victims of ethnic cleansing. 196
16
17
148.
Whistleblower Francis Haugen echoed this sentiment, noting that Facebook’s
18
efforts to train its systems in non-English languages are severely lacking, stating “[o]ne of the
19
core things that I’m trying to draw attention to is the underinvestment in languages that aren’t
20
English.… Unfortunately the most fragile places in the world are the most diverse when it comes
21
to languages.” She goes on to say “I saw a pattern of behavior where I believed there was no
22
chance that Facebook would be able to solve these problems in isolation … I saw what I feared
23
was going to happen continue to unfurl … I knew I could never live with myself if I watched 10
24
million, 20 million people over the next 20 years die because of violence that was facilitated by
25
26
27
196
28
Id.
CLASS ACTION COMPLAINT
63
Case No. __________________
PDF Page 65
1
social media.”197
2
149.
Facebook’s admissions that it should have done more to prevent the genocide in
3
Burma—and its subsequent efforts, if any—came too late for the tens of thousands of Rohingya
4
who have been murdered, raped, and tortured, and for the hundreds of thousands who are now
5
living in squalid refugee camps and displaced from their home across the world.
6
FACTS SPECIFIC TO JANE DOE
7
8
150.
Rakhine State, Burma.
9
10
Plaintiff Jane Doe is a Rohingya Muslim woman who previously lived in the
151.
In 2012, Plaintiff was about 16 years old, her father was detained, beaten, and
tortured for two weeks by the Myanmar military.
11
152.
Around the same time, many young Rohingya girls in Plaintiff’s village and
12
nearby villages were being taken from their families. Members of the Myanmar military came to
13
Plaintiff’s village, and anyone who left their homes was killed. Plaintiff saw at least seven men
14
killed, as well as an elderly woman. Plaintiff knew that many others in her village were also
15
killed, including women and children, but she could only see those directly in the vicinity of her
16
home.
17
18
153.
Fearful that she would be abducted and sexually assaulted or killed herself,
Plaintiff’s family eventually urged her to flee Burma alone.
19
154.
Plaintiff joined a group of Rohingya fleeing by boat to Bangladesh. She traveled
20
to Thailand and then Malaysia, where the UNHCR eventually arranged for her resettlement in
21
the United States.
22
155.
Plaintiff is gravely concerned about her parents and her sisters, who remain in
23
Burma. Their homes and the small store that was their livelihood were destroyed during ethnic
24
violence. Plaintiff’s family land, home, and personal property were eventually seized and those
25
that remained behind in Burma were forced from their homes. They lack any reliable source of
26
197
27
28
Giulia Saudelli, Facebook whistleblower warns company is neglecting languages other
than English, DW, https://www.dw.com/en/facebook-whistleblower-warns-company-isneglecting-languages-other-than-english/a-59739260.
CLASS ACTION COMPLAINT
64
Case No. __________________
PDF Page 66
1
income and live in constant fear of further attacks by the Myanmar military or by Buddhist
2
monks.
3
4
5
6
7
156.
Plaintiff also has an aunt and uncle who fled to a refugee camp in Bangladesh,
where they have remained for several years.
157.
Plaintiff remains traumatized by the ethnic violence and threats of violence
inflicted on her and her family.
158.
Plaintiff did not learn that Facebook’s conduct was a cause of her injuries until
8
2021. A reasonable investigation by Plaintiff into the causes of her injuries would not have
9
revealed this information prior to 2021 because Facebook’s role in the Rohingyan genocide was
10
not widely known or well understood within the Rohingya community. Further, even if such
11
information was known to various journalists or investigators at earlier points in time, Plaintiff’s
12
ability to discover such information was significantly hindered by her inability to read or write.
13
14
15
CLASS ACTION ALLEGATIONS
159.
Class Definition. Plaintiff seeks to represent the following proposed Class
pursuant to California Code of Civil Procedure § 382:
16
All Rohingya who left Burma (Myanmar) on or after June 1, 2012,
and arrived in the United States under refugee status, or who
sought asylum protection, and now reside in the United States.
17
18
The following are excluded from the Class: (1) any Judge or Magistrate presiding over this
19
action and members of their families; (2) Defendant, Defendant’s subsidiaries, parents,
20
successors, predecessors, and any entity in which Defendant or its parents have a controlling
21
interest, and its current or former employees, officers, or directors; (3) Plaintiff’s counsel and
22
Defendant’s counsel; and (4) the legal representatives, successors, and assigns of any such
23
excluded person.
24
160.
Ascertainability and Numerosity. The Class is so numerous that joinder of all
25
members is impracticable. At least 10,000 members of the Class reside in the United States.
26
Class members are ascertainable and can be identified through public records.
27
28
CLASS ACTION COMPLAINT
65
Case No. __________________
PDF Page 67
1
161.
Commonality and Predominance. There are many questions of law and fact
2
common to the claims of Plaintiff and the Class and those questions predominate over any
3
questions that may affect individual members of the Class. These common questions of law and
4
fact include:
5
•
Whether Facebook (the product) contains design defects that
harmed Rohingya Muslims, and, if so, whether Facebook (the
company) is strictly liable for them;
•
Whether Facebook owed a duty of care to Rohingya Muslims
when entering the Burmese market;
•
Whether Facebook breached any duty of care to Rohingya
Muslims in the way it operated in Burma; and
•
Whether Facebook’s Burmese operations caused harm to Rohingya
Muslims.
6
7
8
9
10
11
12
162.
Typicality. Plaintiff’s claims are typical of the claims of all members of the
13
Class. Plaintiff and the other Class members sustained damages as a result of Defendant’s
14
uniform wrongful conduct.
15
163.
Adequacy. Plaintiff will fairly and adequately protect the interests of the Class.
16
Plaintiff has retained counsel with substantial experience in prosecuting complex class actions
17
and particular expertise in litigation involving social media. Plaintiff and her counsel are
18
committed to vigorously prosecuting the action on behalf of the Class and have the resources to
19
do so. Neither the Plaintiff nor her counsel have any interests adverse to those of the other
20
members of the Class. Defendant has no defenses unique to Plaintiff.
21
164.
Superiority. A class action is superior to all other available methods for the fair
22
and efficient adjudication of this controversy and joinder of all members of the Class is
23
impracticable. The members of the proposed Class are, by definition, recent immigrants and lack
24
the tangible resources, language skills, and cultural sophistication to access and participate
25
effectively in the prosecution of individual lawsuits in any forum having jurisdiction over
26
Defendant. A class action in which the interests of the Class are advanced by representative
27
parties therefore provides the greatest chance for individual Class members to obtain relief.
28
CLASS ACTION COMPLAINT
66
Case No. __________________
PDF Page 68
1
Moreover, duplicative individual litigation of the complex legal and factual controversies
2
presented in this Complaint would increase the delay and expense to all parties and impose a
3
tremendous burden on the courts. By contrast, a class action would reduce the burden of case
4
management and advance the interests of judicial economy, speedy justice, and uniformity of
5
decisions.
6
FIRST CAUSE OF ACTION
7
STRICT PRODUCT LIABILITY
8
165.
Plaintiff incorporates the foregoing allegations as if fully set forth herein.
9
166.
Facebook makes its social media product widely available to users around the
167.
Facebook designed its system and the underlying algorithms and in a manner that
10
world.
11
12
rewarded users for posting, and thereby encouraged and trained them to post, increasingly
13
extreme and outrageous hate speech, misinformation, and conspiracy theories attacking
14
particular groups.
15
168.
The design of Facebook’s algorithms and product resulted in the proliferation and
16
intensification of hate speech, misinformation, and conspiracy theories attacking the Rohingya in
17
Burma, radicalizing users, causing injury to Plaintiff and the Class, as described above.
18
Accordingly, through the design of its algorithms and product, Facebook (1) contributed to the
19
development and creation of such hate speech and misinformation and (2) radicalized users,
20
causing them to tolerate, support, and even participate in the persecution of and ethnic violence
21
against Plaintiff and the Class.
22
169.
Because (1) the persecution of the Rohingya by the military government was
23
widely known before Facebook launched its product in Burma and (2) Facebook was repeatedly
24
warned after the launch that hate speech and misinformation on the system was likely to result in
25
ethnic violence, Facebook knew and had reason to expect that the Myanmar military and non-
26
Rohingya civilians would engage in violence and commit atrocities against Plaintiff and the
27
Class.
28
CLASS ACTION COMPLAINT
67
Case No. __________________
PDF Page 69
1
170.
Moreover, the kind of harm resulting from the ethnic violence committed by the
2
Myanmar military and their non-Rohingya supporters is precisely the kind of harm that could
3
have been reasonably expected from Facebook’s propagation and prioritization of anti-Rohingya
4
hate speech and misinformation on its system—e.g., wrongful death, personal injury, pain and
5
suffering, emotional distress, and property loss.
6
7
8
9
10
171.
The dangers inherent in the design of Facebook’s algorithms and product
outweigh the benefits, if any, afforded by that design.
172.
Plaintiff and the Class are entitled to actual damages proximately caused by the
defective design of Facebook’s algorithms and system.
173.
Plaintiff and the Class are further entitled to punitive damages caused by
11
Facebook’s failure to correct or withdraw its algorithms and product after Facebook knew about
12
their defects.
13
SECOND CAUSE OF ACTION
14
NEGLIGENCE
15
174.
Plaintiff incorporates the foregoing allegations as if fully set forth herein.
16
175.
When operating in Burma—as everywhere—Facebook had a duty to use
17
18
reasonable care to avoid injuring others.
176.
Facebook breached this duty by—among other things—negligently designing its
19
algorithms to fill Burmese users’ News Feeds (especially users particularly susceptible to such
20
content) with disproportionate amounts of hate speech, misinformation, and other content
21
dangerous to Plaintiff and the Class; negligently contributing to the creation of hate speech,
22
misinformation, and other content dangerous to Plaintiff and the Class by rewarding (and thus
23
encouraging) users to post ever more extreme content; negligently failing to remove such
24
dangerous content from its system after having been repeatedly warned of the potential for such
25
content to incite violence; negligently making connections between and among violent
26
extremists and susceptible potential violent actors; and negligently allowing users to use
27
28
CLASS ACTION COMPLAINT
68
Case No. __________________
PDF Page 70
1
Facebook in a manner that Facebook knew or should have known would create an unreasonable
2
risk to Plaintiff and the Class.
3
177.
Because (1) the persecution of the Rohingya by the military government was
4
widely known before Facebook launched its product in Burma and (2) Facebook was repeatedly
5
warned after the launch that hate speech and misinformation on the system was likely to result in
6
ethnic violence, Facebook knew and had reason to expect that the proliferation of such content
7
on its system could incite and facilitate violence and atrocities by the Myanmar military and non-
8
Rohingya civilians against Plaintiff and the Class.
9
178.
Moreover, the kind of harm resulting from the ethnic violence committed by the
10
Myanmar military and their non-Rohingya supporters is precisely the kind of harm that could
11
have been reasonably expected from Facebook’s negligent propagation and prioritization of anti-
12
Rohingya hate speech and misinformation on its system—e.g., wrongful death, personal injury,
13
pain and suffering, emotional distress, and property loss.
14
179.
Facebook’s acts and omissions in breach of its duty of care were a proximate
15
cause of the persecution of and ethnic violence against—and resulting injuries to—Plaintiff and
16
the Class.
17
18
19
180.
Plaintiff and the Class are entitled to actual damages proximately caused by
Facebook’s negligence of its algorithms and product.
181.
Plaintiff and the Class are further entitled to punitive damages caused by
20
Facebook’s failure to correct or withdraw its algorithms and system after Facebook knew about
21
their defects.
22
23
24
25
PRAYER FOR RELIEF
WHEREFORE, Plaintiff Jane Doe, on behalf of herself and the Class, respectfully requests
that this Court enter an Order:
A.
Certifying the case as a class action on behalf of the Class, as defined above,
26
appointing Plaintiff Jane Doe as representative of the Class, and appointing her counsel as Class
27
Counsel;
28
CLASS ACTION COMPLAINT
69
Case No. __________________
PDF Page 71
1
2
B.
Declaring that Defendant is strictly liable for defects, as described above, in its
algorithms and system; and that Defendant, as described above, acted negligently;
3
C.
Awarding the Class compensatory damages for wrongful death, personal injury,
4
pain and suffering, emotional distress, and loss of property, in the amount of at least $150
5
billion;
6
7
8
9
10
11
12
D.
Awarding Plaintiff and the Class punitive damages in an amount to be determined
E.
Awarding Plaintiff and the Class their reasonable litigation expenses and
at trial.
attorneys’ fees;
F.
Awarding the Plaintiff and the Class pre- and post-judgment interest, to the extent
allowable; and
G.
Awarding such other and further relief as equity and justice may require.
13
14
JURY TRIAL
Plaintiff demands a trial by jury for all issues so triable.
15
Respectfully Submitted,
16
JANE DOE, individually and on behalf of all
others similarly situated,
17
18
19
Dated: December 6, 2021
20
By:
One of Plaintiff’s Attorneys
Rafey S. Balabanian (SBN 315962)
rbalabanian@edelson.com
EDELSON PC
150 California Street, 18th Floor
San Francisco, California 94111
Tel: 415.212.9300
Fax: 415.373.9435
21
22
23
24
25
Jay Edelson*
jedelson@edelson.com
J. Eli Wade-Scott*
ewadescott@edelson.com
Michael Ovca*
26
27
28
CLASS ACTION COMPLAINT
70
Case No. __________________
PDF Page 72
movca@edelson.com
EDELSON PC
350 North LaSalle, 14th Floor
Chicago, Illinois 60654
Tel: 312.589.6370
Fax: 312.589.6378
1
2
3
4
Richard Fields*
fields@fieldslawpllc.com
Edward Han*
edhan@fieldslawpllc.com
Martin Cunniff*
martincunniff@fieldslawpllc.com
Awista Ayazi*
ayazi@fieldslawpllc.com
FIELDS PLLC
1701 Pennsylvania Avenue, NW, Suite 200
Washington, DC 20006
Tel: 833.382.9816
5
6
7
8
9
10
11
Counsel for Plaintiff and the Proposed Class
12
*Admission pro hac vice to be sought.
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
CLASS ACTION COMPLAINT
71
Case No. __________________