A Genocide Incited by Social Media: Are We Blaming the Tool for Our Crime?
by Yutian Cai | Spring 2019
Following the scandal of Russian interference with 2016 US presidential election and Iranian fake political accounts, Facebook is involved in yet another scandal, this time accused of inciting the Rohingya genocide in Myanmar. Although Facebook has already taken part of the blame for this incident by publicly acknowledging it has been too slow to act in prevention, this paper argues that Facebook is not ethically responsible for this tragedy given that Myanmar military government has violated Facebook’s intention in its use of it as a propaganda platform, and that Facebook arguably could not have unforeseen and thus better prevented this incidence. Instead, this paper recommends that policymakers focus more on attempts to regulate and eliminate inappropriate cyber activities with the help of Facebook instead of letting social media like Facebook take the blame.
1. U.S. State Department, Documentation of Atrocities in Northern Rakhine State, August 2018, accessed March 23, 2019,
2. Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar, [Page #], September 17, 2018, accessed March 23, 2019,
5. Mozur, "A Genocide," The New York Times.
From August 25th, 2017, Myanmar military forces and local Buddhist extremists started attacking Rohingya people, looting and burning down villages, torturing, raping, and enslaving Rohingya people. These attacks have since been classified by the United Nations-commissioned ‘Fact Finding Mission’ and the U.S. State Department as “well-planned and coordinated”, “widespread, systematic, and brutal” genocide, crimes against humanity and war crimes  . Since the outbreak, a growing body of evidence of the genocide, as well as evidence of Facebook’s involvement in inciting said genocide, have outraged the public . A New York Times report from October 15th, 2018 gave more detail of the Myanmar military government’s initial steps on Facebook as disguised private groups with political agenda . Since Russian manipulation of 2016 US presidential election and Iranian inauthentic political accounts, this is hardly the first time Facebook has been involved in a political scandal. Though as Facebook’s Chief Operating Officer Sheryl Sandberg has acknowledged, Facebook should have legal obligation to help take down accounts and posts incentivizing the violence, people should be cautious to put the moral, even legal blame of inciting this genocide on Facebook.
Firstly, Myanmar military group’s use of Facebook for political propaganda is against Facebook’s original intention stated in its mission, and thus Facebook should not take the moral blame for Myanmar genocide on the grounds of intention. According to its mission statement, Facebook is a platform for people to “stay connected with friends and family, to discover what's going on in the world, and to share and express what matters to them” . Facebook states clearly that it intended to be a platform for private individuals and organizations, connecting friends and families, not governments and governments or government and its citizens. All of this information on Facebook is intended to be curated by and targeted towards individuals or individual organizations. It is not Facebook’s intention nor responsibility to censor, verify, and own all the postings on its platform because they never claimed to be an official news source or endorsed any of the announcements on it. Thus, Myanmar military organizations using Facebook as a propaganda agency to spread “official” and incentivizing information to its followers is against Facebook’s original design and intention. Because of this violation of intention, Facebook has the right to take down any information that is against its purpose but should not be blamed by the consequences of speech and postings that were never meant to appear on Facebook in the first place.
Additionally, given current technology, Facebook arguably could not have foreseen Myanmar military group’s abuse of their platform and thus should not be punished for something they could not predict and prevent. The New York Times article pointed out that these pages linked to military groups are “seemingly independent entertainment, beauty, and informational pages” and they originally looked “devoted to pop stars, models, and other celebrities” or “day-to-day life of a soldier”, which were not against Facebook’s intention of promoting daily life and pop culture . From this beginning, we should not expect Facebook to foresee that these pages were going to turn into military propaganda of religious persecution. Furthermore, even if subtle military connotations and fake news begin to emerge on these pages later, given the immaturity of our artificial intelligence technology with respect to analyzing the nuanced meanings of human language, it is currently unreasonable expect Facebook to have a algorithm developed to detect and take down all and only propagandas. Moreover, given that Facebook is currently primarily an Internet based tech company, it is relatively unreasonable to force Facebook to hire thousands of employees to manually go through accounts, postings, and messages to look for potentially inappropriate contents. Even if Facebook were to do so, such level of censorship and monitoring of people’s private information would essentially make Facebook an involuntary speech regulatory agency. Such an approach would infringe on people’s freedom of speech and the consequent threat to democracy may not outweigh the benefit of taking these fake political accounts down earlier. Thus, it’s unreasonable to blame Facebook for not preventing a tragedy it possibly did not foresee and have no power or cost-efficient way to prevent yet.
Another threat of placing the moral and legal blame on Facebook for this genocide and similar events from political propaganda to cyberbullying is that this is essentially shifting the blame from the root of the problems, underlying tension and divide between groups and people intentionally perpetrating harm. When the public and media focus their attention on Facebook’s involvement and legal responsibility, it inevitably takes time and attention away from the people and organizations that are generating these hate posts and the underlying issues that caused these hate crimes. This will unintentionally lead people to lose the opportunity to address issue of lack of communication between government and public, religious, racial or ethnic divide, or more preventable teenage jealousy and rivalry that caused many cyberbullying that have cost way too many precious young lives. Even though facing and solving these problems are harder and may take more time than simply blaming Facebook for everything, it is reasonable to believe that this approach will eventually be more effective because it tries to address the root of the issue. Additionally, when we lift the legal and moral pressure off social medias such as Facebook but instead ask them to help address and remediate the consequences of these post, we are giving them room to breathe and develop, which in long term can lead to development in computer algorithms that would make detection and prevention of similar threats possible. Thus, I would recommend policymakers and the public not to blame, but to seek help and corporate with Facebook and other social medias to solve the problems of our cyber age.
 U.S. State Department, Documentation of Atrocities in Northern Rakhine State, August 2018, accessed March 23, 2019,
 Report of the Detailed Findings of the Independent International Fact-Finding Mission on Myanmar, [Page #], September 17, 2018, accessed March 23, 2019,
 Paul Mozur, "A Genocide Incited on Facebook, with Posts from Myanmar's Military," The New York Times, last modified October 15, 2018, accessed March 23, 2019,
 Facebook, Mission Statement, Investor Relations, accessed March 23, 2019,
 Mozur, "A Genocide," The New York Times.
3. Paul Mozur, "A Genocide Incited on Facebook, with Posts from Myanmar's Military," The New York Times, last modified October 15, 2018, accessed March 23, 2019,
4. Facebook, Mission Statement, Investor Relations, accessed March 23, 2019,