Back to All Events

Facebook, Hate Speech, Misinformation, and Myanmar

Facebook, Hate Speech, Misinformation, and Myanmar

On Tuesday, October 29th, 2019 the Justice Collaboratory welcomed Michael Lwin, CEO of KoeKoe Tech as he presented "Facebook, Hate Speech, Misinformation, and Myanmar". This event was co-sponsored with the Yale Computation & Society Initiative.

In Myanmar, Facebook is the Internet. The New York Times reported that Facebook was used to spread hate speech and misinformation about the Rohingyas, contributing to sectarian tensions that erupted in violence, resulting in over 700,000 Rohingyas being forcibly moved across the border into Bangladesh, with many thousands killed, raped, and maimed.

In this presentation there will be a discussion of several problems involving Facebook, Hate Speech and Misinformation in the Myanmar context, with implications for the United States and global discourse for Facebook. Namely:

  1. issues with the capacity of civil society in international legal and Facebook’s own Community Policies definitions of “hate speech” and “misinformation” and civil society’s flagging of Facebook content for review, as Facebook relies heavily on civil society to generate the “cases” for content review is countries with weak freedom of expression and the rule of law;

  2. this lack of civil society capacity contributing to a lack of sound “training data”, which computer scientists and computational social scientists need to train algorithms that have predictive accuracy in identifying hate speech and misinformation;

  3. the inadvertent consequences of the recent $5 billion FTC fine against Facebook chilling Facebook’s ability to provide its own training data to researchers;

  4. an American-centric discussion on breaking up Facebook on antitrust grounds missing the implications for the Global South, as China’s WeChat, with its policy of censorship and user-data funneling to the Chinese government, will sweep into countries Facebook, Messenger, and Whatsapp may have to exit depending on the nature of a potential antitrust breakup; and

  5. Facebook’s issues in content handling giving rise to a new area of law or “quasi”-law, which we style as Private First Amendment Law, and how regulators, legal academics, computer scientists, technologists, and civil society may need to reason through this new area of law against the backdrop of Facebook’s impending Oversight Board.

Previous
Previous
October 17

The Truth About Marijuana, Mental Illness, and Violence?

Next
Next
November 13

What's Broken About Social Media and Tech Companies