Analysis & Opinions - The Washington Post
Two things Facebook still needs to do to reduce the spread of misinformation
As a reporter years ago in Bosnia, I witnessed malicious actors spreading lies on TV that stoked fear and fomented mass violence. In Rwanda, they used radio broadcasts. Over the past decade, this pattern has repeated on a new medium, with even greater reach. In Brazil, Hungary, Myanmar, the Philippines and elsewhere, those aiming to justify human rights abuses, steal elections, or target ethnic and religious minorities have relied on Facebook.
In the United States, Facebook’s weaponization has been well documented. Despite having been overrun by foreign disinformation in 2016, and vowing to combat falsehoods this election cycle, the platform is still not doing enough to stem their spread. Since 2016, user engagement with content from outlets known to continually publish verifiably false information has more than doubled. Disinformation and conspiracy theories — whether smears of political candidates, phony images of discarded ballots or claims that “the left” deliberately infected President Trump with the coronavirus — are being used to deepen polarization, suppress voter turnout and delegitimize the election. Alarmingly, these falsehoods could also fuel civil unrest, ultimately threatening the fabric of American democracy.
Facebook founder and CEO Mark Zuckerberg recently announced steps aimed at protecting election integrity. These measures include adding correction labels to posts that ascribe victory before results are final and removing explicit misrepresentations about how or when to vote — such as announcements that “You don’t need to register to vote this year,” or misleading information about when ballots must be received. Facebook says it deleted more than 120,000 posts attempting “voter interference” between March and September, and that it affixed misinformation warnings to more than 150 million pieces of content viewed in the United States over that time.
Want to Read More?
The full text of this publication is available via the original publication source.
For more information on this publication:
Belfer Communications Office
For Academic Citation:
Power, Samantha.“Two things Facebook still needs to do to reduce the spread of misinformation.” The Washington Post, October 23, 2020.
- Recommended
- In the Spotlight
- Most Viewed
Recommended
In the Spotlight
Most Viewed
Analysis & Opinions
- Belfer Center for Science and International Affairs, Harvard Kennedy School
The Impact of Henry Kissinger
Analysis & Opinions
- Belfer Center for Science and International Affairs, Harvard Kennedy School
The Real-Life Events of "Oppenheimer"
Newspaper Article
- The Times of London
Professor Unmasks Russian Spy Who Stole the Secrets of Concorde
As a reporter years ago in Bosnia, I witnessed malicious actors spreading lies on TV that stoked fear and fomented mass violence. In Rwanda, they used radio broadcasts. Over the past decade, this pattern has repeated on a new medium, with even greater reach. In Brazil, Hungary, Myanmar, the Philippines and elsewhere, those aiming to justify human rights abuses, steal elections, or target ethnic and religious minorities have relied on Facebook.
In the United States, Facebook’s weaponization has been well documented. Despite having been overrun by foreign disinformation in 2016, and vowing to combat falsehoods this election cycle, the platform is still not doing enough to stem their spread. Since 2016, user engagement with content from outlets known to continually publish verifiably false information has more than doubled. Disinformation and conspiracy theories — whether smears of political candidates, phony images of discarded ballots or claims that “the left” deliberately infected President Trump with the coronavirus — are being used to deepen polarization, suppress voter turnout and delegitimize the election. Alarmingly, these falsehoods could also fuel civil unrest, ultimately threatening the fabric of American democracy.
Facebook founder and CEO Mark Zuckerberg recently announced steps aimed at protecting election integrity. These measures include adding correction labels to posts that ascribe victory before results are final and removing explicit misrepresentations about how or when to vote — such as announcements that “You don’t need to register to vote this year,” or misleading information about when ballots must be received. Facebook says it deleted more than 120,000 posts attempting “voter interference” between March and September, and that it affixed misinformation warnings to more than 150 million pieces of content viewed in the United States over that time.
Want to Read More?
The full text of this publication is available via the original publication source.- Recommended
- In the Spotlight
- Most Viewed
Recommended
In the Spotlight
Most Viewed
Analysis & Opinions - Belfer Center for Science and International Affairs, Harvard Kennedy School
The Impact of Henry Kissinger
Analysis & Opinions - Belfer Center for Science and International Affairs, Harvard Kennedy School
The Real-Life Events of "Oppenheimer"
Newspaper Article - The Times of London
Professor Unmasks Russian Spy Who Stole the Secrets of Concorde