Facebook has been displaying news according to users' preferences, which could override news that conflicts with their point of view. During the election, people saw fake stories that may have affected their vote. One frequent fake news writer bragged, "I think Trump is in the White House because of me. His followers don't fact-check anything - they'll post everything, believe anything."
BuzzFeed analyzed data from Facebook news stories and found that the top 20 fake election news stories outpaced shares, comments, and reaction to the top 20 legitimate election news stories. Although Mark Zuckerberg at first denied any connection between FB's fake stories and the election outcome, he later posted a plan for the social networking site to address fake news.
Zuckerberg posted, "[W]e don't want any hoaxes on Facebook." Still, he said, it's tough to distinguish fake news, and it's quite uncommon:
"Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other."
This is a clever strategy, but the top 20 fake news stories got more than 8,711,000 shares, reactions, and comments, so it's not an insignificant number.
In addition, a small group of students at a hackathon at Princeton University wrote FiB, which uses an algorithm to distinguish true and false news.
- How do you assess news on Facebook or other social media sites?
- Have you ever posted something that you knew wasn't true? Why, and did you consider the potential consequences?
- Do you think fake news stories influenced the election? How?