Facebook could start labeling stories that might be fake. On the heels of criticism that deliberately false content on Facebook helped put Donald Trump in the White House, company founder Mark Zuckerberg announced new steps to cut down on fake news.
It's a change in tune for Zuckerberg who recently said it was "extremely unlikely" fake news had any influence on the election.
In a post to Facebook Friday night, Zuckerberg addressed how the company plans to handle the "relatively small" percentage of fake news stories.
"The bottom line is: we take misinformation seriously," Zuckerberg said.
According to the Facebook CEO, the company has "reached out" to “respected fact-checking organizations” for third-party verification.
Zuckerberg outlined seven steps he said the company will take to stop the spread of false and misleading information among Facebook's users.
1. Improve detection
2. Make it easier for users to report false news
3. Third-party verification
4. Labeling stories as false
5. Insuring "quality" news appears in the News Feed
6. Crack down on ads with misinformation
7. Work with journalists to develop better fact-checking systems
"Some of these ideas will work well, and some will not," Zuckerberg wrote.
Voters make decisions based on their lived experience.
Speaking at the Technonomy conference last week, Zuckerberg said fake news only makes up a "very small amount" of the stories on Facebook and it was "crazy" to think they influenced the election.
"There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news," Zuckerberg said last week.
A majority of Americans get their news on social media, and according to the Pew Research Center, nearly half get their news from Facebook. An estimated 67 percent of U.S. adults use the site.