Skip to main contentSkip to navigationSkip to navigation
facebook
Facebook said disputed stories may also appear lower in the newsfeed. Photograph: Justin Tallis/AFP/Getty Images
Facebook said disputed stories may also appear lower in the newsfeed. Photograph: Justin Tallis/AFP/Getty Images

Facebook to begin flagging fake news in response to mounting criticism

This article is more than 7 years old

Disputed articles will be marked with the help of users and outside fact checkers amid widespread criticism that fake news influenced the US election

Facebook will begin flagging fake news stories with the help of users and outside fact checkers, the company announced on Thursday, responding to a torrent of criticism over fake news during the US election.

Readers will be able to alert Facebook to possible fake news stories, which the social media behemoth will then send to outside fact-checking organizations to verify.

Facebook is working with five fact-checking organizations – ABC News, AP, FactCheck.org, Politifact and Snopes – to launch the initiative. If enough of Facebook’s users report a story as fake, the social network will pass it onto these third parties to scrutinize.

If a story is deemed to fail the fact check, it will be publicly flagged as “disputed by 3rd party fact-checkers” whenever it appears on the social network. Users will be able to click on a link to understand why it’s disputed. If a Facebook user then still want to share the story, they’ll get another warning about its reliability.

Disputed stories also may appear lower in the newsfeed, said Facebook.

“It’s important to us that the stories you see on Facebook are authentic and meaningful,” reads the Facebook press release.

The fact-checking organizations will not be paid to provide this service.

Another change being rolled out identifies stories that are being shared more by people who have only read the headline than by people who have actually clicked on them and read the text. “We’ve found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way,” the company said.

Facebook is also attempting to reduce the financial incentives to create fake news websites, by making it harder to spoof existing legitimate domains.

In a post on his own Facebook page announcing the changes, founder Mark Zuckerberg admitted the business has a “greater responsibility” to the public than just being a tech company.

He wrote:

While we don’t write the news stories you read and share, we also recognize we’re more than just a distributor of news. We’re a new kind of platform for public discourse – and that means we have a new kind of responsibility to enable people to have the most meaningful conversations, and to build a space where people can be informed.

The fact-checking announcement is a turnaround from 12 November, just days after Donald Trump won the election, when Zuckerberg said of Facebook: “I believe we must be extremely cautious about becoming arbiters of truth ourselves.”

Activist and journalist Daniel Sieradski, who created a browser plug-in called BS Detector that flags questionable news sources, has been a vocal critic of Facebook’s failure to acknowledge any responsibility for the spread of misleading and false information on its platform.

He welcomed Facebook’s announcement. “It seems like a pretty good set of suggestions to me,” he said. However, he’s concerned that the system relies on Facebook users flagging stories as hoaxes. “They will get so many things false-flagged as fake news by people with an axe to grind, so it’s going to make it more challenging to moderate.

“But it’s a step in the right direction.”

The role of Facebook in encouraging and spreading misinformation during the US election, including completely fictional news stories created as a moneymaking scheme by teenagers in Macedonia and inaccurate propaganda, saw the company be accused of abdicating its responsibility and helping the election of Donald Trump. Many fake news stories appear in the “trending” feed on Facebook, encouraging them to be read and shared, despite their inaccuracies.

The rise of fake news across Facebook and other social media has quickly became a global problem, with tech companies, including Twitter, rolling out changes to attempt to thwart the trend.

Comments (…)

Sign in or create your Guardian account to join the discussion

Most viewed

Most viewed