TikTok, Like Facebook, Will Limit Spread of Misleading Content About Election Results

TikTok will coordinate with fact-checkers to minimize the likelihood that users will see social content that claims to share U.S. election outcomes before results are officially confirmed. The popular short-form video application, which previously implemented policies to limit the spread of false, misleading and manipulated information—as it relates to the upcoming general election and otherwise—announced plans to tailor regulatory procedures to content surrounding the races' conclusions on Wednesday.

"With heightened focus around Election Day, we'll be partnering with these fact checkers to reduce discoverability of content that prematurely claims victory in a race before results are confirmed by the Associated Press," the company said in a statement released Wednesday morning. The APwill declare winning candidates in roughly 7,000 races in the election's aftermath, offering the final word on results of the presidential bid, as well as congressional and state-level races. The news organization is collaborating with Google to provide results on its search engine.

In comments to Newsweek on Wednesday, a spokesperson from TikTok said fact-checking posts and limiting discoverability of content when necessary, "has always been [its] approach" to managing election-related misinformation. When the company identifies disinformation or manipulated media connected to the election, it removes the content.

In detailing its fact-checking endeavor linked to election results specifically, TikTok's Wednesday statement addressed scenarios where research partners are unable to authenticate or disprove claims with certainty.

TikTok
A sign displayed outside TikTok's headquarters in Culver City, California, is pictured on October 13. The company announced plans to limit circulation of content that prematurely declares election results on Wednesday. AaronP/Bauer-Griffin/GC Images

"Out of an abundance of caution, if claims can't be verified or fact-checking is inconclusive, we'll limit distribution of the content," the company's statement continued. "We'll also add a banner pointing viewers to our election guide on content with unverifiable claims about voting, premature declarations of victory, or attempts to dissuade people from voting by exploiting COVID-19 as a voter suppression tactic."

TikTok's approach to election-focused misinformation trails similar policies Facebook rolled out ahead of the election. Like TikTok, Facebook works to identify election disinformation and voter suppression content and remove it from the social platform, as well as fake accounts. The company also employs a team of fact-checkers to flag other posts containing misinformation about the election.

With more than 1 billion posts created per day,misinformation, especially concerning political issues, has run rampant on Facebook in the past. The tech company took steps toward stamping out false claims, inauthentic content and government interference by groups abroad, since the 2016 election.

Recently developments to Facebook's election content management approach have begun to regulate advertising. Facebook implemented a restriction on political advertisements ahead of November 3. The restrictions, which went into effect on Tuesday, prevents new advertisements from circulating on Facebook, although others that previously secured spots on the social platform will remain visible until Election Day.

"Generally, we do believe it's important that campaigns can get out the vote, but in the final days we wanted to make sure that there was enough time to contest new claims, which is why we stopped the creation of new ads," a Facebook spokesperson told Newsweek of the restriction period leading up to Tuesday's election. Advertisers whose ads ran before the restrictions went into effect can make certain adjustments during this period, as long as content remains the same.

Facebook will expand the restrictions after polls close, when it will temporarily remove "all social issue, electoral or political ads in the U.S." from view after and prevent them from recirculating indefinitely. On Wednesday, the company spokesperson said this procedure aims "to reduce opportunities for confusion or abuse." Facebook will reach out to advertisers when the policy lifts.

Correction: An earlier version of this story said that Facebook's policies on handling election misinformation were much less thorough. Those policies, implemented earlier, are very similar to TikTok's.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

About the writer



To read how Newsweek uses AI as a newsroom tool, Click here.
Newsweek cover
  • Newsweek magazine delivered to your door
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go
Newsweek cover
  • Newsweek Voices: Diverse audio opinions
  • Enjoy ad-free browsing on Newsweek.com
  • Comment on articles
  • Newsweek app updates on-the-go