In a bid to slow the spread of misinformation on its platform, TikTok announced that users who try to share flagged videos will soon be prompted to reconsider. TikTok said in a Wednesday blog post that while it partners with independent fact-checking organizations to vet certain content for removal, the new pop-up prompt will be displayed to users sharing videos for which fact-checks have been inconclusive or whose content is unconfirmed.
The move follows TikTok’s Decemberacross its platform, as it and other and calls for regulation.
Under the new process, if a video’s content has been reviewed but can’t be conclusively validated, a viewer will first see a banner on a video. The video’s creator will also be notified that their video has been flagged as misleading. If a viewer attempts to share the video, they’ll be greeted by a pop-up, asking whether they’re sure they want to share it.
“We’ve designed this feature to help our users be mindful about what they share,” the company said. “When we tested this approach we saw viewers decrease the rate at which they shared videos by 24%, while likes on such unsubstantiated content also decreased by 7%.”
The new prompt process will be rolling out globally over the coming weeks, the company said, starting in Canada and the US.