This story is part of , CNET’s coverage of the voting in November and its aftermath.
TikTok will be ramping up its fact-checking efforts in the coming days with the goal of limiting false or manipulated content as the Nov. 3 US election draws near. Although the video-sharing social network notes its app isn’t designed to report “real-time news,” the company said in a blog post on Wednesday that it’s already making an effort to remove content that could intimidate voters, suppress voting or . TikTok will also limit distribution of misleading posts that, for example, falsely report victory in an election before results are confirmed by The Associated Press.
TikTok also BBC reported videos on the topic were still available in the app after the ban.last month to help users find trustworthy information about candidates and issues. Like other social media platforms, TikTok grapples with moderating some political topics. Earlier this year, conspiracy theory, although the
For their part, other social media companies have also increased their vigilance against misinformation surrounding the election. In recent weeks, Facebook hastargeting the US election while it prepares tools to help post-election. Twitter has said it’ll .
Chinese-owned TikTok has come under fire in recent months, including threats by the Trump administration to ban the app in the US. President Donald Trump signed an executive order in Augustwith TikTok’s parent company, ByteDance, citing “national security” issues. However, a against the order in September.