For months, U.S. President Donald Trump has tried to ban TikTok, the wildly popular short-video app owned by Chinese technology company ByteDance, on the grounds that the app's Chinese ownership poses a national security threat.
U.S. officials claim that TikTok allows the Chinese government to access user data, censors political topics, and could be used to spread misinformation.
"As far as TikTok is concerned, we're banning them from the United States," Trump said in August. The same month, he said TikTok "can't be controlled, for security reasons, by China. Too big. Too invasive."
But TikTok, which is still operating in the U.S., turned the tables this weekend when it confirmed it was removing some videos of Trump speaking because they violate the company's misinformation policy.
Trump doesn't have a personal TikTok account, but TikTok's restrictions on the president fall in line with actions by Twitter, Facebook, Instagram, and Snapchat, all of which suspended Trump's accounts on their platforms following last week's Capitol Hill attack, which Trump incited.
Apart from removing videos of the president's speeches, TikTok blocked certain hashtags related to the Capitol attack, such as #stopthesteal, a campaign to overturn the results of the U.S. presidential election over false claims that Trump won the contest—claims Trump has repeatedly asserted.
"Hateful behavior and violence have no place on TikTok. Content or accounts that seek to incite, glorify, or promote violence violate our Community Guidelines and will be removed," TikTok said in a statement.
TikTok is still alive in the U.S. despite the Trump administration's months-long effort to ban the platform or orchestrate the sale of its U.S. operations from its Chinese parent. The push started in August, when Trump signed an executive order restricting TikTok's U.S. operations over national security concerns. (TikTok has repeatedly denied U.S. government claims that it is a security risk in the U.S., where it has 100 million monthly active users.)