TikTok took down more than 113 million videos between April and June of this year, according to the company’s quarterly transparency report published today.
Though the number of videos the platform removed for policy violations is up slightly from the first few months of 2022, it’s just a drop in the bucket compared to the amount of content shared on TikTok. The 113 million represents just 1 percent of total videos uploaded during the three-month period.
The most common reason for removal was violating policies around minor safety, which accounted for just under 44 percent of videos taken down. Other common reasons include illegal activities and regulated goods, as well as adult nudity and sexual activities, according to the report.
An increasing number of videos taken down by TikTok are being removed by the platform’s automated systems — around 48 million videos in the last quarter. Nearly 96 percent of videos in the three-month period were removed before a user reported it to TikTok. The company said it uses automated tools and human review to sift through content that could violate user guidelines.
“Leveraging machine learning has been especially impactful when it comes to our countering harmful misinformation,” the report reads. “We expanded our capacity to iterate rapidly on our systems given the fast changing nature of misinformation, especially during a crisis or event (e.g. the war in Ukraine or an election).”
In preparation for the US midterm elections in November, TikTok announced last week that it would ban all political fundraising and require government, politician, and political party accounts to apply for verification on the platform. The announcement is part of a larger effort by TikTok to rein in election misinformation and political advertising, which, despite being banned, has continued to slip through via influencer sponcon.