YouTube removed 11.4 million videos between April and June, with the vast majority — 10.85 million — flagged by automated systems alone the company said Tuesday. Due to , the video sharing site said, it had “greatly reduced human review capacity” to double-check whether videos breached its user policies. As a result, it decided to “over enforce” by using automated systems, with YouTube removing more than double the videos it removed in the period January to March.
“The decision to over-enforce in these policy areas — out of an abundance of caution — led to a more than 3x increase in removals of content our systems suspected was tied to violent extremism or was potentially harmful to children,” YouTube explained. “This includes dares, challenges, or other innocently posted content that might endanger minors.”
Usually, YouTube relies on automation to flag videos that are then assessed by people. Since it was relying on automation alone this time, it made sure the appeals process had more staffers so the appeals could be quickly reviewed. Less than 3% of removals resulted in an appeal, but YouTube ended up with double the reinstatement rate it had last quarter.
Of the removed videos, 3.8 million were taken down for child safety reasons, 3.2 million for spam or scams, 1.7 million for nudity or sexual content, 1.2 million for violence and 900,000 for the promotion of violence.
Read more: The best live TV streaming services
Just 382,000 videos were flagged for removal by users, 167,000 by individual trusted flaggers, 2,220 by NGOs and 25 by government agencies. Three-quarters were removed before they got more than 10 views.