Takedowns on YouTube skyrocket during pandemic as AI replaces human moderators (2020)

Summary:

YouTube has always faced an uphill battle when it comes to content moderation. A decade ago, the job was impossible to handle with only human moderators. According to stats released by the platform in 2010, uploads to YouTube had surpassed 18 million hours of video per year. To break that down to daily management terms, YouTube’s combination of human and AI moderators were expected to handle 35 hours of uploads per minute, all day, every day.

A decade later, this had increased to 500 hours per minute. YouTube’s algorithms continued to be tweaked to handle this exponential increase in uploads. Human moderators were still active, even if their contribution to overall moderation continued to diminish as uploads increased.

But human moderators were still a key part of the process… right up until a global pandemic meant sending moderators home to curb the spread of the coronavirus. 

In the quarter preceding the reaction to the pandemic (January 2020 – March 2020), YouTube takedowns of uploaded videos neared 6 million, with human moderators contributing nearly 400,000. 

As the pandemic’s infection rate increased, so did YouTube’s takedowns. Limited contributions by human moderators meant turning over most of the workload to YouTube’s automated processes, which almost immediately resulted in a spike in takedowns. 

According to YouTube’s transparency report, takedowns nearly doubled during the following quarter as the platform began relying more heavily on automated content removal.

YouTube was upfront about this spike in takedowns, which it had logically assumed would be the outcome of increased automation. Particularly during the first quarter of 2020, with global uncertainty over the pandemic and mandated lockdowns, YouTube relied heavily on automation. In a blog post, YouTube acknowledged the direct result of its pandemic-related changes was the automated takedown of some content that did not actually violate its policies.

Because responsibility is our top priority, we chose the latter—using technology to help with some of the work normally done by reviewers. The result was an increase in the number of videos removed from YouTube; more than double the number of videos we removed in the previous quarter. For certain sensitive policy areas, such as violent extremism and child safety, we accepted a lower level of accuracy to make sure that we were removing as many pieces of violative content as possible. This also means that, in these areas specifically, a higher amount of content that does not violate our policies was also removed.

YouTube

It also noted that this led to nearly double the number of appeals from users. Unfortunately, due to COVID-related limitations, YouTube was unable to backstop the expanded use of automated moderation with human moderators. What it did do was direct more of its human resources towards handling appeals of removed content to more quickly reinstate unintentional takedowns of content not in violation of the platform’s rules.

Company considerations:

  • With the physical health of human moderators being a priority, what steps can be taken to prepare for unforeseen circumstances  like these?
  • What can be done to facilitate/scale up remote human moderation efforts in cases where it’s unsafe for employees to be physically present at work?
  • What can be learned from unprecedented events like these to reduce collateral damage to users and their content? How can this information be applied to head off future moderation disruptions?

Issue considerations:

  • With limited human resources, how should content moderation be prioritized? Which type of questionable content should be more subject to potential over-moderation? 
  • If disruptive events like this become more frequent, what is the potential fallout from over-moderation by AI? How can this be addressed in a way that satisfies users?

Resolution:

Mutations of the COVID virus have made it impossible for YouTube to return to normal as of the writing of this case study. YouTube had originally planned to bring its workers back to the office on January 10, 2022. That plan has changed. Google employees will still have the option to work from home for the time being, meaning the company will continue to rely more heavily on its automated moderation options.


Written by The Copia Institute, November 2021