Published on

Enhancing Content Moderation on TikTok: Q4 2022 Community Guidelines Enforcement Report

Enhancing Content Moderation on TikTok: Q4 2022 Community Guidelines Enforcement Report

At TikTok, the goal is to create a fun and inclusive environment where people can connect, express themselves, and be entertained. To maintain this environment, strict Community Guidelines are in place and action is taken to remove content that violates these guidelines. Steps are also taken to limit the reach of content that may not be suitable for a broad audience.

Ensuring safety is an ongoing process, and regular updates are provided on the progress made to hold TikTok accountable. In the Q4 2022 Community Guidelines Enforcement Report, insights are shared into the improvements made to enhance the accuracy and efficiency of content moderation on TikTok.

Moderating content at the scale of TikTok is a challenging task. The aim is to identify and remove violative content as quickly as possible. To achieve this, the swift removal of highly egregious content, such as child sexual abuse material (CSAM) and violent extremism, is prioritized. A combination of automated technology and skilled human moderators is employed to make contextual decisions on nuanced topics like misinformation, hate speech, and harassment.

In the past, the focus was on casting a wider net for content review to catch as much violative content as possible. However, this approach did not align with the overarching safety goals. As the TikTok community has grown and evolved, so has the approach to content moderation. The focus is now on accuracy, minimizing views of violative content, and removing egregious content quickly. Systems have been upgraded to better incorporate factors such as the severity of harm a video may cause and the expected reach based on the account's following. This helps determine whether to remove a video, escalate it for human review, or take other appropriate actions. Features like age restrictions, ineligibility for recommendation, and the Content Levels system are also utilized to limit the creation and visibility of certain content. Proactive technology has become more sophisticated in detecting spam accounts and duplicative content, reducing the amount of content that requires manual review.

These changes have already had an impact, as reflected in the Q4 Community Guidelines Enforcement report. For instance, total content removals have decreased as low-harm content is made ineligible for the For You Feed instead of being fully removed. At the same time, the proportion of accurately removed content by automation has increased, indicating the improved precision of the systems. These fluctuations are expected, and consistently 1% or less of published content is removed for violating the guidelines. The aim is to create a safer community and provide a more consistent experience for creators.

As the systems continue to evolve, further fluctuations in metrics are anticipated. Significant improvements have been made this year, including the introduction of a new account enforcement system and a comprehensive refresh of the Community Guidelines. Moderation processes are also being refined by specializing content moderation teams in specific areas of expertise.

The commitment remains to build a safe, inclusive, and authentic home for the global TikTok community. Updates on ongoing efforts to prevent harm will continue to be shared.