February 14, 2022
The world’s leading short-form video platform — TikTok — has removed more than 6 million videos from Pakistan, ranking it fourth in the world for the largest volume of videos taken down for Community Guidelines violations in Q3 2021.
TikTok released its Community Guidelines Enforcement Report on Monday which details the volume and nature of violative content and accounts removed from the platform in Q3 of 2021.
Following the guidelines, 73.9% of content promoting harassment and bullying were proactively removed, while 72.4% of hateful-behaviour videos were also removed before anyone reported them.
The report provides insight into content removed for violating the Community Guidelines, reinforcing the platform’s public accountability, to the community, policymakers, and NGOs.
Meanwhile, 91 million videos were removed globally, between July 1 and September 30 in 2021 — comprising around 1% of all videos uploaded. Nearly 95% of those videos were removed before a user reported it, while 88% before the video received any views and 93% were removed within 24 hours of being posted.
TikTok has announced updates to its Community Guidelines to further support the well-being of its community and the integrity of the platform. These updates clarify or expand upon the types of behaviour and content that will be removed from the platform or made ineligible for recommendation in the ‘For You’ feed.
Over the coming weeks, every TikTok member will be prompted to read the updated guidelines when they open the application.
To protect the security, availability, and reliability of the platform, the video app is expanding its policy to include the prohibition of unauthorised access to the platform, as well as TikTok content, accounts, systems, or data. Use of the application to perpetrate criminal activity is also prohibited.
In addition to educating the community on ways to spot, avoid, and report suspicious activity, the platform is opening state-of-the-art cyber-incident monitoring and investigative-response centres in Washington DC, Dublin, and Singapore this year.
TikTok continues to expand its system that detects and removes certain categories of violations during upload – including adult-nudity and sexual activities, child safety, illegal activities and regulated goods.
As a result, the volume of automated removals has increased, which improves the overall safety of the app and enables the team to focus more on reviewing contextual or nuanced content, such as hate speech, bullying, harassment and misinformation, to improve the efficacy, speed, and consistency of the TikTok platform.
The improvement stems from the pioneering combination of technology and content moderation by a dedicated investigations team, deployed to identify videos that violate policies.
To better enforce these policies, moderators also receive regular training, enabling them to identify content that features; reappropriation, slurs and bullying.
The guidelines apply to everyone and all content on the platform to achieve a safer standard of content that is appropriate for the general audience, which includes everyone from teens to great-great-grandparents.
To learn more about content guidelines and policies on TikTok, refer to the Community Guidelines.