Transparency Report

Featured policies

This section provides more details about some of the policy areas where our automated flagging systems are instrumental in helping detect violative content. Once potentially problematic content is flagged by our automated systems, human review of that content verifies whether the content does indeed violate our policies and our reviewers take appropriate action. These decisions continuously train and reinforce our machines for better coverage in the future.

Featured Policies: Child Safety

Safeguarding the emotional and physical well-being of minors is a priority for YouTube. We have policies that prohibit harmful and dangerous content involving minors, and we don’t allow sexual content or content with other inappropriate themes involving minors. Child Sexual Abuse Material (CSAM) represents a fraction of a percent of the content we remove. To learn more about how we report illegal CSAM please see Combating Child Sexual Abuse Material transparency report.

We also have strict policies around online harassment and bullying. Other sexually explicit content like pornography is not allowed on YouTube and is covered by our policies against nudity and sexual content. Reviewers evaluate flagged content against all of our Community Guidelines. We limit the numbers in this section to the content that is removed for violating our child safety policy.

Community Guidelines and enforcement details

How YouTube uses technology to detect violative content: Child Safety

YouTube has strict policies and robust operations in place to tackle content and behavior that is harmful or exploitative to children. YouTube prohibits content that puts minors at risk including areas such as unwanted sexualization, abuse, and harmful and dangerous acts. Uploading, streaming, commenting, or engaging in activity that harms minors will result in the content being removed and the account may be terminated.

We have heavily invested in engineering resources to detect child sexual abuse material (CSAM) in ways that are precise and effective, and have long used this technology to prevent the distribution of known child sexual abuse imagery (CSAI) videos on YouTube. Our proprietary CSAI Match technology, which we license to a number of other technology companies free of charge, allows us to detect known CSAI videos. In cases where a video contains CSAI or a user solicits CSAI through comments or other communications, our team reports it to the National Center for Missing and Exploited Children (NCMEC), who then liaise with global law enforcement agencies. Once we have identified a video as illegal and reported it to NCMEC, the content is hashed—i.e. given a unique digital fingerprint—and used to detect matching content. This hashing and scanning technology is highly precise at detecting known CSAI and enables us to detect illegal content more quickly. We maintain a database of known CSAI hashes and any content that is matched against this list is removed and reported to NCMEC. Google’s Combating Child Sexual Abuse Material transparency report includes data about YouTube’s efforts to detect, remove, and report CSAM and CSAI. Learn about CSAI Match and our other tools to identify CSAM at scale here.

In addition to our long-standing efforts to combat CSAI video, we have made large investments to detect and remove content which may not meet the legal definition of CSAI, but where minors are still being sexualized or exploited. We continue to invest more resources to ensure children and families have a safe experience on YouTube.

If you come across content that you think is depicting a child in danger or an abusive situation, you can:

  • Flag the video: Report videos that contain inappropriate content involving minors by flagging the video for ‘Child Abuse’.
  • File an abuse report: If you have found multiple videos, comments, or a user’s entire account that you wish to report, please visit our reporting tool, where you will be able to submit a more detailed complaint.

If we believe a child is in danger based on content that has been reported to us, we will assist with investigations into the content by law enforcement.

Priority Flaggers: Child Safety

Across our policy areas, we continue to invest in the network of over 300 government partners and NGOs who bring valuable expertise to our enforcement systems, including through our Priority Flagger program. Participants in the Priority Flagger program receive training in enforcing YouTube’s Community Guidelines, and because their flags have a higher action rate than the average user, we prioritize them for review. Participants of the Priority Flagger program have a direct line of communication with our Trust & Safety teams for quicker issue resolutions. Content flagged by Priority Flaggers is subject to the same policies as content flagged by any other user and is reviewed by our teams who are trained to make decisions on whether content violates our Community Guidelines and should be removed.

Age-restricted videos: Child Safety

Some videos don't violate our policies, but may not be appropriate for all audiences. In these cases, our review team may place an age restriction on the video when we're notified of the content. Age-restricted videos are not visible to users who are logged out, who are signed in and under 18 years of age, or who have Restricted Mode enabled.

When evaluating whether content is appropriate for all ages, we consider factors like whether the video has vulgar language, violence and disturbing imagery, nudity and sexually suggestive content, or portrayal of harmful or dangerous activities. Videos that are made to appear as family content but contain adult themes will likely be age-restricted so that only people over 18 can view the content.

Flagged video process examples: Child Safety

These are examples of videos that were flagged as potentially violating our Community Guidelines. These examples provide a glimpse of the range of flagged content that we receive and are not comprehensive.

Flagging reason
Child abuse
Flagger type
Priority Flagger
Video description
A video depicting a minor in non-sexual activity, with a video title sexualising the minor.
Outcome
Video violates child safety policies prohibiting content that includes sexualisation of minors, and the channel was removed.
Flagging reason
Child abuse
Flagger type
Priority Flagger
Video description
A video which solicited sexual imagery from minors at school.
Outcome
Video violates child safety policies prohibiting content that includes sexualisation of minors, and the channel was removed.
Flagging reason
Child abuse
Flagger type
User
Video description
A news clip from a Russian television broadcast with a military official discussing the situation in Dagestan which did not feature children.
Outcome
Content did not violate policy. No action taken.
Flagging reason
Child abuse
Flagger type
User
Video description
A video from a prominent environmental organization about the life of young lions.
Outcome
Content did not violate policy. No action taken.
Flagging reason
Child abuse
Flagger type
User
Video description
Cover video by an Eastern European group of a popular music video, with commentary about bullying.
Outcome
Content did not violate policy. No action taken.

YouTube Community Guidelines enforcement

Viewers and Creators around the world use YouTube to express their ideas and opinions. YouTube’s approach to responsibility involves four Rs: Remove violative content, Raise authoritative voices, Reduce recommendations of borderline content, and Reward trusted creators.

Learn more at How YouTube Works