How YouTube uses technology to detect violative content: Child Safety
YouTube has strict policies and robust operations in place to tackle content and behavior that is harmful or exploitative to children. YouTube prohibits content that puts minors at risk including areas such as unwanted sexualization, abuse, and harmful and dangerous acts. Uploading, streaming, commenting, or engaging in activity that harms minors will result in the content being removed and the account may be terminated.
We have heavily invested in engineering resources to detect child sexual abuse material (CSAM) in ways that are precise and effective, and have long used this technology to prevent the distribution of known child sexual abuse imagery (CSAI) videos on YouTube. Our proprietary CSAI Match technology, which we license to a number of other technology companies free of charge, allows us to detect known CSAI videos. In cases where a video contains CSAI or a user solicits CSAI through comments or other communications, our team reports it to the National Center for Missing and Exploited Children (NCMEC), who then liaise with global law enforcement agencies. Once we have identified a video as illegal and reported it to NCMEC, the content is hashed—i.e. given a unique digital fingerprint—and used to detect matching content. This hashing and scanning technology is highly precise at detecting known CSAI and enables us to detect illegal content more quickly. We maintain a database of known CSAI hashes and any content that is matched against this list is removed and reported to NCMEC. Google’s Combating Child Sexual Abuse Material transparency report includes data about YouTube’s efforts to detect, remove, and report CSAM and CSAI. Learn about CSAI Match and our other tools to identify CSAM at scale here.
In addition to our long-standing efforts to combat CSAI video, we have made large investments to detect and remove content which may not meet the legal definition of CSAI, but where minors are still being sexualized or exploited. We continue to invest more resources to ensure children and families have a safe experience on YouTube.
If you come across content that you think is depicting a child in danger or an abusive situation, you can:
- Flag the video: Report videos that contain inappropriate content involving minors by flagging the video for ‘Child Abuse’.
- File an abuse report: If you have found multiple videos, comments, or a user’s entire account that you wish to report, please visit our reporting tool, where you will be able to submit a more detailed complaint.
If we believe a child is in danger based on content that has been reported to us, we will assist with investigations into the content by law enforcement.