Google is committed to fighting child sexual abuse material (CSAM) online.
CSAM is illegal and our Terms of Service prohibit using any of Google’s
platforms or services to store or share this content. Across
Google, our teams work around the clock to identify, remove and report this
content, using a combination of industry-leading automated detection tools
and specially trained reviewers. We also receive reports from third parties
and our users, which complement our ongoing work. We report CSAM to the
National Center for Missing and Exploited Children (NCMEC), the
clearing house and comprehensive reporting centre in the United States for
issues related to child exploitation. NCMEC sends those reports to law
enforcement agencies around the world.
This report
contains data regarding Google’s global efforts and resources
to combat CSAM on our platform.
When we identify CSAM on our platforms, we make a 'CyberTipline' report to NCMEC. A single report may contain one or more pieces of content, depending on the circumstances. This content could include, for example, images, videos, URLs and/or text soliciting CSAM. A piece of content may be identified in more than one account or on more than one occasion, so this metric may include pieces of content reported more than once.
A report sent to NCMEC may include information identifying the user who is responsible for the illegal content, the minor victim, the illegal content itself and/or other helpful contextual facts. It may be the case that more than one report is sent on a particular user or piece of content – for example, in cases where content is identified from multiple sources. NCMEC triages these reports to send to law enforcement agencies around the world.
This metric also includes supplemental CyberTipline reports that Google escalates to NCMEC, which provides additional information on egregious cases of sexual abuse of children, including the hands-on or ongoing sexual abuse of children and the production of CSAM.
Examples of the impact of our CyberTipline reports and supplements that we send to NCMEC can be found in the FAQ.
When CSAM is identified in a user’s Google Account, we take appropriate action, including sending a CyberTipline report to NCMEC. We may also take other enforcement actions on the account, including but not limited to disabling or restricting access to services. Affected users are notified about the enforcement actions and are given an opportunity to appeal.
This metric represents the top 10 countries where Google issued CSAM-related account enforcements. This data is based on user country assignments, which are largely determined by either the country from which the user created their account or the country from which the user most often accesses Google services.
This metric represents the number of URLs that we have reported and removed from the Search index. Google Search aggregates and organises information published on the web. We don’t have control over the content on third-party web pages. When we identify CSAM on third-party web pages, we report, de-index and remove that URL from search results, but have no ability to remove the content from the third-party page itself. This metric is a combination of both automated and manual removals.
When we identify new CSAM we may create a hash of the content and add that to our internal repository. Hashing technology allows us to find previously identified CSAM. We also share hash values with NCMEC so that other providers can access these hashes as well. Contributing to NCMEC’s hash database is one of the key ways to fight online CSAM across industry. This metric represents the cumulative number of hashes that Google has contributed to this effort.
Google is committed to working across the industry and with experts and policymakers around the world to combat the spread of CSAM online.
Learn more about how Google detects, removes and reports CSAM arrow_forward