Transparency Report

Google’s efforts to combat online child sexual abuse material

Google is committed to fighting child sexual abuse material (CSAM) online. CSAM is illegal and our Terms of Service prohibit using any of Google’s platforms or services to store or share this content. Across Google, our teams work around the clock to identify, remove and report this content, using a combination of industry-leading automated detection tools and specially trained reviewers. We also receive reports from third parties and our users, which complement our ongoing work. We report CSAM to the National Center for Missing and Exploited Children (NCMEC), the clearing house and comprehensive reporting centre in the United States for issues related to child exploitation. NCMEC sends those reports to law enforcement agencies around the world.

This report contains data regarding Google’s global efforts and resources to combat CSAM on our platform.

Total pieces of content reported to NCMEC

When we identify CSAM on our platforms, we make a 'CyberTipline' report to NCMEC. A single report may contain one or more pieces of content, depending on the circumstances. This content could include, for example, images, videos, URLs and/or text soliciting CSAM. A piece of content may be identified in more than one account or on more than one occasion, so this metric may include pieces of content reported more than once.

Total content reported

CyberTipline reports to NCMEC

A report sent to NCMEC may include information identifying the user who is responsible for the illegal content, the minor victim, the illegal content itself and/or other helpful contextual facts. It may be the case that more than one report is sent on a particular user or piece of content – for example, in cases where content is identified from multiple sources. NCMEC triages these reports to send to law enforcement agencies around the world.

This metric also includes supplemental CyberTipline reports that Google escalates to NCMEC, which provides additional information on egregious cases of sexual abuse of children, including the hands-on or ongoing sexual abuse of children and the production of CSAM.

Examples of the impact of our CyberTipline reports and supplements that we send to NCMEC can be found in the FAQ.

Total reports

Accounts enforced for CSAM violations

When CSAM is identified in a user’s Google Account, we take appropriate action, including sending a CyberTipline report to NCMEC. We may also take other enforcement actions on the account, including but not limited to disabling or restricting access to services. Affected users are notified about the enforcement actions and are given an opportunity to appeal.

Total accounts enforced

Accounts enforced per country (top 10)

This metric represents the top 10 countries where Google issued CSAM-related account enforcements. This data is based on user country assignments, which are largely determined by either the country from which the user created their account or the country from which the user most often accesses Google services.

URLs reported and de-indexed for CSAM from Google Search

This metric represents the number of URLs that we have reported and removed from the Search index. Google Search aggregates and organises information published on the web. We don’t have control over the content on third-party web pages. When we identify CSAM on third-party web pages, we report, de-index and remove that URL from search results, but have no ability to remove the content from the third-party page itself. This metric is a combination of both automated and manual removals.

URLs reported and de-indexed

CSAM hashes contributed to the NCMEC database

When we identify new CSAM we may create a hash of the content and add that to our internal repository. Hashing technology allows us to find previously identified CSAM. We also share hash values with NCMEC so that other providers can access these hashes as well. Contributing to NCMEC’s hash database is one of the key ways to fight online CSAM across industry. This metric represents the cumulative number of hashes that Google has contributed to this effort.

Total hashes contributed

Google’s efforts to combat online child sexual abuse material

Google is committed to working across the industry and with experts and policymakers around the world to combat the spread of CSAM online.

Learn more about how Google detects, removes and reports CSAM