Transparency Report

Removals under the Network Enforcement Law

The Network Enforcement Law (NetzDG) came into effect in Germany on October 1, 2017. The law requires an effective and transparent procedure to handle removal complaints that concern illegal content as defined under NetzDG and an obligation for social networks to publish a transparency report on a biannual basis. We deprecated the service Google+ for all consumers globally on April 2, 2019. The default report available here covers the previous reporting period until the deprecation, but users can also view the data from previous reporting periods. The current version of the report is also available for download at the end of the report.

This report provides data on our organisation and procedures, on the volume of complaints, and on the volume of removed content. The report also provides general information about our removal practices and policies.

Removing content from Google+ based on the Network Enforcement Law

The Network Enforcement Law (NetzDG) requires social networks with more than 2 million registered users in Germany to exercise a local take down of “obviously illegal” content (e.g., a video or a comment) within 24 hours after notification. Where the (il)legality is not obvious, the provider normally has up to seven days to decide on the case. On an exceptional basis, it can take longer—if, for example, users who upload content—the users for whom posts, pictures or comments are stored on Google+ (uploader)— are asked to weigh in, or if the decision gets passed onto a joint industry body accredited as an institution of regulated self-regulation. To qualify for a removal under NetzDG, content needs to fall under one of the 21 criminal statutes in the German Criminal Code (StGB) to which NetzDG refers. We also evaluated content under our global Google+ Content Guidelines. If the content violated our global guidelines, we removed it globally. If the content did not fall under these policies, but we identified it as illegal according to one of the 21 statutes of the StGB to which NetzDG refers (§ 1 III NetzDG) or any other local law, we locally restricted it.

The NetzDG also requires social networks to create and publish a report about the handling of such complaints (transparency report) on a biannual basis. We comply with this obligation by publishing this report. We updated this report for the reporting periods January to June and July to December of each year.

General remarks on how we dealt with allegedly illegal content

We did not allow illegal content on Google+. When we gained knowledge about such content, we reviewed the complaint and locally blocked content that we identified as illegal. Find more information about our reporting and review mechanisms below.

Total items reported

This data includes only complaints that concern content alleged to be illegal within the meaning of NetzDG. A single complaint may specify several pieces of content, which we call items. We count each individual Google+ post, image, or comment as a single item. The charts below provide data on the total number of items cited in complaints during the 6-month reporting period.

Items reported by submitter

Items ReportedUserAgency0501001502002503003504004505005505416
EntityItems Reported
User541
Agency6

The chart above shows the number of items reported to us in the reporting period by the type of submitter (reporting agencies and users). This data relies on self-identification at the time of reporting and we cannot verify whether a user who selects “reporting agency” is indeed affiliated with a reporting agency.

Items reported by complaint reason

Items ReportedPrivacyDefamation or InsultsHarmful or Dangerous ActsSexual ContentTerrorist or Unconstitutional ContentHate Speech or Political ExtremismViolence0204060801001201401601802002204116230475020314
CategoryItems Reported
Privacy41
Defamation or Insults162
Harmful or Dangerous Acts30
Sexual Content47
Terrorist or Unconstitutional Content50
Hate Speech or Political Extremism203
Violence14

The chart above shows the number of items reported to us in the reporting period by complaint reason. This chart reflects only the reason provided by the submitter at the time of reporting.

Verified reporting agencies

AgencyItems RequestedItems Removed
Eco00
FSM00
jugendschutz.net00

This table shows the number of items in this reporting period we have received from reporting agencies known to us in the area of NetzDG and which have a legal mandate to process child sexual abuse imagery (CSAI): Eco, Freiwillige Selbstkontrolle Multimedia-Diensteanbieter e.V. and Jugendschutz.net.

Removal volume

The charts in this section detail the number of reported items that were removed or blocked in the reporting period.

Items removed by submitter

Items RemovedUserAgency0501001502002503002832
EntityItems Removed
User283
Agency2

The chart above shows the number of items that were removed or blocked in the reporting period by the type of submitter (reporting agencies and users). This data relies on self-identification at the time of reporting and we cannot verify whether a user who selects “reporting agency” is indeed affiliated with a reporting agency.

Items removed by complaint reason

Items RemovedPrivacyDefamation or InsultsHarmful or Dangerous ActsSexual ContentTerrorist or Unconstitutional ContentHate Speech or Political ExtremismViolence010203040506070809010011011811428381067
CategoryItems Removed
Privacy11
Defamation or Insults81
Harmful or Dangerous Acts14
Sexual Content28
Terrorist or Unconstitutional Content38
Hate Speech or Political Extremism106
Violence7

The chart above shows the number of items in the reporting period that resulted in removal or blocking of content by the complaint reason. This chart reflects the reason provided by the submitter at the time of reporting, which may not be the same as the actual reason for removal or blocking.

Obtaining additional information

It is difficult for a social network to determine the veracity of a statement made by an uploader or other factual circumstances. NetzDG allows social networks to contact the uploader when a statement of fact is at issue. Getting a reasonable answer by the uploader though, requires a detailed and legally substantiated complaint by the submitter on why the statement in question is allegedly false. Due to the fact that the majority of legal NetzDG complaints were unsubstantiated (even after Google explicitly asked for further information), there was no sufficient ground to contact the uploaders again in these cases. This was in line with recent Supreme Court case law that requires hosting platforms to contact a content uploader only when there is a substantiated legal complaint.

Interaction with uploader
2
Total number of items we have forwarded to the uploader in order to get his/her view on the complaint in question.
Incomplete complaints
106
Total number of items where we needed more information from the submitter.

Seeking external advice

There are cases in the area of criminal law that are complex and require specific local background knowledge—e.g., about banned organizations, prohibited signs and gestures, indexed song. These are circumstances where we may have sought the advice of an external institution, such as outside counsel or one of the self-regulation institutions that specializes in matters related to NetzDG.

Self-regulation institutions
0
Total number of items forwarded to a self-regulation institution.
Outside counsel
0
Total number of items for which we sought advice from outside counsel in preparing a decision.

Turnaround time

The charts in this section show the total number of items we removed or blocked during the reporting period according to turnaround time. Turnaround time is the amount of time that passes between receipt of a complaint and the removal or the blocking of reported items.

Turnaround time by submitter

AgencyUserLess than 24 hrsLess than 48 hrsLess than one weekLonger025507510012515017520022525027520002559145
TimeAgencyUser
Less than 24 hrs2255
Less than 48 hrs09
Less than one week014
Longer05

The chart above shows the turnaround time for items removed or blocked in the reporting period by the type of submitter (users and reporting agencies). Note again that this data relies on self-identification at the time of reporting and we cannot verify whether a user who selects “reporting agency” is indeed affiliated with a reporting agency. Cases may exceed seven days because of technical issues, complex cases where we sought external advice, or rare languages.

Turnaround time by complaint reason

Less than 24 hrsLess than 48 hrsLess than one weekLongerPrivacyDefamation or InsultsHarmful or Dangerous ActsSexual ContentTerrorist or Unconstitutional ContentHate Speech or Political ExtremismViolence0102030405060708090100110
ReasonLess than 24 hrsLess than 48 hrsLess than one weekLonger
Privacy7130
Defamation or Insults69660
Harmful or Dangerous Acts14000
Sexual Content24112
Terrorist or Unconstitutional Content36110
Hate Speech or Political Extremism100033
Violence7000
ReasonLess than 24 hrsLess than 48 hrsLess than one weekLonger
Privacy7130
Defamation or Insults69660
Harmful or Dangerous Acts14000
Sexual Content24112
Terrorist or Unconstitutional Content36110
Hate Speech or Political Extremism100033
Violence7000

The charts above show the turnaround time for items removed or blocked in the reporting period by complaint reason. These charts reflect the reason provided by the submitter at the time of reporting, which may not be the same as the actual reason for removal or blocking.

Percentage of reported items that we did not remove or block because the content did not violate our Content Guidelines nor was it found to violate the criminal statutes referred to in NetzDG.
Percentage of items that we removed or blocked within 24 hours of receipt of the complaint. In most of these cases, content was globally removed due to a violation of Google+ Content Guidelines.

Content Guidelines enforcement

This chart shows a side-by-side comparison of global removal of reported items due to violation of our Content Guidelines and only local content restriction based on NetzDG. An item may both violate our Content Guidelines and constitute an infringement of the criminal statutes covered by NetzDG. In such instances, we removed globally due to our Content Guidelines. For example, as the chart shows, the vast majority of the items reported for sexual content were removed globally under our Community Guidelines in the reporting period rather than locally under a local legal provision.

Content Guideline enforcement versus NetzDG statutes

Removed locally (NetzDG)Removed globally (CG)PrivacyDefamation or InsultsHarmful or Dangerous ActsSexual ContentTerrorist or Unconstitutional ContentHate Speech or Political ExtremismViolence01020304050607080904240010182757142828885
CategoryRemoved locally (NetzDG)Removed globally (CG)
Privacy47
Defamation or Insults2457
Harmful or Dangerous Acts014
Sexual Content028
Terrorist or Unconstitutional Content1028
Hate Speech or Political Extremism1888
Violence25

Mechanisms, notification, reporting methods, and evaluation

Google seeks to preserve freedom of expression and access to information. But in order to maintain a vibrant and enjoyable community on Google+, we balanced those principles with efforts to prevent the spread of content that violates our Content Guidelines and/or the law. Striking this balance is never easy, especially for a global platform operating in societies that have different standards for speech.

Mechanisms for submitting complaints about allegedly illegal content according to NetzDG

To report a complaint under NetzDG, Google+ provided an easily recognizable, directly accessible and permanently available in-product reporting flow. The procedure for logged-in users to submit legal complaints under NetzDG was directly and intuitively integrated into the flagging flow, which was available next to each post in the upper right corner. When clicking through the flagging flow and selecting the NetzDG option (“I believe this content should be restricted under the Network Enforcement Law."), the submitter was directed to a legal webform through which the individual was able to submit a NetzDG complaint.

Google+ also offered a legal NetzDG webform for both logged-in and logged-out users that was directly available through the Google+ imprint (a contact page available to all users in Germany).

As a legal layperson, the average user will likely be overwhelmed when confronted with a complete portfolio of complex offenses or even deterred from any reporting at all. The average user is unlikely to understand and cite relevant statutes of the StGB when submitting a legal NetzDG complaint. Moreover, some notified content may have violated more than one of the listed offenses. For instance, a video that tries to recruit new members or supporters for a criminal or terrorist organisation (§ 129, 129a StGB) will usually also contain symbols like a flag, which may be punishable under §§ 86, 86a StGB; it may also fulfil the offence of a preparation of a serious violent offence endangering the state according to § 89a StGB.

Thus, in order to ease the notification process and help the submitters report content they believed may be illegal under NetzDG, we had created seven content categories in our NetzDG reporting flow that corresponded to, reflected, and categorized in a generally understandable way the 21 relevant criminal offenses. This approach also helped to make offenses that are very abstract and have a very broad scope more tangible (especially for legal laypersons). For example, § 140 StGB refers to such diverse offenses as prerequisite as treason, murder and other serious crimes like war crimes, grievous bodily harm, certain crimes endangering the public, as well as certain serious sexual crimes, etc. Our experience with content notification processes demonstrated that users appreciate such a true-to-life and low-threshold approach to submit effective complaints of illegal content.

These categories—and the corresponding criminal offenses we expected to be essentially covered and reported by selecting among them—are:

Hate speech or political extremism

  • § 130 StGB: Incitement to hatred
  • § 166 StGB: Defamation of religions, religious and ideological associations

Terrorist or unconstitutional content

  • § 86 StGB: Dissemination of propaganda material of unconstitutional organizations
  • § 86a StGB: Using symbols of unconstitutional organizations
  • § 89a StGB: Preparation of a serious violent offence endangering the state
  • § 91 StGB: Encouraging the commission of a serious violent offence endangering the state
  • § 100a StGB: Treasonous forgery
  • § 129 StGB: Forming criminal organizations
  • § 129a StGB: Forming terrorist organizations
  • § 129b StGB: Criminal and terrorist organizations abroad; extended confiscation and deprivation
  • § 140 StGB in connection with § 138 I StGB: Rewarding and approving of certain offenses listed in § 138 I StGB
  • § 269 StGB: Forgery of data intended to provide proof

Violence

  • § 131 StGB: Dissemination of depictions of violence

Harmful or dangerous acts

  • § 111 StGB: Public incitement to crime
  • § 126 StGB: Breach of the public peace by threatening to commit offenses
  • § 140 StGB in connection with § 126 I StGB: Rewarding and approving of offenses listed in § 126 I StGB
  • § 241 StGB: Threatening the commission of a felony

Defamation or insult

  • § 185 StGB: Insult
  • § 186 StGB: Defamation
  • § 187 StGB: Intentional defamation

Privacy

  • § 201a StGB: Violation of intimate privacy by taking photographs

Sexual content

  • § 184b StGB: Distribution, acquisition and possession of child pornography in connection with § 184d StGB: Distribution of pornographic performances by broadcasting, media services or telecommunications services
  • § 140 in connection with §§ 176 to 178: Rewarding and approving of certain offenses listed in §§ 176 to 178

Based on our long-term global experience with flagging of content, we were convinced that the legal aim of NetzDG as a whole was best supported and achieved by these categories.

Measures to inform the submitter and the uploader according to NetzDG

When we received a complaint under NetzDG through the dedicated NetzDG reporting channels described above, the submitter received an email with a reference number confirming that we received the complaint and will review it. Once we reviewed the complaint and the allegedly illegal content, we sent the submitter an email informing him or her of our decision to remove or to take no action. We also notified the uploaders when they had violated our policies or the law and provided more information on the removal so that Google+ uploaders were educated on our terms of service.

Methods of reporting

Technology. Our technologies were constantly running to identify Content Guideline violations on Google+, such as spam or sexual content. New rules were added to our algorithms on a weekly and monthly basis. We also used hashing technologies to prevent re-uploads of photos on Google+ that have been removed for some policy violations, such as child sexual abuse imagery. We use fingerprinting and matching to scan, identify, and block uploaded photos that contain child sexual abuse imagery.

Human flagging: users and top contributors. We had a flagging system for signed-in users to alert us to content that violated the Google+ Global Content Guidelines. This was a voluntary self-regulatory system that existed outside of any legal obligation. Anyone who was signed in to his or her Google account and found a piece of content that may have potentially violated our Global Content Guidelines was able to flag it by accessing the flagging option of Google+—represented by three dots on the top right corner—then clicking “report abuse” and selecting the category of the alleged content violation. When flagging, users reported what policy they believe the content violates. Policy reporting categories and removal reasons include: sexually explicit content, violent or dangerous content, hateful, harassing or bullying content, and spam. Community flags were assessed purely based on our Content Guidelines.

The Top Contributor program was developed as a way for a subset of pre-defined Google+ users to prioritize for the Google+ content review team. These Top Contributors escalated trends or edge cases based on in-depth product knowledge from the large amount of time spent answering questions on our forums. Top Contributors flagged Content Guideline violations in-product, which were then escalated to the content review team. Flags from Top Contributors were only reviewed under our Content Guidelines. You can learn more about the Top Contributors Program here.

Legal complaints. We had developed a dedicated process so that signed-in users were able to inform us directly and easily if they believed that content posted on Google+ violated one of the statutes that falls under NetzDG. Allegedly illegal content could be reported by accessing the flagging option of Google+ via three dots on the top right corner of posted content, and selecting “I believe this content should be restricted under the Network Enforcement Law." Signed-in and signed-out users could also file a NetzDG complaint through the NetzDG webform that was accessible through the Google+ imprint. The submitter received a reply confirming that we have received the complaint. These reporting channels enabled users to identify the objectionable item and provide a reason for the legal complaint. This information is necessary for us to conduct a proper legal review so that we can take appropriate action. If the rationale was unclear or there is insufficient justification for local legal removal, we may have asked the submitter to supply additional information. If the reported content infringed the Google+ Content Guidelines, we removed it globally. If the content did not violate these Guidelines, but one or more of the criminal statutes NetzDG refers to, we blocked the content locally. The submitter received an email notice from Google+ with our decision and a reason for our decision.

For many years, we have provided other dedicated legal webforms for users to submit legal complaints and we have blocked content we have identified as illegal in the relevant jurisdiction. The submitter has always received feedback on the legal complaint.

Process for evaluation

Human flagging of the Google+ Community. When we received a flag, our review teams assessed the content under our global Google+ Content Guidelines. The teams were able to see the surrounding context during their review of reported content—for example, the headline accompanying a photo on a post or the Google+ community description. These contextual clues are often important factors in evaluating the intent of the upload. For example, a political community on current affairs would have likely be allowed under our global policies. We may have had to discern this context via the community description and other content uploaded. However, the same content uploaded to glorify or encourage hateful views could have potentially violated our Content Guidelines and resulted in a removal.

We had developed Content Guidelines that set the rules of the road on the kind of content we allowed, many of which overlap with NetzDG statutes. These included guidelines prohibiting: hate speech, harassment, bullying, and threats; personal and confidential information; child exploitation; sexually explicit material; violence; and terrorist content. You can read the policies for each of these areas in detail here.

Our review teams were able to take one of several actions: remove content globally if the content violated our community guidelines, mark as “not family safe” if the content did not violate our guidelines, but may not be appropriate for minors, or leave content live if it was deemed not in violation of our guidelines. In cases of repeat abuse or of more egregious violations, we may have penalized the user by disabling certain features or by terminating their account. We may have also terminated the account at first offense for egregious violations like terror.

General legal reporting. When we received a legal complaint, our review teams performed a review based on the information provided in the complaint and the referenced content. In addition, reviewers saw the surrounding context of the reported content as described above. In case some important information was missing in a complaint—for example, the identity of a person affected by allegedly defamatory content—the team may have contacted the submitter and asked for additional information. Once this was received, the team did a legal assessment. If we identified the content as illegal—for example, for a claimed copyright or personality rights infringement—we blocked the content locally.

NetzDG complaints. Given our review process, when we received a NetzDG complaint, our specialized NetzDG review team (see section Review Teams), that saw the surrounding context of the reported content, also assessed the content against our global Content Guidelines and removed it globally in case of a violation. If the content did not violate our guidelines, but one or more of the 21 statutes of the StGB covered by NetzDG (§1 III NetzDG), we blocked the content locally.

The assessment of complaints is often not easy. Some of the criminal offenses are difficult to pin down, even for lawyers—e.g., forgery of data intended to provide proof (§ 269 StGB)—the whole category of defamation and insults, to give another example, is an area where extensive case law has been established over the last decades, in particular since the German Constitution came into effect. Thus, when it comes to defamation and insults, only a minority of cases are obviously illegal. Courts sometimes deliberate the legality of a piece of content for years and still come to different conclusions. For example, the Federal Constitutional Court has reversed judgments by the Federal Supreme Court, showing many times that complex balancing tests need to be made and that legality is always circumstantial, depending on the circumstances of an individual case. Unlike in court proceedings, the social network doesn’t always have all necessary information. Furthermore, there is no main proceeding requiring evidentiary rules. In these cases the admissibility of content—when measured by specific elements of offenses—is very difficult to judge upon and should typically be decided by the responsible courts.

These considerations are also supported by the actual practice: Many NetzDG complaints in the area of defamation and insults were not submitted by the affected person, but rather from third parties who assumed that the affected person might feel defamed. Whether that was indeed the case or whether the affected person actually filed a criminal complaint at the respective law enforcement authorities—because the prosecution of these offenses require a first party complaint (“Antragsdelikt”)—was not known to the social network because we were not in a position to verify the identity of the submitter.

NetzDG requests were reviewed by our NetzDG team in two shifts, seven days a week, 365 days a year until the deprecation of Google+ to allow for a global removal or local block of content as applicable within the time limits of the NetzDG. If a request was obviously unfounded, the requester was immediately notified according to the legal requirements (see section “Measures to inform the submitter and the uploader according to NetzDG”). If the content did not obviously violate either global Google+ Content Guidelines or the relevant criminal statutes, the content was otherwise complex or did not obviously relate to Germany, the responsible NetzDG content reviewer escalated the request to the next level for prompt review with the appropriate action then taken by senior content reviewers. Complex requests were passed on to the Google legal team who, if in doubt, further escalated difficult and edgy cases to members of the legal department of Google Germany GmbH, who in turn had the option to escalate particularly difficult cases to an external law firm that is specialised in criminal law. This process usually took up to 7 days.

To ensure that the NetzDG team was operating as intended and was applying global Google+ Content Guidelines and the criminal offenses under NetzDG correctly and consistently, we implemented a rigid quality assessment process. In the reporting period we audited approximately 70% of the reviewed content. The quality assessment volume may have changed from week to week depending on incoming request volumes. During this process the quality review team evaluated the decisions taken by each content reviewer, provided individualized feedback and performed an overall analysis of the results of the quality review. The selected quality sample was the basis for a weekly quality data overview. The quality reviewers were a separate team within the NetzDG team and consisted of senior team members who previously worked in content review and had substantial experience with the criminal statutes referred to in NetzDG and with global Google+ Content Guidelines. During weekly meetings between the Google legal team and the NetzDG team, we not only discussed the most recent quality assessment results, but also calibrated on particularly interesting, difficult and complex cases. Furthermore, any notable trends, current “hot topics” and case law developments were raised and fully discussed in order to ensure a consistent approach across the NetzDG team. When appropriate, we refined the removal policies to adapt to, for example, updates to our Content Guidelines and case law developments. In such instances, new guidance and, where appropriate, training materials were delivered to all members of the NetzDG team.

Review teams

All policy flags and legal complaints relating to content in Google+ were reviewed by either Google employees or hired vendors. We had a robust quality review framework in place to make sure our global staff was consistently making the best decisions on reported content, and received regular feedback on their performance.

The Google+ policy and enforcement teams, who assessed community flags based on our Google+ Content Guidelines, included people around the world who are fluent in multiple languages, including German. Given the nature of some reported content, Google had a robust wellbeing program, including one-to-one and group counseling and other wellness activities for individual employees.

Some members of the teams working on other legal complaints relating to content on Google+ were German-speaking legal specialists with a law degree. These legal specialists, all trained in local laws, conferred with local Google counsel. If needed, the teams sought advice from local outside counsel advice when additional legal expertise was needed.

NetzDG team. For NetzDG complaints we had built up a specialized team at an external service provider in Germany (NetzDG team). Depending on the amount of incoming NetzDG requests, the number of content reviewers varied. This team was affiliated with YouTube’s NetzDG team. In the reporting period until the deprecation of Google+ the team had 11 team members. From this team, 8 team members were content reviewers (including senior content reviewers), supervised by one team lead and supported by one quality reviewer and one trainer.

In order to ensure cultural diversity, our reviewers had different professional backgrounds, spoke different languages, and were from different age ranges between 20 and 45 years of age. All NetzDG team members were native German speakers and also spoke English; some NetzDG team members spoke one or more of the following languages: French, Japanese, Spanish, Portuguese. This proved to be useful to assess a potential link to Germany in a foreign language. Half of the NetzDG team members had a University degree, such as a Bachelor or a Master, in fields such as political science, media science, health science or teaching. Some of the team members had acquired apprenticeships in fields such as office communication and foreign trade. The other team members had recently finished their school education.

Each NetzDG team member received general onboarding training on all of global Google+ Content Guidelines, processes, and our technical systems, as well as legal trainings on the criminal offenses under NetzDG. In addition to the onboarding training, we delivered mandatory legal NetzDG refresher trainings every 6 months. The refresher trainings were delivered in the German language by a team that typically consisted of a law professor, a criminal lawyer, and members from the legal teams of Google, in particular Google Germany GmbH. We delivered these trainings in several sessions to ensure that all team members, including the team lead, quality reviewer and trainer, were able to receive the trainings and had sufficient opportunity to discuss questions and debatable examples (collected in advance). In addition, we delivered ad hoc legal trainings as required, which were delivered by the Google legal team with the assistance of members of the legal department of Google Germany GmbH. We use a “train-the-trainer” model, meaning that we trained the designated trainers of the NetzDG team who were then responsible for delivering the training to the rest of the NetzDG team. Both the refresher trainings and the ad hoc legal trainings addressed current developments, trends, new case law, and types of requests that were difficult to assess in the previous half-year term.

The NetzDG team also benefited from frequent Content Guidelines refresher trainings. These refresher trainings dealt with new developments and trends relevant for the team, such as around hate speech and child safety. In addition, there were specialized trainings for specific areas, for example, such as weapons, harmful and dangerous pranks and challenges, digital security and hoax content. These Content Guidelines–based trainings were delivered by policy enforcement managers in conjunction with members of Google’s legal team. In addition, the NetzDG team received weekly and urgent updates on the Content Guidelines when relevant for the team.

Robust wellbeing programs and psychological support were offered for the NetzDG team members, such as regular and upon request trainings and individual counseling sessions in German through a dedicated team of German-speaking psychologists, therapists, and trainers. The team had access to 24/7 counselling via a support hotline. We also provided facilities that support wellness, including breakout spaces and dedicated private space for individual counseling sessions. This was consistent with the wellbeing program we provide for all review teams across Google. In addition, the team received reduced rates for fitness and gym memberships.

Industry association membership

During the reporting period Google+ was represented via Google in the following associations that are relevant to NetzDG:

The FSM and eco operate hotlines where consumers may call and file complaints about potentially illegal content, which are then forwarded to our review teams for evaluation. In each case, we send detailed feedback about our decision to the reporting hotline.

We believe that collaboration is key. Regardless of the deprecation of Google+, Google works closely with civil society groups whose mission it is to eradicate hate speech and discrimination, and with governments to build our understanding of local context and develop solutions. We regularly review our policy enforcement practices with partners and experts. We also invite NGOs to participate in local or cross-country workshops where we educate them on our policy and product updates, train them in using Google services but also security measures and discuss recent challenges and core issues.

Removals under the Network Enforcement Law

Google provides the information about removals, policies, and procedures contained in this report in accordance with Germany’s Network Enforcement Law.

Download the report

Google+ report archive