- Period
- Jan 2018 – Jun 2018
- Type
- Legally inactionable
A music video showing a naked man coming out of the ocean was reported several times for illegal sexual content.
The Network Enforcement Law (NetzDG) came into effect in Germany on October
1, 2017. The law required an effective and transparent procedure to handle
removal complaints that concern illegal content as defined under NetzDG and
an obligation for social networks to publish a transparency report on a
biannual basis. However, due to the primacy of the European
Union's Digital Services Act (DSA), the NetzDG is not applied to YouTube from
August 25, 2023 onwards.
The reports still
available here provide data on our organization and procedures, on the volume
of complaints, and on the volume of removed content. The reports also provide
general information about removal practices and policies.
Please note that the reports available here do not contain the current
information on these matters and may be out of date.
The Network Enforcement Law (NetzDG) requires social networks with more than two million registered users in Germany to exercise a local takedown of 'obviously illegal' content (e.g. a video or a comment) within 24 hours after a complaint about illegal content according to the NetzDG (in the following only 'complaint' or 'NetzDG complaint'). Where the (il)legality is not obvious, the provider normally has up to seven days to decide on the case. In exceptional cases, it can take longer if, for example, users who upload content – the users for whom videos or comments are stored on YouTube (uploader) – are asked to weigh in, or if the decision gets passed onto a joint industry body accredited as an institution of regulated self-regulation. To qualify for a removal under NetzDG, content needs to fall under one of the 22 criminal statutes in the German Criminal Code (StGB) to which NetzDG refers (§ 1 (3) NetzDG). We review all NetzDG complaints based on the criminal offenses listed in § 1 (3) NetzDG. If we identify the content as illegal in accordance with § 1 (3) NetzDG, we locally restrict it. If the content clearly violates our own global YouTube Community Guidelines we remove it globally.
The Act to Combat Right-Wing Extremism and Hate Crime expanded the criminal offenses listed in § 1 (3) of the NetzDG content-wise. For example, the offense of disturbing public peace by threatening to commit offenses (§ 126) and the offense of rewarding and approving offenses (§ 140) have been amended to now also include the offense of causing dangerous bodily harm (§ 224). Furthermore, §189 StGB (defiling memory of dead) was added to the criminal offenses listed in the NetzDG on February 1, 2022. These extensions and additions introduce further indeterminate legal terms into the checks to be carried out, which additionally increase the risk of incorrect decisions.
The NetzDG also requires social networks to create and publish a report about the handling of such complaints (transparency report) on a biannual basis. We comply with this obligation by publishing this report. We update this report for the reporting periods January to June and July to December of each year. The default report available here covers the previous reporting period, but users can also view data from previous reporting periods. The current version of the report is also available for download at the end of the report.
Today, over five hundred hours of video are uploaded to YouTube every minute, making it one of the largest living collections of human culture ever assembled in one place: a community where people all over the world can create and share ideas and opinions. YouTube needs to ensure that its users abide by rules that serve to protect and maintain the whole community. Our community guidelines prohibit certain categories of material, including sexually explicit content, spam, hate speech, harassment and incitement to violence. If it is in violation of our community guidelines, we will remove or restrict access globally. Using a 'people + machine' framework, YouTube has made substantial progress enforcing our guidelines. To find out more, please visit our YouTube Community Guidelines enforcement report.
We respect German and other local laws. When we receive complaints to remove allegedly illegal content, we review each complaint carefully. If the content is in violation of local law, we will locally block the content that we identify as illegal. This is the same approach that we take with any other legal requests for content removal. As we detail in this report, deciding whether content is illegal under local laws can be among the more difficult legal assessment decisions that YouTube reviewers have to make.
YouTube’s efforts to build an engaging online global community cannot succeed without the efforts of a cross-functional team, including policy specialists, lawyers, engineers, product managers, data analysts, content reviewers, operations analysts, emerging threat analysts and many others. In addition, inputs from the global community of users, NGOs, governments and industry partners – including other technology companies – play an important role in shaping YouTube as a platform that works for users and creators all across the globe.
This data includes only complaints that concern content alleged to be illegal within the meaning of sec. 1 para. 3 NetzDG (not, however, complaints received through reporting channels, which clearly do not lead to an assessment according to the NetzDG).
The term complaint refers to single items in this Transparency Report. If several items (e.g. several videos or comments) are subject to a NetzDG complaint, we count one complaint per item. If we for example receive a NetzDG complaint containing three different videos, we count three complaints respectively; that means one complaint per item reported. Therefore we count each individual complaint on a video or comment as a single item in this report. The charts below thus, marked as items, provide data on the number of complaints regarding the items filed during the six-month reporting period.
The chart above shows the number of items reported to us in the reporting period by the type of submitter (users and reporting agencies). This data relies on self-identification at the time of reporting and we cannot verify whether a user who selects 'reporting agency' is indeed affiliated with a reporting agency.
The chart above shows the number of items reported to us in the reporting period by reason for complaint. This chart reflects the reason for complaint provided by the submitter at the time of reporting.
The charts below show the number of items that were removed or blocked following a NetzDG complaint in the reporting period.
The chart above shows the number of items that were removed or blocked in the reporting period by the type of submitter (users and reporting agencies). This data relies on self-identification at the time of reporting and we cannot verify whether a user who selects 'reporting agency' is indeed affiliated with a reporting agency.
NetzDG complaints | Current reporting cycle | Previous reporting cycle (6 months ago) | Previous reporting cycle (12 months ago) |
---|---|---|---|
Total items reported | 193,131 | 233,440 | 282,858 |
Total items removed / blocked | 30,870 | 32,150 | 50,717 |
Percentage of reported items removed / blocked | 15.98% | 13.77% | 17.93% |
The table above shows the comparison of total number of items reported under NetzDG, total number of items blocked or removed, as well as the percentage of reported items that were blocked or removed for this reporting period and the two previous reporting periods.
Agency | Items Requested | Items Removed |
---|---|---|
Eco | 0 | 0 |
FSM | 1 | 0 |
jugendschutz.net | 18 | 12 |
This table shows the number of items that we have received from reporting agencies known to us in the area of NetzDG and which have a legal mandate to process child sexual abuse imagery (CSAI) based on agreements with the Federal Criminal Police Office (BKA) and the Federal Agency for the Protection of Children and Young People in the Media (BzKJ), Eco, Freiwillige Selbstkontrolle Multimedia-Diensteanbieter e.V. and Jugendschutz.net. The data in the table shows the first decision associated with a given report. The data does not show the final status of an item reported in the reporting period.
NetzDG allows social networks to contact the uploader when a statement of fact is at issue or other factual circumstances are relevant. Getting a reasonable answer from the uploader, however, requires a detailed and substantiated complaint by the submitter on why the statement in question is allegedly false. Due to the fact that the majority of legal NetzDG complaints are unsubstantiated (even after YouTube explicitly asks for further information), there are no sufficient grounds to contact the uploaders in these cases.
There are cases in criminal law that are complex and require specific legal expert knowledge. These are circumstances where we may seek the advice of an external institution, such as outside counsel or one of the self-regulation institutions that specializes in matters related to NetzDG.
We review all NetzDG complaints about illegal content according to NetzDG to determine whether the reported content violates any of the criminal offenses listed in the NetzDG (§ 1 (3) NetzDG).
The number above shows the number of items removed or blocked during the reporting period that we classified as unlawful under one of the criminal offenses listed in the NetzDG.
The charts in this section show the total number of items that we removed or blocked and that were identified as unlawful under one of the criminal offenses listed in the NetzDG during the reporting period according to turnaround time. Turnaround time is the amount of time that passes between receipt of a complaint and the removal or blocking of the reported items.
The chart above, contrary to previous reports, shows only the turnaround time for items removed or blocked and that were identified as unlawful under one of the criminal offenses listed in the NetzDG in the reporting period by the type of submitter (reporting agencies and users). You will find data on the turnaround time for all items removed or blocked in previous reporting periods in the drop-down menu below the chart. Note: This data relies on self-identification at the time of reporting and we cannot verify whether a user who selects 'reporting agency' is indeed affiliated with a reporting agency. Cases may exceed seven days because of technical issues, complex cases where we sought external advice or if it is a rare language.
Reason | Less than 24 hrs | Less than 48 hrs | Less than one week | Longer |
---|---|---|---|---|
Privacy | 88 | 3 | 19 | 2 |
Defamation or Insults | 2,240 | 149 | 52 | 30 |
Harmful or Dangerous Acts | 180 | 36 | 30 | 0 |
Sexual Content | 154 | 9 | 5 | 0 |
Terrorist or Unconstitutional Content | 590 | 43 | 31 | 0 |
Hate Speech or Political Extremism | 2,551 | 235 | 129 | 3 |
Violence | 381 | 24 | 16 | 0 |
The charts above show the turnaround time for items removed or blocked and that we identified as unlawful under one of the criminal offenses listed in the NetzDG in the reporting period by complaint reason. These charts reflect the reason provided by the submitter at the time of reporting, which may not be the same as the actual reason for removal or blocking.
Government agencies, courts and parties in civil litigation regularly ask technology and communications companies to turn over user data. In this report, we disclose information about the number and type of requests that we receive from governments.
YouTube seeks to balance and preserve 'Four Freedoms' – freedom of expression, freedom of information, freedom of opportunity and freedom to belong. Enforcing our community guidelines and/or the law is part of balancing these freedoms and preserving the YouTube community. Striking this balance is never easy, especially for a global platform operating in societies that have different standards for speech.
For users in Germany, YouTube provides an easily recognisable, directly accessible and always available in-product reporting flow for complaints under NetzDG. The procedure for logged-in users to submit legal complaints under NetzDG is integrated into the flagging flow, which is accessible under each video and next to each comment (three dots). The submitter needs to click the tick box, 'I believe this content should be restricted under the Network Enforcement Law' in the flagging flow and a short legal web form opens up for the submitter to add additional detail.
YouTube also offers a legal NetzDG web form for all logged-in and logged-out users that is directly available through the link NetzDG complaints in YouTube’s main menu and through the YouTube imprint (a contact page available to all users in Germany).
As a legal layperson, the average user will likely be overwhelmed when confronted with a complete portfolio of complex offenses, or even deterred from any reporting at all. The average user is unlikely to understand and cite relevant statutes of the StGB when submitting a legal NetzDG complaint. Moreover, some notified content may violate more than one of the listed offenses. For instance, a video that tries to recruit new members or supporters for a criminal or terrorist organization (§ 129, 129a StGB) will usually also contain symbols like a flag, which may be punishable under §§ 86, 86a StGB; it may also fulfill the offense of a preparation of a serious violent offense endangering the state according to § 89a StGB. These concerns – which have already been highlighted in previous reports – have been considerably reinforced by the changes introduced by the Act to Combat Right-Wing Extremism and Hate Crime. For example, a threat within the meaning of § 241 of the Criminal Code can also be made ‘via third parties’ if the disclosure to the addressee is encompassed by the perpetrator's intent. The criminal offenses of threat (§ 126 StGB) and menace (§ 241 StGB) therefore lack definiteness and selectivity, especially when the respective content is published on social networks.
Thus, in order to ease the complaint process and help the submitters report content that they believe may be illegal under NetzDG, we have created seven content categories in our NetzDG reporting flow that correspond to, reflect and categorize in a generally understandable way the 22 relevant criminal offenses. This approach also helps to make offenses that are very abstract and have a very broad scope more tangible (especially for legal laypersons). For example, § 140 StGB penalizes the rewarding or approving of such diverse offenses as treason, murder and other serious crimes like war crimes, (at least) dangerous bodily harm, certain crimes endangering the public, as well as certain serious sexual crimes, etc. Our experience with content notification processes demonstrates that users appreciate such a true-to-life and low-threshold approach to submit effective complaints of illegal content. The catalog of §126 of the Criminal Code referred to by § 140 of the Criminal Code has in turn been expanded. Moreover, for the alternative of approval in § 140 of the Criminal Code, it is now no longer even necessary that the act in question has already been committed or at least attempted in a punishable manner. The technique of referring to catalogs of offenses and the nested examination of offenses overstrains legal laypersons more than ever, precisely because of the future-orientated character of threats. The true-to-life and low threshold approach that has been used so far will presumably continue to allow the goals of the NetzDG to be taken into account in the best possible way.
These categories – and the corresponding criminal offenses referred to in § 1 (3) NetzDG that we expect to be essentially covered and reported by selecting among them – are:
Hate speech or political extremism
Terrorist or unconstitutional content
Violence
Harmful or dangerous acts
Defamation or insult
Privacy
Sexual content
Based on our long-term global experience with flagging of content, we are convinced that the legal aim of NetzDG as a whole is best supported and achieved by these categories.
When we receive a NetzDG complaint through the reporting procedures described above, the submitter will receive a confirmation email with a reference number stating that we have received their complaint and will review it.
After reviewing a NetzDG complaint, we send the submitter an email with information about our decision under the NetzDG and the reasons for it. This notice also contains information that a criminal complaint can be filed with the law enforcement authorities and that further information can be found at this link , a help page of the Federal Ministry of Justice and Consumer Protection at hilfe-info.de. If YouTube does not comply with a NetzDG complaint regarding a video, the notice will also contain a link through which the submitter can file a request for a reconsideration of the decision under the NetzDG.
If, in response to a NetzDG complaint, we block content for violating one of the offenses set out in NetzDG, we will send an email notification to the uploader with information about the blocking. If a video has been blocked, this notification also contains a link through which the uploader can submit a request for a reconsideration of the decision under the NetzDG. If a video was removed for violating the community guidelines, the uploader will receive a notification with a link to request a community guidelines review of the decision.
When a video is removed based on community guidelines or blocked due to local law, we display a public notice instead of the video, informing users that the video is no longer available.
Legal complaints. As outlined above, we have created additional reporting tools for individuals to report content that allegedly is in violation of NetzDG statutes (NetzDG complaints): a webform that is directly available to all users through the link NetzDG complaints in YouTube’s main menu or through the YouTube imprint or – for signed-in users – by clicking the NetzDG tick box which is incorporated in the German flagging flow, as described above. These reporting channels enable users to identify the objectionable item and provide a reason for the legal complaint. This information is necessary for us to conduct a proper review so that we can take appropriate action. If there is an unclear rationale or insufficient justification for a local legal removal, we may ask the submitter to supply additional information.
Providing an intuitive and easily accessible reporting flow adjacent to the content leads to a high number of clicks and complaints. However, user reporting is not always reliable. Many of the complaints are off topic or unsubstantiated and thus inactionable. Other users submit a complaint without providing any information on why they think the content is illegal. This is especially problematic when the content is not obviously illegal.
Apart from the mechanisms to file NetzDG complaints, we have provided legal web forms to YouTube users (e.g. for personality rights complaints, copyright complaints, trademarks complaints, etc.) for many years prior to NetzDG. So the legal removal process was not introduced through NetzDG; it merely links our existing reporting mechanisms – flagging for alleged community guidelines violations and legal complaints for allegedly illegal content – closer together.
Human flagging. We have a flagging system that enables logged-in users to alert us to content that potentially violates our community guidelines. This flagging system is accessible under each video and next to each comment. The user can choose from different content categories to select the reason that they are reporting the content. Unless the NetzDG tick box is ticked, these flags are assessed purely based on our community guidelines. This is a voluntary self-regulatory system that exists separately from any legal obligation. We have also developed a programme called 'Trusted Flagger' to provide organizations that are particularly effective at telling us about content that violates our Community Guidelines with robust processes and powerful tools for reporting multiple pieces of content. Flags from Trusted Flaggers are assessed purely based on our community guidelines. Trusted Flaggers are NGOs and government agencies who usually have a high accuracy rate and domain expertise that makes their flagging a valuable input for the overall system. Detailed information about the Trusted Flagger programme is available in the YouTube Community Guidelines enforcement report .
Automated matching by machine. YouTube’s enforcement system starts from the point at which a user uploads a video. YouTube utilizes technology to prevent re-uploads of known violative content, including through the use of hashes (or 'digital fingerprints'). Hashes are unique digital fingerprints for images and videos and they help us prevent re-uploads of exact matches to videos removed for community guidelines violations. For some content, like child sexual abuse images (CSAI) and terrorist recruitment videos, YouTube also uses a shared industry database of hashes to increase the volume of content that our machines can catch at upload.
Automated flagging by machine. In June 2017 we began to deploy machine-learning technology to flag violent extremist content for human review. YouTube uses the corpus of videos already reviewed and removed for violent extremism to train machine learning technology to flag new content that might also violate the community guidelines. Using machine-learning technology trained by human decisions means that the enforcement systems adapt and get smarter over time. Because we have seen these positive results, we have begun training machine-learning technology across other challenging content areas, including child safety and hate speech. However, we find that these systems are most effective when there is a clearly defined target that is violative in any context. Machine automation simply cannot replace human judgment and nuance. To learn more, please visit our YouTube Community Guidelines enforcement report .
Community guideline flagging. YouTube’s globally applicable community guidelines are clear, high-level rules available here. These guidelines have evolved over time as YouTube has grown and user behavior has changed. Users consent to adhere to these guidelines before opening a YouTube channel.
We review items that have been flagged (as outlined above) against all of our community guidelines, but the guidelines prohibit among others nudity or sexual content; harmful or dangerous content; hateful content; violent or graphic content; harassment or cyberbullying; threats and child endangerment.
YouTube ensures consistent enforcement of the community guidelines via a more detailed and living set of interpretation guidelines. For example, the community guidelines prohibit content that promotes terrorism. If a terrorist group creates a new branch, the internal enforcement guidelines may be updated with information about that specific group so that reviewers have the guidance needed to remove content promoting that group. YouTube does not always disclose these kinds of updates to the public because doing so would make it easier for unscrupulous users to evade detection.
The review teams are able to see the surrounding context during their review of reported content including the video description, other content uploaded to the channel and metadata (titles, tags or captions). These contextual clues are important in evaluating the intent of the upload. In addition, our review tool captures the timestamp at which a video was flagged and our web forms ask submitters to include timestamps. This enables our reviewers to focus on the potentially problematic moments within a video.
We have carved out exceptions to the community guidelines for material that is educational, documentary, scientific and/or artistic (EDSA) . Videos and comments that fall under those exceptions are crucial to understanding the world and to chronicling history, whether it is documenting wars and revolutions, or artistic expression that may include nudity. Because of this, YouTube’s enforcement guidelines take great care in helping reviewers understand the EDSA exceptions when reviewing flagged videos. However, even with well articulated guidelines, determining what videos and comments are subject to EDSA exceptions can be among the more difficult policy enforcement decisions that YouTube reviewers have to make.
In general, our review teams will remove content globally if it’s in violation of our community guidelines. Our teams may also take one of several alternative actions:
In cases of repeated abuse or of more egregious violations, we may penalize the user by disabling certain features or by terminating their account. In most cases the first violation of our community guidelines will result in a warning. Then we have a general three-strikes rule where three policy violations lead to account termination, but we may also terminate the account at first offense for egregious violations like terror.
Legal complaints. When we receive a legal request, our review teams will perform a legal review based on the information in the request, including the objection raised by the submitter. In addition, they will also see the surrounding context of the reported content (e.g. metadata, title, etc.). If important information is missing from a request – for example, the identity of a person in a defamation complaint – the team will contact the submitter for further information that is necessary for the legal assessment. If we identify the content as illegal, we locally block the content.
NetzDG complaints. Given our review process, when we receive a NetzDG request, our specialized NetzDG review team (see section Review teams), that sees the surrounding context of the reported content, assesses the content against the criminal offenses listed in NetzDG. If the content clearly violates our community guidelines, the NetzDG review team removes it globally. Accordingly, we can apply two decisions to content reported under NetzDG. This can lead to videos being removed globally if it only violates our guidelines or additionally violates German law. If there is no violation of the guidelines, but we consider the video to be illegal according to § 1 (3) NetzDG or other local legal norms, it will be blocked locally.
The assessment of complaints is often not easy. Some of the criminal offenses are difficult to pin down, even for lawyers – e.g. forgery of data intended to provide proof (§ 269 StGB). To give another example, the whole category of defamation and insults is an area where extensive case law has been established over the last decades, in particular since the German Constitution came into effect. Thus, when it comes to defamation and insults, only a minority of cases are obviously illegal. Courts sometimes deliberate the legality of a piece of content for years and still come to different conclusions. For example, the Federal Constitutional Court has reversed judgments by the Federal Supreme Court, showing time and again that complex balancing tests need to be made and that legality is always circumstantial, depending on the factors of an individual case. Unlike in court proceedings, the social network doesn’t always have all necessary information. Furthermore, there is no main proceeding requiring evidentiary rules. In these cases, the admissibility of content – when measured by specific elements of offenses – is very difficult to judge and should typically be decided by the responsible courts.
These considerations are also supported by the actual practice: many NetzDG complaints in the area of defamation and insults are not submitted by the affected person, but rather from third parties who assume that the affected person might feel defamed. Whether that is indeed the case or whether the affected person actually filed a criminal complaint with the respective law enforcement authorities – because the prosecution of these offenses require a first-party complaint ('Antragsdelikt') – is not known to the social network because we are not in a position to verify the identity of the submitter.
NetzDG requests are reviewed by our NetzDG team in two shifts, seven days a week, 365 days a year to allow for a global removal or local block of content as applicable within the time limits of the NetzDG. If a complaint is obviously unfounded, the requester is immediately notified according to the legal requirements (see section 'Measures to inform the submitter and the uploader according to NetzDG'). If the content does not obviously violate either YouTube’s Community Guidelines or the relevant criminal statutes, or the content is otherwise complex or does not obviously relate to Germany, the responsible NetzDG content reviewer escalates the complaint to the next level for prompt review, with the appropriate action then taken by senior content reviewers. Complex requests are passed on to the YouTube Legal Team who, if in doubt, further escalate difficult and edgy cases to members of the legal department of Google Germany GmbH, who in turn have the option to escalate particularly difficult cases to an external law firm that is specialized in criminal law. This process usually takes up to seven days.
To ensure that the NetzDG team is operating as intended and is applying YouTube’s Community Guidelines and the criminal offenses under NetzDG correctly and consistently, we have implemented a rigid quality assessment process. In the reporting period we sampled, on average, approximately 30% of the reviewed content from the previous week. The quality assessment volume may change from week to week, depending on incoming complaint volumes. During this process, the quality review team evaluates the decisions taken by each content reviewer, provides individualized feedback and performs an overall analysis of the results of the quality review. The selected quality sample is the basis for a weekly quality data overview. The quality reviewers are a separate team within the NetzDG team and consist of senior team members who previously worked in content review and have substantial experience with the criminal statutes referred to in NetzDG and with YouTube’s Community Guidelines. During weekly meetings between the YouTube Legal Team and the NetzDG team, we not only discuss the most recent quality assessment results, but also calibrate on particularly interesting, difficult and complex cases. Furthermore, any notable trends, current 'hot topics' and case law developments are raised and fully discussed in order to ensure a consistent approach across the NetzDG team. When appropriate, we refine the removal policies to adapt to, for example, updates to YouTube’s Community Guidelines and case law developments. In such instances, new guidance and, where appropriate, training materials are delivered to all members of the NetzDG team.
Appeals of NetzDG decisions: If we identify a video that has been reported through a NetzDG complaint as unlawful under one of the criminal offenses listed in the NetzDG, the uploader receives a similar message with a link to the form to appeal the decision. The same applies for the submitter if the complaint was refused because we did not find a violation of a criminal complaint listed in the NetzDG.
If such an appeal based on NetzDG is submitted, a member of the NetzDG team who did not make the original decision will perform a further review under the NetzDG that either maintains or revises the original decision regarding a violation of the criminal statutes listed in the NetzDG. The result of this further review is communicated in an email.
Community guideline appeals: When YouTube takes action on a video that violates our community guidelines, YouTube informs the uploader about the action that has been taken and why. We provide the user with a description, a link where they can learn more about the removal of the content, and a link to an appeals process for a reassessment of the content based on our community guidelines. We have long allowed users to appeal a decision if they believe that their content is not in violation of our community guidelines. The process is explained here in detail. Community guideline appeals and reinstatements are part of the YouTube Community Guidelines enforcement report. Detailed data can be accessed on this web page.
Although technology has become very helpful in identifying some kinds of controversial content – e.g. finding objects and patterns quickly and at scale in images, video and audio – humans are best at assessing context. For example, algorithms cannot always tell the difference between terrorist propaganda and human rights footage or hate speech and provocative comedy. People are often needed to make the final call.
Review of community guideline flags and other notices are conducted by our technological systems in conjunction with Google employees and hired vendors. We have a robust quality review framework in place to make sure that our global staff are consistently making the right decisions on reported content, and receive regular feedback on their performance. Our review teams consist of thousands of people, fluent in multiple languages, who carefully evaluate flags 24 hours a day, seven days a week, 365 days a year in time zones around the world. These teams have multiple German speakers.
Some members of the teams working on legal complaints are German-speaking legal specialists with a law degree. These specialists, all trained in local laws, confer with local Google counsel. Google employs a large legal team of legal counsel located in many European countries. They are involved as needed in the legal analysis of content reported to us, and teams may seek additional legal advice from local outside counsel.
NetzDG team. For NetzDG requests we have built up a specialized team at an external service provider in Germany (NetzDG team). Depending on the amount of incoming NetzDG requests, the number of content reviewers can vary. In the reporting period, 77 NetzDG team members have been working on NetzDG requests. From this team, 61 team members were content reviewers (including senior content reviewers), supervised by five team leads and supported by six quality reviewers and two trainers.
In order to ensure cultural diversity, our reviewers have different professional backgrounds, speak different languages and are from different age ranges, in the reporting period between 21 and 55 years of age. All NetzDG team members speak fluent German; most of them are native German speakers. All NetzDG team members also speak English; 1 to 30 NetzDG team members each speak one or more of the following languages: Turkish, Russian, Spanish, Bulgarian, Italian, Japanese, Serbian, Kurdish and Ukrainian. This proves to be useful in assessing a potential link to Germany in a foreign language. In the reporting period, one quarter (25%) of the NetzDG team members have a University degree, such as a Bachelors, a Masters or a State Exam in fields such as political science, translation studies, media science, business administration, health science, industrial engineering, archaeology or teaching. One team member has completed a PhD. Approximately 40% of the team have acquired apprenticeships as a retail merchant, cook, clerkship in industry, office, travel or publishing management, bricklayer or in fields such as information and media technology, logistics management or food technology.
The NetzDG team is trained on both the assessment of the relevant criminal offenses under NetzDG and YouTube’s Community Guidelines on at least a half-yearly basis.
Each NetzDG team member receives general onboarding training on all of YouTube’s Community Guidelines, processes and our technical systems, as well as legal training on the criminal offenses under NetzDG. In addition to the onboarding training, we generally deliver mandatory legal NetzDG refresher training every six months. The refresher training is delivered in the German language by a team that typically consists of external lawyers and members from the legal teams of YouTube and Google Germany GmbH. Using the group size of shift-work patterns, we deliver these training sessions in several sessions to ensure that all team members, including the team leads, quality reviewers and trainers, are able to receive the training and have sufficient opportunity to discuss questions and debatable examples (also collected in advance). In addition, we deliver ad hoc legal training as required, which is delivered by YouTube’s legal team with the assistance of members of the legal department of Google Germany GmbH. We use a 'train-the-trainer' model, training the designated trainers of the NetzDG team who are then responsible for delivering the training to the rest of the NetzDG team. Both the refresher training and the ad hoc legal training address current developments, trends, new case law and types of request that were difficult to assess in the previous half-year term. We deliver ad hoc legal training, for example, on subjects such as religious defamation, violent content and insults directed at public figures and politicians.
The NetzDG team also benefits from frequent community guidelines refresher training. These refresher training deals with new developments and trends relevant for the team, such as around hate speech and child safety. In addition, there is specialized training for specific areas, for example, content relating to weapons, harmful and dangerous pranks and challenges, digital security and hoax content. The community guidelines-based training is delivered by policy enforcement managers in conjunction with members of YouTube’s legal team. In addition, the NetzDG team receives weekly and urgent updates on the community guidelines when relevant for the team.
Robust well-being programmes and psychological support are offered for the NetzDG team members, such as regular and upon request training and individual counseling sessions in German, through a dedicated team of German-speaking psychologists, therapists and trainers. The team has access to 24/7 counseling via a support hotline. We also provide facilities that support wellness, including breakout spaces and dedicated private space for individual counseling sessions. This is consistent with the well-being programme that we provide for all review teams across Google and YouTube. In addition, the team receives reduced rates for fitness and gym memberships.
YouTube is represented via Google in the following associations that are relevant to NetzDG:
The FSM and eco operate hotlines where consumers may call and file reports, which are then forwarded to our review teams for evaluation. In each case, we send detailed feedback about our decision to the reporting hotline.
We believe that collaboration is key. We work closely with civil society groups whose mission is to eradicate hate speech and discrimination, and with governments to build our understanding of local context and develop solutions. We regularly review our policy enforcement practices with partners and experts. We also invite NGOs to participate in local or cross-country workshops where we educate them on our policy and product updates, train them in using Google services but also security measures and discuss with them recent challenges and core issues.
There are various initiatives and projects by which Google and YouTube tackle hate speech online and support victims of illegal content.
The Google.org Impact Challenge on Safety is a €10m fund to support organizations across Europe that are working on challenges related to hate, extremism and child safety, both online and offline. By funding new and existing community projects across Europe, Google hopes to support initiatives to counter hate and extremism, and help young people to become confident digital citizens. Winners include the German-based organizations HateAid and Gefangene helfen Jugendlichen e.V. With the grant, HateAid aims to improve the support available for victims of hate speech and hate crime online, to build resilience amongst users who experience this and to re-empower them to go back online so that they don't feel like they have been silenced by hateful sentiment. In order to reduce the recidivism rates among vulnerable German youth, the team from Gefangene helfen Jugendlichen provides youngsters with professional support and programmes, including visits to prisons and discussions with ex-convicts so that vulnerable youth learn about the real life harms of crimes.
For many years, Google has also been committed to promoting media literacy, empowering young people and users of all ages, and supporting teachers with resources for the classroom. We constantly strive to raise awareness about the flagging mechanisms and educate users about Google’s and YouTube’s policies, settings and digital wellbeing tools.
Since 2013, Google has provided funding and support to the German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM e. V.) to create and further develop the teaching resources compendium 'Medien in die Schule' which includes lesson plans on shaping opinion online, countering hate speech, reality vs. fiction in the media, information literacy.
Back in 2009, YouTube, together with local partners and under the patronship of Chancellor Merkel, first launched the '361 Grad Respekt' initiative, which ran multiple times (from 2010 under the patronship of the acting Family Minister) between 2009 and 2014 and was revived under the name of #NichtEgal in 2016 with two iterations (2016 & 2018). These initiatives aimed to foster respectful behavior both online and offline and encouraged young people to take a stand against hate and discrimination with video competitions and schools workshops.
Funded by a grant from Google’s philanthropic arm, Google.org, the German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM e. V.) launched the project 'Weitklick' in May 2020 which aims to develop a blended learning platform to educate teachers about the phenomenon of disinformation and how to address this in class as well as to bring teachers, schools and journalists together.
We continue to invest in a global network of over 300 academics, government partners and NGOs globally who bring valuable expertise to our enforcement systems, like in Germany the Amadeu Antonio Foundation, the Nummer gegen Kummer as well as the Violence Prevention Network.
In order to enforce the YouTube Community Guidelines YouTube relies on a 'people + machine' framework to flag inappropriate content and allow for an assessment based on the community guidelines. Flags based on potential violations of the community guidelines can come from YouTube’s automated flagging systems, from members of the Trusted Flagger programme (NGOs, government agencies and individuals), or from users in the broader YouTube community.
YouTube has always used a mix of human reviewers and technology to address violative content on the platform, and in 2017 YouTube started applying more advanced machine learning technology to flag content for review by the review teams. This combination of smart detection technology and highly trained human reviewers has enabled YouTube to consistently enforce the policies with increasing speed.
Machines are allowing YouTube to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed. And YouTube’s investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).
You can access up-to-date removal data on the YouTube Community Guidelines enforcement report here.
Deploying machine learning actually means more people reviewing content, not fewer. Our systems rely on human review to assess whether content violates our policies.
When we detect a video that violates our guidelines, we remove the video and apply a strike to the channel. We terminate entire channels if they are dedicated to posting content prohibited by our community guidelines or contain a single egregious violation, like child sexual exploitation. The vast majority of attempted abuse comes from bad actors trying to upload spam or adult content.
To read more about how we are fighting this type of content and our investment’s impact, click here.
We’ve also built tools that allow creators to moderate comments on their videos. For example, creators can choose to hold all comments for review, or to automatically hold comments that have links or may contain offensive content. Over one million creators now use these tools to moderate their channel’s comments.
YouTube's Terms of Service are clearly structured and include, among other things, the chapter 'Your content and conduct', which explains to users in user-friendly language what content and what behavior is prohibited on the platform. The Terms of Service expressly state that use of the service is governed by the Terms of Service, YouTube Community Guidelines, YouTube’s Policy, Safety and Copyright Policies (together, the 'Agreement'). According to the Terms of Service, no content may be posted that violates the Agreement or the law. The community guidelines incorporated into the Agreement are structured in a clear and comprehensible way and cover five subject areas: spam and deceptive practices, sensitive content, violent or dangerous content, regulated goods and misinformation. Each of these subject areas is subdivided into further sections that contain the respective policies. Each of these policies starts with a general description of what content is not allowed on YouTube and the reasons for this. Under the heading 'What this policy means for you', the policy then informs users what this specifically means for their conduct on the platform. This is usually followed by an 'Examples' subsection that lists specific examples of content that is not allowed on YouTube under that policy. Finally, in a chapter 'What happens if content violates this policy', users are informed about possible consequences of violations. Thus, these provisions regulate the admissibility of the distribution of content on YouTube in a user-friendly design and language. They contain detailed descriptions of inadmissible conduct and add ostensive examples. The provisions are easy to find, they are worded in a clear and comprehensive manner and thus meet the requirements of §§ 307 Para. 1, 308 and 309 BGB. The provisions in these guidelines are based on objective, verifiable criteria and thus meet the standards of relevant case law. In addition, users are informed in detail how they are notified about any violations of the Terms of Service or community guidelines and how and where they can submit statements or appeal actions.
Each reporting period, we collect a set of examples that better illustrate the decisions we have made regarding allegedly illegal content in order to paint a picture of the breadth of content covered by NetzDG and the types of content we are being asked to remove.
A music video showing a naked man coming out of the ocean was reported several times for illegal sexual content.
A speech by Angela Merkel about whether Islam belongs to Germany was reported for hate speech as well as terrorist content because Merkel according to the requestors would endanger the country and lead Germany into a catastrophe.
A video by Jan Böhmermann, in which he makes a far right group subject of critical and satirical discussion, received several reports because of alleged hate speech, violence, spam, defamation and insults.
A comment that praises “cuddling at home” was flagged for “harmful or dangerous acts”.
A video showing a known German Islamic “hate preacher” calling for “dua” and “tawaf”; “takbir” hacklings can be heard from the community. The video was reported under NetzDG for terrorist content several times . As the Da’wa spectrum also allows another interpretation than pure propaganda, and propaganda couldn’t be identified in the video, outside counsel identified it as legal.
Google provides the information about removals, policies, and procedures contained in this report in accordance with Germany’s Network Enforcement Law.
YouTube report archive arrow_forward