Our Content Removal Transparency Report for January to June 2021
Courts and government agencies around the world regularly require that we remove content and information from various Google services like Google Search and YouTube.
We review these demands carefully to determine if the content that is the subject of the request violates a specific local legal requirement. Because we value access to information, we work to minimize over-reaching removals whenever possible by seeking to narrow the scope of government demands and ensure that they are authorized by relevant laws.
For over a decade, we’ve also published a transparency report on Government Requests for Content Removal. This report includes only demands made by governments and courts. We report separately on requests by private actors under content-removal systems established by various governments such as the Digital Millennium Copyright Act (DMCA) in the United States or the Right to be Forgotten included in the General Data Protection Regulation (GDPR) in the EU.
Over the years, as use of our services has grown, our transparency report shows a rise in the number of government demands for content removal – as to both the volume of requests that we receive and the number of individual items of content we are asked to remove. Today’s transparency report, covering January to June 2021, represents the highest volumes we’ve seen on both measures to date.
January - June 2021 Data
Top countries by volume of requests:
- Russia
- India
- South Korea
- Turkey
- Pakistan
- Brazil
- United States
- Australia
- Vietnam
- Indonesia
Top countries by volume of items:
- Indonesia
- Russia
- Kazakhstan
- Pakistan
- South Korea
- India
- Vietnam
- United States
- Turkey
- Brazil
See the full report here.
As research by organizations like Freedom House makes clear, all online platforms are seeing a similar trend.
We’re also seeing a significant increase in the number of laws that require information to be removed from online services. These laws vary by country and region, and require the removal of content on a very wide range of issues – from hate speech to adult content and obscenity, to medical misinformation, to privacy and intellectual property violations.
Many of these laws seek to protect people online and align with Google's own platform policies and community guidelines that help ensure people have a good experience while they are using our services. But laws in some countries can also go significantly beyond those policies, affecting access to information on a range of topics.
Coupled with this, we’ve also seen new laws that impose individual liability on local employees for actions taken by a company offering online services. These types of laws have drawn concern from organizations like the Global Network Initiative because individuals can be pressured, prosecuted, and held personally liable, even when they are not responsible for the content decisions of the company they work for.
While content removal and local representative laws are often associated with repressive regimes, they are increasingly not limited to such nations. Findings from entities like the UN Office of the High Commissioner for Human Rights (OHCHR), our own transparency report data, and any survey of international laws introduced over the past few years all point to the fact that we are likely to continue to see a rise in these types of laws across more countries around the world.