Our Shared Responsibility: YouTube’s response to the Government’s proposal to address harmful content online
At YouTube, our mission is to “give everyone a voice and show them the world.” Implicit in that mission is a sense of responsibility to our community: our users, our creators, and our advertisers. Responsibility is our number one priority at YouTube, and we want to protect our community while enabling new and diverse voices to break through.
The Government of Canada is drafting legislation to address “Harmful Content Online” and we are committed to helping them achieve that objective. Everyone deserves to feel safe online. At YouTube we feel a deep responsibility to keep our users safe and remove content that violates our policies. Part of that responsibility includes working together with governments and other stakeholders to get regulatory frameworks right.
It has been encouraging to see so many voices contribute to the government’s consultation, and transparency around this issue is incredibly important. That's why, like so many others, we are taking the step of publicly releasing our submission made to the Government of Canada. You can read our entire submission here.
YouTube has existing, robust policies in place for content hosted on our platform, including prohibitions on hate speech, terrorist content, nudity, harassment and incitement to violence. Through the Global Internet Forum to Combat Terrorism (GIFCT) and the Technology Coalition we work closely with government partners around the world to tackle illegal content online. We understand the calls for increased transparency and it is why we publish quarterly reports of how YouTube deals with content that violates our Community Guidelines. There are more than 20,000 people across YouTube and Google working to tackle abuse of our platforms and keep our users safe. You can read more about our approach to moderating content online here. We are committed to sharing our experiences and expertise to offer constructive recommendations to the Canadian government as it develops this new regime.
Our starting point is that we believe the same standards should apply to expression in online and offline environments. As currently drafted in the Government’s proposal, what is legal to say offline may not be permissible to share online. We believe it is critical that content regulated by the proposed framework be precisely defined and limited to illegal content in order to avoid undermining access to information; restricting the exchange of ideas and viewpoints that are necessary in a democratic society; and creating a legal framework that could be used to censor political speech in the future.
There are aspects of the Government’s proposal that could be vulnerable to abuse and lead to over removal of legitimate content. Some categories of content, such as defamation, are highly dependent on context and require nuanced decision making. We need to take the time to properly review and assess these types of content. The government’s proposal contains a provision that would require platforms to take down user-flagged content within 24 hours. YouTube receives hundreds of thousands of content flags on a daily basis. While many are good-faith attempts to flag problematic content, large numbers of them represent mere disagreement with views expressed in legitimate content or are inaccurate. As pointed out by Emily Laidlaw and Darryl Carmichael, from the University of Calgary’s Faculty of Law, user-submitted flags can be used as a tool to harass and infringe on the expression of others, and would “disproportionately impact marginalized, racialized and intersectional groups.” In other words, the proposal could harm the diverse voices we hope will thrive on YouTube.
We act rapidly in responding to user flags; however, it’s essential to strike the right balance between speed and accuracy. User flags are best utilized as “signals” of potentially violative content, rather than definitive statements of violations. In Q2 2021, 17.2M videos were flagged by users. In that same period, we removed over 6.2M videos for violating our Community Guidelines, and of those removed, 296,454 were first flagged by users.
We’re also strongly recommending that the legislation does not impose a requirement for proactive monitoring —a system where content is pre-emptively scanned for potentially offensive material before it can be posted. The European Union has already taken a strong stand against proactive monitoring. The EU Commission stated that requiring monitoring “could disproportionately limit users’ freedom of expression and freedom to receive information, and could burden service providers excessively and thus unduly interfere with their freedom to conduct a business.” Imposing proactive monitoring obligations could result in the suppression of lawful expression (potentially including content that is intended to educate and inform the public about societal challenges) and would be out of step with international democratic norms.
We appreciate the opportunity to share our submission with the Government and with Canadians. Many have voiced concerns with the proposal and while we share some of those concerns, we also believe that there is a path forward. We’re at the table, ready to work hand-in-hand with the government, civil society, and Canadians on this critical issue. We all deserve to feel and be safe online.