How we're addressing explicit fake content in Search
With every new technology advancement, there are new opportunities to help people — but also new forms of abuse that we need to combat. As generative imagery technology has continued to improve in recent years, there has been a concerning increase in generated images and videos that portray people in sexually explicit contexts, distributed on the web without their consent.
These are sometimes referred to as explicit “deepfakes,” and this content can be deeply distressing for people affected by it. That's why we've invested in long-standing policies and systems to help people gain more control over this content.
Today, we're sharing a few significant updates, which were developed based on feedback from experts and victim-survivors, to further protect people. These include: updates to our removal processes to make it easier for people to remove this content from Search and updates to our ranking systems to keep this type of content from appearing high up in Search results.
Easier ways to remove content
For many years, people have been able to request the removal of non-consensual fake explicit imagery from Search under our policies. We’ve now developed systems to make the process easier, helping people address this issue at scale.
When someone successfully requests the removal of explicit non-consensual fake content featuring them from Search, Google’s systems will also aim to filter all explicit results on similar searches about them. In addition, when someone successfully removes an image from Search under our policies, our systems will scan for – and remove – any duplicates of that image that we find.
These protections have already proven to be successful in addressing other types of non-consensual imagery, and we've now built the same capabilities for fake explicit images as well. These efforts are designed to give people added peace of mind, especially if they’re concerned about similar content about them popping up in the future.
Improved ranking systems
With so much content created online every day, the best protection against harmful content is to build systems that rank high-quality information at the top of Search. So in addition to improving our processes for reporting and removing this content, we are updating our ranking systems for queries where there’s a higher risk of explicit fake content appearing in Search.
First, we’re rolling out ranking updates that will lower explicit fake content for many searches. For queries that are specifically seeking this content and include people’s names, we'll aim to surface high-quality, non-explicit content — like relevant news articles — when it’s available. The updates we’ve made this year have reduced exposure to explicit image results on these types of queries by over 70%. With these changes, people can read about the impact deepfakes are having on society, rather than see pages with actual non-consensual fake images.
There’s also a need to distinguish explicit content that’s real and consensual (like an actor’s nude scenes) from explicit fake content (like deepfakes featuring said actor). While differentiating between this content is a technical challenge for search engines, we're making ongoing improvements to better surface legitimate content and downrank explicit fake content.
Generally, if a site has a lot of pages that we've removed from Search under our policies, that's a pretty strong signal that it's not a high-quality site, and we should factor that into how we rank other pages from that site. So we’re demoting sites that have received a high volume of removals for fake explicit imagery. This approach has worked well for other types of harmful content, and our testing shows that it will be a valuable way to reduce fake explicit content in search results.
These changes are major updates to our protections on Search, but there's more work to do to address this issue, and we’ll keep developing new solutions to help people affected by this content. And given that this challenge goes beyond search engines, we’ll continue investing in industry-wide partnerships and expert engagement to tackle it as a society.