Improving Search to better protect people from harassment
Over the past two decades of building Google Search, we’ve continued to improve and refine our ability to provide the highest quality results for the billions of queries we see every day. Our core principles guide every improvement, as we constantly update Search to work better for you. One area we’d like to shed more light on is how we balance maximizing access to information with the responsibility to protect people from online harassment.
We design our ranking systems to surface high quality results for as many queries as possible, but some types of queries are more susceptible to bad actors and require specialized solutions. One such example is websites that employ exploitative removals practices. These are sites that require payment to remove content, and since 2018 we’ve had a policy that enables people to request removal of pages with information about them from our results.
Beyond removing these pages from appearing in Google Search, we also used these removals as a demotion signal in Search, so that sites that have these exploitative practices rank lower in results. This solution leads the industry, and is effective in helping people who are victims of harassment from these sites.
However, we found that there are some extraordinary cases of repeated harassment. The New York Times highlighted one such case, and shed light on some limitations of our approach.
To help people who are dealing with extraordinary cases of repeated harassment, we’re implementing an improvement to our approach to further protect known victims. Now, once someone has requested a removal from one site with predatory practices, we will automatically apply ranking protections to help prevent content from other similar low quality sites appearing in search results for people’s names. We’re also looking to expand these protections further, as part of our ongoing work in this space.
This change was inspired by a similar approach we’ve taken with victims of non-consensual explicit content, commonly known as revenge porn. While no solution is perfect, our evaluations show that these changes meaningfully improve the quality of our results.
Over the years of building Search, our approach has remained consistent: We take examples of queries where we’re not doing the best job in providing high quality results, and look for ways to make improvements to our algorithms. In this way, we don’t “fix” individual queries, since they’re often a symptom of a class of problems that affect many different queries. Our ability to address issues continues to lead the industry, and we’ve deployed advanced technology, tools and quality signals over the last two decades, making Search work better every day.
Search is never a solved problem, and there are always new challenges we face as the web and the world change. We’re committed to listening to feedback and looking for ways to improve the quality of our results.