Skip to main content
The Keyword

A principled approach to evolving choice and control for web content



At Google I/O, we announced new AI-driven products and experiments that build on our years of research in the field. We also spoke about Google’s commitment to developing AI responsibly in ways that maximize the positive benefits to society while addressing the challenges, guided by our AI Principles and in line with our customer privacy commitment.

We believe everyone benefits from a vibrant content ecosystem. Key to that is web publishers having choice and control over their content, and opportunities to derive value from participating in the web ecosystem. However, we recognize that existing web publisher controls were developed before new AI and research use cases.

As new technologies emerge, they present opportunities for the web community to evolve standards and protocols that support the web’s future development. One such community-developed web standard, robots.txt, was created nearly 30 years ago and has proven to be a simple and transparent way for web publishers to control how search engines crawl their content. We believe it’s time for the web and AI communities to explore additional machine-readable means for web publisher choice and control for emerging AI and research use cases.

Today, we’re kicking off a public discussion, inviting members of the web and AI communities to weigh in on approaches to complementary protocols. We’d like a broad range of voices from across web publishers, civil society, academia and more fields from around the world to join the discussion, and we will be convening those interested in participating over the coming months.

You can join the web and AI communities’ discussion by signing up on our website and we'll share more information about this process soon.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe