Skip to main content
The Keyword

Search

Search On 2022: Search and explore information in new ways

Illustration symbolizing search boxes, videos and images searched for with Lens

At Search On today, we shared how we’re getting closer to making search experiences that reflect how we as people make sense of the world, thanks to advancements in machine learning. With a deeper understanding of information in its many forms — from language, to images, to things in the real world — we’re able to unlock entirely new ways to help people gather and explore information.

We're advancing visual search to be far more natural than ever before, and we're helping people navigate information more intuitively. Here's a closer look.

Helping you search outside the box

With Lens, you can search the world around you with your camera or an image. (People now use it to answer more than 8 billion questions every month!) Earlier this year, we made visual search even more natural with the introduction of multisearch, a major milestone in how you can search for information. With multisearch, you can take a picture or use a screenshot and then add text to it — similar to the way you might naturally point at something and ask a question about it. Multisearch is available in English globally, and will be coming to over 70 languages in the next few months.

At Google I/O, we previewed how we’re supercharging this capability with “multisearch near me,” enabling you to snap a picture or take a screenshot of a dish or an item, then find it nearby instantly. This new way of searching will help you find and connect with local businesses, whether you’re looking to support your neighborhood shop, or just need something right now. “Multisearch near me” will start rolling out in English in the U.S. later this fall.

Phone showing multisearch for a patterned floral-print shirt and a tie.

One of the most powerful aspects of visual understanding is its ability to break down language barriers. With Lens, we’ve already gone beyond translating text to translating pictures. In fact, every month, people use Google to translate text in images over 1 billion times, across more than 100 languages.

With major advancements in machine learning, we’re now able to blend translated text into complex images, so it looks and feels much more natural. We’ve even optimized our machine learning models so we're able to do all this in just 100 milliseconds — shorter than the blink of an eye. This uses generative adversarial networks (also known as GAN models), which is what helps power the technology behind Magic Eraser on Pixel. This improved experience is launching later this year.

Mobile screengrabs of translating text on an image about the NASA James Webb telescope

With the new Lens translation update, you can point your camera at a poster in another language, for example, and you’ll now see translated text realistically overlaid onto the pictures underneath.

And now, we’re putting some of our most helpful tools directly at your fingertips, beginning with the Google app for iOS. Starting today, you’ll see shortcuts right under the search bar to shop your screenshots, translate text with your camera, hum to search and more.

New shortcuts underneath the Google Search bar prompting a user to translate text, get homework help or search inside a photo.

New ways to explore information

As we redefine how people search for and interact with information, we’re working to make it so you’ll be able to ask questions with fewer words — or even none at all — and we’ll still understand exactly what you mean, or surface things you might find helpful. And you can explore information organized in a way that makes sense to you — whether that’s going deeper on a topic as it unfolds, or discovering new points of view that expand your perspective.

An important part of this is being able to quickly find the results you’re looking for. So in the coming months, we’re rolling out an even faster way to find what you need. When you begin to type in a question, we can provide relevant content straight away, before you’ve even finished typing.

Mobile search for Fort Funston with detailed new auto-completing results.

But sometimes you don't know what angle you want to explore until you see it. So we’re introducing new search experiences to help you more naturally explore topics you care about when you come to Google.

As you start typing in the search box, we’ll provide keyword or topic options to help you craft your question. Say you're looking for a destination in Mexico. We’ll help you specify your question — for example, “best cities in Mexico for families” — so you can navigate to more relevant results for you.

Mobile search for “Best Mexico Cities” with for families auto-completing

Maybe you hadn’t considered Oaxaca, but it looks like a great place to visit with the kids. And as you’re learning about a topic, like a new city, you might find yourself wondering what it will look like or what it will feel like. So we’re also making it easier to explore a subject by highlighting the most relevant and helpful information, including content from creators on the open web. For topics like cities, you may see visual stories and short videos from people who have visited, tips on how to explore the city, things to do, how to get there and other important aspects you might want to know about as you plan your travels.

Scrolling through new, more visual search results for Oaxaca, Mexico.

Additionally, with our deep understanding of how people search, we’ll soon show you topics to help you go deeper or find a new direction on a subject. And you can add or remove topics when you want to zoom in and out. The best part is this can help you discover things that you might not have thought about. For example, you might not have known that Oaxacan beaches were one of Mexico’s best-kept secrets.

We're also reimagining the way we display results to better reflect the ways people explore topics. You’ll see the most relevant content, from a variety of sources, no matter what format the information comes in — whether that's text, images or video. And as you continue scrolling, you’ll see a new way to get inspired by related topics to your search. For instance, you may never have thought to visit the historic sites in Oaxaca or find live music while you’re there.

New, visual and inspiration rich results for historic sites in Oaxaca.

These new ways to explore information will be available in the coming months, to help wherever your curiosity takes you.

We hope you’re excited to search outside the box, and we look forward to continuing to build the future of search together.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe