100 things we announced at I/O
And that’s a wrap on I/O 2022! We returned to our live keynote event, packed in more than a few product surprises, showed off some experimental projects and… actually, let’s just dive right in. Here are 100 things we announced at I/O 2022.
Gear news galore
- Let’s start at the very beginning — with some previews. We showed off a first look at the upcoming Pixel 7 and Pixel 7 Pro1, powered by the next version of Google Tensor
- We showed off an early look at Google Pixel Watch! It’s our first-ever all-Google built watch: 80% recycled stainless steel2, Wear OS, Fitbit integration, Assistant access…and it’s coming this fall.
- Fitbit is coming to Google Pixel Watch. More experiences built for your wrist are coming later this year from apps like Deezer and Soundcloud.
- Later this year, you’ll start to see more devices powered with Wear OS from Samsung, Fossil Group, Montblanc and others.
- Google Assistant is coming soon to the Samsung Galaxy Watch 4 series.
- The new Pixel Buds Pro use Active Noise Cancellation (ANC), a feature powered by a custom 6-core audio chip and Google algorithms to put the focus on your music — and nothing else.
- Silent Seal™ helps Pixel Buds Pro adapt to the shape of your ear, for better sound. Later this year, Pixel Buds Pro will also support spatial audio to put you in the middle of the action when watching a movie or TV show with a compatible device and supported content.
- They also come in new colors: Charcoal, Fog, Coral and Lemongrass. Ahem, multiple colors — the Pixel Buds Pro have a two-tone design.
- With Multipoint connectivity, Pixel Buds Pro can automatically switch between your previously paired Bluetooth devices — including compatible laptops, tablets, TVs, and Android and iOS phones.
- Plus, the earbuds and their case are water-resistant3.
- …And you can preorder them on July 21.
- Then there’s the brand new Pixel 6a, which comes with the full Material You experience.
- The new Pixel 6a has the same Google Tensor processor and hardware security architecture with Titan M2 as the Pixel 6 and Pixel 6 Pro.
- It also has two dual rear cameras — main and ultrawide lenses.
- You’ve got three Pixel 6a color options: Chalk, Charcoal and Sage. The options keep going if you pair it with one of the new translucent cases.
- It costs $449 and will be available for pre-order on July 21.
- We also showed off an early look at the upcoming Pixel tablet4, which we’re aiming to make available next year.
Android updates
18. In the last year, over 1 billion new Android phones have been activated.
19. You’ll no longer need to grant location to apps to enable Wi-Fi scanning in Android 13.
20. Android 13 will automatically delete your clipboard history after a short time to preemptively block apps from seeing old copied information
21. Android 13’s new photo picker lets you select the exact photos or videos you want to grant access to, without needing to share your entire media library with an app.
22. You’ll soon be able to copy a URL or picture from your phone, and paste it on your tablet in Android 13.
23. Android 13 allows you to select different language preferences for different apps.
24. The latest Android OS will also require apps to get your permission before sending you notifications.
25. And later this year, you’ll see a new Security & Privacy settings page with Android 13.
26. Google’s Messages app already has half a billion monthly active users with RCS, a new standard that enables you to share high-quality photos, see type indicators, message over Wi-Fi and get a better group messaging experience.
27. Messages is getting a public beta of end-to-end encryption for group conversations.
28. Early earthquake warnings are coming to more high-risk regions around the world.
29. On select headphones, you’ll soon be able to automatically switch audio between the devices you’re listening on with Android.
30. Stream and use messaging apps from your Android phone to laptop with Chromebook’s Phone Hub, and you won’t even have to install any apps.
31. Google Wallet is here! It’s a new home for things like your student ID, transit tickets, vaccine card, credit cards, debits cards.
32. You can even use Google Wallet to hold your Walt Disney World park pass.
33. Google Wallet is coming to Wear OS, too.
34. Improved app experiences are coming for Android tablets: YouTube Music, Google Maps and Messages will take advantage of the extra screen space, and more apps coming soon include TikTok, Zoom, Facebook, Canva and many others.
Developer deep dive
35. The Google Home and Google Home Mobile software developer kit (SDK) for Matter will be launching in June as developer previews.
36. The Google Home SDK introduces Intelligence Clusters, which make intelligence features like Home and Away, available to developers.
37. Developers can even create QR codes for Google Wallet to create their own passes for any use case they’d like.
38. Matter support is coming to the Nest Thermostat.
39. The Google Home Developer Center has lots of updates to check out.
40. There’s now built-in support for Matter on Android, so you can use Fast Pair to quickly connect Matter-enabled smart home devices to your network, Google Home and other accompanying apps in just a few taps.
41. The ARCore Geospatial API makes Google Maps’ Live View technology available to developers for free. Companies like Lime are using it to help people find parking spots for their scooters and save time.
42. DOCOMO and Curiosity are using the ARCore Geospatial API to build a new game that lets you fend off virtual dragons with robot companions in front of iconic Tokyo landmarks, like the Tokyo Tower.
43. AlloyDB is a new, fully-managed PostgreSQL-compatible database service designed to help developers manage enterprise database workloads — in our performance tests, it’s more than four times faster for transactional workloads and up to 100 times faster for analytical queries than standard PostgreSQL.
44. AlloyDB uses the same infrastructure building blocks that power large-scale products like YouTube, Search, Maps and Gmail.
45. Google Cloud’s machine learning cluster powered by Cloud TPU v4 Pods is super powerful — in fact, we believe it’s the world’s largest publicly available machine learning hub in terms of compute power…
46. …and it operates at 90% carbon-free energy.
47. We also announced a preview of Cloud Run jobs, which reduces the time developers spend running administrative tasks like database migration or batch data transformation.
48. We announced Flutter 3.0, which will enable developers to publish production-ready apps to six platforms at once, from one code base (Android, iOS, Desktop Web, Linux, Desktop Windows and MacOS).
49. To help developers build beautiful Wear apps, we announced the beta of Jetpack Compose for Wear OS.
50. We’re making it faster and easier for developers to build modern, high-quality apps with new Live edit features in Android Studio.
Help for the home
51. Many Nest Devices will become Matter controllers, which means they can serve as central hubs to control Matter-enabled devices both locally and remotely from the Google Home app.
52. Works with Hey Google is now Works with Google Home.
53. The new home.google is your new hub for finding out everything you can do with your Google Home system.
54. Nest Hub Max is getting Look and Talk, where you can simply look at your device to ask a question without saying “Hey Google.”
55. Look and Talk works when Voice Match and Face Match recognize that it’s you.
56. And video from Look and Talk interactions is processed entirely on-device, so it isn’t shared with Google or anyone else.
57. Look and Talk is opt-in. Oh, and FYI, you can still say “Hey Google” whenever you want!
58. Want to learn more about it? Just say “Hey Google, what is Look and Talk?” or “Hey Google, how do you enable Look and Talk?”
59. We’re also expanding quick phrases to Nest Hub Max, so you can skip saying “Hey Google” for some of your most common daily tasks – things like “set a timer for 10 minutes” or “turn off the living room lights.”
60. You can choose the quick phrases you want to turn on.
61. Your quick phrases will work when Voice Match recognizes it’s you .
62. And looking ahead, Assistant will be able to better understand the imperfections of human speech without getting tripped up — including the pauses, “umms” and interruptions — making your interactions feel much closer to a natural conversation.
Taking care of business
63. Google Meet video calls will now look better thanks to portrait restore and portrait light, which use AI and machine learning to improve quality and lighting on video calls.
64. Later this year we’re scaling the phishing and malware protections that guard Gmail to Google Docs, Sheets and Slides.
65. Live sharing is coming to Google Meet, meaning users will be able to share controls and interact directly within the meeting, whether it’s watching an icebreaker video from YouTube or sharing a playlist.
66. Automated built-in summaries are coming to Spaces so you can get a helpful digest of conversations to catch up quickly.
67. De-reverberation for Google Meet will filter out echoes in spaces with hard surfaces, giving you conference-room audio quality whether you’re in a basement, a kitchen, or a big empty room.
68. Later this year, we're bringing automated transcriptions of Google Meet meetings to Google Workspace, so people can catch up quickly on meetings they couldn't attend.
Apps for on-the-go
69. Google Wallet users will be able to check the balance of transit passes and top up within Google Maps.
70. Google Translate added 24 new languages.
71. As part of this update, Indigenous languages of the Americas (Quechua, Guarani and Aymara) and an English dialect (Sierra Leonean Krio) have also been added to Translate for the first time.
72. Google Translate now supports a total of 133 languages used around the globe.
73. These are the first languages we’ve added using Zero-resource Machine Translation, where a machine learning model only sees monolingual text — meaning, it learns to translate into another language without ever seeing an example.
74. Google Maps’ new immersive view is a whole new way to explore so you can see what an area truly looks and feels like.
75. Immersive view will work on nearly any phone or tablet; you don’t need the fanciest or newest device.
76. Immersive view will first be available in L.A., London, New York, San Francisco and Tokyo — with more places coming soon.
77. Last year we launched eco-friendly routing in the U.S. and Canada. Since then, people have used it to travel 86 billion miles, which saved more than half a million metric tons of carbon emissions — that’s like taking 100,000 cars off the road.
78. And we’re expanding eco-friendly routing to more places, like Europe.
All in on AI
The 10 shades of the Monk Skin Tone Scale.
79. A team at Google Research partnered with Harvard’s Dr. Ellis Monk to openly release the Monk Skin Tone Scale, a new tool for measuring skin tone that can help build more inclusive products.
80. Google Search will use the Monk Skin Tone Scale to make it easier to find more relevant results — for instance, if you search for “bridal makeup,” you’ll see an option to filter by skin tone so you can refine to results that meet your needs.
81. Oh, and the Monk Skin Tone Scale was used to evaluate a new set of Real Tone filters for Photos that are designed to work well across skin tones. These filters were created and tested in partnership with artists like Kennedi Carter and Joshua Kissi.
82. We’re releasing LaMDA 2, as a part of the AI Test Kitchen, a new space to learn, improve, and innovate responsibly on this technology together.
83. PaLM is a new language model that can solve complex math word problems, and even explain its thought process, step-by-step.
84. Nest Hub Max’s new Look and Talk feature uses six machine learning models to process more than 100 signals in real time to detect whether you’re intending to make eye contact with your device so you can talk to Google Assistant and not just giving it a passing glance.
85. We recently launched multisearch in the Google app, which lets you search by taking a photo and asking a question at the same time. At I/O, we announced that later this year, you'll be able to take a picture or screenshot and add "near me" to get local results from restaurants, retailers and more.
86. We introduced you to an advancement called “scene exploration,” where in the future, you’ll be able to use multisearch to pan your camera and instantly glean insights about multiple objects in a wider scene.
Privacy, security and information
87. We’ve expanded our support for Project Shield to protect the websites of 200+ Ukrainian government agencies, news outlets and more.
88. Account Safety Status will add a simple yellow alert icon to flag actions you should take to secure your Google Account.
89. Phishing protections in Google Workspace are expanding to Docs, Slides and Sheets.
90. My Ad Center is now giving you even more control over the ads you see on YouTube, Search, and your Discover feed.
91. Virtual cards are coming to Chrome and Android this summer, adding an additional layer of security and eliminating the need to enter certain card details at checkout.
92. In the coming months, you’ll be able to request removal of Google Search results that have your contact info with an easy-to-use tool.
93. Protected Computing, a toolkit that helps minimize your data footprint, de-identifies your data and restricts access to your sensitive data.
94. On-device encryption is now available for Google Password Manager.
95. We’re continuing to auto enroll people in 2-Step Verification to reduce phishing risks.
What else?!
96. A new Google Store is opening in Williamsburg.
97. This is our first “neighborhood store” — it’s in a more intimate setting that highlights the community. You can find it at 134 N 6th St., opening on June 16.
98. The store will feature an installation by Brooklyn-based artist Olalekan Jeyifous.
99. Visitors there can picture everyday life with Google products through interactive displays that show how our hardware and services work together, and even get hands-on help with devices from Google experts.
100. We showed a prototype of what happens when we bring technologies like transcription and translation to your line of sight.