Skip to main content
The Keyword

Google’s AI hardware, in new hands

Article's hero media
10:25

A robot that turns portraits into manga. A vest that helps people navigate. Gloves that transmit the feeling of bird songs. These aren’t props from the set of a Sci-Fi movie, they’re a few of the finalist projects presented in New Orleans during the ACM Symposium on User Interface Software and Technology (UIST), the premier venue for innovations in human-computer interfaces. 

UIST’s Student Innovation Contest is a forum where university students demonstrate how novel input, interaction, actuation and output technologies can create interactive experiences. UIST partnered with two teams from Google Research, Google Coral and Bio Interfaces, to sponsor the event, the theme of which was Interactive Systems for Social Impact: See, Feel, Hear the Invisible. 

Google Coral’s mission is to build beneficial and privacy preserving AI by providing a platform to strengthen society, improve the environment and enrich lives. One way we do this is by giving students access to the same machine learning hardware and toolchains used by AI researchers and industry practitioners. The Bio Interfaces team focuses on developing AI hardware to help people better communicate and interact with their world. 

Our two teams worked together to provide contest participants with 15 pre-release hardware kits that embed machine learning into custom electronics. Students then used TensorFlow Lite and an assortment of sensors to build machine learning-powered experiences that could use various inputs, such as camera feeds, temperature, atmospheric pressure, ambient light and humidity sensing.

The nine finalist teams had three months to use these kits to create interactive experiences with cardboard prototypes, and the contest had two types of awards: The People’s Choice awards were based on votes from the conference attendees. The Jury’s Choice awards were determined by Jessica Cauchard (Ben-Gurion University of the Negev), Alexandra Ion (ETH Zurich) and myself (Google Research). 

  • None

    The Jury’s Choice Award went to Team Visionaries, Dennis Dietz, Stefan Langer, Kyrill Schmid, Daniel Neumann and Felix Dietz from Ludwig Maximilian University of Munich in Germany, who developed a haptic vest with a camera. “The vest is intended for visually impaired people,” the team explained. “We have vibration sensors in the back, and it detects objects in front of the person wearing it so that they can then navigate around these obstacles.”

  • None

    Team MountainSoil, Keitaro Tsuchiya, Hiroo Yamamura and Daisuke Yamamoto from Keio University Graduate School of Media Design in Japan, received a Jury’s Choice Honorable Mention award for a system that monitored a person’s facial expression and used a robotic arm to draw it as a manga character on a transparent pane in front of their face. They were also awarded a People’s Choice Honorable Mention Award. 


  • None

    Team Greens, Shou-En Tsai, Shang-Hsun Lu and Hsin-Yu Yao, from National Chiao Tung University and National Tsing Hua University in Taiwan received a Jury’s Choice Honorable Mention award for their robotic shape display that used projection mapping to visualize environmental impact. “Our goal was to raise people’s awareness to show them how humans affect the environment,” the students said. They were also awarded the People’s Choice Award. 


  • None

    “Our project is about enabling people who are suffering from hearing loss to experience nature,” said Team Nature in a Box. The team, Wagram Airiian and Marla Narazani from TU in Munich, Germany, had a vision of providing haptic experiences using a machine learning pipeline that transformed the sounds of bird songs to tactile sensations in gloves worn by the user.


  • None

    Team UCLA-HCI out of the University of California and comprised of Ruolin Wang and Hsuan-Wei Fan, presented a related accessibility project, where a robotic camera monitored the surroundings to provide guidance for people with tunnel vision. “Our basic concept is that we use a camera that rotates from left to right to detect the objects.”


Our collaboration with the UIST Student Innovation Contest was an amazing opportunity to support the next generation of researchers to develop inspiring ideas for social impact through machine learning, hardware and interaction techniques. We found it particularly inspiring to see that almost half of the projects were related to accessibility or sustainability.

And now, it could be your turn: The hardware students used in their projects is publicly available and you can get started at coral.ai. Maybe we’ll see your ideas come to life at the next Student Innovation Contest.

Special thanks to the Coral team, especially Ajay K. Nair, Billy Ruthledge, Noli Grutas, Bill Luan, Kirsten Climer, Vikram Tank and Julie Sohn, and to Google Perception for support. Special thanks also to David Lindlbauer and Pascal E. Fortin, the Student Innovation Contest chairs for the collaboration and François Guimbretière, the conference chair. Thanks to all the student teams for their hard work and wonderful projects. Thanks to Molly Moker‎, Daniel Yadin and Beatriz Browne for video documentation and production. 


Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe