How Google AI is helping scientists protect humpback whales in Australia
Every year, humpback whales migrate up the east coast of Australia to breed, and journey back to Antarctica to feed. During their migration, the whales make calls and sing songs – a grand chorus in the symphony of their ecosystems.
This underwater soundscape is a valuable and vital window into the health of this species and their habitats. By tracking audio data, scientists can understand migration activity, patterns, mating calls, competitive behaviors and more.
As part of the Digital Future Initiative, Google Australia is teaming up with Griffith University to implement more precise, comprehensive and efficient monitoring of whale migrations and their ecosystems in Australia – enabled by Google AI and automatic audio detection.
Researchers Dr Olaf Meynecke from Griffith University’s Whales and Climate Program and Dr Lauren Harrell from Google Research are leading this collaboration.
Traditional whale research methods have faced limitations in both data collection and analysis. Researchers logged sightings and manually analysed audio recordings, which is time-consuming and does not give a continuous view of whale activity. Moreover, visual sightings can only be logged during daylight, and tracking the evolving vocal dialects of whales across different regions and seasons is a complex task.
With this new collaboration, researchers have deployed hydrophones — underwater microphones — and Google AI powered audio detection systems to monitor the sounds and songs of humpback whales and their habitats.
A seal swimming around a hydrophone off the South Coast, NSW
Hydrophones allow us to tune into marine soundscapes and continuously collect underwater audio data all day and all night, through the entire humpback migration season. Google's AI technology processes this data, automatically detecting whale sounds, marking their location in time and classifying the species. This frees researchers from the minutiae and laborious manual work, so they can look at the big picture, uncover insights and explore new research frontiers.
Dr Olaf Meynecke deploying a hydrophone in Terrigal, NSW
Curtin University’s Centre for Marine Science and Technology is supporting the collection and labeling of acoustic data, and a range of local citizen science groups will assist with monitoring each of the hydrophone sites. The AI model will eventually be open-sourced on Kaggle and GitHub, benefiting other whale and marine researchers worldwide.
While our current focus is on monitoring humpback whale sounds, the potential of this AI model extends far beyond. We'll look to build on the model to detect the sounds of diverse marine species, from fish to dolphins and seals. These advancements will open up uncharted territories of research that could help protect these magnificent creatures and their habitats for generations to come.