Skip to main content
The Keyword

Arts & Culture

A new AI tool to help monitor coral reef health

A researcher deploys a hydrophone on a coral reef in Sulawesi

Coral reefs cover only 0.1% of the ocean's surface — yet they host 25% of all known marine species. It is critical that we greatly scale-up our efforts to monitor, manage, protect and restore reefs around the world that are in crisis as a result of threats such as overfishing, disease, coastal construction or heatwaves.

Emerging research demonstrates how ecoacoustics — that is, the natural sounds that characterize an ecosystem – can help to gain a better understanding of reef health. Over the past year we asked people from around the world to participate in “Calling in our Corals” — a project created in collaboration with Google Arts & Culture that invites the public to listen to reef audio recordings to build a bioacoustic data library on the health of reefs. Today we're taking a further step and introducing a new AI-powered tool called SurfPerch created with Google Research and DeepMind that can be applied to automatically process thousands of hours of audio to build new understanding of coral reef ecosystems.

Why listening to coral reefs matters

By listening to the diversity and patterns of behavior of animals on reefs, we can hear reef health from the inside, track activity at night, and even survey reefs in deep and murky waters. Yet analyzing the countless hours of underwater sounds has been a manual process and something that scientists are unable to keep up with. This is why we’re excited about the work we have done with Calling in our Corals, bringing together marine biologists, creatives, programmers and citizen scientists to monitor health, assess biodiversity, identify new behaviors and measure restoration success.

From a listening collective to a trained AI model

Last year, visitors to Calling in Our Corals listened to over 400 hours of reef audio from coral reef sites around the world. The members of this open listening collective had to click when they heard a fish sound. This brought thousands of eyes and ears to data that would take bioacousticians months to analyze. The results provided a wealth of fascinating new fish sounds that we’ve been using to fine tune SurfPerch. SurfPerch was trained and rigorously tested to produce a model that can quickly be trained to detect any new reef sound using just a handful of examples. This allows us to analyze new datasets with far more efficiency than previously possible, removing the need for training on expensive GPU processors and opening new opportunities to understand reef communities and conservation of these.

  • A man sitting in front of a computer screen studying data.

    Ben conducting an analysis of a coral reef audio waveform from the ReefSet data (credit Ben Williams)

  • An underwater view of a coral reef

    Coral attached to a Reef Star on a newly-restored reef (credit Tim Lamont, University of Exeter)

  • A diver handling reef stars underwater

    Reef Stars are installed in degraded areas to stabilize loose rubble and kickstart rapid coral growth (credit The Ocean Agency)

An exciting discovery we made along the way was that we were able to significantly boost our model's performance by utilizing the large diversity of bird recordings out there. Despite sounding very different, there were enough common patterns between bird song and fish sounds for the model to learn from one to improve performance for the other.

From a lab experiment to real-world insights

Our first trial combining Calling in Our Corals with SurfPerch has already revealed a difference between protected and unprotected reefs in the Philippines, restoration outcomes in Indonesia, and relationships with the fish community on the Great Barrier Reef.

The best part — you can still help by listening to brand new audio on Calling in Our Corals to help further train the model.

To learn more about our work supporting reef restoration visit https://www.buildingcoral.com.

About the authors: Steve Simpson, Professor of Marine Biology, University of Bristol, UK is dedicated to mapping the impact of anthropogenic noise on marine and coral ecosystems, and how they relate to climate change. Ben Williams, University College London is a marine biologist whose research focuses on ways we can support coral reef conservation and restoration using AI.

Let’s stay in touch. Get the latest news from Google in your inbox.

Subscribe