Green Tech

To monitor coral reefs, Google DeepMind listens to the fishes

The AI-powered SurfPerch tool measures fish sounds to determine the health of reefs.
article cover

Giordano Cipriani/Getty Images

3 min read

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

Google DeepMind is tapping deep learning to learn more about the watery deep.

The search giant’s AI research arm unveiled a pair of projects this month that apply AI to monitoring various aspects of the Earth’s oceans.

One, an AI-powered tool called SurfPerch, aims to better understand coral-reef health through audio signals. It’s trained on thousands of hours of real recordings, helpfully annotated through crowdsourced users who listened to them online and identified fish sounds.

DeepMind also recently announced the release of two AI-derived datasets for researchers that could offer them a fuller picture of human activity at sea, including fishing, shipping, or mining vessels that might not be publicly broadcasting their positions.

The two efforts are the latest examples of how AI is increasingly being used to help size up and predict problems related to the climate crisis. For instance, IBM and NASA rolled out a foundation model trained on wildfire and flood data last year, and Google published a paper on predicting flooding with AI in March.

Reef rhythms: For the past year, Google has been tasking the public with listening to recordings from underwater microphones placed near coral reefs all over the world on a website called Calling in our Corals.

The site includes handy instructions from marine biologists on how to listen for the sounds of fish life, which tend to indicate a healthier reef. A damaged reef might sound like radio static, but a vibrant one is much noisier. You might hear distinct patterns like “the purring of damsel fish and the characteristic scraping sound produced by parrot fish feeding,” according to Ben Williams, a marine biologist who worked with Google on the project.

Google also augmented the fish sounds with bird recordings, which emphasized the “fundamental similarities between these different fields,” Williams said in a blog post.

Now, after running 400 hours of reef audio through that process, Google has collected enough data to train an AI model to detect fish sounds automatically. Google said the tool will make it easier to track activity among reefs in dark and murky water or at night.

Williams said the tool will save scientists time because it can now be easily fine-tuned to new datasets, a process that took “days or weeks” of labeling for previous AI projects of this kind.

Dark vessels: Google’s other big maritime announcement focuses on using AI to map human activity on the ocean. Despite relying on the world’s oceans for food, shipping, and—increasingly—offshore energy, Google researchers claim in a research paper that most industrial fishing vessels (72%–76%) and a good chunk of transport and energy vessel activity (21%–30%) are not publicly tracked.

The newly released datasets, which were created by using AI to better track this activity, will give environmental scientists a better idea of carbon emissions and harm to ocean life from this activity, as well as better inform environmental protection policies, according to Google.

Keep up with the innovative tech transforming business

Tech Brew keeps business leaders up-to-date on the latest innovations, automation advances, policy shifts, and more, so they can make informed decisions about tech.

T
B