Comment & Analysis
Dec 13, 2025

How AI is Transforming Conservationism with Bioacoustics

Google DeepMind’s latest Perch model helps scientists analyse audio faster to preserve threatened wildlife

Liam PowerContributing writer
blank

The chorus of the rainforest’s orchestra is constant and intense. Sharp bursts of insects chirping cover up a noticeable decline in one group – rare bird species are experiencing an unprecedented drop in population. In 2025, Birdlife International reported 61 percent of bird populations are declining – so how are scientists using AI to stem the tide of their extinction?

In the midst of the AI revolution, Perch, a model developed by the Google DeepMind team, helps conservationists analyse audio and interpret bioacoustic data faster.

Perch relies on basic equipment – scientists place simple audio recorders in an environment for long periods of time to capture “sound diaries”. It then slices the audio into a snapshot of noise, known as a spectrogram, a visual representation of the spectrum of frequencies of an audio signal. By using sound classification, we can employ the power of computers to process and recognise patterns, or similar noises, based on these spectrograms.

ADVERTISEMENT

What separates Perch from any audio classification tool before is the neural networks at the heart of it. Simply put, Perch is given spectrograms as an input, compares these audio patterns to others, and recognises similar sounds. Based on how similar a sound is to another, Perch is able to make calculated guesses that those inputted sounds belong to the same species, and accurately identify the given bird species. It works much the same as the large-language models powering popular consumer tools such as OpenAI’s ChatGPT, where Large Language Models (LLMs) read in a prompt, compare the input to billions of parameters it has been trained upon, and return the most likely reply.

Classical methods of bioacoustics were held back by efficiency, whether that was reliance on manual matching techniques, difficulty with complicated messes of noise in saturated environments, or ineffective for large-scale data analysis. Neural networks enable this by identifying similar spectrogram patterns and processing them in parallel, with modern hardware capable of handling up to hundreds of thousands of operations every single second.

No longer held back by classical methods, Perch isn’t limited to just the avian echoes of the rainforest – it can be applied to any surroundings, from rolling tundras to red sand dunes. It can even handle environs once thought of as impossible, particularly underwater environments like coral reefs. On top of its remarkable use cases, Perch is open-source. It’s open to anyone to download and use for individual purposes, preventing DeepMind from ever profiting from their creation, in the spirit of true scientific endeavours.

Neural networks started as groundbreaking technology in mathematics and computation, but the world still hasn’t seen where its greatest effects will be felt. Perch serves as an impressive case study on how AI is helping advance the science of bioacoustics to save endangered species. Despite its roots in the classrooms of heavily theoretical fields, we’re seeing AI’s impact in the practical realm of conservationism, helping living creatures fight on, in the fastest technological revolution ever.

A decade ago, if you wanted to work on your niche interest of artificial intelligence, you needed to bundle it in with endangered species research. Nowadays, if you don’t mention AI, you can’t win funding. The buzz around artificial intelligence can be useful for researchers and students to follow their passions, by lending a tiny portion of its investments to your projects. Research funding has risen tenfold in AI since the early 2000s. As the world storms on to profit as much as possible, tools like Perch demonstrate the good that AI can do.

Sign Up to Our Weekly Newsletters

Get The University Times into your inbox twice a week.