Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on March 2, 2020

Alphabet’s AI-powered camera system will help fish live their best lives

The Tidal project aims to make fish farming more environmentally-friendly


Alphabet’s AI-powered camera system will help fish live their best lives Image by: Jin Kemoole

Google’s parent company, Alphabet, is bringing AI under the sea to help fish farmers develop more sustainable practices.

The project, dubbed Tidal, uses underwater cameras connected to computer vision software to analyze fish behaviors that the human eye can’t see.

In a blog post, Tidal general manager Neil Davé said the system could monitor thousands of individual fish habits to understand their eating patterns and movements. It will also collect information on their environment, such as temperatures and oxygen levels.

Farmers can use the insights to keep their fish healthy, optimize their feeding, and reduce waste.

[Read: The digital farming revolution will cost workers their power, dignity, and possibly their jobs]

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

“Really what we are hoping to do is provide these tools to farmers so that they can move their operations towards more sustainability,” Davé told The Financial Times.

“There may be an opportunity there to relieve some pressure on wild fishing if we made aquaculture very compelling from an operational and environmental perspective.” 

Developing Tidal

Tidal is the latest initiative of X, the Alphabet moonshot factory best known as the birthplace of self-driving vehicle subsidiary Waymo.

The opportunities and challenges of using tech under the sea made Tidal a natural fit for the experimental lab.

The ocean covers more than 70% of the planet’s surface, but only around 80% of it has been explored. This is partly due to the difficulties of developing technology that can function in the freezing temperatures, saltwater, crushing pressure, and darkness found deep underwater.

Fishing practices are therefore typically guided by manually inspecting a few fish pulled out of the sea.

The X team recognized that computer vision could do a more efficient job. But first, they needed to teach the system to see underwater.

They did this by training their algorithms on a new dataset of fish filmed in a paddling pool in Silicon Valley. The software was then hooked up to a stereo camera rig that’s lowered into a fish farming enclosure.

The system will be initially used to help fish farmers in Europe and Asia to run their operations in more efficient and environmentally-friendly ways.

The Tidal team believes their work could help wean consumers off meat and onto fish, which have a lower carbon footprint than other sources of animal protein. Ultimately, they want to apply the lessons from this project to other environmental applications that could help protect the ocean and the food, jobs, and air quality that depend on its health.


You’re here because you want to learn more about artificial intelligence. So do we. So this summer, we’re bringing Neural to TNW Conference 2020, where we will host a vibrant program dedicated exclusively to AI. With keynotes by experts from companies like Spotify and RSA, our Neural track will take a deep dive into new innovations, ethical problems, and how AI can transform businesses. Get your early bird ticket and check out the full Neural track.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with