These robots could help uncover the ocean's mysteries

A look at autonomous tools from three groups, used for monitoring marine mammals, mapping the seafloor, exploring shipwrecks, and more.
article cover

Francis Scialabba

· 7 min read

Humans have long used technology to help us understand the ocean. In the Middle Ages, we used mariner’s astrolabes to help us determine a ship’s location at sea. In the 1900s, we used early-stage sonar systems to get an idea of the ocean floor. And today, robots are helping with everything from monitoring endangered species to mapping the seafloor.

Startups specializing in autonomous ocean robots have slowly been gaining traction—and funding. Let’s take a look at three applications of ocean robots that could help humans explore the ocean and uncover its secrets.

For keeping an eye on marine mammals

Julie Angus, a Canadian adventurer-turned-robotics-entrepreneur, founded Open Ocean Robotics in 2018 as a way to bring together autonomous boats, solar energy, and long-term ocean monitoring.

The startup’s unmanned surface vehicles (USVs) are designed to be deployed for months at a time, primarily for monitoring marine mammals and other species. In the past, it has worked on illegal fishing enforcement and seafloor mapping, but now, Open Ocean typically focuses on observing endangered whale populations on the Canadian coastline.

There’s not only an academic call for this type of monitoring, but also a commercial one: Many governments have specific requirements for protecting marine life in cases of ocean development, transportation, and even naval operations. Open Ocean has raised a total of $3.6 million to date, with its latest funding round in December.

The startup’s boats are loaded with sensors, thermal and visual cameras, and machine-learning algorithms that use those inputs to detect nearby vessels and marine mammals. During the development process, the startup traded lidar sensors for automatic identification system (AIS) ones, due to lidar’s reach limitations and performance in stormier conditions.

Open Ocean Robotics employee sitting in front of a bunch of screens monitoring ocean drone

Open Ocean Robotics

Even if the boats are autonomously collecting data on a pre-programmed route, there’s always a human in the loop—someone observing their operations from Open Ocean’s control room in Victoria, British Columbia.

“This kind of onboard autonomy is increasingly important when you’re on satellite connection, when you have limited bandwidth, because you’re not able to see the full suite of data coming in—so you become more dependent on the vessel for being able to make detection,” Angus said. “It also eases the impact on the human on the loop.”

Open Ocean started using its first late-stage prototype boat in 2020, but the startup has since moved on to five commercial boats. Later this year, it plans to scale up to add an additional 10 boats.

For mapping the ocean floor

To date, just over 20% of the ocean floor has been mapped. Terradepth, an Austin-based robotics startup, wants to plot out the rest of it.

Founded by two former Navy SEALs, Terradepth uses autonomous underwater vehicles (AUV) to collect data that fuels the startup’s cloud-based data visualization platform. It raised a $20 million funding round earlier this month, on top of $10 million in 2019.

Terradepth currently operates three commercial vehicles, which it bought from Teledyne, an AUV manufacturer, in January. The startup’s business model is based on collecting and mapping ocean data “in places previously impossible or impractical” and collecting the insights in a database with analytics tools. Its game plan is to eventually pair two of its AUVs for at-sea missions: running one on the surface for navigation, communication, and lithium-battery recharging, while the submersed vehicle collects data—and then, when the submersed vehicle needs to charge, the two will switch roles. At the moment, the startup is practicing using just one vehicle in both roles, but it hopes to begin paired missions in late 2023 or 2024. Supply-chain constraints, like component hardware delivery taking double or triple as long as before, contribute to that timeline.

Each vehicle is about nine meters long and one meter in diameter, allowing it to carry sensor payloads—like different types of sonars, magnetometers, optical cameras, and more. It’s exploring radar as well.

Stay up to date on emerging tech

Drones, automation, AI, and more. The technologies that will shape the future of business, all in one newsletter.

“But it doesn’t matter if you can fit a lot of stuff in your vehicle if you can’t power the stuff you put in your vehicle,” Joe Wolfel, Terradepth’s co-founder, told us. “So that’s where we did a lot of work on autonomous energy recharging.”

To parse the data it collects, Terradepth uses algorithms originally designed for lidar sensor data. The startup runs two different machine-learning models on the robots while at sea. But there’s a human in the loop, too: After analyzing the data onboard, the robot flags any anomalies, or objects of interest, and passes them to a human operator for review. Then, the human operator can order the robot to go back and re-scan an area, for example.

One big obstacle Terradepth and competitors are facing: the lack of training data available for maritime target recognition. There’s a data-labeling problem, as well, since the type of acoustic datasets that undersea robots collect aren’t as easily labeled as optical imagery.

“Ocean data is so expensive to collect that it’s by definition a precious asset,” Wolfel said. “So getting your hands on it for training purposes is actually more difficult than you’d think, especially at the resolution that you need.”

For climate monitoring and data collection

How’s this for a nickname: CARL-Bot, short for Caltech Autonomous Reinforcement Learning Robot.

That’s Caltech researchers’ new ocean-exploration tool—a capsule-shaped robot that could one day autonomously study the depths of the sea.

“Our specific project is somewhat inspired by the way animals navigate in the ocean,” Peter Gunnarson, a PhD student working on the CARL-Bot project, told us. “Many animals, like seals, can navigate in the dark just using their whiskers. And fish have a variety of ways to sense the water moving around them, and use that to navigate through currents and around obstacles. So we’re trying to see if we can replicate that style of flow-based navigation using AI.”

Image of the CARL-Bot, a small ocean-mapping robot

California Institute of Technology

The researchers are employing reinforcement learning—a machine-learning approach that works like trial and error—to help the robot “learn” to use the flow of water to navigate, collect data, track certain species’ biosignatures, and seek out potential research areas of interest, Gunnarson said. The researchers’ eventual aspiration: Introduce a large group of robots to investigate the ocean together, since the algorithms—which are isolated underwater—can share data among the group once the robots surface, making the collective group perform tasks better.

“The nice thing about some of the modern reinforcement learning algorithms is that you can kind of have it learn on the fly: So basically, you can plop it in the ocean, and maybe you did a little bit of training in a simulation beforehand, but for the most part, it can just kind of learn as it’s navigating,” Gunnarson said. “So over time, it might get better at surveying the ocean—and doing it using less battery, or doing it in less time.”

Gunnarson and the Caltech team have spent two years on CARL-Bot, training it via simulation and—due to quarantine lab closures—testing it in Gunnarson’s own bathtub. But from a software-development standpoint, the robot’s small size has proven challenging.

“When you think of machine learning, usually [you] might think of desktop computers with high-powered graphics cards doing a lot of number-crunching, but what we’re trying to do is make an algorithm that could go on even the smallest of ocean robots,” Gunnarson said. He added, “It took a lot of work to write software at a lower level to fit onto microcontrollers, so that we could actually do all the learning on board this tiny little robot.”

Now, the team is working to choose sensors that will fit onto CARL-Bot, testing the algorithm they developed in simulation, and observing the robot’s performance in a brand-new, two-story water tank—a far cry from Gunnarson’s bathtub.

Stay up to date on emerging tech

Drones, automation, AI, and more. The technologies that will shape the future of business, all in one newsletter.