Aquaculture Europe 2025

September 22 - 25, 2025

Valencia, Spain

Add To Calendar 24/09/2025 17:00:0024/09/2025 17:15:00Europe/ViennaAquaculture Europe 2025ROBUST LOCALIZATION AND MAPPING FOR UUVS OPERATING IN INDUSTRIAL SCALE FISH FARMSGoleta, Hotel - Floor 14The European Aquaculture Societywebmaster@aquaeas.orgfalseDD/MM/YYYYaaVZHLXMfzTRLzDrHmAi181982

ROBUST LOCALIZATION AND MAPPING FOR UUVS OPERATING IN INDUSTRIAL SCALE FISH FARMS

M. Job1* , D. Botta1, V. Reijgwart1, L. Ebner2, A . Studer2, R. Siegwart1 and E. Keladisi1,3

1Autonomous Systems Lab, Institute of Robotics and Intelligent Systems, Department of Mechanical and Process Engineering, ETH, Zurich, Switzerland

2Tethys Robotics, Zurich, Switzerland

3Department of Mechanical and Industrial Engineering, NTNU and Aquaculture Robotics and Automation Group, SINTEF Ocean, Trondheim, Norway

* Email: mjob30488@gmail.com



Introduction

The aquaculture industry has seen rapid growth over the last decades . This growth, however, presents new challenges in terms of ensuring efficient, safe, and sustainable operations [1] . Fish farming  often involve a significant amount of manual labor, which can be physically demanding and dangerous. Tasks such as net inspection, maintenance, and repairs expose workers to hazardous underwater conditions, including rough seas, low visibility, and the presence of potentially harmful marine life. Addressing some of these problems, the interest in robotic systems for aquaculture has also grown significantly in recent  years [1]. However, in order to have robust systems operating in dynamic environments we need to increase the level of autonomy of underwater vehicles operating in fish farms. This paper presents a general framework that integrates visual and acoustic sensor data to enhance localization and mapping in complex, highly dynamic underwater environments, with a particular focus on fish farming.

 This work was financed by the Research Council of Norway through the project: CHANGE [2 ] and ResiFarm [3].

Materials and methods

Fig. 1 provides a brief overview of the proposed underwater localization and mapping framework, which allows UUVs to operate in dynamic underwater environments [4]. The proposed framework investigates vision-based methods to: 1) obtain the relative 3D pose of an UUV from a flexible and deformable structure to facilitate control strategies for autonomous net inspection operations in fish farms, 2) construct the depth map from mono-vision data that can be crucial for both collision-free autonomous operations with UUVs in dynamic environments and for obtaining the real-time map of the inspected area to identify irregularities such as holes, or biofouling in net pens, 3) estimate the global pose of an UUV within the net pen utilizing the available sensor measurements and relative poses of the robot, and 4) create a detailed 3D map of the net-pen environment utilizing data obtained from an industrial scale fish farms (Fig. 1).

Results

This section presents evaluations across four key areas: relative poses, depth image predictions, global pose estimates, and mapping [4]. Field trials have been conducted in SINTEF ACE facilities to obtain the datasets for validation of the method. Fig. 2 shows the net relative distances estimated where the distance measurements from the DVL and the forward-facing ping echo sounder are compared with distance estimates obtained from the modified FFT-based method (Method 1), the TRU-depth (Method 2) and the Radar Meets Vision (Method 3) , while Fig. 3  and Fig. 4 show the depth predictions and obtained map.

Conclusion and future work

 This paper presents a vision-based framework for underwater localization and mapping, using a large dataset from industrial fish farms. The approach employs FFT-based priors to support TRU-Depth and Radar Meets Vision methods, enabling depth prediction from monocular images for 3D mapping. It also proposes techniques for estimating both net-relative and global poses of UUVs. Results show that combining depth prediction, wavemap methods, and pose estimation produces accurate volumetric maps, demonstrating real-world potential in underwater environments. Future work could integrate priors based on fish or other structures to enhance 3D scene reconstruction.

References

[1]                        Kelasidi, E., Svendsen, E. (2023).  Robotics for Sea-Based Fish Farming. In: Zhang, Q. (eds) Encyclopedia of Smart Agriculture Technologies. Springer, Cham. https://doi.org/10.1007/978-3-030-89123-7\_202-1

[2]                        CHANGE  ̶ An Underwater Robotics Concept for Dynamically Changing Environments. https://www.sintef.no/en/projects/2021/change-an-underwater-robotics-concept-for-dynamically-changing-environments/

[3]                        ResiFarm - Resilient Robotic Autonomy for Underwater Operations in Fish Farms . https://www.sintef.no/en/projects/2021/resifarm-resilient-robotic-autonomy-for-underwater-operations-in-fish-farms/

[4]                       D. Botta, L. Ebner, A. Studer, V. Reijgwart, R. Siegwart and E. Kelasidi, “Framework for Robust Localization of UUVs and Mapping of Net Pens”, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024), Abu Dhabi, United Arab Emirates, 2024