Aquaculture Europe 2022

September 27 - 30, 2022

Rimini, Italy

Add To Calendar 30/09/2022 15:45:0030/09/2022 16:00:00Europe/RomeAquaculture Europe 2022AUTOMATED FISH MONITORING AND SAMPLING IN RAS USING OAK-DArengo RoomThe European Aquaculture Societywebmaster@aquaeas.orgfalseDD/MM/YYYYaaVZHLXMfzTRLzDrHmAi181982

AUTOMATED FISH MONITORING AND SAMPLING IN RAS USING OAK-D

Ievgen Koliada*, Petr Císar, Jan Urban, Oleksandr Movchan

University of South Bohemia in Ceské Budejovice, Faculty of Fisheries and Protection of Waters, South Bohemian Research Center of Aquaculture and Biodiversity of Hydrocenoses, Institute of Complex Systems, Zámek 136, 373 33 Nové Hrady, Czech Republic

E-mail: koliada@frov.jcu.cz

 



Introduction

The monitoring of fish for stock assessment in aquaculture and commercial fisheries is essential for the economic and environmental management. Measurement of fish mass as one of the important stock assessment features, is one of the most common and important practice in the aquaculture management. Fish mass information is playing important role in feeding regimes management, oxygen consumption calculation, antibiotic dose, grading time and the optimum time of harvest. Moreover, fish mass measurement has become more critical in recirculating aquaculture system (RAS) which recently became popular among fish farmers (Bergheim et al., 2009). The most conventional method to estimate the mass of a fish population is to net fish samples from a tank and weight them. This method is labour-intensive and stressful for fish. The possibility to measure the fish in a tank without human intervention is therefore of great interest for the aquaculture community. Machine vision system as non-invasive technique is attracting researcher and agriculturists for estimating fish mass and size during cultivation remotely without causing stress in past three decade (Saberioon et al., 2016; Zion, 2012). Usually, it is complicated to transfer the images from the tank camera to the central computer at the aquaculture facility and it became more problematic with the high number of tanks. The need of individual camera for each tank and processing of all data on the central computer makes the system difficult to apply under the real aquaculture conditions. The solution could be to detect the fish, estimate the weight and send the information to the central computer (wireless connection can be used). OpenCV AI Kit with Depth (OAK-D) can effectively solve the described problems. The individual fish is detected using the CNN. The stereo vison system is used to determine the fish depth and estimate weight. The main objective of this study was developing real time monitoring system using OAK-D for remote and automatic estimation of fish biomass in RAS.

Materials and methods

The study was divided into a several phases:

  1. Data collection

OAK-D was mounted on top of fish tank for initial data collection. At first OAK-D was calibrated with a checkerboard (9*6 internal corners and square size is 2.35 cm) according to DepthAI Docs. Disparity map was not good, especially at distances of 2,5 m. In this range of depth, we are using OAK-D in our system. So, calibration procedure was redone with the checkerboard (8*6 internal corners and square size is 9 cm), because DocsAI protocols couldn’t detect internal corners of smaller checkerboard at that distance.

Lights condition is very crucial in which our system work. As the water should be transparent enough for good fish detection light conditions source were adjusted. The common carp is using as a testing specie. For estimation biomass and individual fish weight and length/width conversion, we sampled all fishes (Table 1). Each individual fish was captured by OAK-D at closest depth to the surface. Several videos were recorded with different light sources. These data were manually processed and labeled in LabelImg v.1.8.2. Two training data set were created for individual fish detection with 800 of photos each with different resolutions. Tiny Yolo V3 Fish Detector was trained with 416*416 resolution. Second detector was based on SSD MobileNet V2 and it was trained with resolution of 300*300.

The new data set of 2000 photos (300*300 resolution) with different light conditions was sampled. All photos were annotated in a simple graphical tool – imaglab which is part of Dlib library. The data were randomly split into training and testing sets in proportion 90 and 10% respectively. The SSD MobileNet fish detector was trained.

To localize individual landmarks on detected fish we used well-known dlib’s shape predictor. dlib’s find_min_global function was used to optimize the options and hyperparameters to dlib’s shape predictor training. Dlib handles computing the mean average error (MAE) between the predicted landmark coordinates and the ground-truth landmark coordinates.

For conversion pixel size to real length system was calibrated with special white plate. It was captured 10 photos at 3 locations in tank: middle, edge and between by step 10 cm lowering this plate. The distance to the plate was measured with 5 m ruler. The conversion equation was calculated by fitting a linear model.

2.                       Implementation and testing performance

SSD MobileNet model was compiled for 4, 5, 6 shaves. After both detectors were applied we had detection of reflections without fish and wrong detections of landmarks. To avoid misestimations of size we used several conditional statements which were calculated according to our data set. These conditions are distance to bounding box and distance between landmarks, as we know real size of our fish. Weight of each detected fish is calculated by conversion models. Number of fish (N) to the day of recording video in the tank was 21. Total biomass was calculated by multiplying average weight of fishes on N.

Stereovision provided distance to the detected fish.

Results and discussion

This automated monitoring and sampling system can help farmers monitor fish welfare and growth rates in high intensity cultivation systems such as RAS continuously in real-time. The SSD MobileNet fish detector showed better performance than Tiny Yolo. The lost on last step of the SSD MobileNet training was 1.32. Best performance was gotten on 6 shaves blob file. FPS is around 16. Dlib MAE of training and testing set was 3.5 and 15 respectively. Despite not so well depth map under water OAK-D shows sufficient performance. As far as we know that it is no such system at the market. It can increase cultural efficiency and productivity. More statistic data will be acquired, and that result will be published in corresponding scientific journal.

Acknowledgement

The study was financially supported by the Ministry of Education, Youth and Sports of the Czech Republic - project „CENAKVA“(LM2018099), the CENAKVA Centre Development [No.CZ.1.05/2.1.00/19.0380] and GAJU 013/2019/Z.

References:

Bergheim, Drengstig, Ulgenes, Fivelstad, 2009. Production of Atlantic  salmon smolts in   Europe-current  characteristics and future trends. Aquacultural engineering 41, 46–52.

Saberioon, M., Gholizadeh, A., Cisar, P., Pautsina, A., Urban, J., 2016. Application of machine vision systems in aquaculture with emphasis on fish:  state-of-the-art and key issues. Reviews in Aquaculture doi: 10.1111/raq.12143.

Zion, B., 2012. The use of computer vision technologies in aquaculture – A review. Computers and Electronics in Agriculture 88, 125.