Aquaculture Europe 2025

September 22 - 25, 2025

Valencia, Spain

Add To Calendar 24/09/2025 16:30:0024/09/2025 16:45:00Europe/ViennaAquaculture Europe 2025UTILIZING ARTIFICIAL INTELIGENCE OBJECT RECOGNITION FOR ANALYZING AND TRACKING HEALTH OF ARTEMIA, ROTIFER, SHRIMP, AND FISH IN HATCHERIESGoleta, Hotel - Floor 14The European Aquaculture Societywebmaster@aquaeas.orgfalseDD/MM/YYYYaaVZHLXMfzTRLzDrHmAi181982

UTILIZING ARTIFICIAL INTELIGENCE OBJECT RECOGNITION FOR ANALYZING AND TRACKING HEALTH OF ARTEMIA, ROTIFER, SHRIMP, AND FISH IN HATCHERIES

1 D. Johanson*, 2S. Calloni, 3 L. Chiappi, 3T . De Wolf, 4 R. Den Boer, 3 G. Franchi, 4 K. Fransen, 1 F. Nagels, 4 P. Obels, 1G. Rombaut

 

1INVE Technologies NV, Hoogveld 93, 9200 Dendermonde, Belgium

2INVE Asia Services Ltd., 471 Bond Street Tambon Bangpood Amphur Pakkred, Nonthaburi 11120, Thailand

3I NVE Aquaculture Research Centre, Via P.Gigli snc, 57016 Rosignano Solvay (LI), Italy

4Aris B.V., Esp 300, 5633 AE Eindhoven, The Netherlands

 Email: d.johanson@inveaquaculture.com



Introduction

Fish and shrimp hatcheries world-wide use a variety of methods to monitor the growth of their animals and live feed. This is often a labor-intensive process which relies on subjective judgements of  field operators to produce accurate results. The ability of artificial intelligence (AI) models to recognize specified objects in digital images is an established trait that could prove useful when applied to counting animals and live feed in hatcheries in addition to tracking the growth of their developmental stages. The aim of this study is to create an AI model, trained  on images labeled by experts, along with digital support systems that can recognize and track Artemia developmental stages, rotifers, rotifer eggs, early shrimp developmental stages, fish eggs, and fish egg quality for use in hatcheries around the world.

Materials and Methods

Samples containing examples of the target animals and animal life stages were prepared in 3x4 cell culture well plates. Images of these wells were taken using a digital camera and lens system that uploaded the files into a computer vision annotation tool (CVAT). The labels used in this image library include Artemia embryos, Artemia umbrellas, Artemia nauplii instar I, Artemia nauplii instar II ,  live rotifer, dead rotifer, rotifer eggs, shrimp nauplii, zoea, mysis and PL stages, live and dead seabass and seabream eggs .  The labels were applied manually by experts and the AI model was trained using this dataset. A separate labeled dataset was used to verify the training of the model. The model is derived from a You Only Look Once (YOLO) object recognition base. This was chosen due to its speed and accuracy in tandem with a proper microscope-optics setup. The model was trained recursively via comparison to the known result of the separate set until a maximum accuracy asymptote was reached. This trained model was then installed with proper optics on a custom device named the SnappArt 360 to be tested in the field by technicians .  The technicians reported feedback, and the model was trained several more times based on this cycle of labeling, training, technician testing, and feedback until the technicians reported satisfaction with the model accuracy. A user-friendly data management system for the SnappArt 360 machine was also created to let technicians track trends in animal growth patterns from past measurements.

Results and Discussion

 Raw counts per sampled well (250µL sample volume) of Artemia nauplii, umbrella, and embryos were used t o compare accuracy of the SnappArt AI model to the technician results for Artemia stage objects .  Figure 1  shows  the distribution of the differences between the technician and SnappArt measured values of these counts respectively. The average difference s are 0.02, -0.03, and 0.03 with standard deviations of 1.00, 0.53, and 0.41 with no statistically significant difference in results between technicians and the SnappArt AI model  according to p-Values of 0.728, 0.355, and 0.238 for nauplii, umbrella, and embryo counts respectively.

Results on other objects reflect the accuracy found in Artemia measurement with 95% of seabass and seabream eggs and over 95% of rotifers and rotifer eggs being identified with the SnappArt AI model under the correct annotation of live or dead. Figure 2 shows the accuracy of rotifer animal and egg counting.

Furthermore, results for shrimp show an accuracy of 99% with no statistically significant difference between technician and SnappArt AI counts for all larval stages of shrimp.

 These results indicate that AI modeling could be a promising option for future automation of counting and monitoring the health of animals in hatcheries for a wide variety of species.