Introduction
In aquaculture, the monitoring and analysis of fish behaviour hold crucial insights into various aspects of fish welfare. Notably, operational welfare indicators (OWIs) and laboratory-based welfare indicators (LABWIs) have emerged as promising tools for welfare auditing, as highlighted by Noble et al. (2018). Recent advances in computer vision technology have opened up possibilities to improve the monitoring of fish behaviour, rendering it an invaluable resource for welfare evaluation. Such technology allows for non-invasive inspection in a variety of applications as it offers methods across diverse applications. Despite notable progress in computer vision particularly in object detection (Wen et al. 2021) a persistent challenge remains in high-density scenarios. This study explored the potential of using pose estimation (Jocher et al. 2023) in Atlantic salmon aquaculture to automatically assess swimming behaviour during feeding events in tanks.
Material and Methods
The video recordings captured using GoPro Hero4 Black cameras were used in this study to investigate the behaviour of Atlantic salmon (S. salar) reared in hexagonal tanks of 3300 litres, each holding approximately 90 fish (> 500g). The water current in the tanks was clockwise. Fig 1. gives an overview of the method. A key point annotation scheme, as shown in Fig. 1(a), was used to annotate the fish’s snout, dorsal fin, and tail base. The training dataset was prepared by subsampling videos of four tanks using the CVAT annotation tool (CVAT 2023) in COCO key point format. Fig. 1(c) depicts the overall pipeline of the employed method. The annotations from COCO key points were converted into YOLOv8 format to train the pose model. The experimental/test videos were used to predict the fish pose. The predicted key points were used to calculate the orientation flows. The orientation flow is the angle of the fish’s heading in relation to the opposite direction of the water current, as shown in Fig. 1(b). It is calculated by finding the angle between the tangential vector at the dorsal fin and the vector from the dorsal fin to the fish snout. As shown in Fig. 1(d), a spatio-temporal analysis is performed by visualizing the orientation flow over time. Our common observation is that the fish tend to swim facing the water current unless disturbed by some events. Finally, the results are obtained in the form of an orientation score, which is defined as the absolute value of the mean of the orientation flows over time, reflecting the spread of the distribution. The YOLOv8 pose model was trained on a workstation with an Intel(R) Core (TM) i9-10885H CPU with 64.0 GB RAM and an NVIDIA GPU Quadro RTC 4000 using python 3.9.Results and discussion
A few results from analyses are presented here to illustrate the relevance of the tool for documenting and understanding behaviour. Fig. 2(a) shows the pose model evaluation metrics. A snapshot of the predicted key points is shown in Fig. 2(b). Fig. 2(c) demonstrates the orientation scores during two key events, feeding and disturbance by the presence of persons around the tanks. The study demonstrates that the proposed method can interpret the dispersion of fish using swimming orientation, and this can be a potential tool in welfare auditing in experimental studies. The results demonstrate that the tool has the potential to explore behaviour in different scenarios and settings. We intend to explore this aspect by applying it in different contexts and developing the tool further and auditing its utility for welfare documentation.
References
Noble, C., Gismervik, K., Iversen, M. H., Kolarevic, J., Nilsson, J., Stien, L. H. & Turnbull, J. F. (Eds.) (2018). Welfare Indicators for farmed Atlantic salmon: tools for assessing fish welfare 351pp. ISBN 978-82-8296-556-9
Wen, L., Du, D., Zhu, P., Hu, Q., Wang, Q., Bo, L., & Lyu, S. (2021). Detection, tracking, and counting meets drones in crowds: A benchmark. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (pp. 7812-7821).
Jocher, G., Chaurasia, A. & Qiu, J. (2023). Ultralytics YOLOv8, https://github.com/ultralytics/ultralytics
OpenCV. CVAT: Computer Vision Annotation Tool (2023). GitHub, 2023. https://github.com/opencv/cvat.
Acknowledgements
This research has been funded by the Nofima AI-WELL project, a spin-off from the Nofima project DigitalAqua https://nofima.com/projects/digitalaqua/. This research has used a dataset kindly provided by the CrowdMonitor project, funded by the Norwegian Seafood Research Fund (FHF), project number 901595.