Free-traffic cow barns equipped with robotic milking systems are increasingly recognized for supporting better animal welfare, as cows can move according to their own schedule. However, this freedom may also introduce management challenges, such as congestion, uneven cow flow, and inefficient use of barn space. This study aimed to develop and test a non-invasive computer-vision framework capable of continuously monitoring cow movement in a free-traffic barn under real commercial conditions. To manage the large amount of video produced, an entropy-based selection process condensed >216,000 frames into 600 diverse, representative images. Consequently, three YOLO models (v8x, v11x, and v12x) were trained and compared, running 675 training experiments to find the most accurate and efficient setup for detecting cows and supporting barn-wide spatial analysis. Across these tests, image resolution had the biggest effect on performance, while factors like optimizer choice and batch size played a smaller role. On an independent test set, YOLOv11x offered the best overall balance between accuracy and computational load (mAP@[0.50:0.95] = 0.735; precision = 0.862; recall = 0.897). YOLOv12x performed slightly better in terms of mAP (0.739) but required more processing power. Cow centroids were projected onto a calibrated barn map to create movement trajectories and highlight areas where cows tended to cluster or slow down. Overall, the framework performed efficiently under real farm conditions reducing annotation work and providing a scalable, non-invasive way to track spatial behaviour. The findings showed that computer vision can support data-driven decisions about barn layout and management, improving efficiency in automated dairy systems.

A computer vision framework for barn-wide monitoring of dairy cows in free-traffic systems with robotic milking

Fiorilla E.
First
;
Vitturini D.;Grangetto M.;Ozella L.
Last
2026-01-01

Abstract

Free-traffic cow barns equipped with robotic milking systems are increasingly recognized for supporting better animal welfare, as cows can move according to their own schedule. However, this freedom may also introduce management challenges, such as congestion, uneven cow flow, and inefficient use of barn space. This study aimed to develop and test a non-invasive computer-vision framework capable of continuously monitoring cow movement in a free-traffic barn under real commercial conditions. To manage the large amount of video produced, an entropy-based selection process condensed >216,000 frames into 600 diverse, representative images. Consequently, three YOLO models (v8x, v11x, and v12x) were trained and compared, running 675 training experiments to find the most accurate and efficient setup for detecting cows and supporting barn-wide spatial analysis. Across these tests, image resolution had the biggest effect on performance, while factors like optimizer choice and batch size played a smaller role. On an independent test set, YOLOv11x offered the best overall balance between accuracy and computational load (mAP@[0.50:0.95] = 0.735; precision = 0.862; recall = 0.897). YOLOv12x performed slightly better in terms of mAP (0.739) but required more processing power. Cow centroids were projected onto a calibrated barn map to create movement trajectories and highlight areas where cows tended to cluster or slow down. Overall, the framework performed efficiently under real farm conditions reducing annotation work and providing a scalable, non-invasive way to track spatial behaviour. The findings showed that computer vision can support data-driven decisions about barn layout and management, improving efficiency in automated dairy systems.
2026
13
1
13
Automated milking systems; Computer vision; Dairy cows; Free cow traffic; Multi-camera monitoring; Precision livestock farming
Fiorilla E.; Vitturini D.; Bovo M.; Grangetto M.; Ozella L.
File in questo prodotto:
File Dimensione Formato  
Fiorilla_et_al_2026.pdf

Accesso aperto

Dimensione 3.87 MB
Formato Adobe PDF
3.87 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/2137992
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact