Facial expression studies in animal communication are essential. However, manual inspection methods are only practical for small datasets. Deep learning techniques can help discriminate facial configurations associated with vocalisations over large datasets. We extracted and labelled frames of different primate species, trained deep-learning models to identify key points on their faces, and computed distances between them to identify facial gestures. We used machine learning algorithms to classify vocalised and non-vocalised gestures across different species. The algorithms showed higher-than-chance correct classification rates, with some exceeding 90 %. Our work employs deep learning to map primate facial gestures and offers an innovative application of pose estimation systems. Our approach facilitates the investigation of facial repertoire across primate species and behavioural contexts, enabling comparative research in primate communication.

Discrimination between the facial gestures of vocalising and non-vocalising lemurs and small apes using deep learning

Carugati, Filippo
First
Membro del Collaboration Group
;
Friard, Olivier
Membro del Collaboration Group
;
Protopapa, Elisa
Membro del Collaboration Group
;
De Gregorio, Chiara;Valente, Daria
Membro del Collaboration Group
;
Ferrario, Valeria
Membro del Collaboration Group
;
Cristiano, Walter
Membro del Collaboration Group
;
Raimondi, Teresa
Membro del Collaboration Group
;
Torti, Valeria;Miaretsoa, Longondraza
Membro del Collaboration Group
;
Giacoma, Cristina
Membro del Collaboration Group
;
Gamba, Marco
Last
Membro del Collaboration Group
2024-01-01

Abstract

Facial expression studies in animal communication are essential. However, manual inspection methods are only practical for small datasets. Deep learning techniques can help discriminate facial configurations associated with vocalisations over large datasets. We extracted and labelled frames of different primate species, trained deep-learning models to identify key points on their faces, and computed distances between them to identify facial gestures. We used machine learning algorithms to classify vocalised and non-vocalised gestures across different species. The algorithms showed higher-than-chance correct classification rates, with some exceeding 90 %. Our work employs deep learning to map primate facial gestures and offers an innovative application of pose estimation systems. Our approach facilitates the investigation of facial repertoire across primate species and behavioural contexts, enabling comparative research in primate communication.
2024
2
49
https://www.sciencedirect.com/science/article/pii/S1574954124003893
Primate face, Indri indri, Propithecus diadema, Nomascus gabriellae, DeepLabCut, Acoustic communication
Carugati, Filippo; Friard, Olivier; Protopapa, Elisa; Mancassola, Camilla; Rabajoli, Emanuela; De Gregorio, Chiara; Valente, Daria; Ferrario, Valeria;...espandi
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S1574954124003893-main-3.pdf

Accesso aperto

Descrizione: PREPROOF
Tipo di file: PREPRINT (PRIMA BOZZA)
Dimensione 1.71 MB
Formato Adobe PDF
1.71 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/2032511
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 2
social impact