Facial expression studies in animal communication are essential. However, manual inspection methods are only practical for small datasets. Deep learning techniques can help discriminate facial configurations associated with vocalisations over large datasets. We extracted and labelled frames of different primate species, trained deep-learning models to identify key points on their faces, and computed distances between them to identify facial gestures. We used machine learning algorithms to classify vocalised and non-vocalised gestures across different species. The algorithms showed higher-than-chance correct classification rates, with some exceeding 90 %. Our work employs deep learning to map primate facial gestures and offers an innovative application of pose estimation systems. Our approach facilitates the investigation of facial repertoire across primate species and behavioural contexts, enabling comparative research in primate communication.
Discrimination between the facial gestures of vocalising and non-vocalising lemurs and small apes using deep learning
Carugati, Filippo
First
Membro del Collaboration Group
;Friard, OlivierMembro del Collaboration Group
;Protopapa, ElisaMembro del Collaboration Group
;De Gregorio, Chiara;Valente, DariaMembro del Collaboration Group
;Ferrario, ValeriaMembro del Collaboration Group
;Cristiano, WalterMembro del Collaboration Group
;Raimondi, TeresaMembro del Collaboration Group
;Torti, Valeria;Miaretsoa, LongondrazaMembro del Collaboration Group
;Giacoma, CristinaMembro del Collaboration Group
;Gamba, Marco
Last
Membro del Collaboration Group
2024-01-01
Abstract
Facial expression studies in animal communication are essential. However, manual inspection methods are only practical for small datasets. Deep learning techniques can help discriminate facial configurations associated with vocalisations over large datasets. We extracted and labelled frames of different primate species, trained deep-learning models to identify key points on their faces, and computed distances between them to identify facial gestures. We used machine learning algorithms to classify vocalised and non-vocalised gestures across different species. The algorithms showed higher-than-chance correct classification rates, with some exceeding 90 %. Our work employs deep learning to map primate facial gestures and offers an innovative application of pose estimation systems. Our approach facilitates the investigation of facial repertoire across primate species and behavioural contexts, enabling comparative research in primate communication.| File | Dimensione | Formato | |
|---|---|---|---|
|
1-s2.0-S1574954124003893-main-3.pdf
Accesso aperto
Descrizione: PREPROOF
Tipo di file:
PREPRINT (PRIMA BOZZA)
Dimensione
1.71 MB
Formato
Adobe PDF
|
1.71 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.



