All societies should carefully address the child abuse and neglect phenomenon due to its acute and chronic sequelae. Even if artificial intelligence (AI) implementation in this field could be helpful, the state of the art of this implementation is not known. No studies have comprehensively reviewed the types of AI models that have been developed/validated. Furthermore, no indications about the risk of bias in these studies are available. For these reasons, the authors conducted a systematic review of the PubMed database to answer the following questions: “what is the state of the art about the development and/or validation of AI predictive models useful to contrast child abuse and neglect phenomenon?”; “which is the risk of bias of the included articles?”. The inclusion criteria were: articles written in English and dated from January 1985 to 31 March 2023; publications that used a medical and/or protective service dataset to develop and/or validate AI prediction models. The reviewers screened 413 articles. Among them, seven papers were included. Their analysis showed that: the types of input data were heterogeneous; artificial neural networks, convolutional neural networks, and natural language processing were used; the datasets had a median size of 2600 cases; the risk of bias was high for all studies. The results of the review pointed out that the implementation of AI in the child abuse and neglect field lagged compared to other medical fields. Furthermore, the evaluation of the risk of bias suggested that future studies should provide an appropriate choice of sample size, validation, and management of overfitting, optimism, and missing data.

Artificial Intelligence and Child Abuse and Neglect: A Systematic Review

Lupariello, Francesco
;
Sussetto, Luca;Di Trani, Sara;Di Vella, Giancarlo
2023-01-01

Abstract

All societies should carefully address the child abuse and neglect phenomenon due to its acute and chronic sequelae. Even if artificial intelligence (AI) implementation in this field could be helpful, the state of the art of this implementation is not known. No studies have comprehensively reviewed the types of AI models that have been developed/validated. Furthermore, no indications about the risk of bias in these studies are available. For these reasons, the authors conducted a systematic review of the PubMed database to answer the following questions: “what is the state of the art about the development and/or validation of AI predictive models useful to contrast child abuse and neglect phenomenon?”; “which is the risk of bias of the included articles?”. The inclusion criteria were: articles written in English and dated from January 1985 to 31 March 2023; publications that used a medical and/or protective service dataset to develop and/or validate AI prediction models. The reviewers screened 413 articles. Among them, seven papers were included. Their analysis showed that: the types of input data were heterogeneous; artificial neural networks, convolutional neural networks, and natural language processing were used; the datasets had a median size of 2600 cases; the risk of bias was high for all studies. The results of the review pointed out that the implementation of AI in the child abuse and neglect field lagged compared to other medical fields. Furthermore, the evaluation of the risk of bias suggested that future studies should provide an appropriate choice of sample size, validation, and management of overfitting, optimism, and missing data.
2023
10
1659
1
14
https://www.mdpi.com/2227-9067/10/10/1659
artificial intelligence; child abuse; machine learning; deep learning; neglect; child sexual abuse; violence against children
Lupariello, Francesco; Sussetto, Luca; Di Trani, Sara; Di Vella, Giancarlo
File in questo prodotto:
File Dimensione Formato  
children-10-01659-v2.pdf

Accesso aperto

Tipo di file: PDF EDITORIALE
Dimensione 550 kB
Formato Adobe PDF
550 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1943370
Citazioni
  • ???jsp.display-item.citation.pmc??? 1
  • Scopus 3
  • ???jsp.display-item.citation.isi??? 0
social impact