Information extraction is one of the core fundamentals of natural language processing. Different recurrent neural network-based models have been implemented to perform text classification tasks like named entity recognition (NER). To increase the performance of recurrent networks, different factors play a vital role in which activation functions are one of them. Yet, no studies have perfectly analyzed the effectiveness of the activation function on Named Entity Recognition based classification task of textual data. In this paper, we have implemented a Bi-LSTM-based CRF model for Named Entity Recognition on the semantically annotated corpus i.e., GMB, and analyzed the impact of all non-linear activation functions on the performance of the Neural Network. Our analysis has stated that only Sigmoid, Exponential, SoftPlus, and SoftMax activation functions have performed efficiently in the NER task and achieved an average accuracy of 95.17%, 95.14%, 94.38%, and 94.76% respectively.

The Role of Activation Function in Neural NER for a Large Semantically Annotated Corpus

Amin, Muhammad Saad;Anselma, Luca;Mazzei, Alessandro
2022-01-01

Abstract

Information extraction is one of the core fundamentals of natural language processing. Different recurrent neural network-based models have been implemented to perform text classification tasks like named entity recognition (NER). To increase the performance of recurrent networks, different factors play a vital role in which activation functions are one of them. Yet, no studies have perfectly analyzed the effectiveness of the activation function on Named Entity Recognition based classification task of textual data. In this paper, we have implemented a Bi-LSTM-based CRF model for Named Entity Recognition on the semantically annotated corpus i.e., GMB, and analyzed the impact of all non-linear activation functions on the performance of the Neural Network. Our analysis has stated that only Sigmoid, Exponential, SoftPlus, and SoftMax activation functions have performed efficiently in the NER task and achieved an average accuracy of 95.17%, 95.14%, 94.38%, and 94.76% respectively.
2022
2022 International Conference on Emerging Trends in Electrical, Control, and Telecommunication Engineering (ETECTE)
Lahore, Pakistan
02-04 December 2022
2022 International Conference on Emerging Trends in Electrical, Control, and Telecommunication Engineering (ETECTE) Conference Proceedings
IEEE
1
6
978-1-6654-9242-3
https://ieeexplore.ieee.org/document/10007317
activation functions, Groningen Meaning Bank (GMB), named entity recognition, recurrent neural networks
Amin, Muhammad Saad; Anselma, Luca; Mazzei, Alessandro
File in questo prodotto:
File Dimensione Formato  
IEEE_Role of Activation Function in Neural NER (1).pdf

Accesso aperto

Dimensione 836.93 kB
Formato Adobe PDF
836.93 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2318/1891092
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact