Information extraction is one of the core fundamentals of natural language processing. Different recurrent neural network-based models have been implemented to perform text classification tasks like named entity recognition (NER). To increase the performance of recurrent networks, different factors play a vital role in which activation functions are one of them. Yet, no studies have perfectly analyzed the effectiveness of the activation function on Named Entity Recognition based classification task of textual data. In this paper, we have implemented a Bi-LSTM-based CRF model for Named Entity Recognition on the semantically annotated corpus i.e., GMB, and analyzed the impact of all non-linear activation functions on the performance of the Neural Network. Our analysis has stated that only Sigmoid, Exponential, SoftPlus, and SoftMax activation functions have performed efficiently in the NER task and achieved an average accuracy of 95.17%, 95.14%, 94.38%, and 94.76% respectively.
The Role of Activation Function in Neural NER for a Large Semantically Annotated Corpus
Amin, Muhammad Saad;Anselma, Luca;Mazzei, Alessandro
2022-01-01
Abstract
Information extraction is one of the core fundamentals of natural language processing. Different recurrent neural network-based models have been implemented to perform text classification tasks like named entity recognition (NER). To increase the performance of recurrent networks, different factors play a vital role in which activation functions are one of them. Yet, no studies have perfectly analyzed the effectiveness of the activation function on Named Entity Recognition based classification task of textual data. In this paper, we have implemented a Bi-LSTM-based CRF model for Named Entity Recognition on the semantically annotated corpus i.e., GMB, and analyzed the impact of all non-linear activation functions on the performance of the Neural Network. Our analysis has stated that only Sigmoid, Exponential, SoftPlus, and SoftMax activation functions have performed efficiently in the NER task and achieved an average accuracy of 95.17%, 95.14%, 94.38%, and 94.76% respectively.File | Dimensione | Formato | |
---|---|---|---|
IEEE_Role of Activation Function in Neural NER (1).pdf
Accesso aperto
Dimensione
836.93 kB
Formato
Adobe PDF
|
836.93 kB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.