z-logo
open-access-imgOpen Access
Research of Attention-Based Bi-GRU-CRF for Slot Filling
Author(s) -
Luwang Zhou,
Zhimin Huang,
Ying Nie
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1757/1/012077
Subject(s) - conditional random field , utterance , computer science , exploit , benchmark (surveying) , artificial intelligence , key (lock) , field (mathematics) , sequence labeling , sequence (biology) , recurrent neural network , natural language processing , artificial neural network , mathematics , computer security , geodesy , biology , pure mathematics , genetics , task (project management) , geography , management , economics
Slot Filling (SF) is a critical part of spoken language understanding (SLU) which targets to capture semantic constituents from a specific utterance. It is considered as a sequence labeling issue. Currently, recurrent neural networks have shown promising effectiveness in this issue. Considering the effects of interrelated information within adjacent words and labels, we present that a novel approach consists of bi-directional gate recurrent unit (Bi-GRU), attention mechanism, and conditional random field (CRF). Our model can utilize interrelated information from words in the neighborhood, highlight key information, and exploit the dependencies within labels corresponding to surrounding words. The empirical experiments illustrate that our model significantly boosts the F1 score with around 1% and 5.1% relative enhancement on two public benchmark dataset ATIS and SNIPS separately.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here