z-logo
open-access-imgOpen Access
Tying of embeddings for improving regularization in neural networks for named entity recognition task
Author(s) -
M. Bevza
Publication year - 2018
Publication title -
vìsnik. serìâ fìziko-matematičnì nauki/vìsnik kiì̈vsʹkogo nacìonalʹnogo unìversitetu ìmenì tarasa ševčenka. serìâ fìziko-matematičnì nauki
Language(s) - English
Resource type - Journals
eISSN - 2218-2055
pISSN - 1812-5409
DOI - 10.17721/1812-5409.2018/3.8
Subject(s) - computer science , tying , task (project management) , named entity recognition , artificial intelligence , entity linking , variety (cybernetics) , regularization (linguistics) , generalizability theory , artificial neural network , natural language processing , architecture , statement (logic) , speech recognition , knowledge base , machine learning , art , statistics , mathematics , management , economics , visual arts , political science , law , operating system
We analyze neural network architectures that yield state of the art results on named entity recognition task and propose a new architecture for improving results even further. We have analyzed a number of ideas and approaches that researchers have used to achieve state of the art results in a variety of NLP tasks. In this work, we present a few of them which we consider to be most likely to improve existing state of the art solutions for named entity recognition task. The architecture is inspired by recent developments in language modeling task. The suggested solution is based on a multi-task learning approach. We incorporate part of speech tags as input for the network. Part of speech tags to be yielded by some state of the art tagger and also ask the network to produce those tags in addition to the main named entity recognition tags. This way knowledge distillation from a strong part of speech tagger to our smaller network is happening. We hypothesize that designing neural network architecture in this way improves the generalizability of the system and provide arguments to support this statement.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here