z-logo
open-access-imgOpen Access
Learning Simpler Language Models with the Differential State Framework
Author(s) -
Alexander G. Ororbia,
Tomáš Mikolov,
David Reitter
Publication year - 2017
Publication title -
neural computation
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.235
H-Index - 169
eISSN - 1530-888X
pISSN - 0899-7667
DOI - 10.1162/neco_a_01017
Subject(s) - recurrent neural network , computer science , language model , representation (politics) , artificial intelligence , simple (philosophy) , state (computer science) , word (group theory) , deep learning , reservoir computing , artificial neural network , machine learning , algorithm , mathematics , philosophy , geometry , epistemology , politics , political science , law
Learning useful information across long time lags is a critical and difficult problem for temporal neural models in tasks such as language modeling. Existing architectures that address the issue are often complex and costly to train. The differential state framework (DSF) is a simple and high-performing design that unifies previously introduced gated neural models. DSF models maintain longer-term ...

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom