z-logo
open-access-imgOpen Access
On the Derivational Entropy of Left-to-Right Probabilistic Finite-State Automata and Hidden Markov Models
Author(s) -
Joan Andreu Sánchez,
Martha Rocha,
Verónica Romero,
Mauricio Villegas
Publication year - 2017
Publication title -
computational linguistics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.314
H-Index - 98
eISSN - 1530-9312
pISSN - 0891-2017
DOI - 10.1162/coli_a_00306
Subject(s) - quantum finite automata , probabilistic automaton , finite state , deterministic finite automaton , finite state machine , nondeterministic finite automaton , hidden markov model , probabilistic logic , entropy (arrow of time) , entropy rate , mathematics , computer science , automaton , regular language , principle of maximum entropy , markov model , maximum entropy markov model , markov chain , computation , algorithm , automata theory , theoretical computer science , variable order markov model , binary entropy function , artificial intelligence , machine learning , physics , quantum mechanics
Probabilistic finite-state automata are a formalism that is widely used in many problems of automatic speech recognition and natural language processing. Probabilistic finite-state automata are closely related to other finite-state models as weighted finite-state automata, word lattices, and hidden Markov models. Therefore, they share many similar properties and problems. Entropy measures of finite-state models have been investigated in the past in order to study the information capacity of these models. The derivational entropy quantifies the uncertainty that the model has about the probability distribution it represents. The derivational entropy in a finite-state automaton is computed from the probability that is accumulated in all of its individual state sequences. The computation of the entropy from a weighted finite-state automaton requires a normalized model. This article studies an efficient computation of the derivational entropy of left-to-right probabilistic finite-state automata, and it introduces an efficient algorithm for normalizing weighted finite-state automata. The efficient computation of the derivational entropy is also extended to continuous hidden Markov models.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom