z-logo
Premium
Computational Modeling of Statistical Learning: Effects of Transitional Probability Versus Frequency and Links to Word Learning
Author(s) -
Mirman Daniel,
Graf Estes Katharine,
Magnuson James S.
Publication year - 2010
Publication title -
infancy
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.361
H-Index - 69
eISSN - 1532-7078
pISSN - 1525-0008
DOI - 10.1111/j.1532-7078.2009.00023.x
Subject(s) - statistical learning , language acquisition , word (group theory) , artificial intelligence , computer science , artificial neural network , psychology , statistical model , text segmentation , simple (philosophy) , natural language processing , cognitive psychology , machine learning , segmentation , linguistics , philosophy , mathematics education , epistemology
Statistical learning mechanisms play an important role in theories of language acquisition and processing. Recurrent neural network models have provided important insights into how these mechanisms might operate. We examined whether such networks capture two key findings in human statistical learning. In Simulation 1, a simple recurrent network (SRN) performed much like human learners: it was sensitive to both transitional probability and frequency, with frequency dominating early in learning and probability emerging as the dominant cue later in learning. In Simulation 2, an SRN captured links between statistical segmentation and word learning in infants and adults, and suggested that these links arise because phonological representations are more distinctive for syllables with higher transitional probability. Beyond simply simulating general phenomena, these models provide new insights into underlying mechanisms and generate novel behavioral predictions.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here