
Learning Recursion: Multiple Nested and Crossed Dependencies
Author(s) -
M.H. de Vries,
Morten H. Christiansen,
Karl Magnus Petersson
Publication year - 2011
Publication title -
biolinguistics
Language(s) - English
Resource type - Journals
ISSN - 1450-3417
DOI - 10.5964/bioling.8825
Subject(s) - computer science , recursion (computer science) , cache language model , natural language , artificial intelligence , sequence learning , natural language processing , language acquisition , sequence (biology) , algorithmic learning theory , mechanism (biology) , cognitive science , universal networking language , comprehension approach , linguistics , active learning (machine learning) , psychology , algorithm , biology , philosophy , epistemology , genetics
Language acquisition in both natural and artificial language learning settings crucially depends on extracting information from ordered sequences. A shared sequence learning mechanism is thus assumed to underlie both natural and artificial language learning. A growing body of empirical evidence is consistent with this hypothesis. By means of artificial language learning experiments, we may therefore gain more insight in this shared mechanism. In this paper, we review empirical evidence from artificial language learning and computational modeling studies, as well as natural language data, and suggest that there are two key factors that help deter-mine processing complexity in sequence learning, and thus in natural language processing. We propose that the specific ordering of non-adjacent dependencies (i.e. nested or crossed), as well as the number of non-adjacent dependencies to be resolved simultaneously (i.e. two or three) are important factors in gaining more insight into the boundaries of human sequence learning; and thus, also in natural language processing. The implications for theories of linguistic competence are discussed.