z-logo
Premium
Cue Combinatorics in Memory Retrieval for Anaphora
Author(s) -
Parker Dan
Publication year - 2019
Publication title -
cognitive science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.498
H-Index - 114
eISSN - 1551-6709
pISSN - 0364-0213
DOI - 10.1111/cogs.12715
Subject(s) - computer science , set (abstract data type) , reading (process) , cue dependent forgetting , nonlinear system , artificial intelligence , natural language processing , information retrieval , psychology , communication , linguistics , philosophy , physics , quantum mechanics , programming language
Many studies have shown that memory retrieval for real‐time language processing relies on a cue‐based access mechanism, which allows the cues available at the retrieval site to directly access the target representation in memory. An open question is how different types of cues are combined at retrieval to create a single retrieval probe (“cue combinatorics”). This study addresses this question by testing whether retrieval for antecedent‐reflexive dependencies combines cues in a linear (i.e., additive) or nonlinear (i.e., multiplicative) fashion. Results from computational simulations and a reading time experiment show that target items that match all the cues of the reflexive are favored more than target items that mismatch these cues, and that different degrees of mismatches slow reading times in comparable amounts. This profile is consistent with the predictions of a nonlinear cue combination and provides evidence against models in which all cues combine in a linear fashion. A follow‐up set of simulations shows that a nonlinear rule also captures previous demonstrations of interference from nontarget items during retrieval for reflexive licensing. Taken together, these results shed new light on how different types of cues combine at the retrieval site and reveal how the method of cue combination impacts the accessibility of linguistic information in memory.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here