z-logo
open-access-imgOpen Access
Hierarchical structure and memory mechanisms in agreement attraction
Author(s) -
Julie Franck,
Matthew Wagers
Publication year - 2020
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0232163
Subject(s) - attraction , sentence , computer science , attractor , focus (optics) , sentence processing , artificial intelligence , natural language processing , position (finance) , linguistics , mathematics , physics , mathematical analysis , philosophy , finance , economics , optics
Speakers occasionally produce verbs that agree with an element that is not the subject, a so-called ‘attractor’; likewise, comprehenders occasionally fail to notice agreement errors when the verb agrees with the attractor. Cross-linguistic studies converge in showing that attraction is modulated by the hierarchical position of the attractor in the sentence structure. We report two experiments exploring the link between structural position and memory representations in attraction. The method used is innovative in two respects: we used jabberwocky materials to control for semantic influences and focus on structural agreement processing, and we used a Speed-Accuracy Trade-off (SAT) design combined with a memory probe recognition task, as classically used in list memorization tasks. SAT enabled the joint measurement of retrieval speed and retrieval accuracy of subjects and attractors in sentences that typically elicit attraction errors. Experiment 1 first established that attraction arises in jabberwocky sentences, to a similar extent and showing structure-dependency effects, as in natural sentences. Experiment 2 showed a close alignment between the attraction profiles found in Experiment 1 and memory parameters. Results support a content-addressable architecture of memory representations for sentences in which nouns’ accessibility depends on their syntactic position, while subjects are kept in the focus of attention.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here