
Optimal forgetting: Semantic compression of episodic memories
Author(s) -
Dávid Nagy,
Balázs Török,
Gergő Orbán
Publication year - 2020
Publication title -
plos computational biology/plos computational biology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.628
H-Index - 182
eISSN - 1553-7358
pISSN - 1553-734X
DOI - 10.1371/journal.pcbi.1008367
Subject(s) - computer science , generative grammar , lossy compression , forgetting , semantic memory , recall , artificial intelligence , variety (cybernetics) , episodic memory , generative model , memory model , machine learning , natural language processing , cognition , cognitive psychology , psychology , neuroscience , shared memory , operating system
It has extensively been documented that human memory exhibits a wide range of systematic distortions, which have been associated with resource constraints. Resource constraints on memory can be formalised in the normative framework of lossy compression, however traditional lossy compression algorithms result in qualitatively different distortions to those found in experiments with humans. We argue that the form of distortions is characteristic of relying on a generative model adapted to the environment for compression. We show that this semantic compression framework can provide a unifying explanation of a wide variety of memory phenomena. We harness recent advances in learning deep generative models, that yield powerful tools to approximate generative models of complex data. We use three datasets, chess games, natural text, and hand-drawn sketches, to demonstrate the effects of semantic compression on memory performance. Our model accounts for memory distortions related to domain expertise, gist-based distortions, contextual effects, and delayed recall.