z-logo
open-access-imgOpen Access
From image descriptions to visual denotations: New similarity metrics for semantic inference over event descriptions
Author(s) -
Peter Young,
Alice Lai,
Micah Hodosh,
Julia Hockenmaier
Publication year - 2014
Publication title -
transactions of the association for computational linguistics
Language(s) - English
Resource type - Journals
ISSN - 2307-387X
DOI - 10.1162/tacl_a_00166
Subject(s) - computer science , inference , natural language processing , denotation (semiotics) , artificial intelligence , similarity (geometry) , set (abstract data type) , semantic similarity , metric (unit) , graph , event (particle physics) , construct (python library) , image (mathematics) , theoretical computer science , linguistics , programming language , philosophy , operations management , economics , semiotics , physics , quantum mechanics
We propose to use the visual denotations of linguistic expressions (i.e. the set of images they describe) to define novel denotational similarity metrics, which we show to be at least as beneficial as distributional similarities for two tasks that require semantic inference. To compute these denotational similarities, we construct a denotation graph, i.e. a subsumption hierarchy over constituents and their denotations, based on a large corpus of 30K images and 150K descriptive captions.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom