z-logo
Premium
Learning overhypotheses with hierarchical Bayesian models
Author(s) -
Kemp Charles,
Perfors Andrew,
Tenenbaum Joshua B.
Publication year - 2007
Publication title -
developmental science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.801
H-Index - 127
eISSN - 1467-7687
pISSN - 1363-755X
DOI - 10.1111/j.1467-7687.2007.00585.x
Subject(s) - bayesian probability , feature (linguistics) , hierarchical database model , psychology , artificial intelligence , inductive bias , rest (music) , bayesian inference , word (group theory) , word learning , bayesian statistics , cognitive psychology , computer science , machine learning , cognitive science , multi task learning , linguistics , task (project management) , medicine , philosophy , cardiology , management , vocabulary , economics , data mining
Inductive learning is impossible without overhypotheses, or constraints on the hypotheses considered by the learner. Some of these overhypotheses must be innate, but we suggest that hierarchical Bayesian models can help to explain how the rest are acquired. To illustrate this claim, we develop models that acquire two kinds of overhypotheses – overhypotheses about feature variability (e.g. the shape bias in word learning) and overhypotheses about the grouping of categories into ontological kinds like objects and substances.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here
Empowering knowledge with every search

Discover

Journals

Proceedings

Books

Explore

Engineering & Computer Science

Health & Medical Sciences

Humanities, Literature & Arts

Life Sciences & Earth Sciences

Physics & Mathematics

Social Sciences

Chemical & Material Sciences

Business, Economics & Management