z-logo
open-access-imgOpen Access
The neural code of reward anticipation in human orbitofrontal cortex
Author(s) -
Thorsten Kahnt,
Jakob Heinzle,
Soyoung Q. Park,
John­–Dylan Haynes
Publication year - 2010
Publication title -
proceedings of the national academy of sciences of the united states of america
Language(s) - English
Resource type - Journals
eISSN - 1091-6490
pISSN - 0027-8424
DOI - 10.1073/pnas.0912838107
Subject(s) - orbitofrontal cortex , coding (social sciences) , psychology , anticipation (artificial intelligence) , cognitive psychology , neuroscience , computer science , prefrontal cortex , artificial intelligence , cognition , mathematics , statistics
An optimal choice among alternative behavioral options requires precise anticipatory representations of their possible outcomes. A fundamental question is how such anticipated outcomes are represented in the brain. Reward coding at the level of single cells in the orbitofrontal cortex (OFC) follows a more heterogeneous coding scheme than suggested by studies using functional MRI (fMRI) in humans. Using a combination of multivariate pattern classification and fMRI we show that the reward value of sensory cues can be decoded from distributed fMRI patterns in the OFC. This distributed representation is compatible with previous reports from animal electrophysiology that show that reward is encoded by different neural populations with opposing coding schemes. Importantly, the fMRI patterns representing specific values during anticipation are similar to those that emerge during the receipt of reward. Furthermore, we show that the degree of this coding similarity is related to subjects’ ability to use value information to guide behavior. These findings narrow the gap between reward coding in humans and animals and corroborate the notion that value representations in OFC are independent of whether reward is anticipated or actually received.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here