Dopamine neurons encode the better option in rats deciding between differently delayed or sized rewards.

PubWeight™: 3.96‹?› | Rank: Top 1%

🔗 View Article (PMC 2562672)

Published in Nat Neurosci on November 18, 2007

Authors

Matthew R Roesch1, Donna J Calu, Geoffrey Schoenbaum

Author Affiliations

1: Department of Anatomy and Neurobiology, University of Maryland School of Medicine, 20 Penn Street, HSF-2 S251, Baltimore, Maryland 21201, USA. mroes001@umaryland.edu

Articles citing this

(truncated to the top 100)

Dopamine in motivational control: rewarding, aversive, and alerting. Neuron (2010) 6.26

The neurobiology of decision: consensus and controversy. Neuron (2009) 4.77

Dopamine signals for reward value and risk: basic and recent data. Behav Brain Funct (2010) 3.42

Decision making in recurrent neuronal circuits. Neuron (2008) 3.19

A new perspective on the role of the orbitofrontal cortex in adaptive behaviour. Nat Rev Neurosci (2009) 3.08

Value representations in the primate striatum during matching behavior. Neuron (2008) 3.02

A unified framework for addiction: vulnerabilities in the decision process. Behav Brain Sci (2008) 2.83

Influence of reward delays on responses of dopamine neurons. J Neurosci (2008) 2.78

Start/stop signals emerge in nigrostriatal circuits during sequence learning. Nature (2010) 2.76

A causal link between prediction errors, dopamine neurons and learning. Nat Neurosci (2013) 2.68

Effort-based cost-benefit valuation and the human brain. J Neurosci (2009) 2.65

Evaluating choices by single neurons in the frontal lobe: outcome value encoded across multiple decision variables. Eur J Neurosci (2009) 2.43

Midbrain dopamine neurons signal preference for advance information about upcoming rewards. Neuron (2009) 2.42

Prefrontal coding of temporally discounted values during intertemporal choice. Neuron (2008) 2.39

Opponency revisited: competition and cooperation between dopamine and serotonin. Neuropsychopharmacology (2010) 2.34

Understanding dopamine and reinforcement learning: the dopamine reward prediction error hypothesis. Proc Natl Acad Sci U S A (2011) 2.33

Game theory and neural basis of social decision making. Nat Neurosci (2008) 2.24

The orbitofrontal cortex and ventral tegmental area are necessary for learning from unexpected outcomes. Neuron (2009) 2.21

Updating dopamine reward signals. Curr Opin Neurobiol (2012) 2.14

Delays conferred by escalating costs modulate dopamine release to rewards but not their predictors. J Neurosci (2010) 2.07

Neural computations underlying action-based decision making in the human brain. Proc Natl Acad Sci U S A (2009) 1.98

Phasic nucleus accumbens dopamine encodes risk-based decision-making behavior. Biol Psychiatry (2011) 1.96

Selective activation of cholinergic interneurons enhances accumbal phasic dopamine release: setting the tone for reward processing. Cell Rep (2012) 1.91

Ventral striatal neurons encode the value of the chosen action in rats deciding between differently delayed or sized rewards. J Neurosci (2009) 1.85

Leptin regulates the reward value of nutrient. Nat Neurosci (2011) 1.83

Expectancy-related changes in firing of dopamine neurons depend on orbitofrontal cortex. Nat Neurosci (2011) 1.81

Are you or aren't you? Challenges associated with physiologically identifying dopamine neurons. Trends Neurosci (2012) 1.78

Neural correlates of variations in event processing during learning in basolateral amygdala. J Neurosci (2010) 1.75

Encoding of marginal utility across time in the human brain. J Neurosci (2009) 1.74

Heterogeneous coding of temporally discounted values in the dorsal and ventral striatum during intertemporal choice. Neuron (2011) 1.72

Neural dissociation of delay and uncertainty in intertemporal choice. J Neurosci (2008) 1.68

Phasic nucleus accumbens dopamine release encodes effort- and delay-related costs. Biol Psychiatry (2010) 1.68

Human reinforcement learning subdivides structured action spaces by learning effector-specific values. J Neurosci (2009) 1.66

Adaptation of reward sensitivity in orbitofrontal neurons. J Neurosci (2010) 1.66

Disentangling the roles of approach, activation and valence in instrumental and pavlovian responding. PLoS Comput Biol (2011) 1.63

Risk-dependent reward value signal in human prefrontal cortex. Proc Natl Acad Sci U S A (2009) 1.63

A pallidus-habenula-dopamine pathway signals inferred stimulus values. J Neurophysiol (2010) 1.60

Dopaminergic genes predict individual differences in susceptibility to confirmation bias. J Neurosci (2011) 1.58

Does the orbitofrontal cortex signal value? Ann N Y Acad Sci (2011) 1.53

Neurons in the ventral striatum exhibit cell-type-specific representations of outcome during learning. Neuron (2014) 1.51

Lateral habenula neurons signal errors in the prediction of reward information. Nat Neurosci (2011) 1.48

Neurogenetics and pharmacology of learning, motivation, and cognition. Neuropsychopharmacology (2010) 1.45

Endocannabinoids promote cocaine-induced impulsivity and its rapid dopaminergic correlates. Biol Psychiatry (2013) 1.44

Distinct subtypes of basolateral amygdala taste neurons reflect palatability and reward. J Neurosci (2009) 1.38

Temporally extended dopamine responses to perceptually demanding reward-predictive stimuli. J Neurosci (2010) 1.38

Convergent processing of both positive and negative motivational signals by the VTA dopamine neuronal populations. PLoS One (2011) 1.36

Reward-dependent modulation of working memory in lateral prefrontal cortex. J Neurosci (2009) 1.36

New insights on the subcortical representation of reward. Curr Opin Neurobiol (2008) 1.35

Rapid dopamine dynamics in the accumbens core and shell: learning and action. Front Biosci (Elite Ed) (2013) 1.34

Neuromodulation of reward-based learning and decision making in human aging. Ann N Y Acad Sci (2011) 1.34

More is less: a disinhibited prefrontal cortex impairs cognitive flexibility. J Neurosci (2010) 1.34

Encoding of reward and space during a working memory task in the orbitofrontal cortex and anterior cingulate sulcus. J Neurophysiol (2009) 1.33

Optogenetic mimicry of the transient activation of dopamine neurons by natural reward is sufficient for operant reinforcement. PLoS One (2012) 1.32

Dopamine and effort-based decision making. Front Neurosci (2011) 1.31

Role of prefrontal cortex and the midbrain dopamine system in working memory updating. Proc Natl Acad Sci U S A (2012) 1.29

Signals in human striatum are appropriate for policy update rather than value prediction. J Neurosci (2011) 1.29

Prefrontal cortex and impulsive decision making. Biol Psychiatry (2010) 1.25

Model-based and model-free Pavlovian reward learning: revaluation, revision, and revelation. Cogn Affect Behav Neurosci (2014) 1.22

All that glitters ... dissociating attention and outcome expectancy from prediction errors signals. J Neurophysiol (2010) 1.20

Multiple timescales of memory in lateral habenula and dopamine neurons. Neuron (2010) 1.17

Short-term temporal discounting of reward value in human ventral striatum. J Neurophysiol (2009) 1.16

Neural correlates of variations in event processing during learning in central nucleus of amygdala. Neuron (2010) 1.16

Neural correlates of stimulus-response and response-outcome associations in dorsolateral versus dorsomedial striatum. Front Integr Neurosci (2010) 1.13

Neural mechanisms of acquired phasic dopamine responses in learning. Neurosci Biobehav Rev (2009) 1.12

Surprise! Neural correlates of Pearce-Hall and Rescorla-Wagner coexist within the brain. Eur J Neurosci (2012) 1.12

Model-based learning and the contribution of the orbitofrontal cortex to the model-free world. Eur J Neurosci (2012) 1.12

Attention for learning signals in anterior cingulate cortex. J Neurosci (2011) 1.10

Decision making and reward in frontal cortex: complementary evidence from neurophysiological and neuropsychological studies. Behav Neurosci (2011) 1.08

Orexin/hypocretin modulates response of ventral tegmental dopamine neurons to prefrontal activation: diurnal influences. J Neurosci (2010) 1.07

Reliability in the identification of midbrain dopamine neurons. PLoS One (2010) 1.07

The role of the basal ganglia in learning and memory: insight from Parkinson's disease. Neurobiol Learn Mem (2011) 1.07

Dissociable reward and timing signals in human midbrain and ventral striatum. Neuron (2011) 1.06

Feedback timing modulates brain systems for learning in humans. J Neurosci (2011) 1.06

The role of the nucleus accumbens core in impulsive choice, timing, and reward processing. Behav Neurosci (2010) 1.06

Fast dopamine release events in the nucleus accumbens of early adolescent rats. Neuroscience (2010) 1.02

Action-specific value signals in reward-related regions of the human brain. J Neurosci (2012) 1.02

Sucrose-predictive cues evoke greater phasic dopamine release than saccharin-predictive cues. Synapse (2011) 1.01

Dopamine neurons learn to encode the long-term value of multiple future rewards. Proc Natl Acad Sci U S A (2011) 1.01

Putting desire on a budget: dopamine and energy expenditure, reconciling reward and resources. Front Integr Neurosci (2012) 1.00

Neural signals of extinction in the inhibitory microcircuit of the ventral midbrain. Nat Neurosci (2012) 1.00

The problem with value. Neurosci Biobehav Rev (2014) 0.99

Computational models of reinforcement learning: the role of dopamine as a reward signal. Cogn Neurodyn (2010) 0.98

A biologically plausible embodied model of action discovery. Front Neurorobot (2013) 0.97

Modulation of taste responsiveness and food preference by obesity and weight loss. Physiol Behav (2012) 0.97

A role for the medial temporal lobe in feedback-driven learning: evidence from amnesia. J Neurosci (2013) 0.96

Reward prediction error signaling in posterior dorsomedial striatum is action specific. J Neurosci (2012) 0.96

Impact of expected value on neural activity in rat substantia nigra pars reticulata. Eur J Neurosci (2011) 0.96

Establishing causality for dopamine in neural function and behavior with optogenetics. Brain Res (2012) 0.96

Limited encoding of effort by dopamine neurons in a cost-benefit trade-off task. J Neurosci (2013) 0.95

Mesolimbic dopamine dynamically tracks, and is causally linked to, discrete aspects of value-based decision making. Biol Psychiatry (2014) 0.95

Sex, ADHD symptoms, and smoking outcomes: an integrative model. Med Hypotheses (2012) 0.95

Rapid signalling in distinct dopaminergic axons during locomotion and reward. Nature (2016) 0.94

Modelling individual differences in the form of Pavlovian conditioned approach responses: a dual learning systems approach with factored representations. PLoS Comput Biol (2014) 0.94

Cue-evoked dopamine release in the nucleus accumbens shell tracks reinforcer magnitude during intracranial self-stimulation. Neuroscience (2010) 0.94

Subsecond dopamine fluctuations in human striatum encode superposed error signals about actual and counterfactual reward. Proc Natl Acad Sci U S A (2015) 0.93

Model-based analyses: Promises, pitfalls, and example applications to the study of cognitive control. Q J Exp Psychol (Hove) (2011) 0.93

Foundations of neuroeconomics: from philosophy to practice. PLoS Biol (2008) 0.93

Brief optogenetic inhibition of dopamine neurons mimics endogenous negative reward prediction errors. Nat Neurosci (2015) 0.93

Conjunctive encoding of movement and reward by ventral tegmental area neurons in the freely navigating rodent. Behav Neurosci (2010) 0.93

Midbrain dopamine neurons compute inferred and cached value prediction errors in a common framework. Elife (2016) 0.93

Articles cited by this

Relative and absolute strength of response as a function of frequency of reinforcement. J Exp Anal Behav (1961) 101.16

Dopamine, learning and motivation. Nat Rev Neurosci (2004) 11.97

Dissociable roles of ventral and dorsal striatum in instrumental conditioning. Science (2004) 11.67

Getting formal with dopamine and reward. Neuron (2002) 9.83

A framework for mesencephalic dopamine systems based on predictive Hebbian learning. J Neurosci (1996) 9.54

Discrete coding of reward probability and uncertainty by dopamine neurons. Science (2003) 9.54

Midbrain dopamine neurons encode a quantitative reward prediction error signal. Neuron (2005) 7.56

A neostriatal habit learning system in humans. Science (1996) 6.78

Encoding predictive reward value in human amygdala and orbitofrontal cortex. Science (2003) 6.76

Adaptive coding of reward value by dopamine neurons. Science (2005) 6.71

Striatonigrostriatal pathways in primates form an ascending spiral from the shell to the dorsolateral striatum. J Neurosci (2000) 6.02

Lesions of dorsolateral striatum preserve outcome expectancy but disrupt habit formation in instrumental learning. Eur J Neurosci (2004) 5.62

Dopamine responses comply with basic assumptions of formal learning theory. Nature (2001) 5.57

Dopamine neurons report an error in the temporal prediction of reward during learning. Nat Neurosci (1998) 5.15

Associative learning mediates dynamic shifts in dopamine signaling in the nucleus accumbens. Nat Neurosci (2007) 4.97

Midbrain dopamine neurons encode decisions for future action. Nat Neurosci (2006) 4.39

Reward, motivation, and reinforcement learning. Neuron (2002) 4.28

Impulsive choice induced in rats by lesions of the nucleus accumbens core. Science (2001) 4.25

Orbitofrontal cortex and representation of incentive value in associative learning. J Neurosci (1999) 3.69

Contrasting roles of basolateral amygdala and orbitofrontal cortex in impulsive choice. J Neurosci (2004) 3.48

Encoding of time-discounted rewards in orbitofrontal cortex is independent of value representation. Neuron (2006) 3.45

Neurotoxic lesions of basolateral, but not central, amygdala interfere with Pavlovian second-order conditioning and reinforcer devaluation effects. J Neurosci (1996) 3.21

Control of response selection by reinforcer value requires interaction of amygdala and orbital prefrontal cortex. J Neurosci (2000) 3.09

The pharmacology of impulsive behaviour in rats: the effects of drugs on response choice with varying delays of reinforcement. Psychopharmacology (Berl) (1996) 2.97

Dopamine cells respond to predicted events during classical conditioning: evidence for eligibility traces in the reward-learning network. J Neurosci (2005) 2.97

Central amygdala ERK signaling pathway is critical to incubation of cocaine craving. Nat Neurosci (2005) 2.90

Effects of lesions of the orbitofrontal cortex on sensitivity to delayed and probabilistic reinforcement. Psychopharmacology (Berl) (2002) 2.77

Importance of unpredictability for reward responses in primate dopamine neurons. J Neurophysiol (1994) 2.57

Dopamine neurons can represent context-dependent prediction error. Neuron (2004) 2.51

Coding of predicted reward omission by dopamine neurons in a conditioned inhibition paradigm. J Neurosci (2003) 2.26

The connections of the dopaminergic system with the striatum in rats and primates: an analysis with respect to the functional and compartmental organization of the striatum. Neuroscience (2000) 2.25

Limbic corticostriatal systems and delayed reinforcement. Ann N Y Acad Sci (2004) 2.24

The effects of d-amphetamine, chlordiazepoxide, alpha-flupenthixol and behavioural manipulations on choice of signalled and unsignalled delayed reinforcement in rats. Psychopharmacology (Berl) (2000) 2.16

Effects of dopaminergic drugs on delayed reward as a measure of impulsive behavior in rats. Psychopharmacology (Berl) (2000) 2.16

Theory and method in the quantitative analysis of "impulsive choice" behaviour: implications for psychopharmacology. Psychopharmacology (Berl) (1999) 1.85

Reward-predicting activity of dopamine and caudate neurons--a possible mechanism of motivational control of saccadic eye movement. J Neurophysiol (2003) 1.66

Comparison of effects of L-dopa, amphetamine and apomorphine on firing rate of rat dopaminergic neurones. Nat New Biol (1973) 1.61

Previous cocaine exposure makes rats hypersensitive to both delay and reward magnitude. J Neurosci (2007) 1.61

Effects of orbital prefrontal cortex dopamine depletion on inter-temporal choice: a quantitative analysis. Psychopharmacology (Berl) (2004) 1.37

Limbic cortical-ventral striatal systems underlying appetitive conditioning. Prog Brain Res (2000) 1.20

Dopamine auto- and postsynaptic receptors: electrophysiological evidence for differential sensitivity to dopamine agonists. Science (1979) 1.19

Single units in the pigeon brain integrate reward amount and time-to-reward in an impulsive choice task. Curr Biol (2005) 1.17

Heterogeneity of ventral tegmental area neurons: single-unit recording and iontophoresis in awake, unrestrained rats. Neuroscience (1998) 1.08

Choice values. Nat Neurosci (2006) 1.04

Articles by these authors

Different roles for orbitofrontal cortex and basolateral amygdala in a reinforcer devaluation task. J Neurosci (2003) 3.51

Encoding of time-discounted rewards in orbitofrontal cortex is independent of value representation. Neuron (2006) 3.45

Basolateral amygdala lesions abolish orbitofrontal-dependent reversal impairments. Neuron (2007) 2.82

Ventral striatum and orbitofrontal cortex are both required for model-based, but not model-free, reinforcement learning. J Neurosci (2011) 2.55

Rapid associative encoding in basolateral amygdala depends on connections with orbitofrontal cortex. Neuron (2005) 2.52

Neural encoding in ventral striatum during olfactory discrimination learning. Neuron (2003) 2.42

Double dissociation of the effects of medial and orbital prefrontal cortical lesions on attentional and affective shifts in mice. J Neurosci (2008) 2.22

The orbitofrontal cortex and ventral tegmental area are necessary for learning from unexpected outcomes. Neuron (2009) 2.21

The reinstatement model of drug relapse: recent neurobiological findings, emerging research topics, and translational research. Psychopharmacology (Berl) (2013) 2.01

What we know and do not know about the functions of the orbitofrontal cortex after 20 years of cross-species studies. J Neurosci (2007) 1.93

The role of the orbitofrontal cortex in the pursuit of happiness and more specific rewards. Nature (2008) 1.89

Ventral striatal neurons encode the value of the chosen action in rats deciding between differently delayed or sized rewards. J Neurosci (2009) 1.85

Expectancy-related changes in firing of dopamine neurons depend on orbitofrontal cortex. Nat Neurosci (2011) 1.81

Neural correlates of variations in event processing during learning in basolateral amygdala. J Neurosci (2010) 1.75

Dialogues on prediction errors. Trends Cogn Sci (2008) 1.74

Differential roles of human striatum and amygdala in associative learning. Nat Neurosci (2011) 1.71

Previous cocaine exposure makes rats hypersensitive to both delay and reward magnitude. J Neurosci (2007) 1.61

Withdrawal from cocaine self-administration produces long-lasting deficits in orbitofrontal-dependent reversal learning in rats. Learn Mem (2007) 1.59

Orbitofrontal cortex supports behavior and learning using inferred but not cached values. Science (2012) 1.53

The impact of orbitofrontal dysfunction on cocaine addiction. Nat Neurosci (2012) 1.50

Associative encoding in anterior piriform cortex versus orbitofrontal cortex during odor discrimination and reversal learning. Cereb Cortex (2006) 1.36

More is less: a disinhibited prefrontal cortex impairs cognitive flexibility. J Neurosci (2010) 1.34

Abnormal associative encoding in orbitofrontal neurons in cocaine-experienced rats during decision-making. Eur J Neurosci (2006) 1.33

Neural substrates of cognitive inflexibility after chronic cocaine exposure. Neuropharmacology (2008) 1.23

All that glitters ... dissociating attention and outcome expectancy from prediction errors signals. J Neurophysiol (2010) 1.20

Risk-responsive orbitofrontal neurons track acquired salience. Neuron (2013) 1.18

Neural correlates of variations in event processing during learning in central nucleus of amygdala. Neuron (2010) 1.16

Neural correlates of stimulus-response and response-outcome associations in dorsolateral versus dorsomedial striatum. Front Integr Neurosci (2010) 1.13

Surprise! Neural correlates of Pearce-Hall and Rescorla-Wagner coexist within the brain. Eur J Neurosci (2012) 1.12

Should I stay or should I go? Transformation of time-discounted rewards in orbitofrontal cortex and associated brain circuits. Ann N Y Acad Sci (2007) 1.12

Model-based learning and the contribution of the orbitofrontal cortex to the model-free world. Eur J Neurosci (2012) 1.12

Associative encoding in posterior piriform cortex during odor discrimination and reversal learning. Cereb Cortex (2006) 1.11

Cocaine-induced decision-making deficits are mediated by miscoding in basolateral amygdala. Nat Neurosci (2007) 1.10

Orbitofrontal inactivation impairs reversal of Pavlovian learning by interfering with 'disinhibition' of responding for previously unrewarded cues. Eur J Neurosci (2009) 1.10

Optogenetic inhibition of dorsal medial prefrontal cortex attenuates stress-induced reinstatement of palatable food seeking in female rats. J Neurosci (2013) 1.09

Cocaine exposure shifts the balance of associative encoding from ventral to dorsolateral striatum. Front Integr Neurosci (2007) 1.05

Medial prefrontal cortex neuronal activation and synaptic alterations after stress-induced reinstatement of palatable food seeking: a study using c-fos-GFP transgenic female rats. J Neurosci (2012) 1.05

Prior cocaine exposure disrupts extinction of fear conditioning. Learn Mem (2006) 1.04

Silencing the critics: understanding the effects of cocaine sensitization on dorsolateral and ventral striatum in the context of an actor/critic model. Front Neurosci (2008) 1.03

Nucleus Accumbens Core and Shell are Necessary for Reinforcer Devaluation Effects on Pavlovian Conditioned Responding. Front Integr Neurosci (2010) 1.03

Cocaine exposure shifts the balance of associative encoding from ventral to dorsolateral striatum. Front Integr Neurosci (2007) 1.02

Neural estimates of imagined outcomes in the orbitofrontal cortex drive behavior and learning. Neuron (2013) 1.01

A role for BDNF in cocaine reward and relapse. Nat Neurosci (2007) 0.99

Effect of fenfluramine on reinstatement of food seeking in female and male rats: implications for the predictive validity of the reinstatement model. Psychopharmacology (Berl) (2011) 0.98

Reward prediction error signaling in posterior dorsomedial striatum is action specific. J Neurosci (2012) 0.96

Conditioned reinforcement can be mediated by either outcome-specific or general affective representations. Front Integr Neurosci (2007) 0.92

Inactivation of the central but not the basolateral nucleus of the amygdala disrupts learning in response to overexpectation of reward. J Neurosci (2010) 0.91

Willingness to wait and altered encoding of time-discounted reward in the orbitofrontal cortex with normal aging. J Neurosci (2012) 0.89

Attention-related Pearce-Kaye-Hall signals in basolateral amygdala require the midbrain dopaminergic system. Biol Psychiatry (2012) 0.89

Toward a model of impaired reality testing in rats. Schizophr Bull (2009) 0.84

Impaired reality testing in an animal model of schizophrenia. Biol Psychiatry (2011) 0.84

Transition from 'model-based' to 'model-free' behavioral control in addiction: Involvement of the orbitofrontal cortex and dorsolateral striatum. Neuropharmacology (2013) 0.84

Neural correlates of inflexible behavior in the orbitofrontal-amygdalar circuit after cocaine exposure. Ann N Y Acad Sci (2007) 0.83

Normal Aging does Not Impair Orbitofrontal-Dependent Reinforcer Devaluation Effects. Front Aging Neurosci (2011) 0.82

Cocaine-paired cues activate aversive representations in accumbens neurons. Neuron (2008) 0.82

Disruption of model-based behavior and learning by cocaine self-administration in rats. Psychopharmacology (Berl) (2013) 0.81

The dorsal raphe nucleus is integral to negative prediction errors in Pavlovian fear. Eur J Neurosci (2014) 0.81

Linking affect to action: critical contributions of the orbitofrontal cortex. Preface. Ann N Y Acad Sci (2007) 0.81

Contrasting Effects of Lithium Chloride and CB1 Receptor Blockade on Enduring Changes in the Valuation of Reward. Front Behav Neurosci (2011) 0.80

Normal aging alters learning and attention-related teaching signals in basolateral amygdala. J Neurosci (2012) 0.80

Affect, action, and ambiguity and the amygdala-orbitofrontal circuit. Focus on "combined unilateral lesions of the amygdala and orbital prefrontal cortex impair affective processing in rhesus monkeys". J Neurophysiol (2004) 0.79

The role of the nucleus accumbens in knowing when to respond. Learn Mem (2011) 0.77

Alcohol reward, dopamine depletion, and GDNF. J Neurosci (2011) 0.75

Corrigendum: Dopamine transients are sufficient and necessary for acquisition of model-based associations. Nat Neurosci (2017) 0.75