Attention-guided Visual Learning, A Computational Model
Summary
Our behavior is driven by a small subset of all information available to us. As our processing resources are limited, we select information to attend to in order to learn from our environment. Visual attention has been studied for many decades. Still, current attentional models do not explain how attentional modulations affect trial-and-error learning in the visual cortex. This study is the first to define synaptic plasticity as a function of attentional modulations observed prior to receiving rewards. The attention-modulated Hebbian plasticity rule is used to simulate attention-guided learning for a series of classification tasks. Despite exclusively receiving reward feedback for the predicted label, our attention-guided reinforcement learning framework is able to perform comparably to supervised error-backpropagation. This holds for datasets with up to 100 class labels. Our results are obtained by redefining learning to reflect biological mechanisms which ultimately govern behavior.