Abstract
Attention serves to represent selectively relevant information at the expense of competing and irrelevant information, but the mechanisms and effects of attention are not unitary. The great variety of methods and techniques used to study automaticity and attention for facial expressions suggests that the time should now be ready for a better breaking down of the concepts of automaticity and attention into elementary constituents that are more tractable to investigations in cognitive neuroscience. This article reviews both the behavioral and neuroimaging literature on the automatic perception of facial expressions of emotion in healthy volunteers and patients with brain damage. It focuses on aspects of automaticity in face perception that relate to task goals, attentional control, and conscious awareness. Behavioral and neuroimaging findings converge to support some degree of automaticity in processing facial expressions and is likely to reflect distinct components that should be better disentangled at both the behavioral and neural level.