Abstract
Attentional capture refers to the fact that task-irrelevant stimuli can involuntarily attract attention. In visual search, many researchers found that an anger facial expression captured more attention and interference the target identification, related to neural facial expression. This phenomenon was termed as anger superiority effect. However, in previous studies, participants were usually instructed to identify specific-defining face (eg. Red face) among other faces with the task-irrelevant anger face. It is unclear whether the decreased performance of target identification was due to the attentional capture of anger face or due to the distraction of target selection. To distinguish the mechanism between attentional capture of emotional faces and target selection, and further explore the impact factors of task-irrelevant emotional faces attentional capture, the present study used visual search task and asked participants to determine the orientation of the nose in the face among six positons equidistance from the central, while an emotional face (anger. happy or neural for baseline) occurred at another position. By defining the target feature as a slop nose, we could distinguish the mechanism between the selective attention for target and the attentional capture for task-irrelevant emotional distractor. In experiment 1, we manipulated three types of emotional facial expression (anger, happy, neural) to investigate the attentional capture for different emotional expressions. Moreover, we manipulated the orientation of face (upright and inversed) to investigate whether the irrelevant facial expression capture attention in a whole processing or in a separate way. The results of experiment 1 showed that compared with happy face, the accuracy of target recognition was significantly decreased when the angry face was used as a distractor, and the response time was significantly increased. This results suggested that the angry face captured more attention when it used to be the task-irrelevant stimuli, indicating an anger superiority effect. In addition, the performance of target identification of the upright face was significantly worse than that of the inverted face, and the anger superiority effect was disappeared when face was inverted. These results indicate that the anger superiority effect was based on the whole emotional context, rather than a simple perception. In experiment 2, we manipulated the onset durations of search task to investigate the effects of temporal task demands on the identification of target and on the attentional capture for task-irrelevant emotional facial expression. The results found that when the temporal task demand was low (task display duration 1000ms), the performance of target recognition was significantly decreased, related to the condition when the temporal task demand was high (task display duration 200ms), indicating the effect of temporal task demand on target identification. However, the anger superiority effect was still occurred in both temporal task demands, indicating that the anger superiority effect may reflect an automatic processing, rather than a control settings. Overall, the present study employed a visual search task to investigate the underlying mechanism of task-irrelevant emotional facial expressions in attentional capture and whether this attentional capture need high temporal task demand. The results of two experiments confirmed that the irrelevant anger face expression could capture attention, even though it totally task-irrelevant. Moreover, this anger superiority effect might occur in the whole expressions, rather than the specific feature. Last, the anger superiority effect was not influenced by temporal task demand, indicating that this effect might be an automatic processing.
Key words
attentional capture /
emotion facial expressions /
temporal task demand /
anger superiority effect
Cite this article
Download Citations
The Effect of Temporal Task Demand on Task-Irrelevant Emotion Facial Expressions in Attentional Capture[J]. Journal of Psychological Science. 2020, 43(1): 2-8
{{custom_sec.title}}
{{custom_sec.title}}
{{custom_sec.content}}