Generalization of the Features of Emotional Faces
Summary
Emotional faces contain important social information and draw attention automatically. Some
emotional expressions draw attention more rapidly than others; however, various studies disagree
as to which emotional expression consistently draws attention the fastest, with both positive (e.g.
happy) and negative (e.g. angry) emotional expressions showing faster reaction times depending
on the study and task. This brings into question whether differences in reaction time to emotional
faces are due to valence alone or factors such as low level image features (e.g. contrast and
orientation). Additionally, if these low level features, particularly spatial frequencies, are
involved in the rapid processing of emotional faces, then non-face objects with similar spatial
frequency content would have similar reaction time effects. In this study, we examined the role
of spatial frequency content in access to awareness of images of emotional faces. We used car
images to test for generalizability based on low level features. Using the spatial frequency
content from angry, happy, and neutral faces, we used machine learning to classify car images,
both frontal and side views, as happy or angry based on their spatial frequency content. Using
breaking continuous flash suppression (b-CFS) and a forced choice task, we measured reaction
time to access of awareness as well as participants' subjective rating of images of emotional faces
and classified "emotional" car sides and fronts. No significant differences were found between
either image type or emotion in b-CFS, and notably, faces did not reach access to awareness
faster than car images. In the rating task, however, human faces were rated as expected (e.g.
happy faces as happy) even though car images were rated neutrally.