The contribution of upper and lower faces in facial expression classification
Date Issued
2009
Date
2009
Author(s)
Chen, Mei-Yen
Abstract
We used a composite face paradigm and a visual classification task to investigate the property of facial expression processors in the visual system and the decision boundary for happy/sad classification in a stimulus space. We measured how the facial expression classification depends on the intensity of a particular expression in either the upper or the lower faces. We then applied a model based on the multidimensional signal detection theory to estimate the statistical properties of the internal representations, or the tuning properties of the expression processors. There were three test conditions: two foveal viewing conditions in which the upper and lower parts of the test image are either aligned or had a lateral shift of 44'' visual angle; and one peripheral condition in which the aligned face was placed at 6o to the left of the fixation. The observers were asked to determine whether a test image was happy or sad. The alignment conditions had no effect on happy/sad classification, suggesting that facial expression classification is an analytic process. The model analysis also showed no interaction between upper and lower facial features in foveal expression classification. It was easier to classify a face as sadness in the periphery than in the fovea, suggesting different mechanisms for expression perception in different eccentricities.
Subjects
multidimensional signal detection
psychophysics
eccentricity
emotion
File(s)![Thumbnail Image]()
Loading...
Name
index.html
Size
23.49 KB
Format
HTML
Checksum
(MD5):f6295c54e873ebfbb561b2f7e606caef