Skip to main content
 

Face-responsive regions differ in their ability to discriminate facial expressions

Thursday, October 11, 2012 — Poster Session IV

2:00 p.m. – 4:00 p.m.

Natcher Conference Center, Building 45

NIMH

NEURO/BEHAV/SENSYS-27

Authors

  • H Zhang
  • R Nolan
  • C Chu
  • S Japee
  • L.G. Ungerleider

Abstract

Recognition and classification of facial expressions is crucial for effective social functioning. Yet, it is unclear how various face-responsive regions in the human brain discriminate between different facial expressions. Multi-voxel pattern analysis (MVPA) is a powerful tool that can be used to decode information from patterns of brain activity. In this study, we used support-vector machine (SVM) to investigate the ability of different face-responsive regions to discriminate between facial expressions. Subjects were shown repeated presentations of facial stimuli that varied based on 4 expressions: fearful, angry, neutral, and happy. The multi-voxel patterns evoked within the face-responsive regions of amygdale, STS, and fusiform gyrus were used to train and test an SVM classifier. In the right amygdala, SVM classification found significant accuracy rate for classifying both fearful and neutral faces from the other expressions. In right STS, the accuracy rate was found to be significantly high for the discrimination of neutral faces relative to all emotional faces. These results indicate that both the right amygdala and right STS are able to discriminate between different facial expressions, albeit in different ways; suggesting that different face-responsive regions in the human brain play distinct roles in the processing of the emotional content within faces.

back to top