Pages in this blog

Wednesday, January 31, 2018

Visual and auditory brain areas share a neural code for perceived emotion

Beau Sievers, Thalia Wheatley, Visual and auditory brain areas share a neural code for perceived emotion, bioRxiv, https://doi.org/10.1101/254961
Abstract: Emotion communication must be robust to interference from a noisy environment. One safeguard against interference is crossmodal redundancy--for example, conveying the same information using both sound and movement. Emotion perceivers should therefore be adapted to efficiently detect crossmodal correspondences, increasing the likelihood that emotion signals will be understood. One possible such adaptation is the use of a single neural code for both auditory and visual information. To investigate this, we tested two hypotheses: (1) that distinct auditory and visual brain areas represent emotion expressions using the same parameters, and (2) that auditory and visual expressions of emotion are represented together in one brain area using a supramodal neural code. We presented emotion expressions during functional magnetic resonance imaging (N=20, 3 scan hrs/participant) and tested these hypotheses using representational similarity analysis (Kriegeskorte & Kievit, 2013). A single model of stimulus features and emotion content fit brain activity in both auditory and visual areas, supporting hypothesis (1), and posterior superior temporal gyrus represented both auditory and visual emotion expressions, supporting hypothesis (2). These results hold for both discrete and mixed (e.g., Happy-Sad) emotional expressions. Surprisingly, further exploratory analysis showed auditory and visual areas represent stimulus features and emotion content even when stimuli are presented in each area's non-preferred modality.

No comments:

Post a Comment