Relationship between auditory processing and affective prosody in schizophrenia.
Patients with schizophrenia have well-established deficits in their ability to identify emotion from facial expression and tone of voice. In the visual modality, there is strong evidence that basic processing deficits contribute to impaired facial affect recognition in schizophrenia. However, few studies have examined the auditory modality for mechanisms underlying affective prosody identification. In this study, we explored links between different stages of auditory processing, using event-related potentials (ERPs), and affective prosody detection in schizophrenia. Thirty-six schizophrenia patients and 18 healthy control subjects received tasks of affective prosody, facial emotion identification, and tone matching, as well as two auditory oddball paradigms, one passive for mismatch negativity (MMN) and one active for P300. Patients had significantly reduced MMN and P300 amplitudes, impaired auditory and visual emotion recognition, and poorer tone matching performance, relative to healthy controls. Correlations between ERP and behavioral measures within the patient group revealed significant associations between affective prosody recognition and both MMN and P300 amplitudes. These relationships were modality specific, as MMN and P300 did not correlate with facial emotion recognition. The two ERP waves accounted for 49% of the variance in affective prosody in a regression analysis. Our results support previous suggestions of a relationship between basic auditory processing abnormalities and affective prosody dysfunction in schizophrenia, and indicate that both relatively automatic pre-attentive processes (MMN) and later attention-dependent processes (P300) are involved with accurate auditory emotion identification. These findings provide support for bottom-up (e.g., perceptually based) cognitive remediation approaches. Published by Elsevier B.V.