论文标题

朝着跨文化影响识别:六种文化的野外视听情感识别

Towards Intercultural Affect Recognition: Audio-Visual Affect Recognition in the Wild Across Six Cultures

论文作者

Mathur, Leena, Adolphs, Ralph, Matarić, Maja J

论文摘要

在我们的多元文化世界中,支持人类的情感意识AI系统需要能够在跨文化的情绪表达模式中感知影响的能力。这些系统必须在文化背景下表现良好,而没有注释的影响数据集可用于培训模型。情感计算中的一个标准假设是,在同一文化中受过训练和使用的识别模型(文化内部)的表现将比在一种文化中训练并在不同文化(跨文化)上使用的模型更好。我们测试了这一假设,并使用来自六种文化的现实世界二元相互作用的视频进行了对跨文化影响识别模型的首次系统研究。我们在时间因果发现下开发了一种基于注意力的特征选择方法,以识别可以在跨文化情感识别模型中利用的行为线索。在所有六种文化中,我们的发现表明,跨文化影响识别模型比内文化模型更有效或更有效。我们确定并为跨文化情感识别而做出有用的行为特征;在本研究的背景下,视觉方式的面部特征比音频方式更有用。我们的论文介绍了跨文化影响识别系统的未来发展的概念证明和动机,尤其是在没有注释数据的情况下部署在低资源情况下的概念和动机。

In our multicultural world, affect-aware AI systems that support humans need the ability to perceive affect across variations in emotion expression patterns across cultures. These systems must perform well in cultural contexts without annotated affect datasets available for training models. A standard assumption in affective computing is that affect recognition models trained and used within the same culture (intracultural) will perform better than models trained on one culture and used on different cultures (intercultural). We test this assumption and present the first systematic study of intercultural affect recognition models using videos of real-world dyadic interactions from six cultures. We develop an attention-based feature selection approach under temporal causal discovery to identify behavioral cues that can be leveraged in intercultural affect recognition models. Across all six cultures, our findings demonstrate that intercultural affect recognition models were as effective or more effective than intracultural models. We identify and contribute useful behavioral features for intercultural affect recognition; facial features from the visual modality were more useful than the audio modality in this study's context. Our paper presents a proof-of-concept and motivation for the future development of intercultural affect recognition systems, especially those deployed in low-resource situations without annotated data.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源