论文标题
生成带有尺寸影响标签的近红外面部表达数据集
Generating near-infrared facial expression datasets with dimensional affect labels
论文作者
论文摘要
长期以来,面部表达分析一直是计算机视觉的积极研究领域。传统方法主要分析原型离散情绪的图像;结果,它们不能准确描述人类复杂的情绪状态。此外,在可见光光谱中,照明方差仍然是面部分析的挑战。为了解决这些问题,我们建议使用基于价和唤醒的维数模型,以代表更广泛的情绪,并结合近红外(NIR)图像,这对于照明变化更加可靠。由于没有现有的NIR面部表达数据集具有价值标签,因此我们提供两种互补数据增强方法(面部变形和自行车方法),可以创建具有来自现有分类和/或可见光光数据集的维度情感标签的NIR Image数据集。我们的实验表明,这些生成的NIR数据集在数据质量和基线预测性能方面与现有数据集相当。
Facial expression analysis has long been an active research area of computer vision. Traditional methods mainly analyse images for prototypical discrete emotions; as a result, they do not provide an accurate depiction of the complex emotional states in humans. Furthermore, illumination variance remains a challenge for face analysis in the visible light spectrum. To address these issues, we propose using a dimensional model based on valence and arousal to represent a wider range of emotions, in combination with near infra-red (NIR) imagery, which is more robust to illumination changes. Since there are no existing NIR facial expression datasets with valence-arousal labels available, we present two complementary data augmentation methods (face morphing and CycleGAN approach) to create NIR image datasets with dimensional emotion labels from existing categorical and/or visible-light datasets. Our experiments show that these generated NIR datasets are comparable to existing datasets in terms of data quality and baseline prediction performance.