论文标题
朝向多模式mir:预测音乐引起的运动的个体差异
Towards Multimodal MIR: Predicting individual differences from music-induced movement
论文作者
论文摘要
随着音乐信息检索的发展,重要的是要考虑音乐的多模式以及如何考虑音乐参与的各个方面。身体运动与音乐普遍相关,并反映了与音乐偏好相关的重要个人特征,例如人格,情绪和同理心。未来的多模式mir系统可能会受益于这些方面。当前的研究通过识别个体差异,特别是五个个性特征,并从参与者的自由舞蹈运动中获得同理心和系统化商(EQ/SQ)来解决这一问题。我们的模型成功地探索了看不见的人格以及EQ,SQ,后者以前尚未完成。人格,EQ和SQ的R2得分分别为76.3%,77.1%和86.7%。作为后续行动,我们研究了哪些身体关节在定义这些特征方面最重要。我们讨论如何进一步的研究如何探讨这些特征对运动模式的映射如何用于构建更个性化的多模式推荐系统以及潜在的治疗应用。
As the field of Music Information Retrieval grows, it is important to take into consideration the multi-modality of music and how aspects of musical engagement such as movement and gesture might be taken into account. Bodily movement is universally associated with music and reflective of important individual features related to music preference such as personality, mood, and empathy. Future multimodal MIR systems may benefit from taking these aspects into account. The current study addresses this by identifying individual differences, specifically Big Five personality traits, and scores on the Empathy and Systemizing Quotients (EQ/SQ) from participants' free dance movements. Our model successfully explored the unseen space for personality as well as EQ, SQ, which has not previously been accomplished for the latter. R2 scores for personality, EQ, and SQ were 76.3%, 77.1%, and 86.7% respectively. As a follow-up, we investigated which bodily joints were most important in defining these traits. We discuss how further research may explore how the mapping of these traits to movement patterns can be used to build a more personalized, multi-modal recommendation system, as well as potential therapeutic applications.