Using emotional context from article for contextual music recommendation
Journal
MM 2013 - Proceedings of the 2013 ACM Multimedia Conference
ISBN
9781450324045
Date Issued
2013-11-18
Author(s)
Abstract
This paper proposes a context-aware approach that recommends music to a user based on the user's emotional state predicted from the article the user writes. We analyze the association between user-generated text and music by using a real-world dataset with tripartite information collected from the social blogging website LiveJournal. The audio information represents various perceptual dimensions of music listening, including danceability, loudness, mode, and tempo; the emotional text information consists of bag-of-words and three dimensional affective states within an article: valence, arousal and dominance. To combine these factors for music recommendation, a factorization machine- based approach is taken. Our evaluation shows that the emotional context information mined from user-generated articles does improve the quality of recommendation, com- paring to either the collaborative filtering approach or the content-based approach. Copyright © 2013 ACM.
Subjects
Emotion-based music recommendation | Listening context
Type
conference paper
