Exploiting Metadata for Music Emotion Classification
Date Issued
2009
Date
2009
Author(s)
Lin, Yu-Ching
Abstract
Along with the explosive growth of the social tagging systems and musical web services, abundant musical metadata are readily obtainable from the Internet. Since most metadata are related to the human perception of music, they can be utilized to bridge the so-called semantic gap between audio signals and high-level semantics for content-based music classification. In this thesis, we first examine the correlation between emotion and the musical metadta by a statistical association test, and then exploit such correlation for emotion classification. We propose to divide songs according to the metadata and build a metadata-specific model to concentrate on the classification in each group. Since a song can be associated with different types of metadata, such as genre and style, we further propose a novel adaptive fusion scheme to utilize all types of metadata. While the existing methods of exploiting metadata are hampered by the noise and sparseness inherent to metadata, the proposed scheme overcomes these difficulties and significantly improves the accuracy of emotion classification.
Subjects
music emotion classification
metadata exploitation
adaptive fusion
Type
thesis
File(s)![Thumbnail Image]()
Loading...
Name
ntu-98-R96942052-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):d36c9bf4a91c8859e4d03f7ef7acc7e1
