Emotion Prediction from User-Generated Videos by Emotion Wheel Guided Deep Learning
Date Issued
2015
Date
2015
Author(s)
Ho, Che-Ting
Abstract
Predicting emotions in videos is important for many applications with the requirements of user reactions. Recently, the increasing web services on the Internet allow users to upload and share videos very conveniently. To build a robust system for predicting emotions in such user-generated videos is a quite challenging problem, due to the diversity of contents and high-level abstrac- tions of human emotions. Motivated by the success of Convolutional Neural Networks (CNN) in several visual competitions, it is a prospective solution to bridge this affective gap. In this paper, we propose a multimodal framework to predict emotions in user-generated videos based on CNN extracted fea- tures. Psychological emotion wheel is included to learn better representations as compare with its simply transfer learning counterpart. We also showed through experiments that traditional encoding methods for local features can help improve the prediction performance. Experiments conducted on a real- world dataset from Youtube and Flickr demonstrate that our proposed frame- work outperforms the previous related work, in terms of prediction accuracy rate, by 54.2% to 46.1%.
Subjects
deep convolutional neural network
emotion prediction
user-generated video
Type
thesis
File(s)![Thumbnail Image]()
Loading...
Name
ntu-104-R02922109-1.pdf
Size
23.32 KB
Format
Adobe PDF
Checksum
(MD5):b74178ffc972c39d4ffb64bb6f077920
