Affective content detection using HMMs
MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
80 Million Tiny Images: A Large Data Set for Nonparametric Object and Scene Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Affective video content representation and modeling
IEEE Transactions on Multimedia
Automatic Video Classification: A Survey of the Literature
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
We report results from a pilot study on mood-based classification of TV programmes. Short video clips from various programmes were labelled on three mood axes, giving the subjects opinion of how happy, serious and exciting each clip was. This data was used for mood classification based on automatically extracted audio and video features in a machine learning framework. Attention was given to the challenges of dealing with a small dataset as commonly obtained from pilot studies, showing that a thorough evaluation was possible and produced useful results. Introducing a new feature based on face detection and combining it with other signal processing features led to good classification accuracies. These lay between 85% and 100% for the most simple setting, and still reached more than 70% accuracy when a finer three point mood scale was used. Overall, the results were promising and showed that automatic mood classification of video material is possible. Moods can therefore be used as additional metadata to facilitate search in large archives.