Design issues in interaction modeling for free improvisation
NIME '07 Proceedings of the 7th international conference on New interfaces for musical expression
Multi-modality web video categorization
Proceedings of the international workshop on Workshop on multimedia information retrieval
Music emotion recognition: the role of individuality
Proceedings of the international workshop on Human-centered multimedia
MusicSense: contextual music recommendation using emotional allocation modeling
Proceedings of the 15th international conference on Multimedia
Quantitative and visual analysis of the impact of music on perceived emotion of film
Computers in Entertainment (CIE) - Theoretical and Practical Computer Applications in Entertainment
Music Expression Understanding Based on a Joint Semantic Space
AI*IA '07 Proceedings of the 10th Congress of the Italian Association for Artificial Intelligence on AI*IA 2007: Artificial Intelligence and Human-Oriented Computing
MIR '08 Proceedings of the 1st ACM international conference on Multimedia information retrieval
Toward Multi-modal Music Emotion Classification
PCM '08 Proceedings of the 9th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Detecting Violent Scenes in Movies by Auditory and Visual Cues
PCM '08 Proceedings of the 9th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Personalized MTV Affective Analysis Using User Profile
PCM '08 Proceedings of the 9th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Language Feature Mining for Music Emotion Classification via Supervised Learning from Lyrics
ISICA '08 Proceedings of the 3rd International Symposium on Advances in Computation and Intelligence
Generating affective music icons in the emotion plane
CHI '09 Extended Abstracts on Human Factors in Computing Systems
A novel method for personalized music recommendation
Expert Systems with Applications: An International Journal
Sentiment vector space model for lyric-based song sentiment classification
HLT-Short '08 Proceedings of the 46th Annual Meeting of the Association for Computational Linguistics on Human Language Technologies: Short Papers
CompositeMap: a novel framework for music similarity measure
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Analytical features: a knowledge-based approach to audio feature generation
EURASIP Journal on Audio, Speech, and Music Processing
SVR-based music mood classification and context-based music recommendation
MM '09 Proceedings of the 17th ACM international conference on Multimedia
Matching information content with music
Proceedings of the third ACM conference on Recommender systems
Improving multilabel analysis of music titles: a large-scale validation of the correction approach
IEEE Transactions on Audio, Speech, and Language Processing
Perceptual organization of affective and sensorial expressive intentions in music performance
ACM Transactions on Applied Perception (TAP)
Feature selection for content-based, time-varying musical emotion regression
Proceedings of the international conference on Multimedia information retrieval
Music emotion classification and context-based music recommendation
Multimedia Tools and Applications
Emotion-based music visualization using photos
MMM'08 Proceedings of the 14th international conference on Advances in multimedia modeling
Probabilistic estimation of a novel music emotion model
MMM'08 Proceedings of the 14th international conference on Advances in multimedia modeling
Music video affective understanding using feature importance analysis
Proceedings of the ACM International Conference on Image and Video Retrieval
Improving mood classification in music digital libraries by combining lyrics and audio
Proceedings of the 10th annual joint conference on Digital libraries
Utilizing affective analysis for efficient movie browsing
ICIP'09 Proceedings of the 16th IEEE international conference on Image processing
Determination of nonprototypical valence and arousal in popular music: features and performances
EURASIP Journal on Audio, Speech, and Music Processing - Special issue on scalable audio-content analysis
On feature combination for music classification
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
Personalization in multimedia retrieval: A survey
Multimedia Tools and Applications
MusCat: a music browser featuring abstract pictures and zooming user interface
Proceedings of the 2011 ACM Symposium on Applied Computing
Exploiting online music tags for music emotion classification
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP) - Special section on ACM multimedia 2010 best paper candidates, and issue on social media
Beat space segmentation and octave scale cepstral feature for sung language recognition in pop music
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Building a personalized music emotion prediction system
PCM'06 Proceedings of the 7th Pacific Rim conference on Advances in Multimedia Information Processing
What does music mood mean for real users?
Proceedings of the 2012 iConference
Machine Recognition of Music Emotion: A Review
ACM Transactions on Intelligent Systems and Technology (TIST)
Generating ground truth for music mood classification using mechanical turk
Proceedings of the 12th ACM/IEEE-CS joint conference on Digital Libraries
A retrieval method adaptively reducing user's subjective impression gap
Multimedia Tools and Applications
Modeling concept dynamics for large scale music search
SIGIR '12 Proceedings of the 35th international ACM SIGIR conference on Research and development in information retrieval
Affective acoustic ecology: towards emotionally enhanced sound events
Proceedings of the 7th Audio Mostly Conference: A Conference on Interaction with Sound
The acoustic emotion gaussians model for emotion-based music annotation and retrieval
Proceedings of the 20th ACM international conference on Multimedia
What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?
ACM Transactions on Computer-Human Interaction (TOCHI)
Mood tracking of musical compositions
ISMIS'12 Proceedings of the 20th international conference on Foundations of Intelligent Systems
MoodScope: building a mood sensor from smartphone usage patterns
Proceeding of the 11th annual international conference on Mobile systems, applications, and services
Parametrisation and correlation analysis applied to music mood classification
International Journal of Computational Intelligence Studies
The Journal of Supercomputing
Optimizing cepstral features for audio classification
IJCAI'13 Proceedings of the Twenty-Third international joint conference on Artificial Intelligence
Seven problems that keep MIR from attracting the interest of cognition and neuroscience
Journal of Intelligent Information Systems
Hi-index | 0.00 |
Music mood describes the inherent emotional expression of a music clip. It is helpful in music understanding, music retrieval, and some other music-related applications. In this paper, a hierarchical framework is presented to automate the task of mood detection from acoustic music data, by following some music psychological theories in western cultures. The hierarchical framework has the advantage of emphasizing the most suitable features in different detection tasks. Three feature sets, including intensity, timbre, and rhythm are extracted to represent the characteristics of a music clip. The intensity feature set is represented by the energy in each subband, the timbre feature set is composed of the spectral shape features and spectral contrast features, and the rhythm feature set indicates three aspects that are closely related with an individual's mood response, including rhythm strength, rhythm regularity, and tempo. Furthermore, since mood is usually changeable in an entire piece of classical music, the approach to mood detection is extended to mood tracking for a music piece, by dividing the music into several independent segments, each of which contains a homogeneous emotional expression. Preliminary evaluations indicate that the proposed algorithms produce satisfactory results. On our testing database composed of 800 representative music clips, the average accuracy of mood detection achieves up to 86.3%. We can also on average recall 84.1% of the mood boundaries from nine testing music pieces.