Music structure analysis by finding repeated parts
Proceedings of the 1st ACM workshop on Audio and music computing multimedia
Rhythm metadata enabled intra-track navigation and content modification in a music player
MUM '06 Proceedings of the 5th international conference on Mobile and ubiquitous multimedia
A supervised classification algorithm for note onset detection
EURASIP Journal on Applied Signal Processing
Template-based estimation of time-varying tempo
EURASIP Journal on Applied Signal Processing
Accurate tempo estimation based on harmonic + noise decomposition
EURASIP Journal on Applied Signal Processing
Enriching music with synchronized lyrics, images and colored lights
Proceedings of the 1st international conference on Ambient media and systems
Beyond the beat: modelling intentions in a virtual conductor
Proceedings of the 2nd international conference on INtelligent TEchnologies for interactive enterTAINment
Automatic transcription of melody, bass line, and chords in polyphonic music
Computer Music Journal
Tango or Waltz?: putting ballroom dance style into tempo detection
EURASIP Journal on Audio, Speech, and Music Processing - Intelligent Audio, Speech, and Music Processing Applications
Temporal interaction between an artificial orchestra conductor and human musicians
Computers in Entertainment (CIE) - SPECIAL ISSUE: Media Arts (Part II)
Creating an autonomous dancing robot
Proceedings of the 2009 International Conference on Hybrid Information Technology
Music structure analysis using a probabilistic fitness measure and a greedy search algorithm
IEEE Transactions on Audio, Speech, and Language Processing
Audio signal representations for indexing in the transform domain
IEEE Transactions on Audio, Speech, and Language Processing
Towards timbre-invariant audio features for harmony-based music
IEEE Transactions on Audio, Speech, and Language Processing
Music tempo estimation with k-NN regression
IEEE Transactions on Audio, Speech, and Language Processing
A music search engine for therapeutic gait training
Proceedings of the international conference on Multimedia
Proceedings of the 2011 Workshop on Open Source and Design of Communication
Pattern induction and matching in music signals
CMMR'10 Proceedings of the 7th international conference on Exploring music contents
A tempo-sensitive music search engine with multimodal inputs
MIRUM '11 Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies
Advantages of nonstationary gabor transforms in beat tacking
MIRUM '11 Proceedings of the 1st international ACM workshop on Music information retrieval with user-centered and multimodal strategies
Towards a reactive virtual trainer
IVA'06 Proceedings of the 6th international conference on Intelligent Virtual Agents
Towards bi-directional dancing interaction
ICEC'06 Proceedings of the 5th international conference on Entertainment Computing
Interacting with a virtual conductor
ICEC'06 Proceedings of the 5th international conference on Entertainment Computing
Comparing onset detection methods based on spectral features
Proceedings of the Workshop on Open Source and Design of Communication
Meter detection from audio for indian music
CMMR'11 Proceedings of the 8th international conference on Speech, Sound and Music Processing: embracing research in India
A domain-specific music search engine for gait training
Proceedings of the 20th ACM international conference on Multimedia
Automatic music transcription: challenges and future directions
Journal of Intelligent Information Systems
Hi-index | 0.00 |
A method is described which analyzes the basic pattern of beats in a piece of music, the musical meter. The analysis is performed jointly at three different time scales: at the temporally atomic tatum pulse level, at the tactus pulse level which corresponds to the tempo of a piece, and at the musical measure level. Acoustic signals from arbitrary musical genres are considered. For the initial time-frequency analysis, a new technique is proposed which measures the degree of musical accent as a function of time at four different frequency ranges. This is followed by a bank of comb filter resonators which extracts features for estimating the periods and phases of the three pulses. The features are processed by a probabilistic model which represents primitive musical knowledge and uses the low-level observations to perform joint estimation of the tatum, tactus, and measure pulses. The model takes into account the temporal dependencies between successive estimates and enables both causal and noncausal analysis. The method is validated using a manually annotated database of 474 music signals from various genres. The method works robustly for different types of music and improves over two state-of-the-art reference methods in simulations.