Refining a relational theory with multiple faults in the concept and subconcepts
ML92 Proceedings of the ninth international workshop on Machine learning
Learning Logical Definitions from Relations
Machine Learning
Constructive adaptive user interfaces: composing music based on human feelings
Eighteenth national conference on Artificial intelligence
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Emotional Coloring of Computer-Controlled Music Performances
Computer Music Journal
Music compositional intelligence with an affective flavor
Proceedings of the 12th international conference on Intelligent user interfaces
An emotion-driven musical piece generator for a constructive adaptive user interface
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
Constructive Adaptive User Interfaces Based on Brain Waves
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
Addressing the problems of data-centric physiology-affect relations modeling
Proceedings of the 15th international conference on Intelligent user interfaces
A musical system for emotional expression
Knowledge-Based Systems
Hi-index | 0.00 |
This research investigates the use of emotion data derived from analyzing change in activity in the autonomic nervous system (ANS) as revealed by brainwave production to support the creative music compositional intelligence of an adaptive interface. A relational model of the influence of musical events on the listener's affect is first induced using inductive logic programming paradigms with the emotion data and musical score features as inputs of the induction task. The components of composition such as interval and scale, instrumentation, chord progression and melody are automatically combined using genetic algorithm and melodic transformation heuristics that depend on the predictive knowledge and character of the induced model. Out of the four targeted basic emotional states, namely, stress, joy, sadness, and relaxation, the empirical results reported here show that the system is able to successfully compose tunes that convey one of these affective states.