Determining mental state from EEG signals using parallel implementations of neural networks
Scientific Programming - On applications analysis
Pass-thoughts: authenticating with our minds
NSPW '05 Proceedings of the 2005 workshop on New security paradigms
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
The Journal of Machine Learning Research
Audiovisual laughter detection based on temporal features
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
ISM '08 Proceedings of the 2008 Tenth IEEE International Symposium on Multimedia
Affective video content representation and modeling
IEEE Transactions on Multimedia
Proceedings of the international conference on Multimedia information retrieval
Finding the user's interest level from their eyes
Proceedings of the 2010 ACM workshop on Social, adaptive and personalized multimedia interaction and access
Implicit image tagging via facial information
Proceedings of the 2nd international workshop on Social signal processing
BI'10 Proceedings of the 2010 international conference on Brain informatics
Affect recognition based on physiological changes during the watching of music videos
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special Issue on Affective Interaction in Natural Environments
NeuroPlace: making sense of a place
Proceedings of the 4th Augmented Human International Conference
Hi-index | 0.00 |
In multimedia content sharing social networks, tags assigned to content play an important role in search and retrieval. In other words, by annotating multimedia content, users can associate a word or a phrase (tag) with that resource such that it can be searched for efficiently. Implicit tagging refers to assigning tags by observing subjects behavior during consumption of multimedia content. This is an alternative to traditional explicit tagging which requires an explicit action by subjects. In this paper we propose a brain-computer interface (BCI) system based on P300 evoked potential, for implicit emotional tagging of multimedia content. We show that our system can successfully perform implicit emotional tagging and naïve subjects who have not participated in training of the system can also use it efficiently. Moreover, we introduce a subjective metric called "emotional taggability" to analyze the recognition performance of the system, given the degree of ambiguity that exists in terms of emotional values associated with a multimedia content.