Segment-level display time as implicit feedback: a comparison to eye tracking
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Is this joke really funny? judging the mirth by audiovisual laughter analysis
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
Image ranking with implicit feedback from eye movements
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Implicit image tagging via facial information
Proceedings of the 2nd international workshop on Social signal processing
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
An eye-tracking-based approach to facilitate interactive video search
Proceedings of the 1st ACM International Conference on Multimedia Retrieval
Implicit relevance feedback from eye movements
ICANN'05 Proceedings of the 15th international conference on Artificial Neural Networks: biological Inspirations - Volume Part I
A Multimodal Database for Affect Recognition and Implicit Tagging
IEEE Transactions on Affective Computing
Hierarchical On-line Appearance-Based Tracking for 3D head pose, eyebrows, lips, eyelids and irises
Image and Vision Computing
Hi-index | 0.00 |
Users react differently to non-relevant and relevant tags associated with content. These spontaneous reactions can be used for labeling large multimedia databases. We present a method to assess tag relevance to images using the non-verbal bodily responses, namely, electroencephalogram (EEG), facial expressions, and eye gaze. We conducted experiments in which 28 images were shown to 28 subjects once with correct and another time with incorrect tags. The goal of our system is to detect the responses to non-relevant tags and consequently filter them out. Therefore, we trained classifiers to detect the tag relevance from bodily responses. We evaluated the performance of our system using a subject independent approach. The precision at top 5% and top 10% detections were calculated and results of different modalities and different classifiers were compared. The results show that eye gaze outperforms the other modalities in tag relevance detection both overall and for top ranked results.