User Modelling for Interactive User-Adaptive Collection Structuring
Adaptive Multimedial Retrieval: Retrieval, User, and Semantics
Learning Similarity Functions from Qualitative Feedback
ECCBR '08 Proceedings of the 9th European conference on Advances in Case-Based Reasoning
LIBLINEAR: A Library for Large Linear Classification
The Journal of Machine Learning Research
Caliph & Emir: MPEG-7 photo annotation and retrieval
MM '09 Proceedings of the 17th ACM international conference on Multimedia
Evaluation of adaptive SpringLens: a multi-focus interface for exploring multimedia collections
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Towards user-adaptive structuring and organization of music collections
AMR'08 Proceedings of the 6th international conference on Adaptive Multimedia Retrieval: identifying, Summarizing, and Recommending Image and Music
IEEE Transactions on Circuits and Systems for Video Technology
Evaluation of adaptive SpringLens: a multi-focus interface for exploring multimedia collections
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
MusicGalaxy: a multi-focus zoomable interface for multi-facet exploration of music collections
CMMR'10 Proceedings of the 7th international conference on Exploring music contents
Bisociative music discovery and recommendation
Bisociative Knowledge Discovery
An experimental comparison of similarity adaptation approaches
AMR'11 Proceedings of the 9th international conference on Adaptive Multimedia Retrieval: large-scale multimedia retrieval and evaluation
Learning music similarity from relative user ratings
Information Retrieval
Hi-index | 0.00 |
Sometimes users of a multimedia retrieval system are not able to explicitly state their information need. They rather want to browse a collection in order to get an overview and to discover interesting content. Exploratory retrieval tools support users in search scenarios where the retrieval goal cannot be stated explicitly as a query or user rather want to browse a collection in order to get an overview and to discover interesting content. In previous work, we have presented Adaptive SpringLens --- an interactive visualization technique building upon popular neighborhood-preserving projections of multimedia collections. It uses a complex multi-focus fish-eye distortion of a projection to visualize neighborhood that is automatically adapted to the user's current focus of interest. This paper investigates how far knowledge about the retrieval task collected during interaction can be used to adapt the underlying similarity measure that defines the neighborhoods.