An optimal algorithm for approximate nearest neighbor searching fixed dimensions
Journal of the ACM (JACM)
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Scalable Recognition with a Vocabulary Tree
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
World-scale mining of objects and events from community photo collections
CIVR '08 Proceedings of the 2008 international conference on Content-based image and video retrieval
Outdoors augmented reality on mobile phone using loxel-based visual feature organization
MIR '08 Proceedings of the 1st ACM international conference on Multimedia information retrieval
Searching the web with mobile images for location recognition
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
SURF: speeded up robust features
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
An augmented reality tourist guide on your mobile devices
MMM'10 Proceedings of the 16th international conference on Advances in Multimedia Modeling
Using mobile phone cameras to interact with ontological data
EUROCAST'11 Proceedings of the 13th international conference on Computer Aided Systems Theory - Volume Part II
Interacting with social networks of intelligent things and people in the world of gastronomy
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special section on internet-scale human problem solving and regular papers
Hi-index | 0.00 |
We present a system for making community driven websites easily accessible from the latest mobile devices. Many of these new devices contain an ensemble of sensors such as cameras, GPS and inertial sensors. We demonstrate how these new sensors can be used to bring the information contained in sites like Wikipedia to users in a much more immersive manner than text or maps. We have collected a large database of images and articles from Wikipedia and show how a user can query this database by simply snapping a photo. Our system uses the location sensors to assist with image matching and the inertial sensors to provide a unique and intuitive user interface for browsing results.