Audio augmented reality: a prototype automated tour guide
CHI '95 Conference Companion on Human Factors in Computing Systems
Learning users' interests by unobtrusively observing their normal behavior
Proceedings of the 5th international conference on Intelligent user interfaces
A reinforcement learning agent for personalized information filtering
Proceedings of the 5th international conference on Intelligent user interfaces
Machine Learning
User Modeling for Personalized City Tours
Artificial Intelligence Review
Proceedings of the 8th international conference on Intelligent user interfaces
Global vs. Community Metadata Standards: Empowering Users for Knowledge Exchange
ISWC '02 Proceedings of the First International Semantic Web Conference on The Semantic Web
Proceedings of the 2004 ACM symposium on Applied computing
Letizia: an agent that assists web browsing
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Ontology-Based User Modeling in an Augmented Audio Reality System for Museums
User Modeling and User-Adapted Interaction
Spatial sound localization in an augmented reality environment
OZCHI '06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments
Adaptive, intelligent presentation of information for the museum visitor in PEACH
User Modeling and User-Adapted Interaction
Situated play in a tangible interface and adaptive audio museum guide
Personal and Ubiquitous Computing
Rules and ontologies in support of real-time ubiquitous application
Web Semantics: Science, Services and Agents on the World Wide Web
First steps to an audio ontology-based classifier for telemedicine
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
Ontology-Based classifier for audio scenes in telemedicine
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
An interactive visualisation interface for virtual museums
VAST'04 Proceedings of the 5th International conference on Virtual Reality, Archaeology and Intelligent Cultural Heritage
Tipple: location-triggered mobile access to a digital library for audio books
Proceedings of the 13th ACM/IEEE-CS joint conference on Digital libraries
StickEar: making everyday objects respond to sound
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
ec(h)o is an "augmented reality interface" utilizing spatialized soundscapes and a semantic web approach to information. The initial prototype is designed for a natural history and science museum. The platform is designed to create a museum experience that consists of a physical installation and an interactive virtual layer of three-dimensional soundscapes that are physically mapped to the museum displays. The source for the audio data is digital sound objects. The digital objects originate in a network of object repositories that connect digital content from one museum with other museums collections. The interface enables people to interact with the system by movement and object manipulation-based gestures without the direct use of a computer device. The focus of this paper is the retrieval mechanism for the sound objects for the museum visitor. The retrieval mechanism is built on the user model and conceptual descriptions of the sound object and museum artifacts in the form of ontologies for sound and psychoacoustics, topic ontology and Conceptual Reference Model for museum information. The retrieval criteria are represented as inference rules that represent knowledge from psychoacoustics, cognitive domain and composition aspects of interaction. The system will be demonstrated in exhibition space in Nature Museum in Ottawa in January 2003.