Developing a context-aware electronic tourist guide: some issues and experiences
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The augurscope: a mixed reality interface for outdoors
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Exciting understanding in Pompeii through on-site parallel interaction with dual time virtual models
Proceedings of the 2001 conference on Virtual reality, archeology, and cultural heritage
Reviving the past: cultural heritage meets virtual reality
Proceedings of the 2001 conference on Virtual reality, archeology, and cultural heritage
Peephole displays: pen interaction on spatially aware handheld computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Reviren: augmenting virtual environments with personal digital assistants
SAICSIT '04 Proceedings of the 2004 annual research conference of the South African institute of computer scientists and information technologists on IT research in developing countries
So James, can you find your way any faster?: exploring navigation aids for taxi drivers
Mobility '07 Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology
LightSense: enabling spatially aware handheld interaction devices
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Novel interface for first person shooting games on PDAs
Proceedings of the 20th Australasian Conference on Computer-Human Interaction: Designing for Habitus and Habitat
Experiencing personalised heritage exhibitions through multimodal mixed reality interfaces
iUBICOM'10 Proceedings of the 5th international conference on Ubiquitous and Collaborative Computing
Hi-index | 0.00 |
In this paper, we describe a project which uses PDAs to provide an interactive experience with a virtual environment. In particular, we focus on the navigational aspects of allowing the users to move through, and view, the environment. As this system will be deployed in a museum, it was crucial that the navigation be as intuitive as possible. To that end, we developed and evaluated two prototypes: one was based purely on gesture, whilst the other used a combination of gesture and keypad. For the purposes of our application, the combination of keypad and gesture provided the most effective.