Bimanual handheld mixed reality interfaces for urban planning

  • Authors:
  • Markus Sareika;Dieter Schmalstieg

  • Affiliations:
  • Graz University of Technology, Graz;Graz University of Technology, Graz

  • Venue:
  • Proceedings of the International Conference on Advanced Visual Interfaces
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Tabletop models are common in architectural and urban planning tasks. We report here on an investigation for view navigation in and manipulation of tracked tabletop models using a handheld Mixed Reality interface targeted at a user group with varying professional background and skill level. Users were asked to complete three basic task types: searching, inserting and creating content in a mixed reality scene, each requiring the user to navigate in the scene while interacting. This study was designed to naturally progress on classic problems like travel, selection and manipulation in an applied scenario concerned with urban planning. The novel bimanual interface configurations utilize a handheld touch screen display for Mixed Reality, with the camera/viewpoint attached or handheld separately. Usability aspects and user satisfaction are scrutinized by a user study, aimed at optimizing usability and supporting the user's intentions in a natural way. We present the results from the user study showing significant differences in task completion times as well as user preferences and practical issues concerning both interface and view navigation design.