Mobile Augmented Reality: Aroundplot: Focus+context interface for off-screen objects in 3D environments

  • Authors:
  • Hyungeun Jo;Sungjae Hwang;Hyunwoo Park;Jung-hee Ryu

  • Affiliations:
  • Graduate School of Culture Technology, KAIST, 335 Gwakak-ro, Yuseong-gu, Daejeon, Republic of Korea;Graduate School of Culture Technology, KAIST, 335 Gwakak-ro, Yuseong-gu, Daejeon, Republic of Korea;Graduate School of Culture Technology, KAIST, 335 Gwakak-ro, Yuseong-gu, Daejeon, Republic of Korea;Graduate School of Culture Technology, KAIST, 335 Gwakak-ro, Yuseong-gu, Daejeon, Republic of Korea

  • Venue:
  • Computers and Graphics
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In exploring 3D environments from a first-person viewpoint, the narrow field-of-view makes it difficult to search for an off-screen object, a task that becomes even harder if the user is looking through the small screen of a mobile phone. This paper presents Aroundplot, a novel focus+context interface for providing multiple location cues for off-screen objects in an immersive 3D environment. One part of this technique is a mapping method from 3D spherical coordinates to 2D orthogonal fisheye, which tackles the problems of existing 3D location cue displays such as occlusion among the cues and discordance with the human frame of reference. The other part is a dynamic magnification method that magnifies the context in the direction the view is moving to alleviate the distortion of the orthogonal fisheye and thus to support precise movement. In an evaluation, the participants could find the target object for a given location cue faster and more accurately with Aroundplot than with a top-down 2D radar. They were more accurate with Aroundplot than with a 3D arrow cluster when the number of objects was large; however, accuracy with a small number of objects and the search speed with any number of objects were not significantly different.