Modeling context in haptic perception, rendering, and visualization

  • Authors:
  • Kanav Kahol;Priyamvada Tripathi;Troy Mcdaniel;Laura Bratton;Sethuraman Panchanathan

  • Affiliations:
  • Arizona State University, Tempe, AZ;Arizona State University, Tempe, AZ;Arizona State University, Tempe, AZ;Arizona State University, Tempe, AZ;Arizona State University, Tempe, AZ

  • Venue:
  • ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Haptic perception refers to the ability of human beings to perceive spatial properties through touch-based sensations. In haptics, contextual clues about material,shape, size, texture, and weight configurations of an object are perceived by individuals leading to recognition of the object and its spatial features. In this paper, we present strategies and algorithms to model context in haptic applications that allow users to haptically explore objects in virtual reality/augmented reality environments. Initial results show significant improvement in accuracy and efficiency of haptic perception in augmented reality environments when compared to conventional approaches that do not model context in haptic rendering.