Saliency-Driven tactile effect authoring for real-time visuotactile feedback

  • Authors:
  • Myongchan Kim;Sungkil Lee;Seungmoon Choi

  • Affiliations:
  • Pohang University of Science and Technology, Korea;Sungkyunkwan University, Korea;Pohang University of Science and Technology, Korea

  • Venue:
  • EuroHaptics'12 Proceedings of the 2012 international conference on Haptics: perception, devices, mobility, and communication - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

New-generation media such as the 4D film have appeared lately to deliver immersive physical experiences, yet the authoring has relied on content artists, impeding the popularization of such media. An automated approach for the authoring becomes increasingly crucial in lowering production costs and saving user interruption. This paper presents a fully automated framework of authoring tactile effects from existing video images to render synchronized visuotactile stimuli in real time. The spatiotemporal features of video images are analyzed in terms of visual saliency and translated into tactile cues that are rendered on tactors installed on a chair. A user study was conducted to evaluate the usability of visuotactile rendering against visual-only presentation. The result indicated that the visuotactile rendering can improve the movie to be more interesting, immersive, appealing, and understandable.