A gesture based user interface prototyping system
UIST '89 Proceedings of the 2nd annual ACM SIGGRAPH symposium on User interface software and technology
Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Making sense of sensing systems: five questions for designers and researchers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal human discourse: gesture and speech
ACM Transactions on Computer-Human Interaction (TOCHI)
Research Challenges in Gesture: Open Issues and Unsolved Problems
Proceedings of the International Gesture Workshop on Gesture and Sign Language in Human-Computer Interaction
ComTouch: design of a vibrotactile communication device
DIS '02 Proceedings of the 4th conference on Designing interactive systems: processes, practices, methods, and techniques
Texture Presentation by Vibratory Tactile Display Image based presentation of a tactile texture
VRAIS '97 Proceedings of the 1997 Virtual Reality Annual International Symposium (VRAIS '97)
Multisensory interaction metaphors with haptics and proprioception in virtual environments
Proceedings of the third Nordic conference on Human-computer interaction
Mixed interaction space: designing for camera based interaction with mobile devices
CHI '05 Extended Abstracts on Human Factors in Computing Systems
A study on the use of semaphoric gestures to support secondary task interactions
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Foreground and background interaction with sensor-enhanced mobile devices
ACM Transactions on Computer-Human Interaction (TOCHI)
The perceived roughness of resistive virtual textures: I. rendering by a force-feedback mouse
ACM Transactions on Applied Perception (TAP)
Design of spatially aware graspable displays
CHI EA '97 CHI '97 Extended Abstracts on Human Factors in Computing Systems
Can we do without GUIs? Gesture and speech interaction with a patient information system
Personal and Ubiquitous Computing
Explicit task representation based on gesture interaction
MMUI '05 Proceedings of the 2005 NICTA-HCSNet Multimodal User Interaction Workshop - Volume 57
Haptics in Virtual Reality and Multimedia
IEEE MultiMedia
Shoogle: excitatory multimodal interaction on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Stane: synthesized surfaces for tactile input
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Activity recognition system for mobile phones using the MotionBand device
Proceedings of the 1st international conference on MOBILe Wireless MiddleWARE, Operating Systems, and Applications
A stroking device for spatially separated couples
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Dynamic audiotactile feedback in gesture interaction
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
Perception of dynamic audiotactile feedback to gesture input
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Comparison of Skin Stretch and Vibrotactile Stimulation for Feedback of Proprioceptive Information
HAPTICS '08 Proceedings of the 2008 Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Mobile multi-actuator tactile displays
HAID'07 Proceedings of the 2nd international conference on Haptic and audio interaction design
Shake2Talk: multimodal messaging for interpersonal communication
HAID'07 Proceedings of the 2nd international conference on Haptic and audio interaction design
On the Synthesis of Haptic Textures
IEEE Transactions on Robotics
Guided by touch: tactile pedestrian navigation
Proceedings of the 1st international workshop on Mobile location-based service
Exploring the effects of cumulative contextual cues on interpreting vibrotactile messages
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Identifying emotions expressed by mobile users through 2D surface and 3D motion gestures
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
BCS-HCI '12 Proceedings of the 26th Annual BCS Interaction Specialist Group Conference on People and Computers
Haptically augmented remote speech communication: a study of user practices and experiences
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Hi-index | 0.01 |
Haptic gestures and sensations through the sense of touch are currently unavailable in remote communication. There are two main reasons for this: good quality haptic technology has not been widely available and knowledge on the use of this technology is limited. To address these challenges, we studied how users would like to, and managed to create spatial haptic information by gesturing. Two separate scenario-based experiments were carried out: an observation study without technological limitations, and a study on gesturing with a functional prototype with haptic actuators. The first study found three different use strategies for the device. The most common gestures were shaking, smoothing and tapping. Multimodality was requested to create the context for the communication and to aid the interpretation of haptic stimuli. The second study showed that users were able to utilize spatiality in haptic messages (e.g., forward-backward gesture for agreement). However, challenges remain in presenting more complex information via remote haptic communication. The results give guidance for communication activities that are usable in spatial haptic communication, and how to make it possible to enable this form of communication in reality.