Virtual environments and advanced interface design
Recognizing Action Units for Facial Expression Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Survey of Energy Efficient Network Protocols for Wireless Networks
Wireless Networks
Extraction of Visual Features for Lipreading
IEEE Transactions on Pattern Analysis and Machine Intelligence
Ambient touch: designing tactile interfaces for handheld devices
Proceedings of the 15th annual ACM symposium on User interface software and technology
SenSay: A Context-Aware Mobile Phone
ISWC '03 Proceedings of the 7th IEEE International Symposium on Wearable Computers
Perception of Short Tactile Pulses Generated by a Vibration Motor in a Mobile Phone
WHC '05 Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
A role for haptics in mobile interaction: initial design using a handheld tactile display prototype
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multidimensional tactons for non-visual information presentation in mobile devices
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Introduction to haptic rendering
SIGGRAPH '05 ACM SIGGRAPH 2005 Courses
Shoogle: excitatory multimodal interaction on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tactile feedback for mobile interactions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lipless Tracking and Emotion Estimation
SITIS '07 Proceedings of the 2007 Third International IEEE Conference on Signal-Image Technologies and Internet-Based System
Perceived magnitude and power consumption of vibration feedback in mobile devices
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: interaction platforms and techniques
Shake2Talk: multimodal messaging for interpersonal communication
HAID'07 Proceedings of the 2nd international conference on Haptic and audio interaction design
Interactive racing game with graphic and haptic feedback
HAID'07 Proceedings of the 2nd international conference on Haptic and audio interaction design
Vibrotactile letter reading using a low-resolution tactor array
HAPTICS'04 Proceedings of the 12th international conference on Haptic interfaces for virtual environment and teleoperator systems
Turn Your Mobile Into the Ball: Rendering Live Football Game Using Vibration
IEEE Transactions on Multimedia
FEPS: a sensory substitution system for the blind to perceive facial expressions
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
Hi-index | 0.00 |
Today, the mobile phone technology is mature enough to enable us to effectively interact with mobile phones using our three major senses namely, vision, hearing and touch. Similar to the camera, which adds interest and utility to mobile experience, the vibration motor in a mobile phone could give us a new possibility to improve interactivity and usability of mobile phones. In this chapter, we show that by carefully controlling vibration patterns, more than 1-bit information can be rendered with a vibration motor. We demonstrate how to turn a mobile phone into a social interface for the blind so that they can sense emotional information of others. The technical details are given on how to extract emotional information, design vibrotactile coding schemes, render vibrotactile patterns, as well as how to carry out user tests to evaluate its usability. Experimental studies and users tests have shown that we do get and interpret more than one bit emotional information. This shows a potential to enrich mobile phones communication among the users through the touch channel.