Spoken dialogue technology: enabling the conversational user interface
ACM Computing Surveys (CSUR)
Voicexml: Introduction to Developing Speech Applications: Version
Voicexml: Introduction to Developing Speech Applications: Version
Navigating Telephone-Based Interfaces with Earcons
HCI 97 Proceedings of HCI on People and Computers XII
Sensitivity to haptic-audio asynchrony
Proceedings of the 5th international conference on Multimodal interfaces
Non-visual information display using tactons
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Audio-haptic feedback in mobile phones
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Improvements to a speech-enabled user assistance system based on pilot study results
SIGDOC '07 Proceedings of the 25th annual ACM international conference on Design of communication
Earcons and icons: their structure and common design principles
Human-Computer Interaction
Hi-index | 0.00 |
Non-speech sounds and haptics have an important role in enabling access to user assistance material in ubiquitous computing scenarios. Non-speech sounds and haptics can be used to cue assistance material that is to be presented to users via speech. In this paper, we report on a study that examines user perception of the duration of a pause between a cue (which may be non-speech sound, haptic, or combined non-speech sound plus haptic) and the subsequent delivery of assistance material using speech.