Multimodal error correction for speech user interfaces
ACM Transactions on Computer-Human Interaction (TOCHI)
Monitoring user distraction in a car by segmentation of experimental data
UAHCI'11 Proceedings of the 6th international conference on Universal access in human-computer interaction: context diversity - Volume Part III
Dictating and editing short texts while driving: distraction and task completion
Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Impact of word error rate on driving performance while dictating short texts
Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Hi-index | 0.00 |
We describe a prototype dictation UI for use in cars and evaluate it by measuring (1) driver's distraction, (2) task completion time, and (3) task completion quality. We use a simulated lane change test (LCT) to assess driving quality while using the prototype, while texting using a cell phone and when just driving. The prototype was used in two modes - with and without a display (eyes-free). Several statistics were collected from the reference and distracted driving LCT trips for a group of 11 test subjects. These statistics include driver's mean deviation from ideal path, the standard deviation of driver's lateral position on the road, reaction times and the amount and quality of entered text. We confirm that driving performance was significantly better when using a speech enabled UI compared to texting using a cell phone. Interestingly, we measured a significant improvement in driving quality when the same dictation prototype was used in eyes-free mode.