Natural language with discrete speech as a mode for human-to-machine
Communications of the ACM
How are windows used? Some notes on creating an empirically-based windowing benchmark task
CHI '86 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A comparison of tiled and overlapping windows
CHI '86 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The utility of speech input in user-computer interfaces
International Journal of Man-Machine Studies
A Taxonomy of Window Manager User Interfaces
IEEE Computer Graphics and Applications
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
ACM SIGCAPH Computers and the Physically Handicapped
Phoneshell: the telephone as computer terminal
MULTIMEDIA '93 Proceedings of the first ACM international conference on Multimedia
Survey of current speech technology
Communications of the ACM
ACM SIGCAPH Computers and the Physically Handicapped
Hanging on the ‘wire: a field study of an audio-only media space
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on speech as data
The Role of Speech Input in Wearable Computing
IEEE Pervasive Computing
An evaluation of an augmented reality multimodal interface using speech and paddle gestures
ICAT'06 Proceedings of the 16th international conference on Advances in Artificial Reality and Tele-Existence
Hi-index | 4.12 |
Some necessary background in speech recognition and window systems is given, with an analysis of how they might be combined. Xspeak, a navigation application, and its operation and a field study of its use are described. With Xspeak, window navigation tasks usually performed with a mouse can be controlled by voice. An improved version, Xspeak II, which incorporates a language for translating spoken commands, is introduced.