CHI '86 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ACM Transactions on Graphics (TOG)
A new model for handling input
ACM Transactions on Information Systems (TOIS)
MMM: a user interface architecture for shared editors on a single screen
UIST '91 Proceedings of the 4th annual ACM symposium on User interface software and technology
The automatic recognition of gestures
The automatic recognition of gestures
Multimedia interface design
Interaction techniques using hand tracking and speech recognition
Multimedia interface design
Toolglass and magic lenses: the see-through interface
SIGGRAPH '93 Proceedings of the 20th annual conference on Computer graphics and interactive techniques
A design space for multimodal systems: concurrent processing and data fusion
CHI '93 Proceedings of the INTERACT '93 and CHI '93 Conference on Human Factors in Computing Systems
Two-handed input in a compound task
CHI '94 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the IFIP TC2/WG2.7 Working Conference on Engineering for Human-Computer Interaction
Defining the Dynamic Behaviour of Animated Interfaces
Proceedings of the IFIP TC2/WG2.7 Working Conference on Engineering for Human-Computer Interaction
Formes: An object and time oriented system for music composition and synthesis
LFP '84 Proceedings of the 1984 ACM Symposium on LISP and functional programming
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
ACM SIGCHI Bulletin
HyperMark: issuing commands by drawing marks in HyperCard
CHI '92 Posters and Short Talks of the 1992 SIGCHI Conference on Human Factors in Computing Systems
A new direct manipulation technique for aligning objects in drawing programs
Proceedings of the 9th annual ACM symposium on User interface software and technology
Pen computing for air traffic control
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Two-handed virtual manipulation
ACM Transactions on Computer-Human Interaction (TOCHI)
A software model and specification language for non-WIMP user interfaces
ACM Transactions on Computer-Human Interaction (TOCHI)
Programmer I'interaction avec des machines à états hiérarchiques
IHM '02 Proceedings of the 14th French-speaking conference on Human-computer interaction (Conférence Francophone sur l'Interaction Homme-Machine)
Revisiting visual interface programming: creating GUI tools for designers and programmers
Proceedings of the 17th annual ACM symposium on User interface software and technology
Extensible input handling in the subArctic toolkit
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multiple pointers: a study and an implementation
IHM 2003 Proceedings of the 15th French-speaking conference on human-computer interaction on 15eme Conference Francophone sur l'Interaction Homme-Machine
Formal description of a multimodal interaction technique in an immersive virtual reality application
IHM 2003 Proceedings of the 15th French-speaking conference on human-computer interaction on 15eme Conference Francophone sur l'Interaction Homme-Machine
Bimanual and unimanual image alignment: an evaluation of mouse-based techniques
Proceedings of the 18th annual ACM symposium on User interface software and technology
Concurrent bimanual stylus interaction: a study of non-preferred hand mode manipulation
GI '06 Proceedings of Graphics Interface 2006
User performance with trackball-mice
Interacting with Computers
AVI '08 Proceedings of the working conference on Advanced visual interfaces
Dynamic positioning systems: usability and interaction styles
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
Hi-index | 0.00 |
Multimodal interaction combines input from multiple sensors such as pointing devices or speech recognition systems, in order to achieve more fluid and natural interaction. Two-handed interaction has been used recently to enrich graphical interaction. Building applications that use such combined interaction requires new software techniques and frameworks. Using additional devices means that user interface toolkits must be more flexible with regard to input devices and event types. The possibility of parallel interactions must also be taken into account, with consequences on the structure of toolkits. Finally, frameworks must be provided for the combination of events and status of several devices. This paper reports on the extensions we made to the direct manipulation interface toolkit Whizz in order to experiment two-handed interaction. These extensions range from structural adaptations of the toolkit to new techniques for specifying the time-dependent fusion of events.