A specification language for direct-manipulation user interfaces
ACM Transactions on Graphics (TOG) - Special issue on user interface software
Integrating gesture and snapping into a user interface toolkit
UIST '90 Proceedings of the 3rd annual ACM SIGGRAPH symposium on User interface software and technology
A new model for handling input
ACM Transactions on Information Systems (TOIS)
A software model and specification language for non-WIMP user interfaces
ACM Transactions on Computer-Human Interaction (TOCHI)
Derivatives of Regular Expressions
Journal of the ACM (JACM)
Providing integrated toolkit-level support for ambiguity in recognition-based interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Interaction techniques for ambiguity resolution in recognition-based interfaces
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Programming Techniques: Regular expression search algorithm
Communications of the ACM
Introduction to the Theory of Computation
Introduction to the Theory of Computation
Cooperative Task Management Without Manual Stack Management
ATEC '02 Proceedings of the General Track of the annual conference on USENIX Annual Technical Conference
SYNGRAPH: A graphical user interface generator
SIGGRAPH '83 Proceedings of the 10th annual conference on Computer graphics and interactive techniques
Executable specifications for a human-computer interface
CHI '83 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards specifying and evaluating the human factors of user-computer interfaces
CHI '82 Proceedings of the 1982 Conference on Human Factors in Computing Systems
EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion
Proceedings of the 16th annual ACM symposium on User interface software and technology
Extensible input handling in the subArctic toolkit
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SwingStates: adding state machines to the swing toolkit
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
iGesture: A General Gesture Recognition Framework
ICDAR '07 Proceedings of the Ninth International Conference on Document Analysis and Recognition - Volume 02
A multitouch software architecture
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
A system for interactive graphical programming
AFIPS '68 (Spring) Proceedings of the April 30--May 2, 1968, spring joint computer conference
Building Interactive Systems: Principles for Human-Computer Interaction
Building Interactive Systems: Principles for Human-Computer Interaction
A specification paradigm for the design and implementation of tangible user interfaces
ACM Transactions on Computer-Human Interaction (TOCHI)
A gestural interaction design model for multi-touch displays
Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology
PyMT: a post-WIMP multi-touch user interface toolkit
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
A framework for robust and flexible handling of inputs with uncertainty
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Midas: a declarative multi-touch interaction framework
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
Towards a formalization of multi-touch gestures
ACM International Conference on Interactive Tabletops and Surfaces
Eden: a professional multitouch tool for constructing virtual organic environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A domain specific language to define gestures for multi-touch applications
Proceedings of the 10th Workshop on Domain-Specific Modeling
Gesture coder: a tool for programming multi-touch gestures by demonstration
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gesture coder: a tool for programming multi-touch gestures by demonstration
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 4th ACM SIGCHI symposium on Engineering interactive computing systems
ConstraintJS: programming interactive behaviors for the web by integrating constraints and states
Proceedings of the 25th annual ACM symposium on User interface software and technology
Proton++: a customizable declarative multitouch framework
Proceedings of the 25th annual ACM symposium on User interface software and technology
A compositional model for gesture definition
HCSE'12 Proceedings of the 4th international conference on Human-Centered Software Engineering
GestureAgents: an agent-based framework for concurrent multi-task multi-user interaction
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Memorability of pre-designed and user-defined gesture sets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gesture studio: authoring multi-touch interactions through demonstration and declaration
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EventHurdle: supporting designers' exploratory interaction prototyping with gesture-based sensors
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
GestIT: a declarative and compositional framework for multiplatform gesture definition
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
Interactive prototyping of tabletop and surface applications
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
Hi-index | 0.01 |
Current multitouch frameworks require application developers to write recognition code for custom gestures; this code is split across multiple event-handling callbacks. As the number of custom gestures grows it becomes increasingly difficult to 1) know if new gestures will conflict with existing gestures, and 2) know how to extend existing code to reliably recognize the complete gesture set. Proton is a novel framework that addresses both of these problems. Using Proton, the application developer declaratively specifies each gesture as a regular expression over a stream of touch events. Proton statically analyzes the set of gestures to report conflicts, and it automatically creates gesture recognizers for the entire set. To simplify the creation of complex multitouch gestures, Proton introduces gesture tablature, a graphical notation that concisely describes the sequencing of multiple interleaved touch actions over time. Proton contributes a graphical editor for authoring tablatures and automatically compiles tablatures into regular expressions. We present the architecture and implementation of Proton, along with three proof-of-concept applications. These applications demonstrate the expressiveness of the framework and show how Proton simplifies gesture definition and conflict resolution.