Beyond Fitts' law: models for trajectory-based HCI tasks
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
The keystroke-level model for user performance time with interactive systems
Communications of the ACM
Measuring the difficulty of steering through corners
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Modeling human performance of pen stroke gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Natural user interfaces are not natural
interactions
The power of automatic feature selection: Rubine on steroids
Proceedings of the Seventh Sketch-Based Interfaces and Modeling Symposium
A quantitative quality model for gesture based user interfaces
Proceedings of the 23rd Australian Computer-Human Interaction Conference
Instructing people for training gestural interactive systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Beginning Kinect Programming with the Microsoft Kinect SDK
Beginning Kinect Programming with the Microsoft Kinect SDK
Proton++: a customizable declarative multitouch framework
Proceedings of the 25th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
As part of a Natural User Interface we can use human body gestures, although they must be evaluated to get better results. We can evaluate them using a quantitative model. For this purpose, we begin the gesture analysis of touchless hand gestures with this work. Our analysis is based on significant gesture attributes. From the large number of possible attributes we select the trajectory. We propose to describe the trajectory by using distance units and directions. We then evaluate our proposal with two gesture data sets. We found that the proposal can be used to describe and quantify gesture trajectories. Moreover, we found that well-defined changes of directions (or corners) influence the speed for performing a gesture, thus users need more time to execute the gesture. The proposed method allows a trajectory to be quantified in a simple manner.