Gesture recognition using recurrent neural networks
CHI '91 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Communications of the ACM
The go-go interaction technique: non-linear mapping for direct manipulation in VR
Proceedings of the 9th annual ACM symposium on User interface software and technology
The pantograph: a large workspace haptic device for multimodal human computer interaction
CHI '94 Conference Companion on Human Factors in Computing Systems
The limits of speech recognition
Communications of the ACM
Design of Multimodal Feedback Mechanisms for Interactive 3D Object Manipulation
Proceedings of HCI International (the 8th International Conference on Human-Computer Interaction) on Human-Computer Interaction: Ergonomics and User Interfaces-Volume I - Volume I
Bare-hand human-computer interaction
Proceedings of the 2001 workshop on Perceptive user interfaces
NOYO: 6DOF elastic rate control for virtual environments
Proceedings of the ACM symposium on Virtual reality software and technology
Usability evaluation of the EPOCH multimodal user interface: designing 3D tangible interactions
Proceedings of the ACM symposium on Virtual reality software and technology
A dialogue approach to learning object descriptions and semantic categories
Robotics and Autonomous Systems
OpenGRASP: a toolkit for robot grasping simulation
SIMPAR'10 Proceedings of the Second international conference on Simulation, modeling, and programming for autonomous robots
Hi-index | 0.00 |
In this paper we present two approaches for intuitive interactive modelling of special object attributes by use of specific sensoric hardware. After a brief overview over the state of the art in interactive, intuitive object modeling, we motivate the modeling task by deriving the dierent object attributes that shall be modeled from an analysis of important interactions with objects. As an example domain, we chose the setting of a service robot in a kitchen. Tasks from this domain were used to derive important basic actions from which in turn the necessary object attributes were inferred. In the main section of the paper, two of the derived attributes are presented, each with an intuitive interactive modeling method. The object attributes to be modeled a restable object positions and movement restrictions for objects. Both of the intuitive interaction methods were evaluated with a group of test persons and the results are discussed. The paper ends with conclusions on the discussed results and a preview of future work in this area, in particular of potential applications.