Synthesis of complex dynamic character motion from simple animations
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
A physically-based motion retargeting filter
ACM Transactions on Graphics (TOG)
Waypoint navigation with a vibrotactile waist belt
ACM Transactions on Applied Perception (TAP)
HAID '08 Proceedings of the 3rd international workshop on Haptic and Audio Interaction Design
Tactile motion instructions for physical activities
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Full body interaction for serious games in motor rehabilitation
Proceedings of the 2nd Augmented Human International Conference
Tactile brush: drawing on skin with a tactile grid display
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Second skin: motion capture with actuated feedback for motor learning
ACM SIGGRAPH 2011 Posters
Teach me to dance: exploring player experience and performance in full body dance games
Proceedings of the 8th International Conference on Advances in Computer Entertainment Technology
TIKL: Development of a Wearable Vibrotactile Feedback Suit for Improved Human Motor Learning
IEEE Transactions on Robotics
Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, Third Edition
Sensing the environment through SpiderSense
Proceedings of the 4th Augmented Human International Conference
Hi-index | 0.00 |
The ability to guide human motion through automatically generated feedback has significant potential for applications in areas, such as motor learning, human-computer interaction, telepresence, and augmented reality. This paper focuses on the design and development of such systems from a human cognition and perception perspective. We analyze the dimensions of the design space for motion guidance systems, spanned by technologies and human information processing, and identify opportunities for new feedback techniques. We present a novel motion guidance system, that was implemented based on these insights to enable feedback for position, direction and continuous velocities. It uses motion capture to track a user in space and guides using visual, vibrotactile and pneumatic actuation. Our system also introduces motion retargeting through time warping, motion dynamics and prediction, to allow more flexibility and adaptability to user performance.