Style-based inverse kinematics
ACM SIGGRAPH 2004 Papers
Motion-capture-based avatar control framework in third-person view virtual environments
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
Real-time locomotion control by sensing gloves: Research Articles
Computer Animation and Virtual Worlds
Accelerometer-based user interfaces for the control of a physically simulated character
ACM SIGGRAPH Asia 2008 papers
Optimization-based interactive motion synthesis
ACM Transactions on Graphics (TOG)
Performance-based control interface for character animation
ACM SIGGRAPH 2009 papers
ACM SIGGRAPH 2009 papers
An operating method for a bipedal walking robot for entertainment
ACM SIGGRAPH ASIA 2009 Art Gallery & Emerging Technologies: Adaptation
Multitouch puppetry: creating coordinated 3D motion for an articulated arm
ACM International Conference on Interactive Tabletops and Surfaces
A puppet interface for retrieval of motion capture data
SCA '11 Proceedings of the 2011 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
FingerWalking: motion editing with contact-based hand performance
EUROSCA'12 Proceedings of the 11th ACM SIGGRAPH / Eurographics conference on Computer Animation
Multi-Touch Interface for Character Motion Control Using Model-Based Approach
CW '13 Proceedings of the 2013 International Conference on Cyberworlds
Hi-index | 0.00 |
In this paper, we propose an interactive motion control interface with hand manipulation. Using fingers and hands, the user can simultaneously control a large number of degrees of freedom. Our interface is inspired by the control mechanism of a real puppet, which is suspended by about 10 strings attached to different parts of the puppet's body and the controller. The puppeteer can move the puppet in a variety of ways by holding and moving the controller with the right hand and manipulating the strings with the left hand. Our interface was designed based on this puppet mechanism. The character's pelvis translation and head/body rotation are controlled by the user's right hand, while the character's legs are controlled by the right hand fingers. The character's arms are controlled by the user's left hand and fingers. We have developed a method to control a character's motion based on this interface design. In this research, we used a depth-camera-based hand motion sensing device to implement our prototype. Using such a camera-based device obviates the need for the user to wear any device. However, the chosen technology has some limitations in hand motion sensing. Our method has been designed to work well even under these constraints. In this paper, we present our interface design and implementation, as well as the experimental results and a discussion thereof.