A Method for Registration of 3-D Shapes
IEEE Transactions on Pattern Analysis and Machine Intelligence - Special issue on interpretation of 3-D scenes—part II
IEEE Transactions on Pattern Analysis and Machine Intelligence
AR2 Hockey: A Case Study of Collaborative Augmented Reality
VRAIS '98 Proceedings of the Virtual Reality Annual International Symposium
A linear-time component-labeling algorithm using contour tracing technique
Computer Vision and Image Understanding
A SWOT analysis of the field of virtual reality rehabilitation and therapy
Presence: Teleoperators and Virtual Environments - Special issue: Virtual rehabilitation
Vision-based 3D finger interactions for mixed reality games with physics simulation
VRCAI '08 Proceedings of The 7th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
ACE '08 Proceedings of the 2008 International Conference on Advances in Computer Entertainment Technology
Real-time hand-tracking with a color glove
ACM SIGGRAPH 2009 papers
KinectFusion: real-time dynamic 3D surface reconstruction and interaction
ACM SIGGRAPH 2011 Talks
Real-time human pose recognition in parts from single depth images
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Free-hands interaction in augmented reality
Proceedings of the 1st symposium on Spatial user interaction
Hi-index | 0.00 |
In this paper we present a method for allowing arbitrary objects to interact physically in an augmented reality (AR) environment. A Microsoft Kinect is used to track objects in 6 degrees of freedom, enabling realistic interaction between them and virtual content in an tabletop AR context. We propose a point cloud based method for achieving such interaction. An adaptive per-pixel depth threshold is used to extract foreground objects, which are grouped using connected-component analysis. Objects are tracked with a variant of the Iterative Closest Point algorithm, which uses randomised projective correspondences. Our algorithm tracks objects moving at typical tabletop speeds with median drifts of 8.5% (rotational) and 4.8% (translational). The point cloud representation of foreground objects is improved as additional views of the object are visible to the Kinect. Physics-based AR interaction is achieved by fitting a collection of spheres to the point cloud model and passing them to the Bullet physics engine as a physics proxy of the object. Our method is demonstrated in an AR application where the user can interact with a virtual tennis ball, illustrating our proposed method's potential for physics-based AR interaction.