HoloWall: designing a finger, hand, body, and object sensitive wall
Proceedings of the 10th annual ACM symposium on User interface software and technology
A user interface using fingerprint recognition: holding commands and data objects on fingers
Proceedings of the 11th annual ACM symposium on User interface software and technology
DiamondTouch: a multi-user touch technology
Proceedings of the 14th annual ACM symposium on User interface software and technology
The smart floor: a mechanism for natural user identification and tracking
CHI '00 Extended Abstracts on Human Factors in Computing Systems
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
SIDES: a cooperative tabletop computer game for social skills development
CSCW '06 Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work
2D and 3D face recognition: A survey
Pattern Recognition Letters
Speeded-Up Robust Features (SURF)
Computer Vision and Image Understanding
SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces
GI '08 Proceedings of graphics interface 2008
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design and evaluation of multitouch marking menus
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
The IR ring: authenticating users' touches on a multi-touch display
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
HandsDown: hand-contour-based user identification for interactive surfaces
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
IdWristbands: IR-based user identification on multi-touch surfaces
ACM International Conference on Interactive Tabletops and Surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Medusa: a proximity-aware multi-touch tabletop
Proceedings of the 24th annual ACM symposium on User interface software and technology
Carpus: a non-intrusive user identification technique for interactive surfaces
Proceedings of the 25th annual ACM symposium on User interface software and technology
Magic finger: always-available input through finger instrumentation
Proceedings of the 25th annual ACM symposium on User interface software and technology
Personal clipboards for individual copy-and-paste on shared multi-user surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
GravitySpace: tracking users and their poses in a smart room using a pressure-sensing floor
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Fiberio: a touchscreen that senses fingerprints
Proceedings of the 26th annual ACM symposium on User interface software and technology
TransformTable: a self-actuated shape-changing digital table
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Hi-index | 0.01 |
In order to enable personalized functionality, such as to log tabletop activity by user, tabletop systems need to recognize users. DiamondTouch does so reliably, but requires users to stay in assigned seats and cannot recognize users across sessions. We propose a different approach based on distinguishing users' shoes. While users are interacting with the table, our system Bootstrapper observes their shoes using one or more depth cameras mounted to the edge of the table. It then identifies users by matching camera images with a database of known shoe images. When multiple users interact, Bootstrapper associates touches with shoes based on hand orientation. The approach can be implemented using consumer depth cameras because (1) shoes offer large distinct features such as color, (2) shoes naturally align themselves with the ground, giving the system a well-defined perspective and thus reduced ambiguity. We report two simple studies in which Bootstrapper recognized participants from a database of 18 users with 95.8% accuracy.