Situated information spaces and spatially aware palmtop computers
Communications of the ACM - Special issue on computer augmented environments: back to the real world
The world through the computer: computer augmented interaction with real world environments
Proceedings of the 8th annual ACM symposium on User interface and software technology
Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
The Rockin'Mouse: integral 3D manipulation on a plane
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
The VideoMouse: a camera-based multi-degree-of-freedom input device
Proceedings of the 12th annual ACM symposium on User interface software and technology
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Motion Tracking: No Silver Bullet, but a Respectable Arsenal
IEEE Computer Graphics and Applications
Peephole displays: pen interaction on spatially aware handheld computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
XWand: UI for intelligent spaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A camera-based interface for interaction with mobile handheld computers
Proceedings of the 2005 symposium on Interactive 3D graphics and games
A two-ball mouse affords three degrees of freedom
CHI EA '97 CHI '97 Extended Abstracts on Human Factors in Computing Systems
Marked-up maps: combining paper maps and electronic information resources
Personal and Ubiquitous Computing
TinyMotion: camera phone based interaction methods
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Ubiquitous graphics: combining hand-held and wall-size displays to interact with large images
Proceedings of the working conference on Advanced visual interfaces
Soap: a pointing device that works in mid-air
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Prakash: lighting aware motion capture using photosensing markers and multiplexed illuminators
ACM SIGGRAPH 2007 papers
SideSight: multi-"touch" interaction around small devices
Proceedings of the 21st annual ACM symposium on User interface software and technology
LightSense: enabling spatially aware handheld interaction devices
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Spatially aware handhelds for high-precision tangible interaction with large displays
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
Bokode: imperceptible visual tags for camera based interaction from a distance
ACM SIGGRAPH 2009 papers
Mouse 2.0: multi-touch meets the mouse
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
LensMouse: augmenting the mouse with an interactive touch display
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SpeckleEye: gestural interaction for embedded electronics in ubiquitous computing
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Lumitrack: low cost, high precision, high speed tracking with projected m-sequences
Proceedings of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.03 |
Motion sensing is of fundamental importance for user interfaces and input devices. In applications, where optical sensing is preferred, traditional camera-based approaches can be prohibitive due to limited resolution, low frame rates and the required computational power for image processing. We introduce a novel set of motion-sensing configurations based on laser speckle sensing that are particularly suitable for human-computer interaction. The underlying principles allow these configurations to be fast, precise, extremely compact and low cost. We provide an overview and design guidelines for laser speckle sensing for user interaction and introduce four general speckle projector/sensor configurations. We describe a set of prototypes and applications that demonstrate the versatility of our laser speckle sensing techniques.