Virtual reality for palmtop computers
ACM Transactions on Information Systems (TOIS)
Situated information spaces and spatially aware palmtop computers
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Peephole displays: pen interaction on spatially aware handheld computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Camera phone based motion sensing: interaction techniques, applications and performance study
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
The vocal joystick:: evaluation of voice-based cursor control techniques
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
Tap input as an embedded interaction method for mobile devices
Proceedings of the 1st international conference on Tangible and embedded interaction
Shift: a technique for operating pen-based interfaces using touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
Large document, small screen: a camera driven scroll and zoom control for mobile devices
Proceedings of the 2008 symposium on Interactive 3D graphics and games
Rubbing and tapping for precise and rapid selection on touch-screen displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multi-context photo browsing on mobile devices based on tilt dynamics
Proceedings of the 9th international conference on Human computer interaction with mobile devices and services
SideSight: multi-"touch" interaction around small devices
Proceedings of the 21st annual ACM symposium on User interface software and technology
LightSense: enabling spatially aware handheld interaction devices
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Back-of-device interaction allows creating very small touch devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
CornerPen: smart phone is the pen
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: towards mobile and intelligent interaction environments - Volume Part III
SpeckleSense: fast, precise, low-cost and compact motion sensing using laser speckle
Proceedings of the 24th annual ACM symposium on User interface software and technology
PICOntrol: using a handheld projector for direct control of physical devices through visible light
Proceedings of the 25th annual ACM symposium on User interface software and technology
Controlling widgets with one power-up button
Proceedings of the 26th annual ACM symposium on User interface software and technology
Design principles of hand gesture interfaces for microinteractions
Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces
LensGesture: augmenting mobile interactions with back-of-device finger gestures
Proceedings of the 15th ACM on International conference on multimodal interaction
Haptic target acquisition to enable spatial gestures in nonvisual displays
Proceedings of Graphics Interface 2013
Personal and Ubiquitous Computing
Hi-index | 0.02 |
We present Minput, a sensing and input method that enables intuitive and accurate interaction on very small devices -- ones too small for practical touch screen use and with limited space to accommodate physical buttons. We achieve this by incorporating two, inexpensive and high-precision optical sensors (like those found in optical mice) into the underside of the device. This allows the entire device to be used as an input mechanism, instead of the screen, avoiding occlusion by fingers. In addition to x/y translation, our system also captures twisting motion, enabling many interesting interaction opportunities typically found in larger and far more complex systems.