HoloWall: designing a finger, hand, body, and object sensitive wall
Proceedings of the 10th annual ACM symposium on User interface software and technology
Augmented surfaces: a spatially continuous work space for hybrid computing environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
ToolStone: effective use of the physical manipulation vocabularies of input devices
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Sensetable: a wireless object tracking platform for tangible user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
DiamondTouch: a multi-user touch technology
Proceedings of the 14th annual ACM symposium on User interface software and technology
SmartSkin: an infrastructure for freehand manipulation on interactive surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Sensor systems for interactive surfaces
IBM Systems Journal
DT controls: adding identity to physical interfaces
Proceedings of the 18th annual ACM symposium on User interface software and technology
reacTIVision: a computer-vision framework for table-based tangible interaction
Proceedings of the 1st international conference on Tangible and embedded interaction
Dip - it: digital infrared painting on an interactive table
CHI '08 Extended Abstracts on Human Factors in Computing Systems
SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces
GI '08 Proceedings of graphics interface 2008
Sphere: multi-touch interactions on a spherical display
Proceedings of the 21st annual ACM symposium on User interface software and technology
Scratch input: creating large, inexpensive, unpowered and mobile finger input surfaces
Proceedings of the 21st annual ACM symposium on User interface software and technology
Empirical evaluation for finger input properties in multi-touch interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hambone: A Bio-Acoustic Gesture Interface
ISWC '07 Proceedings of the 2007 11th IEEE International Symposium on Wearable Computers
Augmenting interactive tables with mice & keyboards
Proceedings of the 22nd annual ACM symposium on User interface software and technology
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Manual deskterity: an exploration of simultaneous pen + touch direct input
CHI '10 Extended Abstracts on Human Factors in Computing Systems
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Sensor synaesthesia: touch in motion, and motion in touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using shear as a supplemental two-dimensional input channel for rich touchscreen interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
MicPen: pressure-sensitive pen interaction using microphone with standard touchscreen
CHI '12 Extended Abstracts on Human Factors in Computing Systems
The fat thumb: using the thumb's contact size for single-handed mobile interaction
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Carpus: a non-intrusive user identification technique for interactive surfaces
Proceedings of the 25th annual ACM symposium on User interface software and technology
Magic finger: always-available input through finger instrumentation
Proceedings of the 25th annual ACM symposium on User interface software and technology
Extended multitouch: recovering touch posture and differentiating users using a depth camera
Proceedings of the 25th annual ACM symposium on User interface software and technology
Proceedings of the 25th annual ACM symposium on User interface software and technology
Acoustic barcodes: passive, durable and inexpensive notched identification tags
Proceedings of the 25th annual ACM symposium on User interface software and technology
Improving touch accuracy on large tabletops using predecessor and successor
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Bezel-Tap gestures: quick activation of commands from sleep mode on tablets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touch & activate: adding interactivity to existing objects using active acoustic sensing
Proceedings of the 26th annual ACM symposium on User interface software and technology
Touch scrolling transfer functions
Proceedings of the 26th annual ACM symposium on User interface software and technology
PAPILLON: designing curved display surfaces with printed optics
Proceedings of the 26th annual ACM symposium on User interface software and technology
Dinner metaphor interface: operating your computers with a knife and fork
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Keyboard clawing: input method by clawing key tops
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Extending the vocabulary of touch events with ThumbRock
Proceedings of Graphics Interface 2013
Hi-index | 0.00 |
We present TapSense, an enhancement to touch interaction that allows conventional surfaces to identify the type of object being used for input. This is achieved by segmenting and classifying sounds resulting from an object's impact. For example, the diverse anatomy of a human finger allows different parts to be recognized including the tip, pad, nail and knuckle - without having to instrument the user. This opens several new and powerful interaction opportunities for touch input, especially in mobile devices, where input is extremely constrained. Our system can also identify different sets of passive tools. We conclude with a comprehensive investigation of classification accuracy and training implications. Results show our proof-of-concept system can support sets with four input types at around 95% accuracy. Small, but useful input sets of two (e.g., pen and finger discrimination) can operate in excess of 99% accuracy.