Use of the Hough transformation to detect lines and curves in pictures
Communications of the ACM
Voice as sound: using non-verbal voice input for interactive control
Proceedings of the 14th annual ACM symposium on User interface software and technology
DiamondTouch: a multi-user touch technology
Proceedings of the 14th annual ACM symposium on User interface software and technology
IEEE Internet Computing
Building and Using A Scalable Display Wall System
IEEE Computer Graphics and Applications
A remote control interface for large displays
Proceedings of the 17th annual ACM symposium on User interface software and technology
The vacuum: facilitating the manipulation of distant objects
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The Large-Display User Experience
IEEE Computer Graphics and Applications
A Camera-Based Input Device for Large Interactive Displays
IEEE Computer Graphics and Applications
Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility
Distant freehand pointing and clicking on very large, high resolution displays
Proceedings of the 18th annual ACM symposium on User interface software and technology
Low-cost multi-touch sensing through frustrated total internal reflection
Proceedings of the 18th annual ACM symposium on User interface software and technology
Building a 100 Mpixel graphics device for the OptIPuter
Future Generation Computer Systems - IGrid 2005: The global lambda integrated facility
Audio location: accurate low-cost location sensing
PERVASIVE'05 Proceedings of the Third international conference on Pervasive Computing
Human-computer intelligent interaction: a survey
HCI'07 Proceedings of the 2007 IEEE international conference on Human-computer interaction
Large area interactive browsing for high resolution digitized dunhuang murals
Transactions on Edutainment III
Hi-index | 0.00 |
When interacting with wall-sized, high-resolution tiled displays, users typically stand or move in front of it rather than sit at fixed locations. Using a mouse to interact can be inconvenient in this context, as it must be carried around and often requires a surface to be used. Even for devices that work in mid-air, accuracy when trying to hit small or distal targets becomes an issue. Ideally, the user should not need devices to interact with applications on the display wall. We have developed a hybrid vision- and sound-based system for device-free interaction with software running on a 7×4 tile 220-inch display wall. The system comprises three components that together enable interaction with both distal and proximal targets: (i) A camera determines the direction in which a user is pointing, allowing distal targets to be selected. The direction is determined using edge detection followed by applying the Hough transform. (ii) Using four microphones, a user double-snapping his fingers is detected and located, before the selected target is moved to the location of the snap. This is implemented using correlation and multilateration. (iii) 16 cameras detect objects (fingers, hands) in front of the display wall. The 1D positions of detected objects are then used to triangulate object positions, enabling touch-free multi-point interaction with proximal content. The system is used on the display wall in three contexts to (i) move and interact with windows from a traditional desktop interface, (ii) interact with a whiteboard-style application, and (iii) play two games.