Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
MagicMouse: an inexpensive 6-degree-of-freedom mouse
Proceedings of the 1st international conference on Computer graphics and interactive techniques in Australasia and South East Asia
Marker Tracking and HMD Calibration for a Video-Based Augmented Reality Conferencing System
IWAR '99 Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality
Vision-Based Sensor for Real-Time Measuring of Surface Traction Fields
IEEE Computer Graphics and Applications
GI '05 Proceedings of Graphics Interface 2005
Active CyberCode: a directly controllable 2D code
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Soap: a pointing device that works in mid-air
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
ACM SIGGRAPH 2007 emerging technologies
ForceTile: tabletop tangible interface with vision-based force distribution sensing
ACM SIGGRAPH 2008 new tech demos
PhotoelasticTouch: transparent rubbery tangible interface using an LCD and photoelasticity
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
In recent years, the use of augmented reality (AR) systems has become quite common. Many marker-based AR systems can input the positions of physical markers and realize a combination of real-world and computer-generated graphics. However, few systems can recognize other information such as fingertip motions. The objective of our study is to create AR environments in which users can manipulate virtual objects by using natural finger motions. Toward this end, we propose a novel marker-based AR system called "ARForce." ARForce enables users to measure the 3D position of markers as well as also the distribution of force vectors that are applied by a user. Using this system, users can manipulate virtual objects using various finger motions. Our proposed system comprises a camera and an input device. The input device is an elastic body and it comprises two types of markers. One is a square-shaped marker that enables a user to detect the position of the device. The other markers are small circular-shaped ones that are placed within the elastic body. The positions of the circular markers are moved when a user applies a force to the device. This enables force detection.