Vision, instruction, and action
Vision, instruction, and action
Artificial Intelligence
International Journal of Computer Vision
Understanding people pointing: the Perseus system
ISCV '95 Proceedings of the International Symposium on Computer Vision
An architecture for vision and action
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Cooperating with people: the intelligent classroom
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Hi-index | 0.00 |
Researchers in robot vision have access to several excellent image processing packages (e.g., Khoros, Vista, Susan, MIL, and XVision to name only a few) as a base for any new vision software needed in most navigation and recognition tasks. Our work in automonous robot control and human-robot interaction, however, has demanded a new level of run-time flexibility and performance: on-the-fly configuration of visual routines that exploit up-to-the-second context from the task, image, and environment. The result is Gargoyle: an extendible, on-board, realtime vision software package that allows a robot to configure, parameterize, and execute imageprocessing pipelines at run-time. Each operator in a pipeline works at a level of resolution and over regions of interest that are computed by upstream operators or set by the robot according to task constraints. Pipeline configurations and operator parameters can be stored as a library of visual methods appropriate for different sensing tasks and environmental conditions. Beyond this, a robot may reason about the current task and environmental constraints to construct novel visual routines that are too specialized to work under general conditions, but that are well-suited to the immediate environment and task. We use the RAP reactive plan-execution system to select and configure pre-compiled processing pipelines, and to modify them for specific constraints determined at run-time.