Integrating Vision-Based Behaviors with an Autonomous Robot

Christian Schlegel,Jörg Illmann, Heiko Jaberg, Matthias Schuster, and Robert Wörz
Research Institute for Applied Knowledge Processing (FAW)



Abstract

Although many different vision algorithms and systems have been developed, the integration into a complex intelligent control architecture of a mobile robot remains in most cases an open problem. In this paper, we describe the integration of different vision-based behaviors into our architecture for sensorimotor systems. This raises new questions and requires the consideration of significant constraints that are often not in the main focus of vision but nonetheless play a major role for the overall success. By means of different scenarios like person tracking, searching for different objects, and achieving different object configurations within stock areas, the structure of the vision system and the interaction with the overall architecture is explained. The interaction of vision-based modules with the task-level control and the symbolic world model is an especially important topic. The architecture is successfully used on different mobile robots in natural indoor environments.

Keywords: robotics, autonomous robots, system architecture, vision integration, object recognition, person following.