Active Vision During Coordinated Head/Eye Movements in a Humanoid Robot
While looking at a point in the scene, humans continually perform smooth eye movements to compensate for involuntary head rotations. Since the optical nodal points of the eyes do not lie on the head rotation axes, this behavior yields useful 3-D information in the form of visual parallax. Here, we describe the replication of this behavior in a humanoid robot. We have developed a method for egocentric distance estimation based on the parallax that emerges during compensatory head/eye movements. This method was tested in a robotic platform equipped with an anthropomorphic neck and two binocular pan-tilt units specifically designed to reproduce the visual input signals experienced by humans. We show that this approach yields accurate and robust estimation of egocentric distance within the space nearby the agent. These results provide a further demonstration of how behavior facilitates the solution of complex perceptual problems.