Visually stimulated motor control for a robot with a pair of LGMD visual neural networks
In this paper, we proposed a visually stimulated motor control (VSMC) system for autonomous navigation of mobile robots. Inspired from a locusts’ motion sensitive interneuron – lobula giant movement detector (LGMD), the presented VSMC system enables a robot exploring local paths or interacting with dynamic objects effectively using visual input only. The VSMC consists of a pair of LGMD visual neural networks and a simple motor command generator. Each LGMD processes images covering part of the wide field of view and extracts relevant visual cues. The outputs from the two LGMDs are compared and interpreted into executable motor commands directly. These motor commands are then executed by the robot’s wheel control system in real time to generate corresponded motion adjustment accordingly. Our experiments showed that this bioinspired VSMC system worked well in different scenarios.