Multimodal control of reaching-simulating the role of tactile feedback
By the onset of reaching, young infants are already able to keep track of the position of their hand by using visual feedback from the target and proprioceptive feedback from the arm. How is this multimodal coordination achieved? We propose that infants learn to coordinate vision and proprioception by using tactile feedback from the target. In order to evaluate this hypothesis, we employ an evolutionary-based learning algorithm as a proxy for trial-and-error sensorimotor development in young infants. A series of simulation studies illustrate how touch: 1) helps coordinate vision and proprioception; 2) facilitates an efficient reaching strategy; and 3) promotes intermodal recalibration when the coordination is perturbed. We present two developmental predictions generated by the model and discuss the relative importance of visual and tactile feedback while learning to reach.