A Neural Model for Coordinated Control of Horizontal and Vertical Alignment of the Eyes in Three-Dimensional Space
Description
The ability of a real robot system to interact with the surrounding environment is subordinate to its ability of perceiving it correctly. Since the perception of depth directly relates to the ability of looking at the same point in space with both the cameras, i.e. to verge on a target object, it is essential for the system to gain a proper fixational posture. The vector disparity pattern that arises from the vergent geometry is the source of information used to actively control the alignment of the optical axes, both horizontally and vertically. The proposed control strategy resorts to a computational substrate of modeled V1 binocular cells that provide a distributed representation of disparity information. The cells' responses are directly exploited to obtain a signal able to nullify the binocular disparity in a foveal region, both in term of its horizontal and vertical components, and in this way to ensure the intersection of the optical axes. The robustness and flexibility of the distributed representation is proved to be instrumental to cope with the vertical disparity that arises in the different mechanical systems adopted in real robot heads. Experimental tests in a simulated environment demonstrated that the vergence control executes effective vergence movements on a visual stimulus posed in the peripersonal space, regardless of the geometry of the mechanical system. The eye posture resulting from a combined horizontal and vertical control of vergence movement allows us to reduce the search zone in perifoveal area, and eventually to obtain a better perception of the 3D structure of the environment from binocular disparity.
Additional details
- URL
- http://hdl.handle.net/11567/391911
- URN
- urn:oai:iris.unige.it:11567/391911
- Origin repository
- UNIGE