Stereo Vision from Color Images Based on Competitive and Cooperative Neural Networks
Stereo vision is a passive method of obtaining depth information of visible surface by measuring disparity of corresponding points in left and right stereo images. Stereo matching, which belongs to ill-posed problems, is the key issue of stereo vision. In the literature, almost all matching techniqu...
Gespeichert in:
Veröffentlicht in: | Shisutemu Seigyo Jouhou Gakkai rombunshi Control and Information Engineers, 2004/10/15, Vol.17(10), pp.435-443 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Stereo vision is a passive method of obtaining depth information of visible surface by measuring disparity of corresponding points in left and right stereo images. Stereo matching, which belongs to ill-posed problems, is the key issue of stereo vision. In the literature, almost all matching techniques only use gray-scale images. The color information is usually neglected, but undoubtedly it may be of great use in helping robot stereo systems perform better. In this paper, we use two layered self-organization neural network model to simulate the competitive and cooperative interaction of binocular neurons. We propose a special similarity function as initial input to make full use of the color information. In RGB color space, the similarity map is established by taking logical AND calculation of red, green and blue color value similarity. In stereo matching experiment, we first consider color random-dot stereogram for stereo correspondence. Then we take real image experiments by means of different color stereo matching approaches. Experiments results have shown that the quality and convergence speed of the stereo matching can be efficiently improved by using appropriate color matching algorithm comparing with the conventional grey value algorithm. |
---|---|
ISSN: | 1342-5668 2185-811X |
DOI: | 10.5687/iscie.17.435 |