In natural outdoor settings, advanced perception systems and learning strategies are major requirement for an autonomous vehicle to sense and understand the surrounding environment, recognizing artificial and natural structures, topology, vegetation and drivable paths. Stereo vision has been used extensively for this purpose. However, conventional single-baseline stereo does not scale well to different depths of perception. In this paper, a multi-baseline stereo frame is introduced to perform accurate 3D scene reconstruction from near range up to several meters away from the vehicle. A classifier that segments the scene into navigable and non-navigable areas based on 3D data is also described. It incorporates geometric features within an online self-learning framework to model and identify traversable ground, without any a priori assumption on the terrain characteristics. The ground model is automatically retrained during the robot motion, thus ensuring adaptation to environmental changes. The proposed strategy is of general applicability for robot's perception and it can be implemented using any range sensor. Here, it is demonstrated for stereo-based data acquired by the multi-baseline device. Experimental tests, carried out in a rural environment with an off-road vehicle, are presented. It is shown that the use of a multi-baseline stereo frame allows for accurate reconstruction and scene segmentation at a wide range of visible distances, thus increasing the overall flexibility and reliability of the perception system. © 2014 Springer-Verlag Berlin Heidelberg.

3D reconstruction and classification of natural environments by an autonomous vehicle using multi-baseline stereo / Milella, A.; Reina, G.. - In: INTELLIGENT SERVICE ROBOTICS. - ISSN 1861-2776. - STAMPA. - 7:2(2014), pp. 79-92. [10.1007/s11370-014-0146-x]

3D reconstruction and classification of natural environments by an autonomous vehicle using multi-baseline stereo

Reina G.
2014-01-01

Abstract

In natural outdoor settings, advanced perception systems and learning strategies are major requirement for an autonomous vehicle to sense and understand the surrounding environment, recognizing artificial and natural structures, topology, vegetation and drivable paths. Stereo vision has been used extensively for this purpose. However, conventional single-baseline stereo does not scale well to different depths of perception. In this paper, a multi-baseline stereo frame is introduced to perform accurate 3D scene reconstruction from near range up to several meters away from the vehicle. A classifier that segments the scene into navigable and non-navigable areas based on 3D data is also described. It incorporates geometric features within an online self-learning framework to model and identify traversable ground, without any a priori assumption on the terrain characteristics. The ground model is automatically retrained during the robot motion, thus ensuring adaptation to environmental changes. The proposed strategy is of general applicability for robot's perception and it can be implemented using any range sensor. Here, it is demonstrated for stereo-based data acquired by the multi-baseline device. Experimental tests, carried out in a rural environment with an off-road vehicle, are presented. It is shown that the use of a multi-baseline stereo frame allows for accurate reconstruction and scene segmentation at a wide range of visible distances, thus increasing the overall flexibility and reliability of the perception system. © 2014 Springer-Verlag Berlin Heidelberg.
2014
3D reconstruction and classification of natural environments by an autonomous vehicle using multi-baseline stereo / Milella, A.; Reina, G.. - In: INTELLIGENT SERVICE ROBOTICS. - ISSN 1861-2776. - STAMPA. - 7:2(2014), pp. 79-92. [10.1007/s11370-014-0146-x]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11589/238756
Citazioni
  • Scopus 23
  • ???jsp.display-item.citation.isi??? 22
social impact