Advanced search
Start date
Betweenand


Environment-aware Sensor Fusion using Deep Learning

Full text
Author(s):
Silva, Caio Fischer ; Borges, Paulo V. K. ; Castanho, Jose E. C. ; Gusikhin, O ; Madani, K ; Zaytoon, J
Total Authors: 6
Document type: Journal article
Source: ICINCO: PROCEEDINGS OF THE 16TH INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, VOL 2; v. N/A, p. 9-pg., 2019-01-01.
Abstract

A reliable perception pipeline is crucial to the operation of a safe and efficient autonomous vehicle. Fusing information from multiple sensors has become a common practice to increase robustness, given that different types of sensors have distinct sensing characteristics. Further, sensors can present diverse performance according to the operating environment. Most systems rely on a rigid sensor fusion strategy which considers the sensors input only (e.g., signal and corresponding covariances), without incorporating the influence of the environment, which often causes poor performance in mixed scenarios. In our approach, we have adjusted the sensor fusion strategy according to a classification of the scene around the vehicle. A convolutional neural network was employed to classify the environment, and this classification is used to select the best sensor configuration accordingly. We present experiments with a full-size autonomous vehicle operating in a heterogeneous environment. The results illustrate the applicability of the method with enhanced odometry estimation when compared to a rigid sensor fusion scheme. (AU)

FAPESP's process: 18/02122-0 - Convolutional neural networks in environment-aware sensor fusion
Grantee:Caio Fischer Silva
Support Opportunities: Scholarships abroad - Research Internship - Scientific Initiation