626 0

Fusion Drive: End-to-End Multi Modal Sensor Fusion for Guided Low-Cost Autonomous Vehicle

Title
Fusion Drive: End-to-End Multi Modal Sensor Fusion for Guided Low-Cost Autonomous Vehicle
Author
서일홍
Keywords
Navigation; Robot sensing systems; Autonomous vehicles; Neural networks; Task analysis; Sensor fusion; Feature extraction
Issue Date
2020-06
Publisher
한국로봇학회
Citation
2020 17th International Conference on Ubiquitous Robots
Abstract
In this paper, we present a supervised learningbased mixed-input sensor fusion neural network for autonomous navigation on a designed track referred to as Fusion Drive. The proposed method combines RGB image and LiDAR laser sensor data for guided navigation along the track and avoidance of learned as well as previously unobserved obstacles for a low-cost embedded navigation system. The proposed network combines separate CNN-based sensor processing into a fully combined network that learns throttle and steering angle labels end-to-end. The proposed network outputs navigational commands with similar learned behavior from the human demonstrations. Performed experiments with validation dataset and in real environment exhibit desired behavior. Recorded performance shows improvement over similar approaches.
URI
https://ieeexplore.ieee.org/document/9144707https://repository.hanyang.ac.kr/handle/20.500.11754/167324
ISBN
978-1-7281-5715-3
ISSN
2325-033X
DOI
10.1109/UR49135.2020.9144707
Appears in Collections:
COLLEGE OF ENGINEERING[S](공과대학) > ELECTRONIC ENGINEERING(융합전자공학부) > Articles
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

BROWSE