The advanced driver assistance systems (ADAS) are one of the issues to protecting people from vehicle collision. Collision warning system is a very important part of ADAS to protect people from the dangers of accidents caused by fatigue, drowsiness and other human errors. Multi-sensors has been widely used in ADAS for environment perception such as cameras, radar, and light detection and ranging (LiDAR). We propose the relative orientation and translation between the two sensors are things that must be considered in performing fusion. we discuss the real-time collision warning system using 2D LiDAR and Camera sensors for environment perception and estimate the distance (depth) and angle of obstacles. In this paper, we propose a fusion of two sensors that is camera and 2D LiDAR to get the distance and angle of an obstacle in front of the vehicle that implemented on Nvidia Jetson Nano using Robot Operating System (ROS). Hence, a calibration process between the camera and 2D LiDAR is required which will be presented in session III. After that, the integration and testing will be carried out using static and dynamic scenarios in the relevant environment. For fusion, we use the implementation of the conversion from degree to coordinate. Based on the experiment, we result obtained an average of 0.197 meters
Copyrights © 2020