Accurate Alignment Inspection System for Low-resolution Automotive and Mobility LiDAR

24 Aug 2020  ·  Seontake Oh, Ji-Hwan You, Azim Eskandarian, Young-Keun Kim ·

A misalignment of LiDAR as low as a few degrees could cause a significant error in obstacle detection and mapping that could cause safety and quality issues. In this paper, an accurate inspection system is proposed for estimating a LiDAR alignment error after sensor attachment on a mobility system such as a vehicle or robot. The proposed method uses only a single target board at the fixed position to estimate the three orientations (roll, tilt, and yaw) and the horizontal position of the LiDAR attachment with sub-degree and millimeter level accuracy. After the proposed preprocessing steps, the feature beam points that are the closest to each target corner are extracted and used to calculate the sensor attachment pose with respect to the target board frame using a nonlinear optimization method and with a low computational cost. The performance of the proposed method is evaluated using a test bench that can control the reference yaw and horizontal translation of LiDAR within ranges of 3 degrees and 30 millimeters, respectively. The experimental results for a low-resolution 16 channel LiDAR (Velodyne VLP-16) confirmed that misalignment could be estimated with accuracy within 0.2 degrees and 4 mm. The high accuracy and simplicity of the proposed system make it practical for large-scale industrial applications such as automobile or robot manufacturing process that inspects the sensor attachment for the safety quality control.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here