All Issue

2018 Vol.29, Issue 5 Preview Page
October 2018. pp. 204-214
Abstract

LIDAR는 정밀 측정이 가능한 특성을 기반으로 정확한 정보 획득과 고해상도 3차원 영상 구현에 유리하기 때문에 사용자의 개입 없이 정확한 정보 획득과 판단이 요구되는 자율 주행 시스템에 필수적으로 적용되고 있다. 최근 LIDAR를 접목한 자율 주행 시스템이 인간의 생활권 안에서 활용되면서 시각안전 문제에 대한 해결과 함께 다양한 환경에서 정확한 장애물 인식을 통한 신뢰성 있는 판단이 요구되고 있다. 본 논문에서는 1550 nm 시각안전 광원을 활용한 Single-Shot LIDAR system (SSLs)을 구성하고 다양한 측정 환경, 반사 물질, 물질의 각도에 대한 LIDAR 신호 분석 방법과 결과를 보고한다. 실내, 주간, 야간의 환경에서 25 m 거리에 위치한 5% 알루미늄 반사판과 건물 벽을 활용하여 각 측정 환경에서 반사율이 다른 물질에 대한 신호를 분석하고, 다양한 각도를 갖는 실제 장애물을 고려하여 반사 물질의 각도 변화에 대한 신호 분석도 진행하였다. 이러한 신호 분석은 수신 정보의 신뢰도 판별을 위한 SNR과 거리 정보 정확성의 지표인 타이밍 지터를 활용하여 측정 환경 및 반사 조건과 LIDAR 신호 간의 상관관계 확인이 가능한 장점이 있다.

Since LIDAR is advantageous for accurate information acquisition and realization of a high-resolution 3D image based on characteristics that can be precisely measured, it is essential to autonomous navigation systems that require acquisition and judgment of accurate peripheral information without user intervention. Recently, as an autonomous navigation system applying LIDAR has been utilized in human living space, it is necessary to solve the eye-safety problem, and to make reliable judgment through accurate obstacle recognition in various environments. In this paper, we construct a single-shot LIDAR system (SSLs) using a 1550-nm eye-safe light source, and report the analysis method and results of LIDAR signals for various measurement environments, reflective materials, and material angles. We analyze the signals of materials with different reflectance in each measurement environment by using a 5% Al reflector and a building wall located at a distance of 25 m, under indoor, daytime, and nighttime conditions. In addition, signal analysis of the angle change of the material is carried out, considering actual obstacles at various angles. This signal analysis has the merit of possibly confirming the correlation between measurement environment, reflection conditions, and LIDAR signal, by using the SNR to determine the reliability of the received information, and the timing jitter, which is an index of the accuracy of the distance information.

References
  1. S. Laible, Y. N. Khan, K. Bohlmann, and A. Zell, “3d lidar-and camera-based terrain classification under different lighting conditions,” Autonomous Mobile Systems 2012, 21-29 (2012).10.1007/978-3-642-32217-4_3
  2. R. Li, J. Liu, L. Zhang, and Y. Hang, “LIDAR/MEMS IMU integrated navigation (SLAM) method for a small UAV in indoor environments,” in Proc. Inertial Sensors and Systems Symposium (Germany, Sept. 2014), pp. 1-15.10.1109/InertialSensors.2014.7049479
  3. G. D. Choi, M. H. Han, M. H. Song, H. S. Seo, C. Y. Kim, S. C. Hong, and B. K. Mheen, “Development trends and expectation of three-dimensional imager based on LIDAR technology for autonomous smart car navigation,” Electronics and Telecommunications Trend, 86-97 (2016).
  4. K. H. An, S. W. Lee, W. Y. Han, and J. C. Son, “Technology trends of self-driving vehicles,” Electronics and Telecommunications Trend, 35-44 (2013).
  5. J. Kim, K. H. Kwak, and K. S. Bae, “Experimental analysis and internal calibration of the 3D LIDAR reflectivity,” J. Inst. Control Robot Syst. 23, 574-582 (2017).10.5302/J.ICROS.2017.17.0086
  6. S. Chen, D. Liu, W. Zhang, L. You, Y. He, W. Zhang, X. Yang, G. Wu, M. Ren, H. Zeng, Z. Wang, X. Xie, and M. Jiang, “Time-of-flight laser ranging and imaging at 1550 nm using low-jitter superconducting nanowire single-photon detection system,” Appl. Opt. 52, 3241-3245 (2013).10.1364/AO.52.00324123669836
  7. Q. Zhu, L. Chen, Q. Li, M. Li, A. Nuchter, and J. Wang, “3D LIDAR point cloud based intersection recognition for autonomous driving,” in Proc. Intelligent Vehicles Symposium (Spain, Jun. 2012), pp. 456-461.10.1109/IVS.2012.6232219
  8. Y. Y. Markushin, G. S. Pati, and R. Tripathi, “Multi-pulse detection technique to improve the timing/range resolution in a scanning LADAR system,” Proc. SPIE 9342, 934201-934208 (2015).
  9. K. Atkinson, “Modelling a road using spline interpolation,” Reports on Computational Mathematics 145, 1-17 (2002).
  10. H. Wang, J. K. Kearney, and K. Atkinson, “Robust and efficient computation of the closest point on a spline curve,” in Proc. 5th International Conference on Curves and Surface (2002), pp. 397-406.PMC2290248
  11. R. Chandrasekhar, Fundamentals of Photonics (Society of Photo-optical Instrumentation Engineers, SPIE, 2008).
  12. M. Pfennigauer and A. Ullrich, “Multi-wavelength airborne laser scanning,” in Proc. International Lidar Mapping Forum (2011).
  13. C. Y. Park, D. B. Kim, C. H. Kim, Y. J. Kwon, and E. C. Kang, “Wideband receiver module for LADAR using large area InGaAs avalanche photodiode,” Korean J. Opt. Photon. 24, 1-8 (2013).10.3807/KJOP.2013.24.1.001
  14. S. Schiemann, W. Hogervorst, and W. Ubachs, “Fourier-transform-limited laser pulse tunable in wavelength and in duration (400-2000 ps),” IEEE J. Quantum Electron. 34, 407-412 (1998).10.1109/3.661446
  15. M. Wollenhaupt, A. Assion, and T. Baumert, Femtosecond Laser Pulse: Linear Properties, Manipulation, Generation and Measurement (Springer Handbook of Lasers and Optics, 2007), pp. 937-983.
  16. W. Kai, “A study of cubic spline interpolation,” InSight: River Acad. J. 9, 1-15 (2013).
  17. J. R. Janesick, K. P. Klaasen, and T. Elliott, “ Charge-coupled-device charge-collection efficiency and Photon-transfer technique,” Opt. Eng. 26, 972-980 (1987).10.1117/12.7974183
  18. G. David, “Characterizing digital cameras with the photon transfer curve,” Summit Imaging, <http://www.couriertronics.com/docs/notes/cameras_application_notes/Photon_Transfer_Curve_Charactrization_Method.pdf>, Accessed: (26 June 2012).
Information
  • Publisher :Optical Society of Korea
  • Publisher(Ko) :한국광학회
  • Journal Title :Korean Journal of Optics and Photonics
  • Journal Title(Ko) :한국광학회지
  • Volume : 29
  • No :5
  • Pages :204-214
  • Received Date :2018. 02. 28
  • Accepted Date : 2018. 10. 05