0
Research Papers

Heterogeneous Multisensor Fusion for Mobile Platform Three-Dimensional Pose Estimation

[+] Author and Article Information
Hanieh Deilamsalehy

Department of Electrical and
Computer Engineering,
Michigan Technological University,
Houghton, MI 49931
e-mail: hdeilams@mtu.edu

Timothy C. Havens

William and Gloria Jackson Associate Professor
Department of Electrical and
Computer Engineering;
Department of Computer Science,
Michigan Technological University,
Houghton, MI 49931
e-mail: thavens@mtu.edu

Joshua Manela

Department of Electrical and
Computer Engineering,
Michigan Technological University,
Houghton, MI 49931
e-mail: jmmanela@mtu.edu

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT, AND CONTROL. Manuscript received July 12, 2016; final manuscript received November 29, 2016; published online April 18, 2017. Assoc. Editor: Davide Spinello.

J. Dyn. Sys., Meas., Control 139(7), 071002 (Apr 18, 2017) (8 pages) Paper No: DS-16-1346; doi: 10.1115/1.4035452 History: Received July 12, 2016; Revised November 29, 2016

Precise, robust, and consistent localization is an important subject in many areas of science such as vision-based control, path planning, and simultaneous localization and mapping (SLAM). To estimate the pose of a platform, sensors such as inertial measurement units (IMUs), global positioning system (GPS), and cameras are commonly employed. Each of these sensors has their strengths and weaknesses. Sensor fusion is a known approach that combines the data measured by different sensors to achieve a more accurate or complete pose estimation and to cope with sensor outages. In this paper, a three-dimensional (3D) pose estimation algorithm is presented for a unmanned aerial vehicle (UAV) in an unknown GPS-denied environment. A UAV can be fully localized by three position coordinates and three orientation angles. The proposed algorithm fuses the data from an IMU, a camera, and a two-dimensional (2D) light detection and ranging (LiDAR) using extended Kalman filter (EKF) to achieve accurate localization. Among the employed sensors, LiDAR has not received proper attention in the past; mostly because a two-dimensional (2D) LiDAR can only provide pose estimation in its scanning plane, and thus, it cannot obtain a full pose estimation in a 3D environment. A novel method is introduced in this paper that employs a 2D LiDAR to improve the full 3D pose estimation accuracy acquired from an IMU and a camera, and it is shown that this method can significantly improve the precision of the localization algorithm. The proposed approach is evaluated and justified by simulation and real world experiments.

FIGURES IN THIS ARTICLE
<>
Copyright © 2017 by ASME
Your Session has timed out. Please sign back in to continue.

References

Jung, P. G. , Oh, S. , Lim, G. , and Kong, K. A. , 2014, “ Mobile Motion Capture System Based on Inertial Sensors and Smart Shoes,” ASME J. Dyn. Syst., Meas., Control, 136(1), p. 011002. [CrossRef]
Bevly, D. M. , 2004, “ Global Positioning System (GPS): A Low-Cost Velocity Sensor for Correcting Inertial Sensor Errors on Ground Vehicles,” ASME J. Dyn. Syst., Meas., Control, 126(2), pp. 255–264. [CrossRef]
Caron, F. , Duflos, E. , Pomorski, D. , and Vanheeghe, P. , 2006, “ GPS/IMU Data Fusion Using Multisensor Kalman Filtering: Introduction of Contextual Aspects,” Inf. Fusion, 7(2), pp. 221–230. [CrossRef]
Mirzaei, F. M. , Roumeliotis, S. , and Kalman, A. , 2008, “ Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation,” IEEE Trans. Rob., 24(5), pp. 1143–1156. [CrossRef]
Hesch, J. A. , Kottas, D. G. , Bowman, S. L. , and Roumeliotis, S. I. , 2013, “ Camera-IMU-Based Localization: Observability Analysis and Consistency Improvement,” Int. J. Rob. Res., 33(1) pp. 182–201.
Carrillo, L. R. , Lpez, A. E. , Lozano, R. , and Pgard, C. , 2012, “ Combining Stereo Vision and Inertial Navigation System for a Quad-Rotor UAV,” J. Intell. Rob. Syst., 65(1–4), pp. 373–387. [CrossRef]
Schauwecker, K. , and Zell, A. , 2014, “ On-Board Dual-Stereo-Vision for the Navigation of an Autonomous MAV,” J. Intell. Rob. Syst., 74(1), pp. 1–6. [CrossRef]
Di, L. , Fromm, T. , and Chen, Y. , 2012, “ A Data Fusion System for Attitude Estimation of Low-Cost Miniature UAVs,” J. Intell. Rob. Syst., 65(1–4), pp. 621–35. [CrossRef]
Hesch, J. , Mirzaei, F. M. , Mariottini, G. L. , and Roumeliotis, S. , 2009, “ A 3D Pose Estimator for the Visually Impaired,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), St. Louis, MO, Oct. 10–15, pp. 2716–2723.
Chambers, A. , Scherer, S. , Yoder, L. , Jain, S. , Nuske, S. , and Singh, S. , 2014, “ Robust Multi-Sensor Fusion for Micro Aerial Vehicle Navigation in GPS-Degraded/Denied Environments,” American Control Conference (ACC), Portland, OR, June 4–6, pp. 1892–1899.
Kelly, J. , and Sukhatme, G. S. , 2011, “ Visual-Inertial Sensor Fusion: Localization, Mapping and Sensor-to-Sensor Self-Calibration,” Int. J. Rob. Res., 30(1), pp. 56–79. [CrossRef]
Laviola, J. J. , 2003, “ A Comparison of Unscented and Extended Kalman Filtering for Estimating Quaternion Motion,” American Control Conference (ACC), Denver, CO, June 4–6, Vol. 3, pp. 2435–2440.
UmaMageswari, A. , Ignatious, J. J. , and Vinodha, R. , 2012, “ A Comparitive Study of Kalman Filter, Extended Kalman Filter and Unscented Kalman Filter for Harmonic Analysis of the Non-Stationary Signals,” Int. J. Sci. Eng. Res., 3(7), pp. 1–9.
Fiorenzani, T. , Manes, C. , Oriolo, G. , and Peliti, P. , 2008, “ Comparative Study of Unscented Kalman Filter and Extended Kalman Filter for Position/Attitude Estimation in Unmanned Aerial Vehicles,” Institute for Systems Analysis and Computer Science (IASI-CNR), Rome, Italy, Report No. 08-08.
Akhoundi, M. A. , and Valavi, E. , 2010, “ Multi-Sensor Fuzzy Data Fusion Using Sensors With Different Characteristics,” J. Computer Science, epub.
Goebel, K. , and Agogino, A. M. , 2001, “ Sensor Validation and Fusion for Automated Vehicle Control Using Fuzzy Techniques,” ASME J. Dyn. Sys., Meas., Control, 123(1), pp. 145–146. [CrossRef]
Zhao, S. , and Farrell, J. A. , 2013, “ 2D LIDAR Aided INS for Vehicle Positioning in Urban Environments,” IEEE International Conference on Control Applications (CCA), Hyderabad, India, Aug. 28–30, pp. 376–381.
Weiss, S. , and Siegwart, R. , 2011, “ Real-Time Metric State Estimation for Modular Vision-Inertial Systems,” IEEE International Conference on Robotics and Automation (ICRA), Shanghai, China, May 9–13, pp. 4531–4537.
Lupton, T. , and Sukkarieh, S. , 2009, “ Efficient Integration of Inertial Observations Into Visual SLAM Without Initialization,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), St. Louis, MO, Oct. 10–15, pp. 1547–1552.
Wasielewski, S. , and Strauss, O. , 1995, “ Calibration of a Multi-Sensor System Laser Rangefinder/Camera,” Intelligent Vehicles' 95 Symposium (IVS), Detroit, MI, Sept. 25–26, pp. 472–477.
Chow, J. C. , Lichti, D. D. , Hol, J. D. , Bellusci, G. , and Luinge, H. , 2014, “ IMU and Multiple RGB-D Camera Fusion for Assisting Indoor Stop-and-Go 3D Terrestrial Laser Scanning,” Robotics, 3(3), pp. 247–280. [CrossRef]
Janabi-Sharifi, F. , and Marey, M. , 2010, “ A Kalman-Filter-Based Method for Pose Estimation in Visual Servoing,” IEEE Trans. Rob., 26(5), pp. 939–947. [CrossRef]
Ligorio, G. , and Sabatini, A. M. , 2013, “ Extended Kalman Filter-Based Methods for Pose Estimation Using Visual, Inertial and Magnetic Sensors: Comparative Analysis and Performance Evaluation,” Sensors, 13(2), pp. 1919–1941. [CrossRef] [PubMed]
Sanchez-Orta, A. , Parra-Vega, V. , Izaguirre-Espinosa, C. , and Garcia, O. , 2015, “ PositionYaw Tracking of Quadrotors,” ASME J. Dyn. Syst., Meas., Control, 137(6), p. 061011. [CrossRef]
Placht, S. , Frsattel, P. , Mengue, E. A. , Hofmann, H. , Schaller, C. , Balda, M. , and Angelopoulou, E. , 2014, “ ROCHADE: Robust Checkerboard Advanced Detection for Camera Calibration,” Computer Vision ECCV, Springer, Berlin, pp. 766–779.
Civera, J. , Grasa, O. G. , Davison, A. J. , and Montiel, J. M. , 2010, “ 1Point RANSAC for Extended Kalman Filtering: Application to Realtime Structure From Motion and Visual Odometry,” J. Field Rob., 27(5), pp. 609–631. [CrossRef]

Figures

Grahic Jump Location
Fig. 1

Different sensor fusion approaches for pose estimation

Grahic Jump Location
Fig. 2

UAV principal axes and orientation geometry. Forward is indicated by the dot.

Grahic Jump Location
Fig. 3

UAV-simulated paths: (a) simulated path 1, (b) simulated path 1 true position, (c) simulated path 1 true orientation, (d) simulated path 2, (e) simulated path 2 true position, and (f) simulated path 2 true orientation

Grahic Jump Location
Fig. 6

Sensor platform: (a) sensors mounted on the UAV, (b) front view, and (c) top view

Grahic Jump Location
Fig. 7

Sensor platform trajectory: (a) sensor platform path, (b) true position, and (c) true orientation

Grahic Jump Location
Fig. 4

Path 1 errors using only camera (solid plots) and using camera and LiDAR (dotted plots): (a) position error comparison (cm) and (b) attitude error comparison (rad)

Grahic Jump Location
Fig. 5

Path 2 errors using only camera (solid plots) and using camera and LiDAR (dotted plots): (a) position error comparison (cm) and (b) attitude error comparison (rad)

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In