0
Research Papers

Heterogeneous Multisensor Fusion for Mobile Platform Three-Dimensional Pose Estimation

[+] Author and Article Information
Hanieh Deilamsalehy

Department of Electrical and
Computer Engineering,
Michigan Technological University,
Houghton, MI 49931
e-mail: hdeilams@mtu.edu

Timothy C. Havens

William and Gloria Jackson Associate Professor
Department of Electrical and
Computer Engineering;
Department of Computer Science,
Michigan Technological University,
Houghton, MI 49931
e-mail: thavens@mtu.edu

Joshua Manela

Department of Electrical and
Computer Engineering,
Michigan Technological University,
Houghton, MI 49931
e-mail: jmmanela@mtu.edu

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT, AND CONTROL. Manuscript received July 12, 2016; final manuscript received November 29, 2016; published online April 18, 2017. Assoc. Editor: Davide Spinello.

J. Dyn. Sys., Meas., Control 139(7), 071002 (Apr 18, 2017) (8 pages) Paper No: DS-16-1346; doi: 10.1115/1.4035452 History: Received July 12, 2016; Revised November 29, 2016

Precise, robust, and consistent localization is an important subject in many areas of science such as vision-based control, path planning, and simultaneous localization and mapping (SLAM). To estimate the pose of a platform, sensors such as inertial measurement units (IMUs), global positioning system (GPS), and cameras are commonly employed. Each of these sensors has their strengths and weaknesses. Sensor fusion is a known approach that combines the data measured by different sensors to achieve a more accurate or complete pose estimation and to cope with sensor outages. In this paper, a three-dimensional (3D) pose estimation algorithm is presented for a unmanned aerial vehicle (UAV) in an unknown GPS-denied environment. A UAV can be fully localized by three position coordinates and three orientation angles. The proposed algorithm fuses the data from an IMU, a camera, and a two-dimensional (2D) light detection and ranging (LiDAR) using extended Kalman filter (EKF) to achieve accurate localization. Among the employed sensors, LiDAR has not received proper attention in the past; mostly because a two-dimensional (2D) LiDAR can only provide pose estimation in its scanning plane, and thus, it cannot obtain a full pose estimation in a 3D environment. A novel method is introduced in this paper that employs a 2D LiDAR to improve the full 3D pose estimation accuracy acquired from an IMU and a camera, and it is shown that this method can significantly improve the precision of the localization algorithm. The proposed approach is evaluated and justified by simulation and real world experiments.

FIGURES IN THIS ARTICLE
<>
Copyright © 2017 by ASME
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Fig. 1

Different sensor fusion approaches for pose estimation

Grahic Jump Location
Fig. 2

UAV principal axes and orientation geometry. Forward is indicated by the dot.

Grahic Jump Location
Fig. 3

UAV-simulated paths: (a) simulated path 1, (b) simulated path 1 true position, (c) simulated path 1 true orientation, (d) simulated path 2, (e) simulated path 2 true position, and (f) simulated path 2 true orientation

Grahic Jump Location
Fig. 4

Path 1 errors using only camera (solid plots) and using camera and LiDAR (dotted plots): (a) position error comparison (cm) and (b) attitude error comparison (rad)

Grahic Jump Location
Fig. 5

Path 2 errors using only camera (solid plots) and using camera and LiDAR (dotted plots): (a) position error comparison (cm) and (b) attitude error comparison (rad)

Grahic Jump Location
Fig. 6

Sensor platform: (a) sensors mounted on the UAV, (b) front view, and (c) top view

Grahic Jump Location
Fig. 7

Sensor platform trajectory: (a) sensor platform path, (b) true position, and (c) true orientation

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In