Precise, robust and consistent localization is a pertinent subject for many areas of science such as vision-based control, path planning, SLAM, etc. To estimate the pose of a platform, sensors such as IMU, GPS, and cameras are commonly employed. Each of these sensors has strengths and weaknesses. Sensor fusion is a known approach that combines the data measured by different sensors to achieve more accurate or complete pose estimation and cope with sensor outages. In this paper, a 3D pose estimation algorithm is presented for a UAV in an unknown GPS-denied environment. A UAV can be exactly localized by three coordinates representing position and three angles showing orientation. The proposed algorithm fuses the data from an IMU, a camera, and a 2D LiDAR using EKF to achieve better localization accuracy than each individual sensor. Among the employed sensors, LiDAR has not received proper attention in the past. This is mostly because a 2D LiDAR can only provide pose estimation in its scanning plane; therefore, it can not present a full pose estimation in a 3D environment. In this paper, a novel method is introduced to employ a 2D LiDAR to improve the full 3D pose estimation accuracy obtained by an IMU and a camera, and it is shown that using it can significantly improve the precision of the localization algorithm. After reviewing previous work on this topic, we first describe the proposed method and then simulation and real world experiment results are presented to assess its performance.