0
Research Papers

Appearance-Based Localization of Mobile Robots Using Group LASSO Regression

[+] Author and Article Information
Huan N. Do

School of Computer Science,
University of Adelaide,
Adelaide 5005, South Australia, Australia
e-mail: huan.do@adelaide.edu.au

Jongeun Choi

School of Mechanical Engineering,
Yonsei University,
Seoul 03722, South Korea
e-mail: jongeunchoi@yonsei.ac.kr

Chae Young Lim

Department of Statistics,
Seoul National University,
Seoul 08826, South Korea
e-mail: limc@stats.snu.ac.kr

Tapabrata Maiti

Professor
Department of Statistics and Probability,
Michigan State University,
East Lansing, MI 48824
e-mail: maiti@stt.msu.edu

1Corresponding author.

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT,AND CONTROL. Manuscript received June 10, 2017; final manuscript received January 20, 2018; published online April 30, 2018. Assoc. Editor: Hashem Ashrafiuon.

J. Dyn. Sys., Meas., Control 140(9), 091016 (Apr 30, 2018) (9 pages) Paper No: DS-17-1297; doi: 10.1115/1.4039286 History: Received June 10, 2017; Revised January 20, 2018

Appearance-based localization is a robot self-navigation technique that integrates visual appearance and kinematic information. To analyze the visual appearance, we need to build a regression model based on extracted visual features from raw images as predictors to estimate the robot's location in two-dimensional (2D) coordinates. Given the training data, our first problem is to find the optimal subset of the features that maximize the localization performance. To achieve appearance-based localization of a mobile robot, we propose an integrated localization model that consists of two main components: the group least absolute shrinkage and selection operator (LASSO) regression and sequential Bayesian filtering. We project the output of the LASSO regression onto the kinematics of the mobile robot via sequential Bayesian filtering. In particular, we examine two candidates for the Bayesian estimator: the extended Kalman filter (EKF) and particle filter (PF). Our method is implemented in both indoor mobile robot and outdoor vehicle equipped with an omnidirectional camera. The results validate the effectiveness of our proposed approach.

FIGURES IN THIS ARTICLE
<>
Copyright © 2018 by ASME
Your Session has timed out. Please sign back in to continue.

References

Bonin-Font, F. , Ortiz, A. , and Oliver, G. , 2008, “ Visual Navigation for Mobile Robots: A Survey,” J. Intell. Rob. Syst., 53(3), pp. 263–296. [CrossRef]
Weiss, S. , Scaramuzza, D. , and Siegwart, R. , 2011, “ Monocular-Slam–Based Navigation for Autonomous Micro Helicopters in GPS-Denied Environments,” J. Field Rob., 28(6), pp. 854–874. [CrossRef]
Pestana, J. , Sanchez-Lopez, J. L. , Saripalli, S. , and Campoy, P. , 2014, “ Computer Vision Based General Object Following for GPS-Denied Multirotor Unmanned Vehicles,” American Control Conference (ACC), Portland, OR, June 4–6, pp. 1886–1891.
Xu, Y. , Choi, J. , Dass, S. , and Maiti, T. , 2016, Bayesian Prediction and Adaptive Sampling Algorithms for Mobile Sensor Networks: Online Environmental Field Reconstruction in Space and Time, Springer, New York.
Xu, Y. , Choi, J. , Dass, S. , and Maiti, T. , 2013, “ Efficient Bayesian Spatial Prediction With Mobile Sensor Networks Using Gaussian Markov Random Fields,” Automatica, 49(12), pp. 3520–3530. [CrossRef]
Gupta, A. , Chang, H. , and Yilmaz, A. , 2016, “ GPS-Denied Geo-Localisation Using Visual Odometry,” ISPRS Annual Photogrammetry, Remote Sensing Spatial Information Science, pp. 263–270.
Bellotto, N. , Burn, K. , Fletcher, E. , and Wermter, S. , 2008, “ Appearance-Based Localization for Mobile Robots Using Digital Zoom and Visual Compass,” Rob. Auton. Syst., 56(2), pp. 143–156. [CrossRef]
Choi, S. , Jadaliha, M. , Choi, J. , and Oh, S. , 2015, “ Distributed Gaussian Process Regression Under Localization Uncertainty,” ASME J. Dyn. Syst. Meas. Control, 137(3), p. 031007.
Jadaliha, M. , Xu, Y. , Choi, J. , Johnson, N. S. , and Li, W. , 2013, “ Gaussian Process Regression for Sensor Networks Under Localization Uncertainty,” IEEE Trans. Signal Process., 61(2), pp. 223–237. [CrossRef]
Hayet, J. , Lerasle, F. , and Devy, M. , 2003, “ Visual Landmarks Detection and Recognition for Mobile Robot Navigation,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Madison, WI, June 18–20, pp. II-313–II-318.
Kosecka, J. , Zhou, L. , Barber, P. , and Duric, Z. , 2003, “ Qualitative Image Based Localization in Indoors Environments,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Madison, WI, June 18–20, pp. II-3–II-8.
Torralba, A. , Murphy, K. , Freeman, W. , and Rubin, M. , 2003, “ Context-Based Vision System for Place and Object Recognition,” IEEE International Conference on Computer Vision, Nice, France, Oct. 13–16, pp. 273–280.
Ohnishi, N. , and Imiya, A. , 2013, “ Appearance-Based Navigation and Homing for Autonomous Mobile Robot,” Image Vision Comput., 31(6–7), pp. 511–532. [CrossRef]
Lazebnik, S. , Schmid, C. , and Ponce, J. , 2006, “ Beyond Bags of Features: Spatial Pyramid Matching for Recognizing Natural Scene Categories,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, New York, June 17–22, pp. 2169–2178.
Valgren, C. , and Lilienthal, A. , 2008, “ Incremental Spectral Clustering and Seasons: Appearance-Based Localization in Outdoor Environments,” IEEE International Conference on Robotics and Automation (ICRA), Pasadena, CA, May 19–23, pp. 1856–1861.
Depatla, S. , Buckland, L. , and Mostofi, Y. , 2015, “ X-Ray Vision With Only WiFi Power Measurements Using Rytov Wave Models,” IEEE Trans. Veh. Technol., 64(4), pp. 1376–1387. [CrossRef]
Se, S. , Lowe, D. , and Little, J. , 2005, “ Vision-Based Global Localization and Mapping for Mobile Robots,” IEEE Trans. Rob., 21(3), pp. 364–375. [CrossRef]
Ribeiro, M. I. , 2004, “ Kalman and Extended Kalman Filters: Concept, Derivation and Properties,” Inst. Syst. Rob., 43, p. 43.
Arulampalam, M. S. , Maskell, S. , Gordon, N. , and Clapp, T. , 2002, “ A Tutorial on Particle Filters for Online Nonlinear/Non-Gaussian Bayesian Tracking,” IEEE Trans. Signal Process., 50(2), pp. 174–188. [CrossRef]
Duvallet, F. , and Tews, A. D. , 2008, “ WiFi Position Estimation in Industrial Environments Using Gaussian Processes,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2008), Nice, France, Sept. 22–26, pp. 2216–2221.
Do, H. N. , Jadaliha, M. , Choi, J. , and Lim, C. Y. , 2015, “ Feature Selection for Position Estimation Using an Omnidirectional Camera,” Image Vision Comput., 39, pp. 1–9. [CrossRef]
Obozinski, G. , Wainwright, M. J. , and Jordan, M. I. , 2011, “ Support Union Recovery in High-Dimensional Multivariate Regression,” Ann. Stat., 39(1), pp. 1–47. [CrossRef]
Brooks, A. , Makarenko, A. , and Upcroft, B. , 2008, “ Gaussian Process Models for Indoor and Outdoor Sensor-Centric Robot Localization,” IEEE Trans. Rob., 24(6), pp. 1341–1351. [CrossRef]
Menegatti, E. , Zoccarato, M. , Pagello, E. , and Ishiguro, H. , 2004, “ Image-Based Monte Carlo Localisation With Omnidirectional Images,” Rob. Auton. Syst., 48(1), pp. 17–30. [CrossRef]
Vlassis, N. , Bunschoten, R. , and Krose, B. , 2001, “ Learning Task-Relevant Features From Robot Data,” IEEE International Conference on Robotics and Automation (ICRA), Seoul, South Korea, May 21–26, pp. 499–504.
Schairer, T. , Huhle, B. , Vorst, P. , Schilling, A. , and Straser, W. , 2011, “ Visual Mapping With Uncertainty for Correspondence-Free Localization Using Gaussian Process Regression,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), San Francisco, CA, Sept. 25–30, pp. 4229–4235.
Do, H. N. , Choi, J. , Lim, C. Y. , and Maiti, T. , 2015, “ Appearance-Based Localization Using Group LASSO Regression With an Indoor Experiment,” IEEE International Conference on Advanced Intelligent Mechatronics (AIM), Busan, South Korea, July 7–11, pp. 984–989.
Do, H. N. , Choi, J. , Lim, C. Y. , and Maiti, T. , 2015, “ Appearance-Based Outdoor Localization Using Group LASSO Regression,” ASME Paper No. DSCC2015-9865.
Se, S. , Lowe, D. , and Little, J. , 2002, “ Mobile Robot Localization and Mapping With Uncertainty Using Scale-Invariant Visual Landmarks,” Int. J. Rob. Res., 21(8), pp. 735–758. [CrossRef]
Winters, N. , Gaspar, J. , Lacey, G. , and Santos-Victor, J. , 2000, “ Omni-Directional Vision for Robot Navigation,” IEEE Workshop on Omnidirectional Vision, Hilton Head Island, SC, June 12, pp. 21–28.
Menegatti, E. , Maeda, T. , and Ishiguro, H. , 2004, “ Image-Based Memory for Robot Navigation Using Properties of Omnidirectional Images,” Rob. Auton. Syst., 47(4), pp. 251–267. [CrossRef]
Nixon, M. , and Aguado, A. S. , 2008, Feature Extraction and Image Processing, Academic Press, Cambridge, MA.
Hadjidemetriou, E. , Grossberg, M. D. , and Nayar, S. K. , 2004, “ Multiresolution Histograms and Their Use for Recognition,” IEEE Trans. Pattern Anal. Mach. Intell., 26(7), pp. 831–847. [CrossRef] [PubMed]
Bay, H. , Andreas Ess, T. T. , and Gool, L. V. , 2008, “ Speeded-Up Robust Features (SURF),” Comput. Vision Image Understanding, 110(3), pp. 346–359. [CrossRef]
Hartigan, J. A. , and Wong, M. A. , 1979, “ Algorithm as 136: A k-Means Clustering Algorithm,” Appl. Stat., (1), pp. 100–108.
Gopaluni, R. , 2008, “ A Particle Filter Approach to Identification of Nonlinear Processes Under Missing Observations,” Can. J. Chem. Eng., 86(6), pp. 1081–1092. [CrossRef]

Figures

Grahic Jump Location
Fig. 1

(a) and (b) show the wrapped and unwrapped omnidirectional images, respectively. (c) shows the FFT magnitude plot in three-dimensional.

Grahic Jump Location
Fig. 2

(a) Shrinkage of estimate vector entries with respect to different values of λ: the elements of b̂ with respect to λ are shown. (b) The optimal value for λ is chosen at the minimum point of the validation error curve.

Grahic Jump Location
Fig. 3

(a) Indoor experiment environment and the zoomed-in picture of the mobile robot shown in the upper left corner. (b) Outdoor experiment in the campus of Michigan State University on Google map.

Grahic Jump Location
Fig. 4

Plot of P[1,1] (dashed line) and P[2,2] (solid line) for iterations from 20 to 80 for the group LASSO-based with EKF localization

Grahic Jump Location
Fig. 5

Indoor experiment: the true trajectory (solid line), group LASSO (squares), group LASSO + PF (dotted line), and group LASSO + EKF (dotted dashed line) predictions

Grahic Jump Location
Fig. 6

(a) Data acquisition circuit, (b) panoramic camera, and (c) vehicle equipped with the camera on top

Grahic Jump Location
Fig. 7

The training (dashed) and testing (solid lines) paths are plotted in meters

Grahic Jump Location
Fig. 8

Outdoor experiment: the true trajectory (solid line), group LASSO (squares), group LASSO + PF (dotted line), and group LASSO + EKF (dotted dashed line) predictions are plotted in meters

Grahic Jump Location
Fig. 9

Outdoor experiment: (a) the evolution of entries of the estimate matrix B versus the penalty λ and (b) overall 200 entries of the optimal matrix B are plotted in bars

Grahic Jump Location
Fig. 10

The snapshot of the particle filter at t = 50: the true trajectory (solid line), group LASSO (squares), and group LASSO + PF (dashed line) predictions are plotted in meters. Each particle is plotted with the color in grayscale corresponding to its probability weight. The sum of the weights of all particles is 1.

Grahic Jump Location
Fig. 11

Visualization of RMSEs of two experiments from Table 1

Tables

Errata

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In