0
Research Papers

A Robust Visual Tracking Method for Unmanned Mobile Systems

[+] Author and Article Information
Xiongfeng Yi

Bio-Inspired Robotics and Controls Lab,
Department of Mechanical and Engineering,
University of Houston,
Houston, TX 77204
e-mail: xyi6@uh.edu

Zheng Chen

Bio-Inspired Robotics and Controls Lab,
Department of Mechanical and Engineering,
University of Houston,
Houston, TX 77204
e-mail: zchen43@central.uh.edu

1Corresponding author.

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT,AND CONTROL. Manuscript received May 5, 2018; final manuscript received January 8, 2019; published online March 25, 2019. Assoc. Editor: Scott Fish.

J. Dyn. Sys., Meas., Control 141(7), 071005 (Mar 25, 2019) (8 pages) Paper No: DS-18-1221; doi: 10.1115/1.4043119 History: Received May 05, 2018; Revised January 08, 2019

This paper introduces a robust visual tracking of objects in complex environments with blocking obstacles and light reflection noises. This visual tracking method utilizes a transfer matrix to project image pixels back to real-world coordinates. During the image process, a color and shape test is used to recognize the object and a vector is used to represent the object, which contains the information of orientation and body length of the object. If the object is partially blocked by the obstacles or the reflection from the water surface, the vector predicts the position of the object. During the real-time tracking, a Kalman filter is used to optimize the result. To validate the method, the visual tracking algorithm was tested by tracking a submarine and a fish on the water surface of a water tank, above which three pieces of blur glass were blocking obstacles between the camera and the object. By using this method, the interference from the reflection of the side glass and the fluctuation of the water surface can be also avoided.

FIGURES IN THIS ARTICLE
<>
Copyright © 2019 by ASME
Your Session has timed out. Please sign back in to continue.

References

Shladover, S. E. , Desoer, C. A. , Hedrick, J. K. , Tomizuka, M. , Walrand, J. , Zhang, W.-B. , McMahon, D. H. , Peng, H. , Sheikholeslam, S. , and McKeown, N. , 1991, “ Automated Vehicle Control Developments in the Path Program,” IEEE Trans. Veh. Technol., 40(1), pp. 114–130. [CrossRef]
Seiler, P. , Song, B. , and Hedrick, J. K. , 1998, “ Development of a Collision Avoidance System,” SAE Paper No. 980853.
Bourquardez, O. , Mahony, R. , Guenard, N. , Chaumette, F. , Hamel, T. , and Eck, L. , 2009, “ Image-Based Visual Servo Control of the Translation Kinematics of a Quadrotor Aerial Vehicle,” IEEE Trans. Rob., 25(3), pp. 743–749. [CrossRef]
Fukao, T. , Fujitani, K. , and Kanade, T. , 2003, “ Image-Based Tracking Control of a Blimp,” 42nd IEEE Conference on Decision and Control (CDC), Maui, HI, Dec. 9–12, pp. 5414–5419.
Brown, A. K. , and Sturza, M. A. , 1995, “ GPS Tracking System,” U.S. Patent No. 5,379,224.
Barton, D. K. , 1988, Modern Radar System Analysis, Artech House, Norwood, MA, p. 607.
Ribas, D. , Ridao, P. , Neira, J. , and Tardos, J. D. , 2006, “ Slam Using an Imaging Sonar for Partially Structured Underwater Environments,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Daejeon, South Korea, Oct. 9–14, pp. 5040–5045.
Belagiannis, V. , Schubert, F. , Navab, N. , and Ilic, S. , 2012, “ Segmentation Based Particle Filtering for Real-Time 2D Object Tracking,” European Conference on Computer Vision, Florence, Italy, Oct. 7–13, pp. 842–855.
Hua, M.-D. , Allibert, G. , Krupínski, S. , and Hamel, T. , 2014, “ Homography-Based Visual Servoing for Autonomous Underwater Vehicles,” IFAC Proc. Vol., 47(3), pp. 5726–5733. [CrossRef]
Prats, M. , Palomeras, N. , Ridao, P. , and Sanz, P. J. , 2012, “ Template Tracking and Visual Servoing for Alignment Tasks With Autonomous Underwater Vehicles,” IFAC Proc. Vol., 45(27), pp. 256–261. [CrossRef]
Kalman, R. E. , 1960, “ A New Approach to Linear Filtering and Prediction Problems,” ASME J. Basic Eng., 82(1), pp. 35–45. [CrossRef]
Trucco, E. , and Verri, A. , 1998, Introductory Techniques for 3-D Computer Vision, Prentice Hall PTR, Upper Saddle River, NJ.
Moshtagh, N. , 2005, “ Minimum Volume Enclosing Ellipsoid,” Convex Optim., 111, p. 112. https://pdfs.semanticscholar.org/21c3/072e516c93b28ccd06f5b994998abc517a7f.pdf
Canny, J. , 1987, “ A Computational Approach to Edge Detection,” Readings in Computer Vision, Elsevier, Amsterdam, The Netherlands, pp. 184–203.
Faragher, R. , 2012, “ Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation [Lecture Notes],” IEEE Signal Process. Mag., 29(5), pp. 128–132. [CrossRef]
Anderson, B. D. , and Moore, J. B. , 1979, Optimal Filtering, Vol. 21, Courier Corporation, North Chelmsford, MA, pp. 22–95.

Figures

Grahic Jump Location
Fig. 1

The tracking domain with four reference points and the initial point for the first searching window to locate the object

Grahic Jump Location
Fig. 2

The process of locating the object. The first image is the original frame, the second one is the edge of the object, the third is the edge point in the R, the last one is the result of the minimum volume.

Grahic Jump Location
Fig. 3

The location given by the vector prediction method. The red line is the vector representing the object, the black cross is the location of the object.

Grahic Jump Location
Fig. 4

The decision tree in the tracking algorithm

Grahic Jump Location
Fig. 5

The setup of the experiment. The gray squares are the blurred glass blocking the view of the camera. The blue square is the water.

Grahic Jump Location
Fig. 6

The tracking result of the blue submarine passes through the blocked area. The black cross represent the location of the object.

Grahic Jump Location
Fig. 7

The path of the submarine in the simulation. The black line is the normal tracking result, the green line is the result given by the vector prediction, the red line is the result of location prediction with the constant velocity, the blue line is the result of Kalman filter.

Grahic Jump Location
Fig. 8

The left figure is the velocity of the submarine in x direction and the right figure is the velocity in y direction based on the tracking result

Grahic Jump Location
Fig. 9

The tracking result of the black auto fish passing through the blocked area. The red cross represent the location of the fish.

Grahic Jump Location
Fig. 10

The path of the robotic fish in the simulation. The black line the normal tracking result, the green line is the result given by the vector prediction, the red line is the result of location prediction with the constant velocity, the blue line is the result of Kalman filter.

Grahic Jump Location
Fig. 11

The left figure is the velocity of the auto fish in x direction and the right figure is the velocity in y direction based on the tracking result

Tables

Errata

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In