0
Research Papers

Fast and Precise Glass Handling Using Visual Servo With Unscented Kalman Filter Dual Estimation

[+] Author and Article Information
Xiaowen Yu

Mechanical System Control Laboratory,
Department of Mechanical Engineering,
University of California,
Berkeley, CA 94720
e-mail: aliceyu@berkeley.edu

Thomas Baker

Department of Mechanical Engineering,
Technische Universität München,
Munich D-80333, Germany
e-mail: tom.baker@tum.de

Yu Zhao

Mechanical System Control Laboratory,
Department of Mechanical Engineering,
University of California,
Berkeley, CA 94720
e-mail: yzhao334@berkeley.edu

Masayoshi Tomizuka

Professor
Department of Mechanical Engineering,
University of California,
Berkeley, CA 94720
e-mail: tomizuka@berkeley.edu

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT, AND CONTROL. Manuscript received March 9, 2017; final manuscript received August 12, 2017; published online November 10, 2017. Assoc. Editor: Heikki Handroos.

J. Dyn. Sys., Meas., Control 140(4), 041008 (Nov 10, 2017) (10 pages) Paper No: DS-17-1145; doi: 10.1115/1.4037734 History: Received March 09, 2017; Revised August 12, 2017

In the protective glass manufacturing industry for cell phones, placing glass pieces into the slots of the grinder requires submillimeter accuracy which only can be achieved by human workers, leading to a bottle neck in the production line. To address such issue, industrial robot equipped with vision sensors is proposed to support human workers. The high placing performance is achieved by a two step approach. In the first step, an eye-to-hand camera is installed to detect the glass piece and slot with robust vision, which can put the glass piece close to the slot and ensures a primary precision. In the second step, a closed-loop controller based on visual servo is adopted to guide the glass piece into the slot with dual eye-in-hand cameras. However, vision sensor suffers from a very low frame rate and slow image processing speed resulting in a very slow placing performance. In addition, the placing performance is substantially limited by the system parameter uncertainty. To compensate for these limitations, a dual-rate unscented Kalman filter (UKF) with dual-estimation is adopted for sensor data filtering and online parameter identification without requiring any linear parameterization of the model. Experimental results are presented to confirm the effectiveness of the proposed approach.

Copyright © 2018 by ASME
Your Session has timed out. Please sign back in to continue.

References

Quarterly, M. , 2016, “ Where Machines Could Replace Humans and Where They Can't (yet),” Digital McKinsey, New York, accessed Sept. 21, 2017, http://www.mckinsey.com/business-functions/digital-mckinsey/our-insights/
McHenry, K. , Ponce, J. , and Forsyth, D. , 2005, “ Finding Glass,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), San Diego, CA, June 20–25, pp. 973–979.
Ihrke, I. , Kutulakos, K. N. , Lensch, H. P. , Magnor, M. , and Heidrich, W. , 2008, “ State of the Art in Transparent and Specular Object Reconstruction,” EUROGRPAHICS, Crete, Greece, Apr. 14–18, pp. 87–108. http://www.cs.toronto.edu/~kyros/pubs/08.eg.star.pdf
Gavin, H. , 2011, “ The Levenberg–Marquardt Method for Nonlinear Least Squares Curve-Fitting Problems,” Duke University, Durham, NC.
Corke, P. , 2011, Robotics, Vision and Control: Fundamental Algorithms in matlab, Vol. 73, Springer, Berlin. [CrossRef]
Siciliano, B. , and Khatib, O. , 2008, Springer Handbook of Robotics, Springer Science & Business Media, Berlin. [CrossRef]
Chaumette, F. , and Hutchinson, S. , 2006, “ Visual Servo Control—I: Basic Approaches,” IEEE Rob. Autom. Mag., 13(4), pp. 82–90. [CrossRef]
Andreff, N. , Espiau, B. , and Horaud, R. , 2002, “ Visual Servoing From Lines,” Int. J. Rob. Res., 21(8), pp. 679–699. [CrossRef]
Espiau, B. , Chaumette, F. , and Rives, P. , 1992, “ A New Approach to Visual Servoing in Robotics,” IEEE Trans. Rob. Autom., 8(3), pp. 313–326. [CrossRef]
Wang, C. , Lin, C.-Y. , and Tomizuka, M. , 2015, “ Statistical Learning Algorithms to Compensate Slow Visual Feedback for Industrial Robots,” ASME J. Dyn. Syst., Meas., Control, 137(3), p. 031011.
Fujii, K. , 2013, “ Extended Kalman Filter,” Reference Manual, ACFA-Sim-J Group, Japan.
Wan, E. A. , and Van Der Merwe, R. , 2000, “ The Unscented Kalman Filter for Nonlinear Estimation,” Adaptive Systems for Signal Processing, Communications, and Control Symposium (AS-SPCC), Lake Louise, AB, Canada, Oct. 1–4, pp. 153–158.
Julier, S. J. , and Uhlmann, J. K. , 1997, “ New Extension of the Kalman Filter to Nonlinear Systems,” AeroSense, Orlando, FL, Apr. 20–25, pp. 182–193. https://www.cs.unc.edu/~welch/kalman/media/pdf/Julier1997_SPIE_KF.pdf
Malis, E. , Mezouar, Y. , and Rives, P. , 2010, “ Robustness of Image-Based Visual Servoing With a Calibrated Camera in the Presence of Uncertainties in the Three-Dimensional Structure,” IEEE Trans. Rob., 26(1), pp. 112–120. [CrossRef]
Papanikolopoulos, N. P. , Nelson, B. J. , and Khosla, P. K. , 1995, “ Six Degree-of-Freedom Hand/Eye Visual Tracking With Uncertain Parameters,” IEEE Trans. Rob. Autom., 11(5), pp. 725–732. [CrossRef]
Wang, H. , Liu, Y.-H. , and Zhou, D. , 2008, “ Adaptive Visual Servoing Using Point and Line Features With an Uncalibrated Eye-in-Hand Camera,” IEEE Trans. Rob., 24(4), pp. 843–857. [CrossRef]
Gonnet, G. H. , 1984, Handbook of Algorithms and Data Structures, Addison-Wesley, London.
ABB, 2016, “ Programming and Testing-Tools,” ABB Robotics, Zürich, Switzerland, accessed Sept. 21, 2017, http://developercenter.robotstudio.com/BlobProxy/manuals/IRC5FlexPendantOpManual/doc99.html
Hallenberg, J. , 2007, “ Robot Tool Center Point Calibration Using Computer Vision,” M.Sc. thesis, Likoping University, Likoping, Sweden. http://www.diva-portal.org/smash/get/diva2:23964/FULLTEXT01.pdf
Fisher, B. , 1997, “ Projective Transformation,” University of Edinburgh, Scotland, UK, accessed Sept. 21, 2017, http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/BEARDSLEY/node3.html
Illingworth, J. , and Kittler, J. , 1988, “ A Survey of the Hough Transform,” Comput. Vision, Graphics, Image Process., 44(1), pp. 87–116. [CrossRef]

Figures

Grahic Jump Location
Fig. 3

Eye-to-hand camera and image acquired: (a) eye-to-hand camera setup and (b) image acquired from camera (without checkerboard)

Grahic Jump Location
Fig. 4

Clustering of the rectified eye-to-hand image

Grahic Jump Location
Fig. 5

Fitting with discretized rectangle

Grahic Jump Location
Fig. 6

Identify glass and slots from regression results (bad fitting: middle left; good fitting: middle right; black lines are distance)

Grahic Jump Location
Fig. 7

Glass piece picked up off-center

Grahic Jump Location
Fig. 8

Glass frame and camera frame

Grahic Jump Location
Fig. 9

Detection of the glass and slot separately

Grahic Jump Location
Fig. 10

Change lighting conditions of workspace

Grahic Jump Location
Fig. 11

Detection of the glass edges

Grahic Jump Location
Fig. 12

Detection of the slot edges

Grahic Jump Location
Fig. 13

Glass pickup accuracy of the first step

Grahic Jump Location
Fig. 14

Glass placing accuracy of the first step

Grahic Jump Location
Fig. 15

Visual servo frame

Grahic Jump Location
Fig. 16

Line detection (back camera) and point approximation (slot's top edge)

Grahic Jump Location
Fig. 17

Dual-rate UKF for low sampling rate and large latency compensation

Grahic Jump Location
Fig. 18

UKF with dual-estimation filtering

Grahic Jump Location
Fig. 19

Parameter adaptation with UKF dual-EST

Grahic Jump Location
Fig. 20

Error convergence comparison of using raw-data, UKF, and UKF with dual-estimation

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In