Technical Brief

Accurate, Robust, and Real-Time Pose Estimation of Finger

[+] Author and Article Information
Youngmok Yun

Department of Mechanical Engineering,
The University of Texas at Austin,
Austin, TX 78712
e-mail: yunyoungmok@utexas.edu

Priyanshu Agarwal

Department of Mechanical Engineering,
The University of Texas at Austin,
Austin, TX 78712
e-mail: mail2priyanshu@utexas.edu

Ashish D. Deshpande

Assistant Professor
Department of Mechanical Engineering,
The University of Texas at Austin,
Austin, TX 78712
e-mail: ashish@austin.utexas.edu

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT, AND CONTROL. Manuscript received December 31, 2013; final manuscript received July 30, 2014; published online October 21, 2014. Assoc. Editor: Jongeun Choi.

J. Dyn. Sys., Meas., Control 137(3), 034505 (Oct 21, 2014) (6 pages) Paper No: DS-13-1536; doi: 10.1115/1.4028162 History: Received December 31, 2013; Revised July 30, 2014

Many robotic applications need an accurate, robust, and fast estimation of finger pose. We present a novel finger pose estimation method using a motion capture system. The method combines a system identification stage and a state tracking stage in a unified framework. The system identification stage develops an accurate model of a finger, and the state tracking stage tracks the finger pose with the extended Kalman filter (EKF) algorithm based on the model obtained in the system identification stage. The algorithm is validated by simulation, and experiments with a human subject and a robotic finger. The experimental results show that the method can robustly estimate the finger pose at a high frequency (greater than 1 kHz) in the presence of measurement noise, occlusion of markers, and fast movement.

Copyright © 2015 by ASME
Your Session has timed out. Please sign back in to continue.


Wege, A., and Hommel, G., 2005, “Development and Control of a Hand Exoskeleton for Rehabilitation of Hand Injuries,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, Aug. 2–6, pp. 3046–3051. [CrossRef]
Worsnopp, T., Peshkin, M., Colgate, J., and Kamper, D., 2007, “An Actuated Finger Exoskeleton for Hand Rehabilitation Following Stroke,” IEEE International Conference on Rehabilitation Robotics, Noordwijk, The Netherlands, June 13–15, pp. 896–901. [CrossRef]
Lelieveld, M. J., and Maeno, T., 2006, “Design and Development of a 4 DOF Portable Haptic Interface With Multi-point Passive Force Feedback for the Index Finger,” IEEE International Conference on Robotics and Automation, Orlando, FL, May 15–19, pp. 3134–3139. [CrossRef]
Koike, H., Sato, Y., and Kobayashi, Y., 2001, “Integrating Paper and Digital Information on Enhanced Desk: A Method for Realtime Finger Tracking on an Augmented Desk System,” ACM Trans. Comput. Hum. Interact., 8(4), pp. 307–322. [CrossRef]
Moeslund, T. B., and Granum, E., 2001, “A Survey of Computer Vision-Based Human Motion Capture,” Comput. Vision Image Understanding, 81(3), pp. 231–268. [CrossRef]
Erol, A., Bebis, G., Nicolescu, M., Boyle, R. D., and Twombly, X., 2007, “Vision-Based Hand Pose Estimation: A Review,” Comput. Vision Image Understanding, 108(1), pp. 52–73. [CrossRef]
Landsmeer, J., 1963, “The Coordination of Finger-Joint Motions,” J. Bone Joint Surg., 45(8), pp. 1654–1662. Available at: http://jbjs.org/content/45/8/1654
Ryu, J. H., Miyata, N., Kouchi, M., Mochimaru, M., and Lee, K. H., 2006, “Analysis of Skin Movement With Respect to Flexional Bone Motion Using MR Images of a Hand,” J. Biomech., 39(5), pp. 844–852. [CrossRef] [PubMed]
Kuo, P., and Deshpande, A. D., 2012, “Muscle–Tendon Units Provide Limited Contributions to the Passive Stiffness of the Index Finger Metacarpophalangeal Joint,” J. Biomech., 45(15), pp. 2531–2538. [CrossRef] [PubMed]
Braido, P., and Zhang, X., 2004, “Quantitative Analysis of Finger Motion Coordination in Hand Manipulative and Gestic Acts,” Hum. Mov. Sci., 22(6), pp. 661–678. [CrossRef] [PubMed]
Baker, N. A., Cham, R., Cidboy, E. H., Cook, J., and Redfern, M. S., 2007, “Kinematics of the Fingers and Hands During Computer Keyboard Use,” Clin. Biomech., 22(1), pp. 34–43. [CrossRef]
Zhang, X., Lee, S.-W., and Braido, P., 2003, “Determining Finger Segmental Centers of Rotation in Flexion–Extension Based on Surface Marker Measurement,” J. Biomech., 36(8), pp. 1097–1102. [CrossRef] [PubMed]
Maycock, J., Steffen, J., Haschke, R., and Ritter, H., 2011, “Robust Tracking of Human Hand Postures for Robot Teaching,” IEEE/ RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, Sept. 25–30, pp. 2947–2952. [CrossRef]
Fu, Q., and Santello, M., 2010, “Tracking Whole Hand Kinematics Using Extended Kalman Filter,” Annual International Conference of the IEEE on Engineering in Medicine and Biology Society (EMBC), Buenos Aires, Argentina, Aug. 31–Sept. 4, pp. 4606–4609.
Todorov, E., 2007, “Probabilistic Inference of Multijoint Movements, Skeletal Parameters and Marker Attachments From Diverse Motion Capture Data,” IEEE Trans. Biomed. Eng., 54(11), pp. 1927–1939. [CrossRef] [PubMed]
Yun, Y., Agarwal, P., and Deshpande, A. D., 2013, “Accurate, Robust, and Real-Time Estimation of Finger Pose With a Motion Capture System,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan, Nov. 3–7, pp. 1626–1631.
An, K.-N., Chao, E. Y., Cooney, W., and Linscheid, R. L., 1979, “Normative Model of Human Hand for Biomechanical Analysis,” J. Biomech., 12(10), pp. 775–788. [CrossRef] [PubMed]
Thrun, S., Burgard, W., and Fox, D., 2005, Probabilistic Robotics. MIT Press, Cambridge, MA.
Yun, Y., Park, B., and Chung, W. K., 2008, “Odometry Calibration Using Home Positioning Function for Mobile Robot,” IEEE International Conference on Robotics and Automation, Pasadena, CA, May 19–23, pp. 2116–2121. [CrossRef]
Flash, T., and Hogan, N., 1985, “The Coordination of Arm Movements: An Experimentally Confirmed Mathematical Model,” J. Neurosci., 5(7), pp. 1688–1703. Available at: http://www.jneurosci.org/content/5/7/1688 [PubMed]
Todorov, E., and Jordan, M. I., 1998, “Smoothness Maximization Along a Predefined Path Accurately Predicts the Speed Profiles of Complex Arm Movements,” J. Neurophysiol., 80(2), pp. 696–714. Available at: http://jn.physiology.org/content/80/2/696 [PubMed]
Uno, Y., Kawato, M., and Suzuki, R., 1989, “Formation and Control of Optimal Trajectory in Human Multijoint Arm Movement,” Biol. Cybern., 61(2), pp. 89–101. [CrossRef] [PubMed]
Gill, P. E., Murray, W., and Wright, M. H., 1981, Practical Optimization, Academic, New York.
Yun, Y., Agarwal, P., and Deshpande, A. D., 2013, “Experiment Result Video for the Finger Pose Estimation,” http://youtube/TrXl82i_txA
Deshpande, A., Xu, Z., Weghe, M., Brown, B., Ko, J., Chang, L., Wilkinson, D., Bidic, S., and Matsuoka, Y., 2013, “Mechanisms of the Anatomically Correct Testbed Hand,” IEEE/ASME Trans. Mechatronics, 18(1), pp. 238–250. [CrossRef]


Grahic Jump Location
Fig. 1

Finger configuration for kinematic modeling. MCP joint is modeled with a saddle joint, and PIP, DIP are modeled with hinge joints. A total of seven markers are attached on metacarpal and phalanges. Each phalange and metacarpal have their own local coordinate frame.

Grahic Jump Location
Fig. 2

Results of system identification in the simulation. The height of bars indicates the average error of 100 optimized model parameter sets compared with the ground-truth values.

Grahic Jump Location
Fig. 3

Results of state estimation in the simulation. A virtual finger moved along a predetermined arbitrary trajectory, and a virtual motion capture system provided the positions of markers with a Gaussian noise whose standard deviation is 3 mm. The estimation was performed twice with 10% occlusion and then 50% occlusion cases. (a)–(d) Show the tracking results for the four joint motions. (e) Demonstrates the tracking error averaged over four joint poses. (f) Shows the size of covariance matrix via the second norm.

Grahic Jump Location
Fig. 4

We attached seven active markers on a subject's index finger. The motion capture system (PhaseSpace Inc., Ref. [5]) provides the three-dimensional positions of markers at 480 Hz. For visual validation, a program renders the finger pose in real-time behind the finger.

Grahic Jump Location
Fig. 5

For experimental validation, a robotic hand, called the ACT hand [25], was used. The system identification and tracking stages were performed in the similar way as done for the human finger.

Grahic Jump Location
Fig. 6

Results of system identification for a human finger. The size of bars indicates the standard deviation of optimized kinematic model parameters.

Grahic Jump Location
Fig. 7

Estimation results from the experiments with a human subject. The first row demonstrates the experimental environment. A subject moved his finger freely, and an OpenGL program showed the estimated finger pose behind the subject's motion in real-time. The second row illustrates the estimated marker positions and the actual marker positions to show the accuracy of estimation. The third row demonstrates the uncertainty of the estimated finger joints via standard deviation. (a) At the start, the initial joint angles are zero and their covariance values are very large. (b) Within 0.1 s, all estimated joint angles converged, and the comparison with actual motion and actual marker position indicates that the algorithm tracks accurately. (c) A researcher occluded some of markers intentionally to test for its robustness. The marker occlusion increased the uncertainty of the estimation, but still the algorithm tracked its actual motion through the predictor. (d) The disturbance made by the researcher produced unrealistic measurement (noise), but the stochastic Kalman gain selectively corrected its state. (e) Last, the subject moved his finger at a high speed. Although the uncertainty of estimation slightly increased, the method tracked the finger pose (see the attached video in Ref. [24]).

Grahic Jump Location
Fig. 8

Results of finger pose estimation. The EKF algorithm tracked the system state. The gap between actual marker positions and reconstructed marker positions, or ‖z-z∧‖, indirectly illustrates the performance of the finger pose estimation algorithm including the system identification stage and the tracking stage. (a) Human finger and (b) ACT hand's finger.




Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In