0
Research Papers

Whole-Body Pose Estimation in Physical Rider–Bicycle Interactions With a Monocular Camera and Wearable Gyroscopes

[+] Author and Article Information
Xiang Lu

Department of Mechanical and
Aerospace Engineering,
Rutgers University,
Piscataway, NJ 08854
e-mail: lux@mail.nankai.edu.cn

Kaiyan Yu

Department of Mechanical and
Aerospace Engineering,
Rutgers University,
Piscataway, NJ 08854
e-mail: kaiyan.yu@rutgers.edu

Yizhai Zhang

Research Center of Intelligent Robotics,
School of Astronautics,
Northwestern Polytechnical University,
Xi'an, Shaanxi 710072, China
e-mail: zhangyizhai@nwpu.edu.cn

Jingang Yi

Professor
Department of Mechanical and
Aerospace Engineering,
Rutgers University,
Piscataway, NJ 08854
e-mail: jgyi@rutgers.edu

Jingtai Liu

Institute of Robotics
and Automatic Information System,
Nankai University,
Tianjin 300071, China
e-mail: liujingtai@nankai.edu.cn

Qijie Zhao

School of Mechatronic Engineering
and Automation,
Shanghai University,
Shanghai 200072, China
e-mail: zqj@shu.edu.cn

1Present address: Institute of Robotics and Automatic Information Systems, Nankai University, Tianjin 300071, China.

2Corresponding author.

3Author is a visiting professor with the School of Mechatronic Engineering and Automation, Shanghai University, Shanghai 200072, China.

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT, AND CONTROL. Manuscript received April 19, 2016; final manuscript received January 9, 2017; published online May 9, 2017. Assoc. Editor: Sergey Nersesov.

J. Dyn. Sys., Meas., Control 139(7), 071005 (May 09, 2017) (11 pages) Paper No: DS-16-1199; doi: 10.1115/1.4035760 History: Received April 19, 2016; Revised January 09, 2017

Pose estimation of human–machine interactions such as bicycling plays an important role to understand and study human motor skills. In this paper, we report the development of a human whole-body pose estimation scheme with application to rider–bicycle interactions. The pose estimation scheme is built on the fusion of measurements of a monocular camera on the bicycle and a set of small wearable gyroscopes attached to the rider's upper- and lower-limbs and the trunk. A single feature point is collocated with each wearable gyroscope and also on the body segment link where the gyroscope is not attached. An extended Kalman filter (EKF) is designed to fuse the visual-inertial measurements to obtain the drift-free whole-body poses. The pose estimation design also incorporates a set of constraints from human anatomy and the physical rider–bicycle interactions. The performance of the estimation design is validated through ten subject riding experiments. The results illustrate that the maximum errors for all joint angle estimations by the proposed scheme are within 3 degs. The pose estimation scheme can be further extended and used in other types of physical human–machine interactions.

FIGURES IN THIS ARTICLE
<>
Copyright © 2017 by ASME
Topics: Bicycles , Design , Errors
Your Session has timed out. Please sign back in to continue.

References

Bonato, P. , 2010, “ Wearable Sensors and Systems,” IEEE Eng. Med. Biol. Mag., 29(3), pp. 25–36. [CrossRef] [PubMed]
Xsens, 2016, “Xsens,” Xsens, Enschede, The Netherlands, accessed Oct. 15, 2016, http://www.xsens.com
Song, C.-G. , Kim, J.-Y. , and Kim, N.-G. , 2004, “ A New Postural Balance Control System for Rehabilitation Training Based on Virtual Cycling,” IEEE Trans. Inform. Technol. Biomed., 8(2), pp. 200–207. [CrossRef]
Aerts, M. B. , Abdo, W. F. , and Bloem, B. R. , 2011, “ The ‘Bicycle Sign’ for Atypical Parkinsonism,” Lancet, 377(9760), pp. 125–126. [CrossRef] [PubMed]
Burt, T. L. , Porretta, D. L. , and Klein, R. E. , 2007, “ Use of Adapted Bicycles on the Learning of Conventional Cycling by Children With Mental Retardation,” Edu. Train. Dev. Disabil., 42(3), pp. 364–379.
You, S. , Neumann, U. , and Azuma, R. , 1999, “ Hybrid Inertial and Vision Tracking for Augmented Reality Registration,” Virtual Reality Conference, Houston, TX, Mar. 13–17, pp. 260–267.
Strelow, D. , and Singh, S. , 2004, “ Motion Estimation From Image and Inertial Measurements,” Int. J. Rob. Res., 23(12), pp. 1157–1195. [CrossRef]
Lobo, J. , and Dias, J. , 2003, “ Vision and Inertial Sensor Cooporation Using Gravity as a Vertical Reference,” IEEE Trans. Pattern Anal. Mach. Intell., 25(12), pp. 1597–1608. [CrossRef]
Jones, E. S. , and Soatto, S. , 2011, “ Visual-Inertial Navigation, Mapping and Localization: A Scalable Real-Time Causal Approach,” Int. J. Rob. Res., 30(4), pp. 407–430. [CrossRef]
Lupton, T. , and Sukkarieh, S. , 2012, “ Visual-Inertial-Aided Navigation for High-Dynamic Motion in Built Environments Without Initial Conditions,” IEEE Trans. Rob., 28(1), pp. 61–76. [CrossRef]
Li, M. , and Mourikis, A. I. , 2012, “ High-Precision, Consistent EKF-Based Visual-Inertial Odometry,” Int. J. Rob. Res., 32(6), pp. 690–711. [CrossRef]
Armesto, L. , Tornero, J. , and Vincze, M. , 2007, “ Fast Ego-Motion Estimation With Multi-Rate Fusion of Inertial and Vision,” Int. J. Rob. Res., 26(6), pp. 577–589. [CrossRef]
Mirzaei, F. , and Roumeliotis, S. , 2008, “ A Kalman Filter-Based Algorithm for IMU-Camera Calibration: Observability Analysis and Performance Evaluation,” IEEE Trans. Rob., 24(5), pp. 1143–1156. [CrossRef]
Martinelli, A. , 2012, “ Vision and IMU Data Fusion: Closed-Form Solutions for Attitude, Speed, Absolute Scale, and Bias Determination,” IEEE Trans. Rob., 28(1), pp. 44–60. [CrossRef]
Foxlin, E. , Altshuler, Y. , Naimark, L. , and Harrington, M. , 2004, “ Flighttracker: A Novel Optical/Inertial Tracker for Cockpit Enhanced Vision,” 3rd IEEE/ACM International Symposium on Mixed Augmented Reality, Arlington, VA, Nov. 2–5, pp. 212–221.
Tao, Y. , Hu, H. , and Zhou, H. , 2007, “ Integration of Vision and Inertial Sensors for 3D Arm Motion Tracking in Home-Based Rehabilitation,” Int. J. Rob. Res., 26(6), pp. 607–624. [CrossRef]
Hu, J.-S. , Tseng, C.-Y. , Chen, M.-Y. , and Sun, K.-C. , 2013, “ IMU-Assisted Monocular Visual Odometry Including the Human Walking Model for Wearable Applications,” IEEE International Conference on Robotics Automation, Karlsruhe, Germany, May 6–10, pp. 2879–2884.
Agarwal, P. , Kumar, S. , Ryde, J. , Corso, J. J. , and Krovi, V. N. , 2014, “ Estimating Dynamics On-the-Fly Using Monocular Video for Vision-Based Robotics,” IEEE/ASME Mechatronics, 19(4), pp. 1412–1423. [CrossRef]
Zhang, Y. , Liu, R. , Trkov, M. , and Yi, J. , 2012, “ Rider/Bicycle Pose Estimation With Integrated IMU/Seat Force Sensor Measurements,” IEEE/ASME Int. Conf. Adv. Intelli. Mechatronics, Kaohsiung, Taiwan, July 11–14, pp. 604–609.
Zhang, Y. , Chen, K. , and Yi, J. , 2013, “ Rider Trunk and Bicycle Pose Estimation With Fusion of Force/Inertial Sensors,” IEEE Trans. Biomed. Eng., 60(9), pp. 2541–2551. [CrossRef] [PubMed]
Zhang, Y. , Chen, K. , Yi, J. , and Liu, L. , 2014, “ Pose Estimation in Physical Human-Machine Interactions With Application to Bicycle Riding,” IEEE/RSJ International Conference on Intelligent Robotics System, Chicago, IL, Sept. 14–18, pp. 3333–3338.
Zhang, Y. , Chen, K. , Yi, J. , Liu, T. , and Pan, Q. , 2016, “ Whole-Body Pose Estimation in Human Bicycle Riding Using a Small Set of Wearable Sensors,” IEEE/ASME Mechatronics, 21(1), pp. 163–174.
Lu, X. , Zhang, Y. , Yu, K. , Yi, J. , and Liu, J. , 2013, “ Upper Limb Pose Estimation in Rider-Bicycle Interactions With an Un-Calibrated Monocular Camera and Wearable Gyroscopes,” ASME Paper No. DSCC2013-3839.
Lu, X. , Yu, K. , Zhang, Y. , Yi, J. , and Liu, J. , 2014, “ Whole-Body Pose Estimation in Physical Rider-Bicycle Interactions With a Monocular Camera and Wearable Gyroscopes,” IEEE/RSJ International Conference on Intelligent Robotics System, Chicago, IL, Sept. 14–18, pp. 4124–4129.
Kim, H. , Miller, L. M. , Byl, N. , Abrams, G. M. , and Rosen, J. , 2012, “ Redundancy Resolution of the Human Arm and an Upper Limb Exoskeleton,” IEEE Trans. Biomed. Eng., 59(6), pp. 1770–1779. [CrossRef] [PubMed]
Otsu, N. , 1979, “ A Threshold Selection Method From Gray Level Histograms,” IEEE Trans. Syst., Man, Cybern., 9(1), pp. 62–66. [CrossRef]
González, R. C. , and Woods, R. E. , 2008, Digital Image Processing, 3rd ed., Prentice Hall, Upper Saddle River, NJ.
Chen, K. , Zhang, Y. , and Yi, J. , 2013, “ Modeling Rider/Bicycle Interactions With Learned Dynamics on Constrained Embedding Manifolds,” IEEE/ASME Int. Conf. Adv. Intelli. Mechatronics, Wollongong, Australia, pp. 442–447.
Yi, J. , Wang, H. , Zhang, J. , Song, D. , Jayasuriya, S. , and Liu, J. , 2009, “ Kinematic Modeling and Analysis of Skid-Steered Mobile Robots With Applications to Low-Cost Inertial-Measurement-Unit-Based Motion Estimation,” IEEE Trans. Rob., 25(5), pp. 1087–1097. [CrossRef]
Roetenberg, D. , Slycke, P. J. , and Veltink, P. H. , 2007, “ Ambulatory Position and Orientation Tracking Fusing Magnetic and Inertial Sensing,” IEEE Trans. Biomed. Eng., 54(5), pp. 883–890. [CrossRef] [PubMed]
Luinge, H. J. , and Veltink, P. H. , 2005, “ Measuring Orientation of Human Body Segments Using Miniature Gyroscopes and Accelerometers,” Med. Biol. Eng. Comput., 43(2), pp. 273–282. [CrossRef] [PubMed]

Figures

Grahic Jump Location
Fig. 1

(a) Riding experiments with the instrumented bicycle and wearable gyroscopes. The vision feature marker is the circular dot and the square feature is used to obtain the pose benchmark in outdoor experiments and (b) schematic of the whole-body frames.

Grahic Jump Location
Fig. 2

(a) Upper-limb and trunk feature frames and gyroscope locations and (b) lower-limb gyroscopes with the feature markers and the body frames

Grahic Jump Location
Fig. 3

Pose estimation and comparison results with strapdown gyroscope integration and the ground truth from Vicon system from one subject's indoor riding experiment: (a) right wrist Euler angle estimation, (b) right elbow angle γre, (c) trunk Euler angle estimation, (d) right thigh Euler angle estimation, (e) right leg ankle δra and the knee angle γrk estimation, and (f) bicycle roll angle φb estimation

Grahic Jump Location
Fig. 4

Pose estimation errors from one subject experiment. The Vicon system measurements are considered as the ground truth: (a) right wrist Euler angle estimation errors, (b) right elbow angle error Δγre, (c) trunk Euler angle estimation error, (d) right thigh Euler angle estimation errors, (e) right leg ankle estimation error Δδra and the knee angle estimation error Δγrk, and (f) the bicycle roll angle estimation error Δφb.

Grahic Jump Location
Fig. 5

The mean and standard deviation (SD) of the pose estimation errors of the body segments from ten subject indoor experiments. The mean values of the EKF design are plotted by the solid lines and the one SD values are plotted as the dash lines. (a) Left-wrist Euler angle estimation statistics, (b) left-thigh Euler angle estimation statistics, (c) right-wrist Euler angle estimation statistics, (d) right-thigh Euler angle estimation statistics, and (e) trunk Euler angle estimation statistics.

Grahic Jump Location
Fig. 6

Pose estimation and comparison results with strapdown gyroscope integration and the ground truth from camera system from one subject's outdoor riding experiment. (a) Right wrist Euler angle estimation, (b) trunk Euler angle estimation, (c) right thigh Euler angle estimation, and (d) bicycle roll angle φb estimation.

Grahic Jump Location
Fig. 7

The mean and SD of the pose estimation errors of the body segments from ten subject outdoor experiments. The mean values of the EKF design are plotted by the solid lines, and the one SD values are plotted as the dash lines: (a) left-wrist Euler angle estimation statistics, (b) left-thigh Euler angle estimation statistics, (c) right-wrist Euler angle estimation statistics, (d) right-thigh Euler angle estimation statistics, and (e) trunk Euler angle estimation statistics.

Grahic Jump Location
Fig. 8

The estimation errors for the right wrist pose angles under various loss percentages of the camera images

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In