Research Papers

Measurement and Modeling of the Effect of Sensory Conflicts on Driver Steering Control

[+] Author and Article Information
Christopher J. Nash

Department of Engineering,
University of Cambridge,
Cambridge CB2 1PZ, UK

David J. Cole

Department of Engineering,
University of Cambridge,
Cambridge CB2 1PZ, UK
e-mail: djc13@cam.ac.uk

Contributed by the Dynamic Systems Division of ASME for publication in the JOURNAL OF DYNAMIC SYSTEMS, MEASUREMENT,AND CONTROL. Manuscript received August 17, 2017; final manuscript received February 4, 2019; published online March 13, 2019. Assoc. Editor: Tesheng Hsiao.

J. Dyn. Sys., Meas., Control 141(6), 061012 (Mar 13, 2019) (11 pages) Paper No: DS-17-1412; doi: 10.1115/1.4042876 History: Received August 17, 2017; Revised February 04, 2019

In previous work, a new model of driver steering control incorporating sensory dynamics was derived and used to explain the performance of drivers in a simulator with full-scale motion feedback. This paper describes further experiments investigating how drivers steer with conflicts between their visual and vestibular measurements, caused by scaling or filtering the physical motion of the simulator relative to the virtual environment. The predictions of several variations of the new driver model are compared with the measurements to understand how drivers perceive sensory conflicts. Drivers are found to adapt well in general, unless the conflict is large, in which case they ignore the physical motion and rely on visual measurements. Drivers make greater use of physical motion which they rate as being more helpful, achieving a better tracking performance. Sensory measurement noise is shown to be signal-dependent, allowing a single set of parameters to be found to fit the results of all the trials. The model fits measured linear steering behavior with an average “variance accounted for (VAF)” of 86%.

Copyright © 2019 by ASME
Your Session has timed out. Please sign back in to continue.


Nash, C. J. , Cole, D. J. , and Bigler, R. S. , 2016, “ A Review of Human Sensory Dynamics for Application to Models of Driver Steering and Speed Control,” Biol. Cybern., 110(2–3), pp. 91–116. [CrossRef] [PubMed]
Nash, C. J. , and Cole, D. J. , 2018, “ Modelling the Influence of Sensory Dynamics on Linear and Nonlinear Driver Steering Control,” Veh. Syst. Dyn., 56(5), pp. 689–718. [CrossRef]
Nash, C. J. , and Cole, D. J. , 2018, “ Identification and Validation of a Driver Steering Control Model Incorporating Human Sensory Dynamics,” Veh. Syst. Dyn., in press.
Butler, J. S. , Smith, S. T. , Campos, J. L. , and Bülthoff, H. H. , 2010, “ Bayesian Integration of Visual and Vestibular Signals for Heading,” J. Vision, 10(11), p. 23. [CrossRef]
Fetsch, C. R. , Deangelis, G. C. , and Angelaki, D. E. , 2010, “ Visual-Vestibular Cue Integration for Heading Perception: Applications of Optimal Cue Integration Theory,” Eur. J. Neurosci., 31(10), pp. 1721–1729. [CrossRef] [PubMed]
Wolpert, D. M. , and Ghahramani, Z. , 2000, “ Computational Principles of Movement Neuroscience,” Nat. Neurosci., 3, pp. 1212–1217. [CrossRef] [PubMed]
Pick, A. J. , and Cole, D. J. , 2007, “ Dynamic Properties of a Driver's Arms Holding a Steering Wheel,” Proc. Inst. Mech. Eng., Part D., 221(12), pp. 1475–1486. [CrossRef]
Telban, R. J. , and Cardullo, F. , 2005, “ Motion Cueing Algorithm Development: Human-Centered Linear and Nonlinear Approaches,” National Aeronautics and Space Administration, Binghamton, NY, Report No. CR-2005-213747. https://ntrs.nasa.gov/archive/nasa/casi.ntrs.nasa.gov/20050180246.pdf
Sharp, R. S. , and Valtetsiotis, V. , 2001, “ Optimal Preview Car Steering Control,” Veh. Syst. Dyn., 35, pp. 101–117. [CrossRef]
Cole, D. J. , Pick, A. J. , and Odhams, A. M. C. , 2006, “ Predictive and Linear Quadratic Methods for Potential Application to Modelling Driver Steering Control,” Veh. Syst. Dyn., 44(3), pp. 259–284. [CrossRef]
Sharp, R. , Casanova, D. , and Symonds, P. , 2000, “ A Mathematical Model for Driver Steering Control, With Design, Tuning and Performance Results,” Veh. Syst. Dyn., 33(5), pp. 289–326. [CrossRef]
Timings, J. P. , and Cole, D. J. , 2012, “ Vehicle Trajectory Linearisation to Enable Efficient Optimisation of the Constant Speed Racing Line,” Veh. Syst. Dyn., 50(6), pp. 883–901. [CrossRef]
Authié, C. N. , and Mestre, D. R. , 2012, “ Path Curvature Discrimination: Dependence on Gaze Direction and Optical Flow Speed,” PLoS One, 7(2), p. e31479. [CrossRef] [PubMed]
Naseri, A. R. , and Grant, P. R. , 2012, “ Human Discrimination of Translational Accelerations,” Exp. Brain Res., 218(3), pp. 455–464. [CrossRef] [PubMed]
Mallery, R. M. , Olomu, O. U. , Uchanski, R. M. , Militchin, V. A. , and Hullar, T. E. , 2010, “ Human Discrimination of Rotational Velocities,” Exp. Brain Res., 204(1), pp. 11–20. [CrossRef] [PubMed]
Bigler, R. S. , 2013, “ Automobile Driver Sensory System Modeling,” Ph.D. thesis, Cambridge University, Cambridge, UK.
Girshick, A. R. , and Banks, M. S. , 2009, “ Probabilistic Combination of Slant Information: Weighted Averaging and Robustness as Optimal Percepts,” J. Vision, 9(9), p. 8. [CrossRef]
Soyka, F. , Robuffo Giordano, P. , Beykirch, K. A. , and Bülthoff, H. H. , 2011, “ Predicting Direction Detection Thresholds for Arbitrary Translational Acceleration Profiles in the Horizontal Plane,” Exp. Brain Res., 209(1), pp. 95–107. [CrossRef] [PubMed]
Barnett-Cowan, M. , 2013, “ Vestibular Perception Is Slow: A Review,” Multisensory Res., 26(4), pp. 387–403. [CrossRef]
Chattington, M. , Wilson, M. , Ashford, D. , and Marple-Horvat, D. E. , 2007, “ Eye-Steering Coordination in Natural Driving,” Exp. Brain Res., 180(1), pp. 1–14. [CrossRef] [PubMed]
Groen, E. L. , and Bles, W. , 2004, “ How to Use Body Tilt for the Simulation of Linear Self Motion,” J. Vestibular Res. Equilib. Orientat., 14(5), pp. 375–385. https://content.iospress.com/articles/journal-of-vestibular-research/ves00205%20
Valente Pais, A. R. , Pool, D. M. , de Vroome, A. M. , van Paassen, M. M. , and Mulder, M. , 2012, “ Pitch Motion Perception Thresholds During Passive and Active Tasks,” J. Guid. Control Dyn., 35(3), pp. 904–918. [CrossRef]


Grahic Jump Location
Fig. 1

Structure of the driver model, reproduced from Ref. [2]

Grahic Jump Location
Fig. 2

Structure of plant in the driver model (adapted from Ref.[2])

Grahic Jump Location
Fig. 3

Model of the driver's visual system

Grahic Jump Location
Fig. 4

Visual display of target line to drivers, with and without preview. Note that the display used in the experiments was much more realistic than these illustrative images.

Grahic Jump Location
Fig. 7

Identified measurement noise amplitudes versus RMS signal amplitudes, using model M2. RMS values correspond to perceived signals, filtered by sensory transfer functions. Vestibular noise amplitudes Va and Vω are not plotted for trials with no translational or rotational motion. Visual noise amplitudes Ve and σϕ are only plotted for the three no-motion trials. Trend lines ignore the high values at low amplitudes for Va and Ve.

Grahic Jump Location
Fig. 5

Bode diagram of motion filters HHP1(s) and HHP2(s)

Grahic Jump Location
Fig. 10

Correlation between metrics for scaled motion experiment with preview: RMS path-following error; difference in VAF values between models M2 and M0; and average driver subjective ratings

Grahic Jump Location
Fig. 6

VAF values found for each model variation using the results of the trials with scaled motion, without preview

Grahic Jump Location
Fig. 8

Variance accounted for using a single parameter set identified to fit all trials, with scaled motion and without preview

Grahic Jump Location
Fig. 9

Variance accounted for values found for each model variation using the results of the trials with scaled motion, with preview

Grahic Jump Location
Fig. 11

Variance accounted for values found for each model variation using the results of the trials with filtered motion

Grahic Jump Location
Fig. 13

Variance accounted for values for all trials using a single set of parameter values, compared with VAFs for parameters found individually for each trial and the Box–Jenkins upper bound

Grahic Jump Location
Fig. 14

Bounds for ratio of measured to predicted noise amplitude. Predicted noise amplitude is defined by the identified single set of parameter values, measured noise amplitude is defined as RMS (δsimδexp) for the upper bound and RMS (δBJδexp) for the lower bound.

Grahic Jump Location
Fig. 12

Variance accounted for values found for each model variation using the results of the trials with full motion



Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In