Function Space BFGS Quasi-Newton Learning Algorithm for Time-Varying Recurrent Neural Networks

[+] Author and Article Information
Lilai Yan

Department of Mechanical Engineering, Columbia University, New York, NY

C. James Li, Tung-Yung Huang

Department of Mechanical Engineering, Aeronautical Engineering and Mechanics, Rensselaer Polytechnic Institute, Troy, NY

J. Dyn. Sys., Meas., Control 118(1), 132-138 (Mar 01, 1996) (7 pages) doi:10.1115/1.2801133 History: Received July 22, 1992; Online December 03, 2007


This paper describes a new learning algorithm for time-varying recurrent neural networks whose weights are functions of time instead of scalars. First, an objective functional that is a function of the weight functions quantifying the discrepancies between the desired outputs and the network’s outputs is formulated. Then, dynamical optimization is used to derive the necessary conditions for the extreme of the functional. These necessary conditions result in a two-point boundary-value problem. This two-point boundary-value problem is subsequently solved by the Hilbert function space BFGS quasi-Newton algorithm, which is obtained by using the dyadic operator to extend the Euclidean space BFGS method into an infinite-dimensional, real Hilbert space. Finally, the ability of the network and the learning algorithm is demonstrated in the identification of three simulated nonlinear systems and a resistance spot welding process.

Copyright © 1996 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.






Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In