0
TECHNICAL PAPERS

An Efficient Algorithm for Computing QFT Bounds

[+] Author and Article Information
J. M. Rodrigues, Y. Chait

Mechanical and Industrial Engineering Department, University of Massachusetts, Amherst, MA 01003

C. V. Hollot

Electrical and Computer Engineering Department, University of Massachusetts, Amherst, MA 01003

J. Dyn. Sys., Meas., Control 119(3), 548-552 (Sep 01, 1997) (5 pages) doi:10.1115/1.2801292 History: Received October 29, 1996; Online December 03, 2007

Abstract

An important step in Quantitative Feedback Theory (QFT) design is the translation of closed-loop performance specifications into QFT bounds. These bounds, domains in a Nichols chart, serve as a guide for shaping the nominal loop response. Traditionally, QFT practitioners relied on manual manipulations of plant templates on Nichols charts to construct such bounds, a tedious process which has recently been replaced with numerical algorithms. However, since the plant template is approximated by a finite number of points, the QFT bound computation grows exponentially with the fineness of the plant template approximation. As a result, the designer is forced to choose between a coarse approximation to lessen the computational burden and a finer one to obtain more accurate QFT bounds. To help mitigate this tradeoff, this paper introduces a new algorithm to more efficiently compute QFT bounds. Examples are given to illustrate the numerical efficiency of this new algorithm.

Copyright © 1997 by The American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In