Research Papers

Convergence Properties of a Computational Learning Model for Unknown Markov Chains

[+] Author and Article Information
Andreas A. Malikopoulos

Department of Mechanical Engineering, University of Michigan, Ann Arbor, MI 48109amaliko@umich.edu

J. Dyn. Sys., Meas., Control 131(4), 041011 (May 20, 2009) (7 pages) doi:10.1115/1.3117202 History: Received March 18, 2008; Revised February 04, 2009; Published May 20, 2009

The increasing complexity of engineering systems has motivated continuing research on computational learning methods toward making autonomous intelligent systems that can learn how to improve their performance over time while interacting with their environment. These systems need not only to sense their environment, but also to integrate information from the environment into all decision-makings. The evolution of such systems is modeled as an unknown controlled Markov chain. In a previous research, the predictive optimal decision-making (POD) model was developed, aiming to learn in real time the unknown transition probabilities and associated costs over a varying finite time horizon. In this paper, the convergence of the POD to the stationary distribution of a Markov chain is proven, thus establishing the POD as a robust model for making autonomous intelligent systems. This paper provides the conditions that the POD can be valid, and be an interpretation of its underlying structure.

Copyright © 2009 by American Society of Mechanical Engineers
Topics: Chain , Probability
Your Session has timed out. Please sign back in to continue.



Grahic Jump Location
Figure 1

Stochastic system model schematic




Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In