Walter H. Chung

人物简介:

Jason L. Speyer is a Distinguished Professor in the Mechanical and Aerospace Engineering Department and the Electrical Engineering Department at the University of California, Los Angeles. Dr. Speyer has twice been an elected member of the Board of Governors of the IEEE Control Systems Society and has served as an Associate Editor for IEEE and AIAA journals. He is a Fellow of the AIAA and a Life Fellow of the IEEE and has been honored with awards from both organizations. He is also a member of the National Academy of Engineering. Walter H. Chung currently works in the aerospace industry. He has taught graduate courses in stochastic processes, estimation, and control at UCLA since 1997.

Stochastic Processes, Estimation and Control书籍相关信息


内容简介:

Uncertainty and risk are integral to engineering because real systems have inherent ambiguities that arise naturally or due to our inability to model complex physics. The authors discuss probability theory, stochastic processes, estimation, and stochastic control strategies and show how probability can be used to model uncertainty in control and estimation problems. The material is practical and rich in research opportunities. The authors provide a comprehensive treatment of stochastic systems from the foundations of probability to stochastic optimal control. The book covers discrete- and continuous-time stochastic dynamic systems leading to the derivation of the Kalman filter, its properties, and its relation to the frequency domain Wiener filter as well as the dynamic programming derivation of the linear quadratic Gaussian (LQG) and the linear exponential Gaussian (LEG) controllers and their relation to H2 and H-inf controllers and system robustness. Stochastic Processes, Estimation, and Control is divided into three related sections. First, the authors present the concepts of probability theory, random variables, and stochastic processes, which lead to the topics of expectation, conditional expectation, and discrete-time estimation and the Kalman filter. After establishing this foundation, stochastic calculus and continuous-time estimation are introduced. Finally, dynamic programming for both discrete-time and continuous-time systems leads to the solution of optimal stochastic control problems, resulting in controllers with significant practical application. This book is suitable for first-year graduate students in electrical, mechanical, chemical, and aerospace engineering specializing in systems and control. Students in computer science, economics, and possibly business will also find it useful. Professionals in all these fields will find the book of interest.