This tutorial-style presentation of the fundamental techniques and algorithms in adaptive control is designed to meet the needs of a wide audience without sacrificing mathematical depth or rigor. The text explores the design, analysis, and application of a wide variety of algorithms that can be used ... read more
Our Editors also recommend:
Control System Design: An Introduction to State-Space Methods by Bernard Friedland Introduction to state-space methods covers feedback control; state-space representation of dynamic systems and dynamics of linear systems; frequency-domain analysis; controllability and observability; shaping the dynamic response; and more. 1986 edition.
Adaptive Control: Stability, Convergence and Robustness by Shankar Sastry, Marc Bodson Clear, conceptual presentation surveys major results, techniques of analysis, and research. Focuses chiefly on linear, continuous time, and single-input, single-output systems, including relevant algorithms, dynamic properties, and tools for analysis. 1989 edition.
Adaptive Control: Second Edition by Karl J. Åström, Dr. Björn Wittenmark Suitable for advanced undergraduates and graduate students, this overview introduces theoretical and practical aspects of adaptive control, with emphasis on deterministic and stochastic viewpoints. 1995 edition.
Introduction to Stochastic Control Theory by Karl J. Åström Exploration of stochastic control theory in terms of analysis, parametric optimization, and optimal stochastic control. Limited to linear systems with quadratic criteria; covers discrete time and continuous time systems. 1970 edition.
Computer-Controlled Systems: Theory and Design, Third Edition by Dr. Karl J Åström, Dr. Björn Wittenmark This volume features computational tools that can be applied directly and are explained with simple calculations, plus an emphasis on control system principles and ideas. Includes worked examples, MATLAB macros, and solutions manual.
Linear Robust Control by Michael Green, David J.N. Limebeer Recent decades have witnessed enormous strides in the field of robust control of dynamical systems. This text for students and control engineers provides an in-depth examination of modern advances. 1995 edition.
Adaptive Filtering Prediction and Control by Graham C Goodwin, Kwai Sang Sin This unified survey focuses on linear discrete-time systems and explores natural extensions to nonlinear systems. It emphasizes discrete-time systems, summarizing theoretical and practical aspects of a large class of adaptive algorithms. 1984 edition.
Algebras of Holomorphic Functions and Control Theory by Amol Sasane Accessible, undergraduate-level text illustrates the role of algebras of holomorphic functions in the stabilization of a linear control system. Concise, self-contained treatment avoids advanced mathematics. 2009 edition.
Decentralized Control of Complex Systems by Dragoslav D. Siljak This book explores optimization, output feedback, manipulative power of graphs, overlapping decompositions and the underlying inclusion principle, and reliability design. An appendix provides efficient graph algorithms. 1991 edition.
Feedback Control Theory by John C. Doyle, Bruce A. Francis, Allen R. Tannenbaum This excellent introduction to feedback control system design offers a theoretical approach that captures the essential issues and can be applied to a wide range of practical problems. 1992 edition.
Functional Analysis and Linear Control Theory by J. R. Leigh Functional analysis provides a concise conceptual framework for linear control theory. This self-contained text demonstrates the subject's unity with a wide range of powerful theorems. 1980 edition.
Lyapunov Matrix Equation in System Stability and Control by Zoran Gajic, Muhammad Tahir Javed Qureshi This book provides solutions to many engineering and mathematical problems related to the Lyapunov matrix equation. Its considerations of development and applications make it practical for problem solving and research. 1995 edition.
Optimal Control: Linear Quadratic Methods by Brian D. O. Anderson, John B. Moore Numerous examples highlight this treatment of linear quadratic Gaussian methods and control system design. It explores linear optimal control theory and applications from an engineering viewpoint. Complete solutions. 1990 edition.
Optimal Control and Estimation by Robert F. Stengel Graduate-level text provides introduction to optimal control theory for stochastic systems, emphasizing application of basic concepts to real problems. "Invaluable as a reference for those already familiar with the subject." — Automatica.
Optimal Control Theory: An Introduction by Donald E. Kirk Upper-level undergraduate text introduces aspects of optimal control theory: dynamic programming, Pontryagin's minimum principle, and numerical techniques for trajectory optimization. Numerous figures, tables. Solution guide available upon request. 1970 edition.
Digital Filters by Richard W. Hamming Introductory text examines role of digital filtering in many applications, particularly computers. Focus on linear signal processing; some consideration of roundoff effects, Kalman filters. Only calculus, some statistics required.
Digital Processing of Random Signals: Theory and Methods by Boaz Porat This excellent advanced text rigorously covers several topics. Geared toward students of electrical engineering, its material is sufficiently general to be applicable to other engineering fields. 1994 edition.
Principles of Digital Communication and Coding by Andrew J. Viterbi, Jim K. Omura This classic by two digital communications experts is geared toward students of communications theory and to designers of channels, links, terminals, modems, or networks used to transmit and receive digital messages. 1979 edition.
This tutorial-style presentation of the fundamental techniques and algorithms in adaptive control is designed to meet the needs of a wide audience without sacrificing mathematical depth or rigor. The text explores the design, analysis, and application of a wide variety of algorithms that can be used to manage dynamical systems with unknown parameters. Topics include models for dynamic systems, stability, online parameter estimation, parameter identifiers, model reference adaptive control, adaptive pole placement control, and robust adaptive laws. Engineers and students interested in learning how to design, stimulate, and implement parameter estimators and adaptive control schemes will find that this treatment does not require a full understanding of the analytical and technical proofs. This volume will also serve graduate students who wish to examine the analysis of simple schemes and discover the steps involved in more complex proofs. Advanced students and researchers will find it a guide to the grasp of long and technical proofs. Numerous examples demonstrating design procedures and the techniques of basic analysis enrich the text.
Reprint of the Prentice-Hall, Inc., Upper Saddle River, New Jersey, 1996 edition.
This book was printed in the United States of America.
Dover books are made to last a lifetime. Our US book-manufacturing partners produce the highest quality books in the world and they create jobs for our fellow citizens. Manufacturing in the United States also ensures that our books are printed in an environmentally friendly fashion, on paper sourced from responsibly managed forests.