Abstract
This paper presents a method for obtaining optimal stable control for general nonlinear nonautonomous dynamical systems. The approach is inspired by recent developments in analytical dynamics and the observation that the Lyapunov criterion for stability of dynamical systems can be recast as a constraint to be imposed on the system. A closed-form expression for control is obtained that minimizes a user-defined control cost at each instant of time and enforces the Lyapunov constraint simultaneously. The derivation of this expression closely mirrors the development of the fundamental equation of motion used in the study of constrained motion. For this control method to work, the positive definite functions used in the Lyapunov constraint should satisfy a consistency condition. A class of positive definite functions has been provided for mechanical systems that meet this criterion. To illustrate the broad scope of the method, for linear systems it is shown that a proper choice of these positive definite functions results in conventional LQR control. Control of the Lorenz system and a multi-degree of freedom nonlinear mechanical system are considered. Numerical examples demonstrating the efficacy and simplicity of the method are provided.
Similar content being viewed by others
References
Khalil, H.K.: Nonlinear Systems. Prentice Hall, New Jersey (2002)
Lefschetz, S.: Differential Equations: Geometric Theory. Dover, New York (1977)
Perko, L.: Differential Equations and Dynamical Systems. Springer, Berlin (1996)
Sontag, E.: Mathematical Theory of Control: Deterministic Finite Dimensional Systems. Springer, Berlin (1998)
Vidyasagar, M.: Nonlinear Systems Analysis. Prentice Hall, Englewood Cliffs NJ (1993)
Zubov, V.I.: Mathematical Theory of Motion Stability. University of St. Petersburg Press, Saint Petersburg (1997)
Sontag, E.D.: A “universal” construction of Artstein’s theorem on nonlinear stabilization. Systems and Control Letters 13(2), 117–123 (1989)
Freeman, R.A., Koktovic, P.V.: Inverse optimality in robust stabilization. SIAM Journal of Control and Optimization 34, 1365–1392 (1996)
Freeman, R.A., Primbs, J.A.: Control Lyapunov functions: New ideas from an old source. In: Proceedings of the 35th IEEE Conference on Decision and Control, pp. 3926–39331 (1996)
Krstic, M., Kanellakopoulos, I., Kokotovic, P.V.: Nonlinear and Adaptive Control Design. Wiley, New York (1995)
Çimen, T.: State-dependent Riccati equation (SDRE) control: a survey. In: Proceedings of the 17th World Congress, IFAC, Seoul, Korea, July 6–11 (2008)
Udwadia, F.E.: A new approach to stable optimal control of complex nonlinear dynamical systems. J. Appl. Mech. 81(3), 031001 (2013)
Udwadia, F.E.: Analytical Dynamics: A New Approach. Cambridge University Press, Cambridge (2008)
Udwadia, F.E., Kalaba, R.E.: A new perspective on constrained motion. Proc. R. Soc. Lond. Ser. A 439, 407–410 (1992)
Udwadia, F.E.: A new perspective on the tracking control of nonlinear structural and mechanical systems. Proc. R. Soc. Lond. Ser. A 459, 1783–1800 (2003)
Udwadia, F.E.: Optimal tracking control of nonlinear dynamical systems. Proc. R. Soc. Lond. Ser. A 464, 2341–2363 (2008)
Udwadia, F.E., Koganti, P.B.: Dynamics and control of a multi-body pendulum. Nonlinear Dyn. (2015). doi:10.1007/s11071-015-2034-0
Bolza, O.: Lectures on the Calculus of Variations. The University of Chicago Press, Chicago (1904)
Pars, L.A.: An Introduction to the Calculus of Variations. Dover, New York (1962)
Gelfand, L.M., Fomin, S.V.: Calculus of Variations. Dover, New York (1963)
Burl, J.: Linear Optimal Control. Addison Wesley, Reading, MA (1998)
Conflict of interest
None.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Udwadia, F.E., Koganti, P.B. Optimal stable control for nonlinear dynamical systems: an analytical dynamics based approach. Nonlinear Dyn 82, 547–562 (2015). https://doi.org/10.1007/s11071-015-2175-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11071-015-2175-1