Skip to main content
Log in

Optimal stable control for nonlinear dynamical systems: an analytical dynamics based approach

  • Original Paper
  • Published:
Nonlinear Dynamics Aims and scope Submit manuscript

Abstract

This paper presents a method for obtaining optimal stable control for general nonlinear nonautonomous dynamical systems. The approach is inspired by recent developments in analytical dynamics and the observation that the Lyapunov criterion for stability of dynamical systems can be recast as a constraint to be imposed on the system. A closed-form expression for control is obtained that minimizes a user-defined control cost at each instant of time and enforces the Lyapunov constraint simultaneously. The derivation of this expression closely mirrors the development of the fundamental equation of motion used in the study of constrained motion. For this control method to work, the positive definite functions used in the Lyapunov constraint should satisfy a consistency condition. A class of positive definite functions has been provided for mechanical systems that meet this criterion. To illustrate the broad scope of the method, for linear systems it is shown that a proper choice of these positive definite functions results in conventional LQR control. Control of the Lorenz system and a multi-degree of freedom nonlinear mechanical system are considered. Numerical examples demonstrating the efficacy and simplicity of the method are provided.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

References

  1. Khalil, H.K.: Nonlinear Systems. Prentice Hall, New Jersey (2002)

    MATH  Google Scholar 

  2. Lefschetz, S.: Differential Equations: Geometric Theory. Dover, New York (1977)

  3. Perko, L.: Differential Equations and Dynamical Systems. Springer, Berlin (1996)

    Book  MATH  Google Scholar 

  4. Sontag, E.: Mathematical Theory of Control: Deterministic Finite Dimensional Systems. Springer, Berlin (1998)

    Book  Google Scholar 

  5. Vidyasagar, M.: Nonlinear Systems Analysis. Prentice Hall, Englewood Cliffs NJ (1993)

    MATH  Google Scholar 

  6. Zubov, V.I.: Mathematical Theory of Motion Stability. University of St. Petersburg Press, Saint Petersburg (1997)

    Google Scholar 

  7. Sontag, E.D.: A “universal” construction of Artstein’s theorem on nonlinear stabilization. Systems and Control Letters 13(2), 117–123 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  8. Freeman, R.A., Koktovic, P.V.: Inverse optimality in robust stabilization. SIAM Journal of Control and Optimization 34, 1365–1392 (1996)

    Article  MATH  Google Scholar 

  9. Freeman, R.A., Primbs, J.A.: Control Lyapunov functions: New ideas from an old source. In: Proceedings of the 35th IEEE Conference on Decision and Control, pp. 3926–39331 (1996)

  10. Krstic, M., Kanellakopoulos, I., Kokotovic, P.V.: Nonlinear and Adaptive Control Design. Wiley, New York (1995)

    Google Scholar 

  11. Çimen, T.: State-dependent Riccati equation (SDRE) control: a survey. In: Proceedings of the 17th World Congress, IFAC, Seoul, Korea, July 6–11 (2008)

  12. Udwadia, F.E.: A new approach to stable optimal control of complex nonlinear dynamical systems. J. Appl. Mech. 81(3), 031001 (2013)

  13. Udwadia, F.E.: Analytical Dynamics: A New Approach. Cambridge University Press, Cambridge (2008)

    Google Scholar 

  14. Udwadia, F.E., Kalaba, R.E.: A new perspective on constrained motion. Proc. R. Soc. Lond. Ser. A 439, 407–410 (1992)

    Article  MATH  MathSciNet  Google Scholar 

  15. Udwadia, F.E.: A new perspective on the tracking control of nonlinear structural and mechanical systems. Proc. R. Soc. Lond. Ser. A 459, 1783–1800 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  16. Udwadia, F.E.: Optimal tracking control of nonlinear dynamical systems. Proc. R. Soc. Lond. Ser. A 464, 2341–2363 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  17. Udwadia, F.E., Koganti, P.B.: Dynamics and control of a multi-body pendulum. Nonlinear Dyn. (2015). doi:10.1007/s11071-015-2034-0

  18. Bolza, O.: Lectures on the Calculus of Variations. The University of Chicago Press, Chicago (1904)

    MATH  Google Scholar 

  19. Pars, L.A.: An Introduction to the Calculus of Variations. Dover, New York (1962)

    MATH  Google Scholar 

  20. Gelfand, L.M., Fomin, S.V.: Calculus of Variations. Dover, New York (1963)

    Google Scholar 

  21. Burl, J.: Linear Optimal Control. Addison Wesley, Reading, MA (1998)

    Google Scholar 

Download references

Conflict of interest

None.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Firdaus E. Udwadia.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Udwadia, F.E., Koganti, P.B. Optimal stable control for nonlinear dynamical systems: an analytical dynamics based approach. Nonlinear Dyn 82, 547–562 (2015). https://doi.org/10.1007/s11071-015-2175-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11071-015-2175-1

Keywords

Navigation