University of Wollongong
Browse

Optimal control: theory and application to science, engineering, and social sciences

Download (112.87 kB)
journal contribution
posted on 2024-11-15, 12:37 authored by Davide La Torre, Herb Kunze, Manuel Ruiz-Galan, Tufail Malik, Simone Marsiglio
[extract] An optimal control problem entails the identification of a feasible scheme, policy, program, strategy, or campaign, in order to achieve the optimal possible outcome of a system. More formally, an optimal control problem means endogenously controlling a parameter in a mathematical model to produce an optimal output, using some optimization technique. The problem comprises an objective (or cost) functional, which is a function of the state and control variables, and a set of constraints. The problem seeks to optimize the objective function subject to the constraints construed by the model describing the evolution of the underlying system. The two popular solution techniques of an optimal control problem are Pontryagin's maximum principle and the Hamilton-Jacobi-Bellman equation.

History

Citation

La Torre, D., Kunze, H., Ruiz-Galan, M., Malik, T. & Marsiglio, S. (2015). Optimal control: theory and application to science, engineering, and social sciences. Abstract and Applied Analysis, 2015 90527-1-90527-2.

Journal title

Abstract and Applied Analysis

Volume

2015

Language

English

RIS ID

100765

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC