• a dynamical system is simply a system (set of variables), whose state (value for all the variables) can be predicted by just looking at the previous state, nothing else. I.e. state is simply a function of the previous state.
  • optimal control theory is a field all about the problem and solution(s) of controlling dynamical systems optimally with respect to some objective function.

A Dynamical System is a system (set of variables) that changes with respect to something (usually time). Time can be continuous or discreet. So why don’t we just say that a dynamical system is simply something as a function of something else? LOL. Well, because what makes a dynamical system special is that its state (value of the variables) at a particular time, can always predict (i.e. tell you) the state (value of variables) at the next time step. So if at any point in discreet time, you know the state of the system, you can calculate its state in the next discreet time step.

In other words, the state (value of all the variables in the system) can be expressed simply as the function of the previous state!

One easy to understand example of a dynamical system is the population of bacteria over time. Again, we’ll consider discreet time steps because it’s easier to explain this way. Our state will simply be one variable, the number of cells. The number of cells at any time step is simply twice the number of cells in the previous time step (of course, we are assuming that each cell divides into 2 each time step).

Note, that your state doesn’t have to be with respect to time, it can be with respect to something else, maybe space. As long as at each “space” step, your system can be predicted by just looking at the previous “space” step, you still got yourself a dynamical system!

There is nothing special about time! It just happens to be the domain of a lot of our functions, but don’t think it has any special meaning in mathematics!

People often want to control dynamical systems in such a way as to maximize some objective function. This field is known as “optimal control theory”.

Optimal Control Theory is a field all about the problem and solution(s) of controlling dynamical systems optimally with respect to some objective function.