Wednesday, February 28, 2007

Definition - Dynamical Systems

This post will be a part of a series, marked by the tag "Dynamical Systems", in which I will offer thoughts on the nature of Dynamical Systems as a mathematical discipline.

Today I will briefly give my definition of the study, and stop there.

Definition: Dynamical Systems is the formal study of the properties of mathematical objects by studying how those objects behave under transformations.

Usually the types of transformations involved are defined in one of two ways:

  • By a continuous variable (like differential equations, where time is viewed as an action by the real numbers on the space of solutions of the ODEs, and the dynamical systems in this category are called flows), or
  • by a discrete variable (think of the behavior of points of a space under the repeated application of a single map from the space to itself. This is viewed as an integer action on the domain of the function, where each integer n is associated to the map given by the n-times composition of the function with itself).

This definition encompasses a very broad interpretation of DS, and reflects its use in so many areas of mathematics, from algebra and analysis, to probability and statistics, to topology and geometry, to number theory.

It is also my favorite....

No comments: