In mathematics, a measure-preserving dynamical system is an object of study in the abstract formulation of dynamical systems, and ergodic theory in particular. Measure-preserving systems obey the Poincaré recurrence theorem, and are a special case of conservative systems. They provide the formal, mathematical basis for a broad range of physical systems, and, in particular, many systems from classical mechanics (in particular, most non-dissipative systems) as well as systems in thermodynamic equilibrium.

## Definition

A measure-preserving dynamical system is defined as a probability space and a measure-preserving transformation on it. In more detail, it is a system

${\displaystyle (X,{\mathcal {B)),\mu ,T)}$

with the following structure:

• ${\displaystyle X}$ is a set,
• ${\displaystyle {\mathcal {B))}$ is a σ-algebra over ${\displaystyle X}$,
• ${\displaystyle \mu :{\mathcal {B))\rightarrow [0,1]}$ is a probability measure, so that ${\displaystyle \mu (X)=1}$, and ${\displaystyle \mu (\varnothing )=0}$,
• ${\displaystyle T:X\rightarrow X}$ is a measurable transformation which preserves the measure ${\displaystyle \mu }$, i.e., ${\displaystyle \forall A\in {\mathcal {B))\;\;\mu (T^{-1}(A))=\mu (A)}$.

## Discussion

One may ask why the measure preserving transformation is defined in terms of the inverse ${\displaystyle \mu (T^{-1}(A))=\mu (A)}$ instead of the forward transformation ${\displaystyle \mu (T(A))=\mu (A)}$. This can be understood intuitively.

Consider the typical measure on the unit interval ${\displaystyle [0,1]}$, and a map ${\displaystyle Tx=2x\mod 1={\begin{cases}2x{\text{ if ))x<1/2\\2x-1{\text{ if ))x>1/2\\\end{cases))}$. This is the Bernoulli map. Now, distribute an even layer of paint on the unit interval ${\displaystyle [0,1]}$, and then map the paint forward. The paint on the ${\displaystyle [0,1/2]}$ half is spread thinly over all of ${\displaystyle [0,1]}$, and the paint on the ${\displaystyle [1/2,1]}$ half as well. The two layers of thin paint, layered together, recreates the exact same paint thickness.

More generally, the paint that would arrive at subset ${\displaystyle A\subset [0,1]}$ comes from the subset ${\displaystyle T^{-1}(A)}$. For the paint thickness to remain unchanged (measure-preserving), the mass of incoming paint should be the same: ${\displaystyle \mu (A)=\mu (T^{-1}(A))}$.

Consider a mapping ${\displaystyle {\mathcal {T))}$ of power sets:

${\displaystyle {\mathcal {T)):P(X)\to P(X)}$

Consider now the special case of maps ${\displaystyle {\mathcal {T))}$ which preserve intersections, unions and complements (so that it is a map of Borel sets) and also sends ${\displaystyle X}$ to ${\displaystyle X}$ (because we want it to be conservative). Every such conservative, Borel-preserving map can be specified by some surjective map ${\displaystyle T:X\to X}$ by writing ${\displaystyle {\mathcal {T))(A)=T^{-1}(A)}$. Of course, one could also define ${\displaystyle {\mathcal {T))(A)=T(A)}$, but this is not enough to specify all such possible maps ${\displaystyle {\mathcal {T))}$. That is, conservative, Borel-preserving maps ${\displaystyle {\mathcal {T))}$ cannot, in general, be written in the form ${\displaystyle {\mathcal {T))(A)=T(A);}$.

${\displaystyle \mu (T^{-1}(A))}$ has the form of a pushforward, whereas ${\displaystyle \mu (T(A))}$ is generically called a pullback. Almost all properties and behaviors of dynamical systems are defined in terms of the pushforward. For example, the transfer operator is defined in terms of the pushforward of the transformation map ${\displaystyle T}$; the measure ${\displaystyle \mu }$ can now be understood as an invariant measure; it is just the Frobenius–Perron eigenvector of the transfer operator (recall, the FP eigenvector is the largest eigenvector of a matrix; in this case it is the eigenvector which has the eigenvalue one: the invariant measure.)

There are two classification problems of interest. One, discussed below, fixes ${\displaystyle (X,{\mathcal {B)),\mu )}$ and asks about the isomorphism classes of a transformation map ${\displaystyle T}$. The other, discussed in transfer operator, fixes ${\displaystyle (X,{\mathcal {B)))}$ and ${\displaystyle T}$, and asks about maps ${\displaystyle \mu }$ that are measure-like. Measure-like, in that they preserve the Borel properties, but are no longer invariant; they are in general dissipative and so give insights into dissipative systems and the route to equilibrium.

In terms of physics, the measure-preserving dynamical system ${\displaystyle (X,{\mathcal {B)),\mu ,T)}$ often describes a physical system that is in equilibrium, for example, thermodynamic equilibrium. One might ask: how did it get that way? Often, the answer is by stirring, mixing, turbulence, thermalization or other such processes. If a transformation map ${\displaystyle T}$ describes this stirring, mixing, etc. then the system ${\displaystyle (X,{\mathcal {B)),\mu ,T)}$ is all that is left, after all of the transient modes have decayed away. The transient modes are precisely those eigenvectors of the transfer operator that have eigenvalue less than one; the invariant measure ${\displaystyle \mu }$ is the one mode that does not decay away. The rate of decay of the transient modes are given by (the logarithm of) their eigenvalues; the eigenvalue one corresponds to infinite half-life.

## Informal example

The microcanonical ensemble from physics provides an informal example. Consider, for example, a fluid, gas or plasma in a box of width, length and height ${\displaystyle w\times l\times h,}$ consisting of ${\displaystyle N}$ atoms. A single atom in that box might be anywhere, having arbitrary velocity; it would be represented by a single point in ${\displaystyle w\times l\times h\times \mathbb {R} ^{3}.}$ A given collection of ${\displaystyle N}$ atoms would then be a single point somewhere in the space ${\displaystyle (w\times l\times h)^{N}\times \mathbb {R} ^{3N}.}$ The "ensemble" is the collection of all such points, that is, the collection of all such possible boxes (of which there are an uncountably-infinite number). This ensemble of all-possible-boxes is the space ${\displaystyle X}$ above.

In the case of an ideal gas, the measure ${\displaystyle \mu }$ is given by the Maxwell–Boltzmann distribution. It is a product measure, in that if ${\displaystyle p_{i}(x,y,z,v_{x},v_{y},v_{z})\,d^{3}x\,d^{3}p}$ is the probability of atom ${\displaystyle i}$ having position and velocity ${\displaystyle x,y,z,v_{x},v_{y},v_{z))$, then, for ${\displaystyle N}$ atoms, the probability is the product of ${\displaystyle N}$ of these. This measure is understood to apply to the ensemble. So, for example, one of the possible boxes in the ensemble has all of the atoms on one side of the box. One can compute the likelihood of this, in the Maxwell–Boltzmann measure. It will be enormously tiny, of order ${\displaystyle {\mathcal {O))\left(2^{-3N}\right).}$ Of all possible boxes in the ensemble, this is a ridiculously small fraction.

The only reason that this is an "informal example" is because writing down the transition function ${\displaystyle T}$ is difficult, and, even if written down, it is hard to perform practical computations with it. Difficulties are compounded if the interaction is not an ideal-gas billiard-ball type interaction, but is instead a van der Waals interaction, or some other interaction suitable for a liquid or a plasma; in such cases, the invariant measure is no longer the Maxwell–Boltzmann distribution. The art of physics is finding reasonable approximations.

This system does exhibit one key idea from the classification of measure-preserving dynamical systems: two ensembles, having different temperatures, are inequivalent. The entropy for a given canonical ensemble depends on its temperature; as physical systems, it is "obvious" that when the temperatures differ, so do the systems. This holds in general: systems with different entropy are not isomorphic.

## Examples

Unlike the informal example above, the examples below are sufficiently well-defined and tractable that explicit, formal computations can be performed.

## Generalization to groups and monoids

The definition of a measure-preserving dynamical system can be generalized to the case in which T is not a single transformation that is iterated to give the dynamics of the system, but instead is a monoid (or even a group, in which case we have the action of a group upon the given probability space) of transformations Ts : XX parametrized by sZ (or R, or N ∪ {0}, or [0, +∞)), where each transformation Ts satisfies the same requirements as T above.[1] In particular, the transformations obey the rules:

• ${\displaystyle T_{0}=\mathrm {id} _{X}:X\rightarrow X}$, the identity function on X;
• ${\displaystyle T_{s}\circ T_{t}=T_{t+s))$, whenever all the terms are well-defined;
• ${\displaystyle T_{s}^{-1}=T_{-s))$, whenever all the terms are well-defined.

The earlier, simpler case fits into this framework by defining Ts = Ts for sN.

## Homomorphisms

The concept of a homomorphism and an isomorphism may be defined.

Consider two dynamical systems ${\displaystyle (X,{\mathcal {A)),\mu ,T)}$ and ${\displaystyle (Y,{\mathcal {B)),\nu ,S)}$. Then a mapping

${\displaystyle \varphi :X\to Y}$

is a homomorphism of dynamical systems if it satisfies the following three properties:

1. The map ${\displaystyle \varphi \ }$ is measurable.
2. For each ${\displaystyle B\in {\mathcal {B))}$, one has ${\displaystyle \mu (\varphi ^{-1}B)=\nu (B)}$.
3. For ${\displaystyle \mu }$-almost all ${\displaystyle x\in X}$, one has ${\displaystyle \varphi (Tx)=S(\varphi x)}$.

The system ${\displaystyle (Y,{\mathcal {B)),\nu ,S)}$ is then called a factor of ${\displaystyle (X,{\mathcal {A)),\mu ,T)}$.

The map ${\displaystyle \varphi \;}$ is an isomorphism of dynamical systems if, in addition, there exists another mapping

${\displaystyle \psi :Y\to X}$

that is also a homomorphism, which satisfies

1. for ${\displaystyle \mu }$-almost all ${\displaystyle x\in X}$, one has ${\displaystyle x=\psi (\varphi x)}$;
2. for ${\displaystyle \nu }$-almost all ${\displaystyle y\in Y}$, one has ${\displaystyle y=\varphi (\psi y)}$.

Hence, one may form a category of dynamical systems and their homomorphisms.

## Generic points

A point xX is called a generic point if the orbit of the point is distributed uniformly according to the measure.

## Symbolic names and generators

Consider a dynamical system ${\displaystyle (X,{\mathcal {B)),T,\mu )}$, and let Q = {Q1, ..., Qk} be a partition of X into k measurable pair-wise disjoint sets. Given a point xX, clearly x belongs to only one of the Qi. Similarly, the iterated point Tnx can belong to only one of the parts as well. The symbolic name of x, with regards to the partition Q, is the sequence of integers {an} such that

${\displaystyle T^{n}x\in Q_{a_{n)).}$

The set of symbolic names with respect to a partition is called the symbolic dynamics of the dynamical system. A partition Q is called a generator or generating partition if μ-almost every point x has a unique symbolic name.

## Operations on partitions

Given a partition Q = {Q1, ..., Qk} and a dynamical system ${\displaystyle (X,{\mathcal {B)),T,\mu )}$, define the T-pullback of Q as

${\displaystyle T^{-1}Q=\{T^{-1}Q_{1},\ldots ,T^{-1}Q_{k}\}.}$

Further, given two partitions Q = {Q1, ..., Qk} and R = {R1, ..., Rm}, define their refinement as

${\displaystyle Q\vee R=\{Q_{i}\cap R_{j}\mid i=1,\ldots ,k,\ j=1,\ldots ,m,\ \mu (Q_{i}\cap R_{j})>0\}.}$

With these two constructs, the refinement of an iterated pullback is defined as

{\displaystyle {\begin{aligned}\bigvee _{n=0}^{N}T^{-n}Q&=\{Q_{i_{0))\cap T^{-1}Q_{i_{1))\cap \cdots \cap T^{-N}Q_{i_{N))\\&{}\qquad {\mbox{ where ))i_{\ell }=1,\ldots ,k,\ \ell =0,\ldots ,N,\ \\&{}\qquad \qquad \mu \left(Q_{i_{0))\cap T^{-1}Q_{i_{1))\cap \cdots \cap T^{-N}Q_{i_{N))\right)>0\}\\\end{aligned))}

which plays crucial role in the construction of the measure-theoretic entropy of a dynamical system.

## Measure-theoretic entropy

The entropy of a partition ${\displaystyle {\mathcal {Q))}$ is defined as[2][3]

${\displaystyle H({\mathcal {Q)))=-\sum _{Q\in {\mathcal {Q))}\mu (Q)\log \mu (Q).}$

The measure-theoretic entropy of a dynamical system ${\displaystyle (X,{\mathcal {B)),T,\mu )}$ with respect to a partition Q = {Q1, ..., Qk} is then defined as

${\displaystyle h_{\mu }(T,{\mathcal {Q)))=\lim _{N\rightarrow \infty }{\frac {1}{N))H\left(\bigvee _{n=0}^{N}T^{-n}{\mathcal {Q))\right).}$

Finally, the Kolmogorov–Sinai metric or measure-theoretic entropy of a dynamical system ${\displaystyle (X,{\mathcal {B)),T,\mu )}$ is defined as

${\displaystyle h_{\mu }(T)=\sup _{\mathcal {Q))h_{\mu }(T,{\mathcal {Q))).}$

where the supremum is taken over all finite measurable partitions. A theorem of Yakov Sinai in 1959 shows that the supremum is actually obtained on partitions that are generators. Thus, for example, the entropy of the Bernoulli process is log 2, since almost every real number has a unique binary expansion. That is, one may partition the unit interval into the intervals [0, 1/2) and [1/2, 1]. Every real number x is either less than 1/2 or not; and likewise so is the fractional part of 2nx.

If the space X is compact and endowed with a topology, or is a metric space, then the topological entropy may also be defined.

If ${\displaystyle T}$ is an ergodic, piecewise expanding, and Markov on ${\displaystyle X\subset \mathbb {R} }$, and ${\displaystyle \mu }$ is absolutely continuous with respect to the Lebesgue measure, then we have the Rokhlin formula[4] (section 4.3 and section 12.3 [5]):

${\displaystyle h_{\mu }(T)=\int \ln |dT/dx|\mu (dx)}$
This allows calculation of entropy of many interval maps, such as the logistic map.

Ergodic means that ${\displaystyle T^{-1}(A)=A}$ implies ${\displaystyle A}$ has full measure or zero measure. Piecewise expanding and Markov means that there is a partition of ${\displaystyle X}$ into finitely many open intervals, such that for some ${\displaystyle \epsilon >0}$, ${\displaystyle |T'|\geq 1+\epsilon }$ on each open interval. Markov means that for each ${\displaystyle I_{i))$ from those open intervals, either ${\displaystyle T(I_{i})\cap I_{i}=\emptyset }$ or ${\displaystyle T(I_{i})\cap I_{i}=I_{i))$.

## Classification and anti-classification theorems

One of the primary activities in the study of measure-preserving systems is their classification according to their properties. That is, let ${\displaystyle (X,{\mathcal {B)),\mu )}$ be a measure space, and let ${\displaystyle U}$ be the set of all measure preserving systems ${\displaystyle (X,{\mathcal {B)),\mu ,T)}$. An isomorphism ${\displaystyle S\sim T}$ of two transformations ${\displaystyle S,T}$ defines an equivalence relation ${\displaystyle {\mathcal {R))\subset U\times U.}$ The goal is then to describe the relation ${\displaystyle {\mathcal {R))}$. A number of classification theorems have been obtained; but quite interestingly, a number of anti-classification theorems have been found as well. The anti-classification theorems state that there are more than a countable number of isomorphism classes, and that a countable amount of information is not sufficient to classify isomorphisms.[6][7]

The first anti-classification theorem, due to Hjorth, states that if ${\displaystyle U}$ is endowed with the weak topology, then the set ${\displaystyle {\mathcal {R))}$ is not a Borel set.[8] There are a variety of other anti-classification results. For example, replacing isomorphism with Kakutani equivalence, it can be shown that there are uncountably many non-Kakutani equivalent ergodic measure-preserving transformations of each entropy type.[9]

These stand in contrast to the classification theorems. These include:

Krieger finite generator theorem[14] (Krieger 1970) — Given a dynamical system on a Lebesgue space of measure 1, where ${\textstyle T}$ is invertible, measure preserving, and ergodic.

If ${\displaystyle h_{T}\leq \ln k}$ for some integer ${\displaystyle k}$, then the system has a size-${\displaystyle k}$ generator.

If the entropy is exactly equal to ${\displaystyle \ln k}$, then such a generator exists iff the system is isomorphic to the Bernoulli shift on ${\displaystyle k}$ symbols with equal measures.

## References

1. ^ a b Walters, Peter (2000). An Introduction to Ergodic Theory. Springer. ISBN 0-387-95152-0.
2. ^ Sinai, Ya. G. (1959). "On the Notion of Entropy of a Dynamical System". Doklady Akademii Nauk SSSR. 124: 768–771.
3. ^ Sinai, Ya. G. (2007). "Metric Entropy of Dynamical System" (PDF).
4. ^ The Shannon-McMillan-Breiman Theorem
5. ^ Pollicott, Mark; Yuri, Michiko (1998). Dynamical Systems and Ergodic Theory. London Mathematical Society Student Texts. Cambridge: Cambridge University Press. ISBN 978-0-521-57294-1.
6. ^ Foreman, Matthew; Weiss, Benjamin (2019). "From Odometers to Circular Systems: A Global Structure Theorem". Journal of Modern Dynamics. 15: 345–423. arXiv:1703.07093. doi:10.3934/jmd.2019024. S2CID 119128525.
7. ^ Foreman, Matthew; Weiss, Benjamin (2022). "Measure preserving Diffeomorphisms of the Torus are unclassifiable". Journal of the European Mathematical Society. 24 (8): 2605–2690. arXiv:1705.04414. doi:10.4171/JEMS/1151.
8. ^ Hjorth, G. (2001). "On invariants for measure preserving transformations". Fund. Math. 169 (1): 51–84. doi:10.4064/FM169-1-2. S2CID 55619325.
9. ^ Ornstein, D.; Rudolph, D.; Weiss, B. (1982). Equivalence of measure preserving transformations. Mem. American Mathematical Soc. Vol. 37. ISBN 0-8218-2262-4.
10. ^ Halmos, P.; von Neumann, J. (1942). "Operator methods in classical mechanics. II". Annals of Mathematics. (2). 43 (2): 332–350. doi:10.2307/1968872. JSTOR 1968872.
11. ^ Sinai, Ya. (1962). "A weak isomorphism of transformations with invariant measure". Doklady Akademii Nauk SSSR. 147: 797–800.
12. ^ Ornstein, D. (1970). "Bernoulli shifts with the same entropy are isomorphic". Advances in Mathematics. 4 (3): 337–352. doi:10.1016/0001-8708(70)90029-0.
13. ^ Katok, A.; Hasselblatt, B. (1995). "Introduction to the modern theory of dynamical systems". Encyclopedia of Mathematics and its Applications. Vol. 54. Cambridge University Press.
14. ^ Downarowicz, Tomasz (2011). Entropy in dynamical systems. New Mathematical Monographs. Cambridge: Cambridge University Press. p. 106. ISBN 978-0-521-88885-1.