TransWikia.com

Friedland metric entropy

MathOverflow Asked by user502940 on November 16, 2021

I was asking if it is possible to extend the definition of topological Friedland entropy for $mathbb{Z}^d$ continuos actions to measure preserving actions.

The topologica Friedland entropy is constructed as follows. Let $T$ be a $mathbb{Z}^d$ action on a compact topological space $X$. Consider then teh sequence space
$$
mathcal{X}=mathcal{X}_T=left{(x_n)_{nin mathbb{N}} in prod_{n in mathbb{N}}X: T_i(x_n)=x_{n+1} text{ for some } i =1,dots,dright}.
$$

This is a compact space. There is a natural shift on $mathcal{X}$, given by $sigma((x_n)_{nin mathbb{N}})=(x_{n+1})_{n in mathbb{N}}$. We can see that $sigma$ is continuos and then define the entropy of the action $T$, $ent_{top}(T)=h_{top}(sigma)$.

This definition is motivated from the fact that when $d=1$, $h_{top}(sigma)=h_{top}(T)$. This comes from the fact that the map $phi:X ni x mapsto orb(x)=(T^{i}(x))_{i in mathbb{N}}$ is a topological conjugacy map.

This definition can be given also for metric non compact spaces, considering Bowen topological entropy.

Is it possible to define an extension of the metric entropy of Kolmogorov using this method? Consider the case of measure preserving actions on probability spaces, the problem I am facing are the following

  1. How can define a probability measure on $mathcal{X}$? I have the product probability on $prod_{n in mathbb{N}}X$, but i cannot see a way to restric it $mathcal{X}$.
  2. How can I see that $sigma$ is measure preserving?
  3. How can I see that those definition are the same in the case $d=1$, using a conjugacy argument?

Thank you for you suggestions!

One Answer

The space $mathcal X$ can be identified with the product of the base space $X$ and the space $mathcal I$ of (one-sided) sequences $(i_1,i_2,dots)$ of symbols $i_kin I$ (where $I={1,2,dots,d}$) by the map $$ (x;i_1,i_2,dots) mapsto (x, T_{i_1}x, T_{i_1}T_{i_2} x,dots ;. $$ Therefore, a measure on $mathcal X$ is uniquely determined by its projection onto $X$ (for which one can naturally take an invariant measure $m$) and a family of measures on $mathcal I$ parameterized by the points from $X$. The choice here is enormous. The most "natural" is to take just the Bernoulli measure on $mathcal I$ corresponding to the uniform distribution on $I$. The related notions of entropy (in a somewhat more general situation) are discussed, for instance, in the paper by Tim Austin "Entropy of probability kernels from the backwards tail boundary".

Answered by R W on November 16, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP