TransWikia.com

Is there a meaning to $mathrm{e}^{H(p_{i})}$ or $2^{H(p_{i})}$?

Physics Asked by frencho on September 11, 2020

In my research I find an equation featuring the "exponential entropy" term $mathrm{e}^{H(p_{i})}$ and I wonder if it has a specific meaning. I have only found rare references to that term (usually in terms of dispersion or "spread of the distribution") so I’m looking for more insights. I work with natural logarithms and in my case the entropy is Shannon’s: $H(p_{i})=-sum{ p_{i}ln p_{i}}$… My question is: what is $mathrm{e}^{H(p_{i})}$ ?

Note: I assume that the same question would arise if I were to work in log-base 2… So is there a meaning to $2^{H(p_{i})}$ when entropy is now defined by $H(p_{i})=-sum{p_{i}log_{2} p_{i}}$ ?

One Answer

"Spread" is a good name. First, notice that the value is indendent of the base of the log:

$$2^{sum -p_ilog_2(p_i)}=expleft(log(2)*-sum p_ifrac{log(p_i)}{log(2)}right)=expleft(-sum p_ilog(p_i)right)$$

I call $p=(p_i)$ a distribution as in probability theory. Entropy is also called quantity of information.

For discrete entropy, $exp(H)$ is the essential number of possible values (possible values of $i$) of the distribution. It is the number of possible values of a (discrete) uniform distribution that would have the same entropy.

$p$ is like an histogram. If you want to draw an histogram for a disbribution containing the same information but with bars having a constant height, you will draw $exp(H)$ bars. Of course, $exp(H)$ is not always an integer, but that's the idea.

For differential entropy (relative to the Lebesgue measure for example), it's more interesting. It is the essential volume. $exp(H)$ is the volume of the support of a uniform distribution having the same entropy. Think of a distribution as a "fuzzy" object, having shades of grey. $exp(H)$ is somehow the object's essential volume. Mathematically, it generalizes the notion of measure of a set to a distribution.

I find it especially useful to understand Liouville's theorem. Liouville theorems says "the Lebesgues mesure in the phase space is preserved by motion": assuming the system is in a certain subset of the phase space at time $0$, it will be in a subset with the same volume at time $t$.

It is natural to generalize this to (exp of) entropy as a generalized volume: if the distribution at time $0$ has a certain (exp of) entropy, it has the same (exp of) entropy at time $t$. It's easy to prove. This can be the first step for proving the second law.

Answered by Benoit on September 11, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP