TransWikia.com

Boltzmann entropy of Gaussian distribution

Physics Asked by MsTais on April 9, 2021

Recently I have been looking into different definitions of entropy and came across Boltzmann (unconditional) entropy:

$S=-int_{R} dx ;p(x) ln p(x)$

I have tried to calculate entropy of regular Gauss distribution. My logic is that if this definition makes sense, then for nice distributions it should produce reasonable results. However, it appeared to be that even in this case this entropy can be negative, see the plot below. The equation I got is:

$S(sigma)=frac{1}{2} (ln(2pisigma^2)+1).$

Forgive me my ignorance in stat. mech. and thermodynamics, but I cannot get it. If this entropy doesn’t work even for this case, then how and under which conditions does this definition make sense? Literature and references are very welcome! Thanks!
enter image description here

One Answer

For a discrete distribution $$ S= -sum_i p_iln p_i $$ is always positive as $0le p_ile 1$. The formula you give for $S$ has problems though. As $$ 1 = int dx ,p(x) $$ whatever units $x$ has (meters say, if $x$ is length) the $p(x)$ has in the inverse units. As a consequnce $ln p(x)$ makes no sense as you cannot take the log of a dimensionful quantity. A valid formula must be $$ S= -int dx p(x)ln ( a p(x)) $$ for some $a$ with the same dimensions as $x$. One would like $S$ to be zero if $p(x)=delta(x-a)$ (no uncertainty) but that is not possible, so there is no preferred choice of $a$. As a consequence, as @Yvan Velenik says, only differences in the entropy of continuous dstributions make sense

Answered by mike stone on April 9, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP