TransWikia.com

Minimize the limit of K-L (Kullback Leibler) divergence for a given conditional probability $p(y|x)$ distribution?

Cross Validated Asked on January 7, 2021

Let, $p(x);p(y)$ are the probability distribution function of random variable $X$, $Y$ and the Conditional probability $p(y|x)$ is given e.g.

$p(y|x)=Q(x+2y)$.

where, $Q(x) = frac{1}{{sqrt {2pi } }}intlimits_t^infty {{e^{ – frac{{{t^2}}}{2}}}dt} $.

Problem statement: Let, $y in[1,-1]$. and the distribution of $p(x)$ is unknown (continuous or discrete). I would like to minimize the following problem.

$Z=min left( {mathop {lim }limits_{x to 0} frac{{D(f(x)||f(y))}}{{{x^2}}}} right)$

Where, $D(.)$ represent Kullback-Liebler divergece (Relative Entropy.)

Solution method:

  1. Find a distribution $p(x)$ that minimize ${D(f(x)||f(y))}$.
  2. I know minimizing ${D(f(x)||f(y))}$ does not minimize $Z$. How should I decompose ${D(f(x)||f(y))}$ so that it will minimize (Lower bound) $Z$ for any $p(x)$ (Or for a fix distribution $p(x_0)$, $Z$ must be minimum (lower bound))?

I heard a term known as ‘Gibbs free energy’ (I don’t know if helps to find the distribution).;
Any insight (or hint) to find the distribution $p(x)$ (or the Lower bound of $Z$) will be helpful. Thank you.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP