Relative entropy and mutual information

Consider some unknown distribution p(x), and suppose that we have modelled this using an approximating distribution q(x). If we use q(x) to construct a coding schemem for the purpose of transmitting values of x to a receiver, then the average additional amount of information required to specify the value of x as a result of using q(x) instead of the true distribution p(x) is given by: KL(p||q) = -\ln p(x)lnq(x)dx - (-\ln p(x)lnp(x)dx) = -\ln p(x)ln{\frac{q(x)}{p(x)}}dx and it’s known as relative entropy or Kullback-Leibler divergence or KL divergence. You could also define it as KL(p(x)||q(x)) = \sum_{x \in X}f(x) \dot log\frac{p(x)}{q(x)} we could give some conclusion in here: \n

1: The value of KL is zero if p(x) and q(x) are exactly same function.
2: If the difference between p(x) and q(x) is larger, the relative entropy will become bigger, otherwise, it will decrease if the variance is smaller.
3: If p(x) and q(x) are distribution function, the relative entropy could been used to measure the difference between them.

The thing need to point out is the relative entropy is not symmetrical quantity, that is to say KL(p||q) \neq KL(q||p)

Now consider the joint distribution between two sets of variables x and y given by p(x,y). If the sets of variables are independent, then their joint distribution will factorize into the product if their marginals p(x, y) = p(x)p(y). If the variables are not independent, we can gain some idea of whether they are ‘close’ to being independent by considering KL divergence between the joint distribution and the product of the marginals, given by: I[x,y] = \sum_{x \in X, y \in Y} P(x, y)log \frac{P(X,Y)}{P(X)P(Y)} or we can just say $I(X; Y) = H(X) – H(X|Y)$

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s