this is for holding javascript data
Kim H. Parker edited subsection_mutual_information_The_concept__.tex
over 8 years ago
Commit id: b58c77a12bfb21072606d049cf71f7489ac8816b
deletions | additions
diff --git a/subsection_mutual_information_The_concept__.tex b/subsection_mutual_information_The_concept__.tex
index f97b26b..80a4500 100644
--- a/subsection_mutual_information_The_concept__.tex
+++ b/subsection_mutual_information_The_concept__.tex
...
\subsection{mutual information} \subsection{information theory}
The concept of mutual information Information theory was introduced originally in
the context of transmitting information through noisy channels and introduced the idea of the entropy of a signal
analysis where it was proposed as a measure of
the mutual information shared by two noisy signals.[refs] its uncertainty.[refs] The concept is also useful in statistical physics where it is related to the entropy of the
system. system originally introduced in thermodynamics. The theory is very well-developed and the reader is referred to almost any text on information theory for a thorough discussion of the concepts involved. We will use
the definition only a small faction of results of
mutual information
based on probability. Given theory: a measure of the 'distance' between two
signals $X$ and $Y$, probability density functions which can be related to their
mutual information entropy.
Given a signals $X(x)$, its entropy $H(X_x)$ is defined as
\[
I(X,Y) H(X_x) =
\sum_{x -\sum_{x \in X}
\sum_{y \in Y} p(x,y) \phi(x) \log
\frac{p(x,y)}{p(x)p(y)} \phi(x)
\]
where
$p(x)$ $\phi(x)$ is the probability
density function of
$x$, $p(y)$ $x$. It is
the probability a measure of
$y$ and $p(x,y)$ is the
joint probability uncertainty of $x$ and
$y$. It is a probabilistic measure its units depend on the base of
how much we can infer about $Y$ if we know $X$ and \textit{vice versa}. the logarithm. We will use log base 2 which means that the units of entropy is bits.
Mutual information is intimately related to Given two probability density functions $X(x)$ and $Y(x)$ which are defined over the same variable $x$, the distance between them can be measured in several different ways. One of the
entropy first measures of
a signal the difference is the Kullback-Leibler divergence
\[
H(X) D(A|B) =
- \sum_{x \in X} p(x) \sum_x \phi(X_x) \log
p(x) \frac{\phi(X_x}{\phi(Y_x}
\]
which is a measure of uncertainty if $x$ is treated as a random variable. Similarly the joint entropy
\[