this is for holding javascript data
Cato edited Gibbs.tex
over 10 years ago
Commit id: f5c0b30e9d6a26178f0f63fb58fa7a01b662b002
deletions | additions
diff --git a/Gibbs.tex b/Gibbs.tex
index fe8fead..208e193 100644
--- a/Gibbs.tex
+++ b/Gibbs.tex
...
When bits are written randomly and independently, the entropy of the source is given by
\begin{equation}
\label{eqn:indep_entropy}
H = -\sum_i p_i \log_2 p_i.
\label{eqn:indep_entropy}
\end{equation}
However, there may be larger-scale patterns in the string. Then we have to make more effort (WHY?). For example, \texttt{0101010101\cdots} has the same entropy as \texttt{010011000111\cdots}, but clearly there's something different going on! I see two cases, which have to be treated differently: