Lucas Fidon edited section_Metrics_for_trajectories_Most__.tex  almost 8 years ago

Commit id: bacae863b818e9d5b70eb7f20dcbfc2e2684eef8

deletions | additions      

       

Shannon introduced the entropy to be a measure of the quantity of information of a random variable.    Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.  The entropy of $X$ is defined as:  $ S(X) = \sum_{x in E}P_{X}(x)*log(P_{X}(x))$      \subsection{MI-based metric for trajectories}