Lucas Fidon edited subsection_Mutual_Information_definition_and__.tex  almost 8 years ago

Commit id: 2f16051645053b20fa919a076dc16e02b7360e37

deletions | additions      

       

In our case the feature space will be the distribution of the couple of positions of 2 players' trajectories during a window of time during a few minutes. Thus it corresponds to a 4-dimension distribution. The Mutual Infomation of this distribution will be the lynchpin of our similarity measure for trajectories.  \subsubsection{Entropy}  Shannon introduced the entropy to be a measure of the quantity of information of the distribution of  a random variable. Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.  The entropy of the distribution of  $X$, noted $S(X)$, is defined as: \[ S(X) = -\sum_{x \in E}P_{X}(x)*log(P_{X}(x)) \]    It is noteworthy that, in particular, the entropy of the joint distribution of two random variables $X : P_{1} \rightarrow E_{1}$ and $Y : P_{2} \rightarrow E_{2}$ is defined as:  \[ S(X,Y) = -\sum_{(x,y) \in E_1\times E_2}P_{(X,Y)}(x,y)*log\Big(log(P_{(X,Y)}(x,y))\Big) \]  The entropy has three interpretations:   \begin{itemize}  \item the amount of information of a random variable (or of an event)