Lucas Fidon added subsection_Mutual_Information_definition_and__.tex  almost 8 years ago

Commit id: 73dd31af4cd295fa67efd09af28245537feaa96c

deletions | additions      

         

\subsection{Mutual Information: definition and properties}  Mutual Information is widely used for instance for registration of medical images as it is depicted in \cite{Pluim_2003}. The main idea is to introduced a feature space (or a joint probability) of the two trajectories we want to compare and to evaluate the quantity of "information" shared by the two trajectories based on this feature space. This quantity is calculate with Mutual Information.   In our case the feature space will be the distribution of the couple of positions of 2 players' trajectories during a window of time during a few minutes. Thus it corresponds to a 4-dimension distribution. The Mutual Infomation of this distribution will be the lynchpin of our similarity measure for trajectories.  \subsubsection{Entropy}  Shannon introduced the entropy to be a measure of the quantity of information of a random variable.    Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.  The entropy of $X$ is defined as:  $ S(X) = \sum_{x in E}P_{X}(x)*log(P_{X}(x))$    The entropy has three interpretations:   \begin{itemize}  \item the amount of information of a random variable (or of an event)  \item the uncertainty about the outcome of a random variable (or of an event)  \item the dispersion of the probability law of a random variable (or of the probabilities with which the events take place)  \end{itemize}  For more information about Entropy the reader can refer to \cite{Pluim_2003} or \href{http://www.yann-ollivier.org/entropie/entropie1}{La théorie de l'information : l'origine de l'entropie}.        \subsection{MI-based metric for trajectories}  \subsection{Empirical MI-based metric for trajectories}