Lucas Fidon edited section_Metrics_for_trajectories_Most__.tex  almost 8 years ago

Commit id: 91857ef85cfb011b863d011e5595a4d45783b56f

deletions | additions      

       

Mutual Information is widely used for instance for registration of medical images as it is depicted in \cite{Pluim_2003}. The main idea is to introduced a feature space (or a joint probability) of the two trajectories we want to compare and to evaluate the quantity of "information" shared by the two trajectories based on this feature space. This quantity is calculate with Mutual Information.   In our case the feature space will be the distribution of the couple of positions of 2 players' trajectories during a window of time during a few minutes. Thus it corresponds to a 4-dimension distribution. The Mutual Infomation of this distribution will be the lynchpin of our similarity measure for trajectories.  \subsubsection{Entropy}  Shannon introduced the entropy to be a measure of the quantity of information of a random variable.    Let $X: P \rightarrow E$  \subsection{MI-based metric for trajectories}  \subsection{Empirical MI-based metric for trajectories}