this is for holding javascript data
Lucas Fidon edited section_Metrics_for_trajectories_Most__.tex
almost 8 years ago
Commit id: 374827949c1af870f2771f23a4b9e607073636cc
deletions | additions
diff --git a/section_Metrics_for_trajectories_Most__.tex b/section_Metrics_for_trajectories_Most__.tex
index c7764bd..d4d1187 100644
--- a/section_Metrics_for_trajectories_Most__.tex
+++ b/section_Metrics_for_trajectories_Most__.tex
...
In our case the feature space will be the distribution of the couple of positions of 2 players' trajectories during a window of time during a few minutes. Thus it corresponds to a 4-dimension distribution. The Mutual Infomation of this distribution will be the lynchpin of our similarity measure for trajectories.
\subsubsection{Entropy}
Shannon introduced the entropy to be a measure of the quantity of information of a random variable.
Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.
The entropy of $X$ is defined as:
$ S(X) = \sum_{x in E}P_{X}(x)*log(P_{X}(x))$
The entropy has three interpretations:
\begin{itemize}
\item the amount of information of a random variable (or of an event)
\item the uncertainty about the outcome of a random variable (or of an event)
\item the dispersion of the probability law of a random variable (or of the probabilities with which the events take place)
\end{itemize}
Let $X: P \rightarrow E$ be a random variable with $E$ a discrete probability space.
The entropy of $X$ is defined as:
$ S(X) = \sum_{x in E}P_{X}(x)*log(P_{X}(x))$
\subsection{MI-based metric for trajectories}