Exercise 1
part 1.
\(D_2\left(x_{i,}x_j\right)=\Sigma_{l=1}^d\left(x_i\left(l\right)-x_j\left(l\right)\right)^2\)
\(x^{\ast}=\arg\ \min_{x\text{ }\epsilon R^d}\Sigma_{x_i\epsilon X}D_2\left(x_i,x\right)\)
In order to find minimum we can set the first order derivative equal to zero
\(\Rightarrow x^{\ast}=\Sigma_{x_i\epsilon X}\left(\frac{d}{dx}\left(x_i-x\right)^2\right)=0\)
\(\Rightarrow0=\Sigma_{i=0}^n\ 2\left(x_i-x\right)\) \(\because x_i\epsilon X\ includes\ all\ points\ x_{1\ }to\ x_n\)
\(\Rightarrow0=x_1-x+x_2-x+x_3-x......x_n-x\)
\(\Rightarrow0=x_1+x_2+x_3.....x_n-nx\)
\(\Rightarrow nx=\Sigma_{i=1}^n\left(x_i\right)\)
\(\Rightarrow x=\frac{1}{n}\Sigma_{i=1}^n\left(x_i\right)\)
Here \(x^{\ast}\ \) is minimum when \(x=\frac{1}{n}\Sigma_{i=1}^n\left(x_i\right)\)
Therefore \(x^{\ast}=x=\frac{1}{n}\Sigma_{i=1}^n\left(x_i\right)\)
Hence Proved.
part 2.
\(D_1\left(x_i,x_j\right)=\Sigma_{l=1}^d\left|x_i\left(l\right)-x_j\left(l\right)\right|\)
\(x^{\ast}=\arg\ \min_{x\text{ }\epsilon R^d}\Sigma_{x_i\epsilon X}D_1\left(x_i,x\right)\)
In order to find minimum we can set the first order derivative equal to zero
\(\Rightarrow x^{\ast}=\Sigma_{x_i\epsilon X}\left(\frac{d}{dx}\left|x_i-x\right|\right)=0\)
We know that \(\frac{d}{\ dx}\ \left|x\right|=k.\left(x\right)\ where\ k\ is\ -1\ when\ \ x\ is\ negative\ and\ 1\ when\ \ x\ is\ positive\)