In this document, we are doing an analytical study of principal manifolds and nonlinear dimension reduction via local tangent space alignment, this article was published in Journal of Shanghai University on 2004 by ZHENYUE ZHANG Professor of Mathematics from Zhejiang University and HONGYUAN ZHA Professor from Georgia Institute of Technology, they have may works mainly on machine learning. In order to resume the article, I start with exposing the scientific context at first, secondly, I will try to highlight the most important existing works; and the position of the article. Thirdly, the contribution in the scientific world, and finally I end my resume with showing the results and experiments of the method

Contexte of work

The manifold learning is a subfield method of machine learning which is also a subfield of computer science, before to give the definition of manifold learning I started with the machine learning, Arthur Samuel describe it like " it gives computers the ability to learn without being explicitly programmed" (Arthur Samuel, 1959). [1], machine learning explores the study and construction of algorithms that can learn from and make predictions on data [2]. It is divided two principal approach the first is the supervising learning; the second is the unsupervised learning witch manifold learning is part of it. From that we can go further to define manifold learning, it Is the approach how attempts to reduce dimension of the dataset to ease representation and interpretation (we called the approach representation learning). The main issue with the high dimensional dataset is more difficult it becomes to sample the space. This causes many problems. Algorithms that operate on high-dimensional data tend to have a very high time complexity. Many machine learning algorithms, for example, struggle with high-dimensional data. This has become known as the curse of dimensionality. Reducing data into fewer dimensions often makes analysis algorithms more efficient, and can help machine learning algorithms make more accurate predictions. Many algorithms were developed instead to reduce dimensionality, we try to present the position of local tangent space alignment algorithm in existing works


The difficulties to visualize and to predict high-dimensional dataset and the pushes researcher scientists to develop several algorithms in this field; each one of them try to solve a part of the problem of high dimensionality, and to aid visualization of the structure of a dataset. In this part of the paper we try to present the most important algorithm dealing with manifold learning and nonlinear reduction dimensionality;


Issued from Isometric Mapping is one of the earliest algorithm treating the manifold learning; it based on Multi-Dimensional Scaling (MDS) or Kernel PCA; it seeks a lower-dimensional embedding which maintains geodesic distances between all points

Locally Linear Embedding

LLE algorithm seeks a lower-dimensional projection of the data which preserves distances within local neighborhoods. It can be thought of as a series of local Principal Component Analyses which are globally compared to find the best non-linear embedding [4]

Hessian Eigenmapping

Hessian Eigenmapping (also known as Hessian-based LLE: HLLE) is another method of solving the regularization problem of LLE. It revolves around a hessian-based quadratic form at each neighborhood which is used to recover the locally linear structure [5]

Local Tangent Space Alignment

Local tangent space alignment (LTSA) is algorithmically similar enough to LLE that it can be put in this category. Rather than focusing on preserving neighborhood distances as in LLE, LTSA seeks to characterize the local geometry at each neighborhood via its tangent space, and performs a global optimization to align these local tangent spaces to learn the embedding [6].

Multi-dimensional Scaling

MDS seeks a low-dimensional representation of the data in which the distances respect well the distances in the original high-dimensional space. In general, is a technique used for analyzing similarity or dissimilarity data. MDS attempts to model similarity or dissimilarity data as distances in a geometric space. The data can be ratings of similarity between objects, interaction frequencies of molecules, or trade indices between countries [7].


The introduction of tangent space alignment to learn the topography of the local manifold and error analysis which is in second-order accuracy. were the major difference from the others approach, the introduction of those approach allows to reduce the computation complexity of the algorithm and improve the performance and precision.