Online Dictionary Learning for Sparse Coding

1 Introduction
The name of this article is " Online Dictionary Learning for Sparse Coding " , Article written and published in Montreal, Canada, 2009 of the 26th International Conference on Machine Learning.  The authors are Julien Mairal,Francis Bach Professors at the university of INRIA in france,while Jean Ponce is from Ecole Normale Superieure, and Guillermo Sapiro from University of Minnesota - Department of Electrical and Computer Engineering, 200 Union Street SE, Minneapolis, USA .
During this report, we will present the article from different views. First we will expose the general context of image and vidéo procesing, After that, we will concentrate on the article's contributions. In order to propose the new online optimization algorithm for dictionary learning . We will be presenting the different results of these experiments later on the report. By the end, we will conclude with application to inpainting.
So this article talk in the introduction what is the sparse coding and dictionary learning are about and also why one should care, after that we're going to define Optimization techniques for sparse coding, next i will going to the dictionary learning for reconstruction, Learning for the task, finally i will present the new sparse models.

2 Context of the work
The context of the article is learning efficiently dictionaries for sparse coding.
The autors propose a new online optimization algorithm for dictionary Learning witch is based on stochastic approximations, which scales up gracefully to large datasets with millions of training samples called Outlier methodology. A proof of convergence is presented, along with experiments with natural images demonstrating that it leads to faster performance and better dictionaries than classical batch algorithms for both .

3 Positionnement
As in all items , many algorithms exist the authors have made many research aprove these works.They intent here is of course not to evaluate our learning procedure in inpainting tasks, which would require a thorough comparison with state-the-art techniques on standard datasets, Comparison with Stochastic Gradient Descent the new method have shown that obtaining good performance with stochastic gradient descent requires.