Denes Csala added MTD_would_obviously_qualify_into__.md  over 8 years ago

Commit id: 86fe8d0237bac1f7b19e1af92e2f0bf00c42766e

deletions | additions      

         

MTD would obviously qualify into the lower branch of Unsupervised Transfer Learning - illustrating that this is a relatively unexplored field. One of the most elegant and earliest applications of unsupervised transfer learning related to topic modeling is that of \cite{De_Smet_2009}, of interlingual topic modeling through cross-language linking of news stories on the web. In this paper, the researchers tweak the standard LDA algorithm to work with two languages (English and Dutch): the conduct a standard LDA mining process for each of the two documents in different languages then cross-correlate the word distributions in the found topics using Kullback-Leibler divergence measure. The outcome is then potentially very powerful and it has a wide array of applications: from basic translation to news validation and event prediction through document (news) clustering.  Another interesting way to improve this approach would be to adapt \cite{Shi_2013} to MTD. Instead of treating two languages or _k_ languages in a pair-by-pair manner, one could treat a number of languages and create a Bayesian mixture for each using the set of the remaining languages. The great advantage of this would be that even if the accuracy (e.g. the Kullback-Leibler measure) of the transfer learning between two languages is low, if a large enough number of these low quality language pairs exist, eventually a good prediction could be obtained even without the presence of a good translation pair.