Denes Csala edited One_of_the_self_declared__.md  over 8 years ago

Commit id: a0fbf03abca5604c86e11a5e58494d29b4c5f749

deletions | additions      

       

One of the self-declared strengths of MTD is the fact that is does no label the prior topics and thus it has the ability to run without human input. One of the most influential works in the domain of topic modeling with transfer learning is \cite{Thrun_1998}. It defines transfer learning - or as it calls it "learning to learn" - if the performance of an algorithm at each task improves with experience and with the number of tasks. In this context, MTD would be hard to bracket into the definition. The book references \cite{Blumer_1987} to describe the equation that defines learning a function from noise-free examples.