this is for holding javascript data
Alexander Kirillov edited bf_Abstract_The_goal_of__.tex
almost 8 years ago
Commit id: a3e8666cbce5fdf051a0a545de28cd381de746fb
deletions | additions
diff --git a/bf_Abstract_The_goal_of__.tex b/bf_Abstract_The_goal_of__.tex
index acc6910..3cfa341 100644
--- a/bf_Abstract_The_goal_of__.tex
+++ b/bf_Abstract_The_goal_of__.tex
...
, obtained by means of the neural networks,
word embedding in euclidean space, em
The word2vec software of Tomas Mikolov and colleagues1 has gained a lot
of traction lately, and provides state-of-the-art word embeddings. The learning
models behind the software are described in two research papers [1, 2]. We
found the description of the models in these papers to be somewhat cryptic
and hard to follow. While the motivations and presentation may be obvious to
the neural-networks language-modeling crowd, we had to struggle quite a bit to
figure out the rationale behind the equations.