Monday, February 12, 2018

深入淺出 Deep Learning():word2vec

深入淺出 Deep Learning():word2vec

 2018/01/13


中文
簡體
英文
論文



解說
討論
源碼
原理



-----

References

中文

Vector Space of Semantics - Mark Chang's Blog
https://ckmarkoh.github.io/blog/2016/07/10/nlp-vector-space-semantics/

Word2vec (Part 1 : Overview) - Mark Chang's Blog
https://ckmarkoh.github.io/blog/2016/07/12/neural-network-word2vec-part-1-overview/

Word2vec (Part 2 : Backward Propagation) - Mark Chang's Blog
https://ckmarkoh.github.io/blog/2016/07/12/-word2vec-neural-networks-part-2-backward-propagation/

Word2vec (Part 3 : Implementation) - Mark Chang's Blog
https://ckmarkoh.github.io/blog/2016/08/29/neural-network-word2vec-part-3-implementation/

Word2vec - 維基百科,自由的百科全書
https://zh.wikipedia.org/wiki/Word2vec

科技大擂台 詞向量介紹
https://fgc.stpi.narl.org.tw/activity/videoDetail/4b1141305ddf5522015de5479f4701b1

Word Embedding與Word2Vec - 掃文資訊
https://hk.saowen.com/a/170e16f48e0ac5d1caf38d1a294739b105fe08b1cc0724b6eed98efc7d49f401

實作Tensorflow (5):Word2Vec - YC Note
http://www.ycc.idv.tw/YCNote/post/44

更別緻的詞向量模型:simpler glove (一) - 幫趣
http://bangqu.com/ADv99g.html

用Python做深度學習的自然語言處理 | Soft & Share
https://softnshare.wordpress.com/2017/09/25/python-natural-language-process-deep-learning/

用深度學習(CNN RNN Attention)解決大規模文本分類問題 - 綜述和實踐 | 香港矽谷
https://www.hksilicon.com/articles/1459305

奇異值分解 (SVD) | 線代啟示錄
https://ccjou.wordpress.com/2009/09/01/奇異值分解-svd/





簡體

[NLP] 秒懂词向量Word2vec的本质
https://zhuanlan.zhihu.com/p/26306795

深度学习word2vec笔记之基础篇 - CSDN博客
http://blog.csdn.net/mytestmy/article/details/26961315

关于 word2vec 我有话要说 - 腾讯云社区 - 腾讯云
https://www.qcloud.com/community/article/404897

word2vec 中的数学原理详解(一)目录和前言 - CSDN博客
http://blog.csdn.net/itplus/article/details/37969519



词向量( Distributed Representation)工作原理是什么? - 知乎
https://www.zhihu.com/question/21714667/answer/19433618

word2vec 相比之前的 Word Embedding 方法好在什么地方? - 知乎
https://www.zhihu.com/question/53011711

Google 开源项目 word2vec 的分析? - 知乎
https://www.zhihu.com/question/21661274

word2vec有什么应用? - 知乎
https://www.zhihu.com/question/25269336

word2vec在工业界的应用场景-大数据算法
http://x-algo.cn/index.php/2016/03/12/281/



Word2vec:Java中的神经词向量 - Deeplearning4j: Open-source, Distributed Deep Learning for the JVM
https://deeplearning4j.org/cn/word2vec

word2vec源码解析之word2vec.c - CSDN博客
http://blog.csdn.net/lingerlanlan/article/details/38232755







英文

Word2vec - Wikipedia
https://en.wikipedia.org/wiki/Word2vec

The amazing power of word vectors | the morning paper
https://blog.acolyer.org/2016/04/21/the-amazing-power-of-word-vectors/

Deep Learning Weekly | Demystifying Word2Vec
https://www.deeplearningweekly.com/blog/demystifying-word2vec

Deep Learning, NLP, and Representations - colah's blog
http://colah.github.io/posts/2014-07-NLP-RNNs-Representations/




Word2Vec Tutorial - The Skip-Gram Model · Chris McCormick
http://mccormickml.com/2016/04/19/word2vec-tutorial-the-skip-gram-model/


Vector Representations of Words  |  TensorFlow
https://www.tensorflow.org/tutorials/word2vec

Learn Word2Vec by implementing it in tensorflow – Towards Data Science
https://towardsdatascience.com/learn-word2vec-by-implementing-it-in-tensorflow-45641adaf2ac

Word2Vec word embedding tutorial in Python and TensorFlow - Adventures in Machine Learning
http://adventuresinmachinelearning.com/word2vec-tutorial-tensorflow/

gensim: models.word2vec – Deep learning with word2vec
https://radimrehurek.com/gensim/models/word2vec.html





Intuitive Understanding of Word Embeddings: Count Vectors to Word2Vec
https://www.analyticsvidhya.com/blog/2017/06/word-embeddings-count-word2veec/




Approximating the Softmax for Learning Word Embeddings
http://ruder.io/word-embeddings-softmax/


Stop Using word2vec | Stitch Fix Technology – Multithreaded
https://multithreaded.stitchfix.com/blog/2017/10/18/stop-using-word2vec/



論文

-----

Bengio, Yoshua, et al. "A neural probabilistic language model." Journal of machine learning research 3.Feb (2003): 1137-1155.

-----

Le, Quoc, and Tomas Mikolov. "Distributed representations of sentences and documents." Proceedings of the 31st International Conference on Machine Learning (ICML-14). 2014.

-----

Mikolov, Tomas, et al. "Efficient estimation of word representations in vector space." arXiv preprint arXiv:1301.3781(2013).

-----

Mikolov, Tomas, Wen-tau Yih, and Geoffrey Zweig. "Linguistic regularities in continuous space word representations." hlt-Naacl. Vol. 13. 2013.

-----

Goldberg, Yoav, and Omer Levy. "word2vec Explained: deriving Mikolov et al.'s negative-sampling word-embedding method." arXiv preprint arXiv:1402.3722 (2014).

-----

Rong, Xin. "word2vec parameter learning explained." arXiv preprint arXiv:1411.2738 (2014).

-----

来斯惟. "基于神经网络的词和文档语义向量表示方法研究." (2016).
https://arxiv.org/ftp/arxiv/papers/1611/1611.05962.pdf

-----

Pennington, Jeffrey, Richard Socher, and Christopher Manning. "Glove: Global vectors for word representation." Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). 2014.


-----

No comments: