Getting Started with Word2Vec

Getting Started with Word2Vec

1. Source by Google

Project with Code: https://code.google.com/archive/p/word2vec/

Blog: Learning Meaning Behind Words

Paper:

  1. Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. Efficient Estimation of Word Representations in Vector Space. In Proceedings of Workshop at ICLR, 2013.
  2. Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean. Distributed Representations of Words and Phrases and their Compositionality. In Proceedings of NIPS, 2013.
  3. Tomas Mikolov, Wen-tau Yih, and Geoffrey Zweig. Linguistic Regularities in Continuous Space Word Representations. In Proceedings of NAACL HLT, 2013.
  4. Tomas Mikolov, Quoc V. Le, Ilya Sutskever. Exploiting Similarities among Languages for Machine Translation
  5. NIPS DeepLearning Workshop NN for Text by Tomas Mikolov and etc. https://docs.google.com/file/d/0B7XkCwpI5KDYRWRnd1RzWXQ2TWc/edit

2. Best explaination

Best explained with original models, optimizing methods, Back-propagation background and Word Embedding Visual Inspector

paper: word2vec Parameter Learning Explained

Slides: Word Embedding Explained and Visualized

Youtube Video: Word Embedding Explained and Visualized – word2vec and wevi

Demo: wevi: word embedding visual inspector

3. Word2Vec Tutorials

Word2Vec Tutorial by Chris McCormick

Chris McCormick http://mccormickml.com/

Note: skip over the usual introductory and abstract insights about Word2Vec, and get into more of the details

Word2Vec Tutorial – The Skip-Gram Model

Word2Vec Tutorial Part 2 – Negative Sampling

Alex Minnaar’s Tutorials

Alex Minnaar http://alexminnaar.com/

Word2Vec Tutorial Part I: The Skip-Gram Model

Word2Vec Tutorial Part II: The Continuous Bag-of-Words Model

4. Learning by Coding

Distributed Representations of Sentences and Documents http://nbviewer.jupyter.org/github/fbkarsdorp/doc2vec/blob/master/doc2vec.ipynb

An Anatomy of Key Tricks in word2vec project with examples http://nbviewer.jupyter.org/github/dolaameng/tutorials/blob/master/word2vec-abc/poc/pyword2vec_anatomy.ipynb

  1. Deep learning with word2vec and gensim, Part One
  2. Word2vec in Python, Part Two: Optimizing
  3. Parallelizing word2vec in Python, Part Three
  4. Gensim word2vec document: models.word2vec – Deep learning with word2vec
  5. Word2vec Tutorial by Radim Řehůřek (Note: Simple but very powerful tutorial for word2vec model training in gensim.)

5. Ohter Word2Vec Resources

Word2Vec Resources by Chris McCormick

Posted by TextProcessing

References

  1. https://textprocessing.org/getting-started-with-word2vec
posted @   健康平安快乐  阅读(282)  评论(0编辑  收藏  举报
编辑推荐:
· 从 HTTP 原因短语缺失研究 HTTP/2 和 HTTP/3 的设计差异
· AI与.NET技术实操系列:向量存储与相似性搜索在 .NET 中的实现
· 基于Microsoft.Extensions.AI核心库实现RAG应用
· Linux系列:如何用heaptrack跟踪.NET程序的非托管内存泄露
· 开发者必知的日志记录最佳实践
阅读排行:
· winform 绘制太阳,地球,月球 运作规律
· AI与.NET技术实操系列(五):向量存储与相似性搜索在 .NET 中的实现
· 超详细:普通电脑也行Windows部署deepseek R1训练数据并当服务器共享给他人
· 【硬核科普】Trae如何「偷看」你的代码?零基础破解AI编程运行原理
· 上周热点回顾(3.3-3.9)
历史上的今天:
2017-02-16 如何用86天考上研究生
点击右上角即可分享
微信分享提示