Word2Vec
- Two-layer neural network to convert words into higher-dimensional vectors
- Words that share common contexts in the corpus are located close to each other in the vector space, capturing semantic relations with vector operations
- Skip-gram model is one implementation
- Google has a pretrained Word2vec using the Google News corpus. We can also pretrain context-specific Word2vec on custom text