Word2vec
tags :
NLP technique #
Word2vec is a technique for natural language processing (NLP) published in 2013.
The word2vec Algorithm uses a neural Neural Networks Model to learn word associations from a large corpus of text.
Once trained, such a model can detect synonymous words or suggest additional words for a partial sentence.
As the name implies, word2vec represents each distinct word with a particular list of numbers called a vector.
The vectors are chosen carefully such that they capture the semantic and syntactic qualities of words; as such, a simple mathematical function (cosine similarity) can indicate the level of Semantic similarity between the words represented by those vectors.