word embedding
1 Concept :
injective and structure-preserving
2 Application:
omas Mikolov's Word2vec, Stanford University's GloVe, Gensim etc.
3 Methods:
include neural networks, dimensionality reduction on the word co-occurrence matrix, probabilistic models, and explicit representation in terms of the context in which words appear.
4details on neural networks: word2vector:
CBOW:
Skip-Gram model:
Negative Sampling