Compositional phrase embeddings
Pytroch implementation of skip-gram model to embed phrases compositionally.
For each phrase, individual phrase's words pass through an encoder (RNN/CNN/etc) to get a phrase embedding. Then we predict surrounding words from this phrase embedding.
Allows getting an embedding for every phrase that consists of known individual words.
- Exploring phrase-compositionality in skip-gram models (Compositional Phrase Embeddings)
- Distributed Representations of Words and Phrases and their Compositionality (Negative Sampling & Subsampling of the frequent words)
- Efficient Estimation of Word Representations in Vector Space (Skip-Gram model)
and also allows using RNNs and CNNs for compositional phrase embeddings
Uses negative sampling and allennlp DL4NLP library.