Skip to content

Natural Language processing staff, including rnn, attention, LSTM, gated recurrent unit, etc; Representation learning for natural languages such as embedding techniques and language models

Notifications You must be signed in to change notification settings

xliu93/NLP_with_Representation_Learning

Repository files navigation

NLP with Representation Learning

Practice jupyter notebooks for RNN, encoder-decoder system, attention, LSTM staff implemented in pyTorch.

Overview

1. Tutorials

    1. PyTorch Basics
    1. Logistic Regression (sklearn/pytorch)
    1. Deep Learning Training Workflow
    1. Building a Bag-of-Words Model
    1. Document Classification with FastText
    1. RNN/CNN based Language Classification
    1. Intrinsic Evaluation of Word Vectors (GloVe/FastText)
    1. Language Modeling with KenLM
    1. Language Modeling with Byte Pair Encoding

2. Experiments

  • Bag-of-Ngram and Document Classification
  • RNN/CNN based Natural Language Inference

Get started

(for MacOS)

  1. Download and install conda (Python3.6)

    • XCode is assumed
    • run bash [installer_file_that_ends_with.sh]
    • run conda list to confirm that installation succeeded
  2. Setup conda environment and install jupyter notebook

    $ conda create -n learn_nlp python=3.6
    $ conda activate learn_nlp
    $ conda install jupyter notebook matplotlib scikit-learn
    $ conda install -c conda-forge jupyterlab  # very helpful
    
  3. Install pyTorch

    conda install pytorch torchvision -c pytorch
    

About

Natural Language processing staff, including rnn, attention, LSTM, gated recurrent unit, etc; Representation learning for natural languages such as embedding techniques and language models

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published