Skip to content
/ Seq2Seq Public

This repository stores the implementation of various sequence to sequence models using PyTorch

Notifications You must be signed in to change notification settings

10-zin/Seq2Seq

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Seq2Seq

A collection of various types of seq2seq models. A Sequence to Sequence model takes a sequence from one domain as an input and outputs a sequence from another domain as an output. Technically, It uses an encoder-decoder architecture to encode a given input sequence to its context representation and then decode it to an output sequence.

More specifically, we find its use in machine translation. Where, the model can learn to translate an input from a given language to a target language.

Each encoder and decoder, is nothing but a recurrent network, a LSTM or GRU in this repository.

Here, I implement three types of sequence to sequence models with increasing complexity and performance.

  1. Seq2SeqExample - Multi-layer LSTM based Encoder-Decoder.
  2. Seq2SeqPR - Context Vector dependent GRU based Encoder-Decoder.
  3. Seq2SeqAttention - Attention dependent GRU based Encoder-Decoder.

Each directory includes the script for the respective implementation.

About

This repository stores the implementation of various sequence to sequence models using PyTorch

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published