Skip to content

Toolkit for efficient experimentation with various sequence-to-sequence models

License

Notifications You must be signed in to change notification settings

fotwo/OpenSeq2Seq

 
 

Repository files navigation

OpenSeq2Seq

OpenSeq2Seq: toolkit for distributed and mixed precision training of sequence-to-sequence models

This is a research project, not an official NVIDIA product.

Documentation: https://nvidia.github.io/OpenSeq2Seq/

OpenSeq2Seq main goal is to allow researchers to most effectively explore various sequence-to-sequence models. The efficiency is achieved by fully supporting distributed and mixed-precision training. OpenSeq2Seq is built using TensorFlow and provides all the necessary building blocks for training encoder-decoder models for neural machine translation and automatic speech recognition. We plan to extend it with other modalities in the future.

Features

  1. Sequence to sequence learning
    1. Neural Machine Translation
    2. Automatic Speech Recognition
  2. Data-parallel distributed training
    1. Multi-GPU
    2. Multi-node
  3. Mixed precision training for NVIDIA Volta GPUs

Requirements

  1. TensorFlow >= 1.7
  2. Horovod >= 0.12.0 (using Horovod is not required, but is highly recommended for multi-GPU setup)

Acknowledgments

Speech-to-text workflow uses some parts of Mozilla DeepSpeech project.

Text-to-text workflow uses some functions from Tensor2Tensor and Neural Machine Translation (seq2seq) Tutorial.

Related resources

Paper

If you use OpenSeq2Seq, please cite this paper

@article{openseq2seq,
  title={
OpenSeq2Seq: extensible toolkit for distributed and mixed precision training of sequence-to-sequence models},
  author={Kuchaiev, Oleksii and Ginsburg, Boris and Gitman, Igor and Lavrukhin, Vitaly and  Case, Carl and Micikevicius, Paulius},
  journal={arXiv preprint arXiv:1805.10387},
  year={2018}
}

About

Toolkit for efficient experimentation with various sequence-to-sequence models

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Python 93.4%
  • C++ 5.0%
  • Other 1.6%