Skip to content

maneeshhsingh100/morphological-reinflection

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

morphological-reinflection

Source code for the paper: Sequence to Sequence Transduction with Hard Monotonic Attention.

Requires dynet.

Usage:

hard_attention.py [--dynet-mem MEM][--input=INPUT] [--hidden=HIDDEN] [--feat-input=FEAT] [--epochs=EPOCHS] [--layers=LAYERS] [--optimization=OPTIMIZATION] [--reg=REGULARIZATION][--learning=LEARNING] [--plot] [--eval] [--ensemble=ENSEMBLE] TRAIN_PATH DEV_PATH TEST_PATH RESULTS_PATH SIGMORPHON_PATH...

Arguments:

  • TRAIN_PATH train set path
  • DEV_PATH development set path
  • TEST_PATH test set path
  • RESULTS_PATH results file (to be written)
  • SIGMORPHON_PATH sigmorphon root containing data, src dirs

Options:

  • -h --help show this help message and exit
  • --dynet-mem MEM allocates MEM bytes for (py)cnn
  • --input=INPUT input embeddings dimension
  • --hidden=HIDDEN lstm hidden layer dimension
  • --feat-input=FEAT feature embeddings dimension
  • --epochs=EPOCHS amount of training epochs
  • --layers=LAYERS amount of layers in lstm
  • --optimization=OPTIMIZATION chosen optimization method ADAM/SGD/ADAGRAD/MOMENTUM/ADADELTA
  • --reg=REGULARIZATION regularization parameter for optimization
  • --learning=LEARNING learning rate parameter for optimization
  • --plot draw a learning curve plot while training each model
  • --eval run evaluation on existing model (without training)
  • --ensemble=ENSEMBLE ensemble model paths, separated by comma

For example:

python hard_attention.py --cnn-mem 4096 --input=100 --hidden=100 --feat-input=20 --epochs=100 --layers=2 --optimization=ADADELTA  /Users/roeeaharoni/research_data/sigmorphon2016-master/data/navajo-task1-train /Users/roeeaharoni/research_data/sigmorphon2016-master/data/navajo-task1-dev /Users/roeeaharoni/research_data/sigmorphon2016-master/data/navajo-task1-test /Users/roeeaharoni/Dropbox/phd/research/morphology/inflection_generation/results/navajo_results.txt /Users/roeeaharoni/research_data/sigmorphon2016-master/

If you use this code for research purposes, please use the following citation:

@article{aharoni2016sequence,
    title={Sequence to Sequence Transduction with Hard Monotonic Attention},
    author={Aharoni, Roee and Goldberg, Yoav},
    journal={arXiv preprint arXiv:1611.01487},
    year={2016}
}

About

Source code for the paper "Morphological Inflection Generation with Hard Monotonic Attention"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 97.3%
  • Perl 1.3%
  • C 1.2%
  • Other 0.2%