Skip to content

Ouya-Bytes/BayesianRNN

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

This is the code used for the experiments in the paper "A Theoretically Grounded Application of Dropout in Recurrent Neural Networks". The sentiment analysis experiment relies on a fork of keras which implements Bayesian LSTM, Bayesian GRU, embedding dropout, and MC dropout. The language model experiment extends wojzaremba's lua code.

Update 1 (Feb 22):

keras now supports dropout in RNNs following the implementation above. A simplified example of the sentiment analysis experiment using the latest keras implementation is given in here.

Update 2 (March 28):

The script main_new_dropout_SOTA implements Bayesian LSTM (Gal, 2015) for the large model of Zaremba et al. (2014). In the setting of Zaremba et al. the states are not reset and the testing is done with a single pass through the test set. The only changes I've made to the setting of Zaremba et al. are:

  1. dropout technique (using a Bayesian LSTM)
  2. weight decay (which was chosen to be zero in Zaremba et al.)
  3. a slightly smaller network was used to fit on my GPU (1250 units per layer instead of 1500)

All other hypers being identical to Zaremba et al.: learning rate decay was not tuned for my setting and is used following Zaremba et al., and the sequences are initialised with the previous state following Zaremba et al. (unlike in main_dropout.lua). Dropout parameters were optimised with grid search (tying dropout_x & dropout_h and dropout_i & dropout_o) over validation perplexity (optimal values are 0.3 and 0.5 compared Zaremba et al.'s 0.6).

Single model test perplexity is improved from Zaremba et al.'s 78.4 to 76.5. Validation perplexity is reduced from 82.2 to 79.1, see log.

References:

  • Gal, Y, "A Theoretically Grounded Application of Dropout in Recurrent Neural Networks", 2015.
  • Zaremba, W, Sutskever, I, Vinyals, O, "Recurrent neural network regularization", 2014.

About

Code for the paper "A Theoretically Grounded Application of Dropout in Recurrent Neural Networks"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Lua 61.0%
  • Python 39.0%