Skip to content

VictorProkhorov/HSVAE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

45 Commits
 
 
 
 
 
 
 
 

Repository files navigation

This repository contains an implementation of HSVAE model presented in the "Learning Sparse Sentence Encoding without Supervision: An Exploration of Sparsity in Variational Autoencoders" paper.

Table of contents

  1. Types of Sparsity
  2. Model
  3. HSVAE usage
  4. MAT-VAE usage
  5. Numerical stability
  6. Citing
  7. Licence
  8. Contact info

Types of Sparsity

In this paper we explore adaptive (a.k.a ephemeral) sparsity: alt tag

Model

alt tag

HSVAE usage

To train a new HSAVE model:

$ cd ./Scripts/Models/

then run:

python3 hsvae.py --z_reg_weight 0.01 --gamma_reg_weight 0.01 --temperature 0.5  --alpha 4.  --beta 1. --iter 1

--z_reg_weight is the weight of the second term of the ELBO
--gamma_reg_weight is the weight of the third term of the ELBO
Adjust --alpha and --beta to achive the desired level of sparsity. Consult the paper for more information.

MAT-VAE usage

To train a new MAT-VAE model:

$ cd ./Scripts/Models/

then run:

python3 mat_vae.py  --alpha 4.  --beta 1. --iter 1

Adjust --alpha (weight on the third term of the ELBO) and --beta (weight on the second term of the ELBO) to achive the desired level of sparsity. Consult the paper for more information.

Numerical stability

During training of a HSVAE model one may experinece a numerical stability problem - the loss functions becomes NaN. We have found it by further experimenting with various architectures of encoder and decoder and dimensions of the latent embeddings. To rectify this problem, consider replacing lines 278 and 279 in hsvae.py with:

q_alpha = tfp.math.clip_by_value_preserve_gradient(self.q_alpha(output), 0.5, 100)
q_beta = tfp.math.clip_by_value_preserve_gradient(self.q_beta(output), 0.5, 100)

Citing

If you find this material useful in your research, please cite:

@inproceedings{prokhorov2020hierarchical,
 title = "Learning Sparse Sentence Encoding without Supervision: An Exploration of Sparsity in Variational Autoencoders",
    author = "Prokhorov, Victor  and
      Li, Yingzhen  and
      Shareghi, Ehsan  and
      Collier, Nigel",
    booktitle = "Proceedings of the 6th Workshop on Representation Learning for NLP (RepL4NLP-2021)",
    month = aug,
    year = "2021",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://aclanthology.org/2021.repl4nlp-1.5",
    doi = "10.18653/v1/2021.repl4nlp-1.5",
    pages = "34--46"    
}

Licence

The code in this repository is free software: you can redistribute it and/or modify it under the terms of the GNU General Public License version 3 as published by the Free Software Foundation. The code is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.

Contact info

For questions or more information please use the following:

About

[RepL4NLP(2021)] Hierarchical Sparse Variation Autoencoder (HSVAE)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages