LG paper review QnA session - Knowledge Distillation
-
Distilling the Knowledge in a Neural Network, 15’NIPS (https://arxiv.org/abs/1503.02531)
-
Knowledge Distillation by On-the-Fly Native Ensemble, 18`NIPS, (https://arxiv.org/abs/1806.04606) (ONE/)
-
Regularizing Class-wise Predictions via Self-knowledge Distillation, 20 `CVPR
-
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation, 19 `ICCV