Skip to content

LG paper review QnA session - Knowledge Distillation

Notifications You must be signed in to change notification settings

jaychoi12/LG_KD

Repository files navigation

LG_KD

LG paper review QnA session - Knowledge Distillation

Paper List (Code)

  1. Distilling the Knowledge in a Neural Network, 15’NIPS (https://arxiv.org/abs/1503.02531)

  2. Knowledge Distillation by On-the-Fly Native Ensemble, 18`NIPS, (https://arxiv.org/abs/1806.04606) (ONE/)

  3. Regularizing Class-wise Predictions via Self-knowledge Distillation, 20 `CVPR

  4. Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation, 19 `ICCV


TAs

About

LG paper review QnA session - Knowledge Distillation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published