Skip to content

Beleiaya/BERT

 
 

Repository files navigation

bert

modification of official bert for downstream task

Support OQMRC, LCQMC, knowledge distillation, adversarial disturbation and bert+esim for multi-choice, classification and semantic match

for OQMRC, we can get 0.787% on dev set for LCQMC, we can get 0.864 on test set knowledge distillation supports self-distillation

support task pretrain+fintuning

for a downstream task, we add masked lm as a auxiliary loss which can be seen as denoising and similar to word dropout to achieve robust performance.

About

modification of official bert for downstream task

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 96.0%
  • Shell 2.8%
  • JavaScript 0.7%
  • HTML 0.3%
  • C++ 0.2%
  • Dockerfile 0.0%