Skip to content

rdiehlmartinez/attention-analysis

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Low Resource Prediction Using Bert Attention Scores

This library is the code-base that accompanies our in-progress workshop paper. We demonstrate that the attention distributions of trained BERT models provide strong enough signal to be used as the input themselves to downstream shallow neural networks. This approach enables us to limit the amount of data we require to train classification models.

The data can be found here: https://drive.google.com/uc?id=1dfN-WvFMiAWuOXq1VJ_EnpTDGQruWuxm&export=download

About

Use attention distributions to classify types of linguistic bias.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published