Closed Book Question Answering using pre-trained Transformers.
This repository collects language models that were trained to generate answers for questions which require world knowledge without explicitly providing the external knowledge source. (please refer to: Roberts et al., 2020, How Much Knowledge Can You Pack Into the Parameters of a Language Model?)
- Python > 3.6
pip install requirements.txt
Available models:
- BART (trained by Sewon Min)
Download:
$ chmod +x download_models.sh; ./download_models.sh
To download the Natural Questions dataset in a JSON format please run:
$ chmod +x download_data.sh; ./download_data.sh
To run an NQ dataset from file using a pre-trained model
python3 main.py --model bart
--predict_file data/nqopen-test.json
This script will parse the dataset JSON and load the downloaded model's state, then run questions and print the predictions alongside the correct answer.
If a prediction run on file was successful, a log file should be added to the logs
directory.
To run in interactive mode run:
python3 main.py --model bart --interactive