The code of my master thesis: Benchmarking and architectural analysis of state-of-the-art transformer models for book review summarization
The Code inside sum_models/finetuning is from the huggingface repository: https://github.com/huggingface/transformers and adjusted to finetune the Bart-Large-cnn model. Inside sum_models/experiments the jupyter notebooks can be found, showing the progess of the experiments. sum_models/experiments/revPrep contains the code, that is necessary for the notebooks to work.