Skip to content

SalahBelila/summarization

Repository files navigation

Abstractive Text Summarization Using Sequence-to-Sequence Model With Modified Attention

A project written to obtain the License Degree

How To Use The Model

Everything you need to know is included in the tutorial.ipynb file. To start training, you can simply run all the cells in the tutorial notebook. However, if you want to use pre-trained embedding you need to download the embedding files manually from: https://nlp.stanford.edu/projects/glove/. Extract the files and put them in the glove directory, also bear in mind that only glove.6b is supported.

About

A project written to obtain the License Degree

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published