A web-based text summarization tool built with Flask and Hugging Face Transformers. Users can input articles and select from multiple fine-tuned models — T5 (CNN/DailyMail, PubMed, XSum), BART, and Pegasus — to generate concise summaries.
- Summarize free-form articles
- Choose from multiple Transformer models:
- T5
- CNN/DailyMail
- PubMed
- XSum
- BART (trained on all datasets combined)
- Pegasus (trained on all datasets combined)
- T5
- Clean, responsive user interface
- Efficient backend using PyTorch and Transformers
-
Clone the repository
git clone https://github.com/HassanSiddique2967/Abstractive-Text-Summarization-for-News-Research-Papers.git cd text-summarization-app
-
Create and activate a virtual environment
python -m venv summarizer summarizer\Scripts\activate # On Windows # OR source summarizer/bin/activate # On macOS/Linux
-
Install dependencies
pip install -r requirements.txt
-
Download or fine-tune models
Place your fine-tuned models in the following directories:t5_cnn_dailymail_finetuned/
t5_pubmed_finetuned/
t5_xsum_finetuned/
bart_finetuned_all/
pegasus_finetuned_all/
-
Run the app
python app.py
The app will start at: http://127.0.0.1:5000/
Model | Dataset Used | Description |
---|---|---|
T5 | CNN/DailyMail | News articles summarization |
T5 | PubMed | Scientific/biomedical summarization |
T5 | XSum | Extreme summarization with a single sentence |
BART | All datasets | Combined training for general summarization |
Pegasus | All datasets | Combined training for high-quality abstractive summarization |
- Python 3.8+
- Flask
- torch
- transformers
Refer to requirements.txt
for the full list.
- Hugging Face Transformers
- Pretrained datasets: CNN/DailyMail, PubMed, XSum