Skip to content

Latest commit

 

History

History
80 lines (48 loc) · 2.57 KB

README.md

File metadata and controls

80 lines (48 loc) · 2.57 KB

News Summarizer

Overview

News Summarizer is an abstractive text summarization project leveraging the T5 transformer model.The implementation includes the T5 model fine-tuned for news summarization. Users can input news articles and witness the model's generation of abstractive summaries. With clear instructions and detailed steps, the project provides an accessible and insightful tool for extracting key information from news content. Explore the power of transformer models in natural language processing with this easy-to-use news summarization solution.

Model Architecture

Model Architecture

Features

  • T5 Transformer Model: Utilizes the T5 transformer model, a powerful architecture for sequence-to-sequence tasks.
  • News Article Summarization: Given a news article, the model generates an abstractive summary.
  • Easy-to-Use Jupyter Notebook: The implementation is presented in a Jupyter Notebook, making it accessible and easy to understand.

Prerequisites

  • Python 3.7 or higher
  • Jupyter Notebook
  • PyTorch and Hugging Face Transformers library

Usage

  1. Clone the Repository:

    git clone https://github.com/rishii100/News-Summarizer.git
    
    cd news-summarizer
    
  2. Install Dependencies:

    pip install -r requirements.txt
    
  3. Open the Jupyter Notebook:

    jupyter notebook News-Summarizer.ipynb
    
  4. Run the Notebook:

    Execute each cell in the Jupyter Notebook to load the model, input a news article, and generate a summary.

Model Details

This project employs the T5 transformer model, pretrained on a large corpus of text data. The model is fine-tuned for the specific task of news summarization.

Results

The model's performance can be evaluated based on the generated summaries.

Accuracy

( Accuracy= 97% )

User Flow

user flow

Contributors

License

This project is licensed under the Apache License 2.0 file.

Acknowledgments

  • Special thanks to the Hugging Face community for providing the transformer models.
  • Inspired by the work in abstractive text summarization.