Skip to content

A Question Answering (QA) system built on a pre-trained BERT model to generate accurate, context-aware responses. By encoding both questions and context with BERT embeddings, the system identifies the most relevant answer spans, delivering fast and high-quality results across various domains like customer support and information retrieval.

Notifications You must be signed in to change notification settings

themakkonen/Question-Answering-System-Using-DistilBERT-NQ-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

6 Commits
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Question Answering System Using DistilBERT-NQ

This is a web-based Question Answering (QA) system built using the pretrained DistilBERT model fine-tuned on the Natural Questions dataset. It allows users to input a passage and ask a question related to it, and returns the most probable answer using deep learning.

🧠 Features

  • Uses AsmaAwad/distilbert-base-uncased-NaturalQuestions transformer model
  • Accepts user-provided paragraph and question inputs
  • Predicts and highlights the most relevant answer
  • Flask backend for inference API
  • Simple, user-friendly web interface

πŸ”§ Tech Stack

  • Frontend: HTML, CSS, Bootstrap
  • Backend: Python, Flask
  • AI Engine: Hugging Face Transformers, DistilBERT
  • Model: AsmaAwad/distilbert-base-uncased-NaturalQuestions

πŸš€ Getting Started

1. Clone the Repository

git clone https://github.com/yourusername/Question-Answering-System-Using-DistilBERT-NQ.git
cd Question-Answering-System-Using-DistilBERT-NQ
python -m venv venv
source venv/bin/activate  # For Linux/macOS
venv\Scripts\activate     # For Windows
pip install -r requirements.txt
pip install flask transformers torch
python app.py
QA System/
β”‚
β”œβ”€β”€ app.py                # Flask application
β”œβ”€β”€ templates/
β”‚   └── index.html        # HTML form for user input
β”œβ”€β”€ static/
β”‚   └── styles.css        # Styling
β”œβ”€β”€ model/
β”‚   └── [optional model loading or tokenizer cache]
└── README.md             # Project overview

About

A Question Answering (QA) system built on a pre-trained BERT model to generate accurate, context-aware responses. By encoding both questions and context with BERT embeddings, the system identifies the most relevant answer spans, delivering fast and high-quality results across various domains like customer support and information retrieval.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published