ā—11 reads

Marouf (Trivia QA RAG System)

Table of contents

Marouf Chatbot - Intelligent Q&A Assistant šŸ¤–

Marouf Chatbot Demo

šŸ“Œ Table of Contents

  1. Overview
  2. Competition Highlights
  3. Key Features
  4. Technical Stack
  5. Installation
  6. Usage
  7. Architecture
  8. Project Structure
  9. Customization
  10. Contact

šŸ” Overview

Marouf Chatbot is an intelligent question-answering assistant leveraging RAG (Retrieval-Augmented Generation) architecture. It integrates FAISS for vector search, LLMs for response generation, and Redis caching to provide fast, accurate, and context-aware answers. Built with FastAPI and Streamlit, this chatbot ensures smooth interaction and seamless deployment via Docker.


šŸ† Competition Highlights

  • Innovative RAG Architecture combining LLMs with vector search.
  • Optimized for low-latency (300ms avg response time).
  • Modular design allowing easy model/dataset swaps.
  • Production-ready deployment with Docker.

šŸŒŸ Key Features

FeatureTechnologyBenefit
Context-aware Q&AFAISS + Sentence Transformersresponse accuracy
Conversational FlowDeepseek-Llama-70BHuman-like responses
Persistent MemoryPostgreSQLSession continuity
High PerformanceGroq LPU300 tokens/sec
Caching SystemRedis68% cache hit rate

šŸ› ļø Technical Stack

Core Components

pie title Technology Distribution "NLP Processing" : 35 "Database" : 25 "API Server" : 20 "Frontend" : 15 "DevOps" : 5

šŸš€ Installation

# Clone repository git clone https://github.com/Mkaljermy/marouf_chatbot.git cd marouf_chatbot # Setup environment cp scripts/.env.example scripts/.env nano scripts/.env # Add your API keys # Build and run docker-compose up --build -d

šŸ’» Usage

import requests response = requests.post( "http://localhost:8000/chat", json={"query": "What's the capital of France?"} ) print(response.json())

šŸ—ļø Architecture

graph LR A[User] --> B[Streamlit] B --> C[FastAPI] C --> D{Redis?} D -->|Cache Hit| E[Return Response] D -->|Cache Miss| F[FAISS Search] F --> G[PostgreSQL] G --> H[LLM Processing] H --> C

šŸ“‚ Project Structure

marouf_chatbot/
ā”œā”€ā”€ scripts/
ā”‚   ā”œā”€ā”€ api/
ā”‚   ā”‚   ā”œā”€ā”€ api.py
ā”‚   ā”‚   ā”œā”€ā”€ embeddings.npy
ā”‚   ā”‚   ā””ā”€ā”€ faiss_index.index
ā”‚   ā”œā”€ā”€ cache/
ā”‚   ā”‚   ā””ā”€ā”€ caching.py
ā”‚   ā”œā”€ā”€ chatbot/
ā”‚   ā”‚   ā”œā”€ā”€ chatbot.py
ā”‚   ā”‚   ā”œā”€ā”€ embeddings.npy
ā”‚   ā”‚   ā””ā”€ā”€ faiss_index.index
ā”‚   ā””ā”€ā”€ frontend/
ā”‚       ā””ā”€ā”€ index.py
ā”œā”€ā”€ data/
ā”‚   ā””ā”€ā”€ trivia_dataset.csv
ā”œā”€ā”€ docker-compose.yml
ā”œā”€ā”€ Dockerfile.api
ā”œā”€ā”€ Dockerfile.frontend
ā””ā”€ā”€ requirements.txt

šŸ› ļø Customization

  • Modify Model: Replace Deepseek-Llama with another LLM by updating chatbot.py.
  • Adjust Dataset: Replace trivia_dataset.csv in the data/ folder.
  • Change UI: Edit frontend/index.py to update the Streamlit interface.

šŸ“ž Contact

For inquiries and contributions, contact Mohammad Aljermy:


šŸŽ‰ Enjoy using Marouf Chatbot! šŸš€

Check out the full project on GitHub: Marouf Chatbot Repository