🧠 AI Chatbot with Flask & OpenRouter – Technical Documentation
🧩 Current State Gap Identification
Despite the growing presence of AI chat systems, many fail to offer seamless integration between frontend interfaces and powerful language models. Real-time responsiveness, accessibility through speech, and modern user-friendly interfaces are often lacking.
❓ Problem Definition
The goal of this project is to create a web-based AI chatbot that combines a powerful LLM (Mistral 7B via OpenRouter) with an interactive frontend. The chatbot should:
- Accept and respond to user input in real time.
- Offer speech input/output options.
- Support dark mode for better UX.
- Be easily deployable and customizable for different use cases.
⚠️ Limitations Discussion
- 💬 No long-term memory – the chatbot does not store context beyond single user inputs.
- 🌐 Dependent on external API – Requires stable internet and functioning OpenRouter access.
- 🔊 Browser compatibility – Speech APIs may not work on all browsers or devices.
- 🔐 Security – No user authentication included.
Component | Requirement |
---|
Backend | Python 3.x, Flask, requests, flask_cors |
Frontend | Modern web browser (Chrome, Firefox, Edge) |
Server Access | localhost or Ngrok (for tunneling) |
API Response | Typically under 1.5 seconds |
TTS/STT APIs | Web Speech API & SpeechSynthesis |
🔧 Maintenance and Support Status
- This is a self-hosted demo project, designed for educational and prototype use.
- Maintenance includes periodic updates to:
- Mistral/OpenRouter integration (if API changes)
- Frontend compatibility across browsers
- Flask dependency updates
🌐 Access and Availability Status
- Available as open-source
- Can be hosted:
- Locally (
python app.py
)
- On platforms like Heroku, Render, or PythonAnywhere
- Use Ngrok for secure public tunneling to localhost
Below is a visual demonstration of how the chatbot works:
📷 Chat UI Preview
