DeepSeek ChatHub (š©šššš š š³šššš šØš° šŖšššššš ššš š¹šššššššš š·š!)
DeepSeek ChatHub is a locally hosted AI chatbot powered by DeepSeek-R1, LangChain, and Streamlit. It provides a privacy-focused and responsive chatbot experience without relying on cloud services.
š Features
š”ļø Privacy-Focused: Runs entirely on your machine.
ā” Fast & Responsive: No cloud dependency, ensuring low latency.
š§ Powered by Ollama DeepSeek-R1: Uses a robust AI model for intelligent responses.
šØ Intuitive UI: Modern, easy-to-use chat interface.
š Prerequisites
Before running the chatbot, ensure you have the following installed:
1ļøā£ Install Ollama
Follow the installation guide at: Ollama
2ļøā£ Run DeepSeek-R1 Model in Ollama
ollama run deepseek-r1:1.5b
This will ensure the model is available for the chatbot.
3ļøā£ Install Required Python Packages
pip install -qU langchain-ollama
pip install langchain streamlit
š Running the Chatbot
Once all prerequisites are set up, you can launch the chatbot using:
streamlit run deepseek-chathub.py
This will start the chatbot UI in your browser, allowing you to interact with DeepSeek-R1 seamlessly.
šÆ Usage
Enter your question in the text area.
Select Explanation Type (Layman, Short, Long) for customized responses.
Click Send to receive AI-generated answers.
View Chat History to track previous conversations.
š¤ Contributing
Feel free to contribute by opening issues or submitting pull requests. š
š License
BSD 3-Clause License
There are no datasets linked
There are no datasets linked