This project presents a versatile conversational AI system that leverages Groq's high-performance API within an intuitive Streamlit interface. By integrating LangChain for conversation management and implementing customizable memory and persona features, the chatbot demonstrates key characteristics of agentic AIโmaintaining context across interactions and adapting to user preferences.
The system offers exceptional response speeds through Groq's optimized inference pipeline while providing flexible configuration options that allow users to tailor the conversation experience. This combination of performance, adaptability, and user-friendly design represents a significant step forward in making advanced conversational AI accessible to a wider audience.
Despite recent advances in language models, creating truly effective conversational interfaces remains challenging. This project addresses that challenge by focusing on three key aspects:
Optimized Performance: By leveraging Groq's API, the system delivers responses with minimal latency, enabling natural conversation flow.
Contextual Awareness: Using LangChain's conversation memory management, the chatbot maintains appropriate context across interactions.
User-Centric Design: The Streamlit interface provides intuitive controls for configuring the chatbot's behavior while maintaining a clean, engaging conversation experience.
The result is a practical demonstration of how agentic principles can be applied to conversational AI, creating systems that better understand and respond to user needs.
mixtral-8x7b-32768
, llama2-70b-4096
, and llama3-8b-8192
The application follows a clean, modular architecture:
User Input โ Streamlit UI โ LangChain Conversation Chain โ Groq API โ Response Rendering
โ โ โ
| | |
UI Controls Memory Management Persona Handling
# Initialize conversation components memory = ConversationBufferWindowMemory(k=memory_length) groq_chat = ChatGroq( groq_api_key=groq_api_key, model_name=model ) conversation = ConversationChain( llm=groq_chat, memory=memory, prompt=get_custom_prompt() )
def get_custom_prompt(): """Get custom prompt template based on selected persona""" persona = st.session_state.get('selected_persona', 'Default') personas = { 'Default': """You are a helpful AI assistant. Current conversation: {history} Human: {input} AI:""", 'Expert': """You are an expert consultant with deep knowledge across multiple fields. Please provide detailed, technical responses when appropriate. Current conversation: {history} Human: {input} Expert:""", 'Creative': """You are a creative and imaginative AI that thinks outside the box. Feel free to use metaphors and analogies in your responses. Current conversation: {history} Human: {input} Creative AI:""" } return PromptTemplate( input_variables=["history", "input"], template=personas[persona] )
# Load chat history into memory for message in st.session_state.chat_history: memory.save_context( {'input': message['human']}, {'output': message['AI']} ) # Clear memory for new topics while preserving history if st.button("๐ New Topic", use_container_width=True): memory.clear() st.success("Memory cleared for new topic!")
Clone the repository:
git clone https://github.com/yourusername/agentic-chatbot.git cd agentic-chatbot
Create and activate a virtual environment:
python -m venv venv # On Windows venv\Scripts\activate # On macOS/Linux source venv/bin/activate
Install required packages:
pip install streamlit langchain langchain-groq python-dotenv groq
Create a .env
file in the project root:
echo "GROQ_API_KEY=your_api_key_here" > .env
Launch the application:
streamlit run app.py
or
streamlit run main.py # For the enhanced version with additional features
The chatbot follows a straightforward workflow designed for both effectiveness and efficiency:
This process creates a seamless conversation flow while maintaining appropriate context and persona characteristics throughout the interaction.
Deploy as a customizable personal assistant that adapts to individual communication styles and preferences, providing a more natural and effective interaction experience.
Function as an exploratory tool for brainstorming and knowledge gathering, with the ability to switch between different expertise modes based on the topic at hand.
Serve as a learning companion that can present information in different ways (technical, creative) based on the student's learning style and needs.
Operate as a flexible consulting tool that can be configured to provide industry-specific expertise across various domains.
The implementation uses LangChain's ConversationBufferWindowMemory
to maintain an appropriate conversation context window:
memory = ConversationBufferWindowMemory(k=conversational_memory_length) # Populate memory from existing chat history for message in st.session_state.chat_history: memory.save_context( {'input': message['human']}, {'output': message['AI']} )
This approach allows the system to maintain relevant context without overwhelming the model with excessive history, striking a balance between coherence and efficiency.
The application features two complementary interface designs:
Classic Chat Layout: app.py
uses Streamlit's native chat components for a familiar messaging experience:
with st.chat_message("user"): st.write(f"{get_avatar('user')} {prompt}")
Enhanced Experience: main.py
offers a more detailed interface with additional features:
with st.container(): st.write(f"๐ค You") st.info(message['human'])
Both approaches prioritize clarity and usability while differentiating between user and AI contributions.
The application maintains conversation continuity through Streamlit's session state:
def initialize_session_state(): """Initialize session state variables""" if 'chat_history' not in st.session_state: st.session_state.chat_history = [] if 'total_messages' not in st.session_state: st.session_state.total_messages = 0 if 'start_time' not in st.session_state: st.session_state.start_time = None
This design ensures that conversations persist throughout the session while enabling easy access to interaction history for both display and context management.
This implementation stands out for several key reasons:
Technical Excellence: The code demonstrates clean architecture, effective use of libraries, and proper error handlingโall essential qualities for production-ready applications.
User-Centered Design: The interface prioritizes usability without sacrificing capability, making advanced AI accessible to users with varying technical backgrounds.
Thoughtful Abstraction: By leveraging LangChain for conversation management, the implementation creates a flexible foundation that can easily accommodate future enhancements.
Performance Optimization: Integration with Groq's high-speed inference API ensures exceptional response times without compromising quality.
Educational Value: The clear, well-structured code provides an excellent learning resource for developers exploring AI application development.
The Agentic AI-Driven Chatbot represents a practical implementation of key principles in modern conversational AI. By combining Groq's powerful language models with LangChain's conversation management capabilities and Streamlit's intuitive interface, it demonstrates how multiple technologies can come together to create a system that is both powerful and accessible.
The project's focus on memory management, persona customization, and user-friendly design aligns with the broader vision of agentic AIโsystems that can maintain context, adapt to user needs, and provide valuable assistance across a range of scenarios. As these technologies continue to evolve, implementations like this will play an increasingly important role in bridging the gap between advanced AI capabilities and practical everyday applications.
This project serves not only as a functional tool but also as a foundation for further exploration and development in the rapidly evolving field of conversational AI.
There are no datasets linked
There are no datasets linked