In the era of digital transformation, conversational AI has become a cornerstone for enhancing user experiences across industries. From customer support to personalized recommendations, chatbots are revolutionizing how businesses interact with their users. This project, "Conversational AI Chatbot with Contextual Understanding and Feedback Integration," is a robust AI-powered chatbot designed to provide intelligent, context-aware responses while incorporating user feedback to improve its performance over time.
The chatbot leverages cutting-edge technologies like LangGraph, FastAPI, and Streamlit to create a seamless workflow that integrates backend processing, frontend interaction, and real-time feedback collection. By combining these technologies, the chatbot not only answers user queries but also learns from user interactions, making it a dynamic and evolving solution.
The primary objective of this project is to develop an AI-powered conversational chatbot that:
The solution consists of a backend powered by FastAPI and LangGraph for processing user queries and a frontend built with Streamlit for user interaction. The chatbot uses LangGraph to dynamically decide whether to answer a query directly or perform a web search for additional context. It also maintains a conversation history and summarizes previous interactions to provide contextually relevant responses.
The chatbot uses LangGraph to decide whether a query requires additional context or can be answered directly.
It maintains a summary of previous interactions to provide contextually relevant responses.
Users can provide feedback (Positive or Negative) on the chatbot's responses.
Feedback is stored in a PostgreSQL database and used to analyze and improve the chatbot's performance.
For queries requiring external information, the chatbot performs a web search using TavilySearchResults and incorporates the results into its response.
The chatbot dynamically selects the appropriate workflow (e.g., direct response, web search) based on the query.
The solution includes a login system to authenticate users and associate chat histories with specific accounts.
The frontend provides a real-time chat interface where users can interact with the chatbot and view responses instantly.
Chat histories and feedback are stored in a PostgreSQL database for analysis and future reference.
This project leverages a combination of cutting-edge technologies to create a robust AI-assisted conversational chatbot. Below is an overview of the technologies used and their roles in the solution:
Overview: LangGraph is a framework for building dynamic workflows in AI applications. It allows developers to create state graphs that define the flow of actions based on user inputs and system states.
Role in the Project
Overview: FastAPI is a modern, high-performance web framework for building APIs with Python. It is designed to be fast, easy to use, and production-ready.
Role in the Project:
Overview: Streamlit is an open-source Python library for building interactive web applications. It simplifies the process of creating user interfaces for data-driven applications.
Role in the Project:
Overview: LangChain is a framework for building applications powered by language models. It provides tools for managing conversation history, generating responses, and integrating with external APIs.
Role in the Project:
Overview: TavilySearchResults is a tool for performing web searches and retrieving relevant information. It is designed to fetch up-to-date content from the web.
Role in the Project:
Overview: PostgreSQL is a powerful, open-source relational database system. It is known for its reliability, scalability, and support for advanced features.
Role in the Project:
Overview: The dotenv library is used to manage environment variables in Python applications. It allows developers to store sensitive information like API keys and database credentials in a .env file.
Role in the Project:
Overview: SQLAlchemy is a Python SQL toolkit and Object-Relational Mapping (ORM) library. It provides a high-level API for interacting with relational databases.
Role in the Project:
This is a demo video of the solution with an overview. Click on the link below.
Creating a Conversational Chatbot with LangGraph+ Postgresdb+ FAST API with Streamlit UI
The project consists of two main components: the backend and the frontend. Below is a detailed explanation of the key sections of the code, along with references to specific code blocks.
The backend is built using FastAPI and LangGraph. It handles user queries, processes them using LangGraph workflows, and integrates tools like TavilySearchResults for web searches. It also manages user authentication, chat history, and feedback storage in a PostgreSQL database.
1. Environment Setup and Database Connection
from dotenv import load_dotenv from psycopg import Connection from langgraph.checkpoint.postgres import PostgresSaver # Load environment variables load_dotenv() # Database connection DB_URI = "postgresql://postgres:<username>@localhost:<port>/<database>?sslmode=disable" connection_kwargs = {"autocommit": True, "prepare_threshold": 0} conn = Connection.connect(DB_URI, **connection_kwargs) checkpointer = PostgresSaver(conn)
{Replace the text in <> with your own credentials.}
2. LangGraph Workflow
def chatbot_withcontext(state): """Node to answer a question with context.""" question = state["question"][-1] context = state.get("context", {})[0] source = state.get("context", {})[1] messages = question.content answer = llm.invoke([SystemMessage(content=f"Answer the question,{question}, using this context,{context}.")]) return {"answer": [answer], "summary": messages}
def should_search(state): """Decides whether to perform a web search.""" question = state["question"][-1] system_instructions = SystemMessage(content=f"Decide if the question {question} needs external tools.") answer = llm.invoke([system_instructions]) if "need tool help".lower() in answer.content.lower(): return "web_search" return "chatbot_with_nocontext"
builder = StateGraph(State) builder.add_node("chatbot_with_nocontext", chatbot_with_nocontext) builder.add_node("web_search", web_search) builder.add_node("chatbot_with_tools", chatbot_withcontext) builder.add_conditional_edges(START, should_search) graph = builder.compile(checkpointer=checkpointer)
** 3. FastAPI Endpoint**
from fastapi import FastAPI from pydantic import BaseModel class RequestState(BaseModel): config: str user_question: str @app.post("/chat_with_AI/") def chat_endpoint(request: RequestState): result = graph.invoke({"question": [HumanMessage(content=request.user_question)]}, {"configurable": {"thread_id": request.config}}) return result
4. Chat History and Feedback Storage
def save_chat_history_with_feedback(user_id, message, response, id, sessionid, feedback): query = text("INSERT INTO chat_history (chat_id, user_id, message, response, feedback) VALUES (:chat_id, :user_id, :message, :response, :feedback)") session.execute(query, {'chat_id': f"{sessionid}_{user_id}_{id}", 'user_id': user_id, 'message': message, 'response': response, 'feedback': feedback}) session.commit()
The frontend is built using Streamlit. It provides a real-time chat interface for user interaction, collects feedback, and sends user queries to the backend.
1. Streamlit Interface
import streamlit as st import requests st.title("AI Chatbot") user_question = st.chat_input("Enter your question:") if user_question: payload = {"config": st.session_state.email, "user_question": user_question} response = requests.post("http://<enter the server location link>/chat_with_AI/", json=payload) if response.status_code == 200: st.session_state.messages.append({'role': 'assistant', "content": response.json()['answer']})
{Replace the text in <> with your own credentials.}
2. Feedback Collection
for i, message in enumerate(st.session_state.messages): if message['role'] == 'assistant': cols = st.columns([0.1, 1, 1, 6]) with cols[1]: st.button(':thumbsup:', on_click=fbcb, args=('Positive', i), key=f"thumbsup_{i}") with cols[2]: st.button(':thumbsdown:', on_click=fbcb, args=('Negative', i), key=f"thumbsdown_{i}")
3. User Authentication
def authenticate_user(email, password): user = session.execute(text(f"SELECT * FROM users WHERE email = '{email}'")).fetchone() if user and user[2] == password: return True return False
4. Saving Chat History
def save_chat_history(user_id, message, response): query = text("INSERT INTO chat_history (user_id, message, response) VALUES (:user_id, :message, :response)") session.execute(query, {'user_id': user_id, 'message': message, 'response': response}) session.commit()
Scenario 1: A student asks the chatbot to explain a complex concept in physics.
User Input: "Can you explain the concept of quantum entanglement?"
Chatbot Response:
"Quantum entanglement is a phenomenon in quantum mechanics where two particles become interconnected such that the state of one particle instantly influences the state of the other, no matter how far apart they are. This is often described as 'spooky action at a distance.' For example, if two particles are entangled and one is measured to have a spin-up state, the other will instantly have a spin-down state, even if they are light-years apart."
Feedback Collection: The student provides feedback, which is stored for future improvements.
Scenario 2: A user engages in a multi-turn conversation about travel destinations.
User Input 1: "What are some popular tourist attractions in Paris?"
Chatbot Response 1:
"Some popular tourist attractions in Paris include the Eiffel Tower, the Louvre Museum, Notre-Dame Cathedral, and Montmartre. Would you like more details about any of these?"
User Input 2: "Tell me more about the Eiffel Tower."
Chatbot Workflow:
The chatbot uses the conversation history to understand the context.
It retrieves detailed information about the Eiffel Tower.
Chatbot Response 2:
"The Eiffel Tower is a wrought-iron lattice tower located on the Champ de Mars in Paris. It was completed in 1889 and stands 330 meters tall. It is one of the most iconic landmarks in the world. For ticket information, visit Eiffel Tower Tickets."
Feedback Collection: The user provides feedback on both responses.
This implementation stands out for several key reasons:
Technical Excellence: The code demonstrates clean architecture, effective use of libraries, and proper error handling—all essential qualities for production-ready applications.
User-Centered Design: The interface prioritizes usability without sacrificing capability, making advanced AI accessible to users with varying technical backgrounds.
Thoughtful Abstraction: By leveraging LangChain for conversation management, the implementation creates a flexible foundation that can easily accommodate future enhancements.
Performance Optimization: Integration with Groq's high-speed inference API ensures exceptional response times without compromising quality.
Memory Integration: Integration with a SQL database(PostgreSQL) enhances the ability of the application to store/retrieve data and user feedback. Thoughtful use of techniques like RL techniques can improve the quality of responses multifold.
Customer Support:
While the "AI Agentic Chatbot" is a powerful and versatile solution, it has certain limitations that need to be addressed for further improvement:
Dependency on External Tools:
Advanced Analytics:
The "Conversational AI Chatbot with Contextual Understanding and Feedback Integration" is a powerful solution that demonstrates the potential of AI in enhancing user experiences. By combining LangGraph, FastAPI, and Streamlit, the project creates a seamless workflow that integrates backend processing, frontend interaction, and real-time feedback collection. With its practical applications and potential for future enhancements, this chatbot is a step towards building intelligent, user-centric AI solutions.
1. LangGraph:: LangGraph Documentation
2. FastAPI: FastAPI Documentation
3. Streamlit: Streamlit Documentation
There are no datasets linked
There are no datasets linked
There are no models linked
There are no models linked