PROJECT TITLE
Agentic AI
DESCRIPTION
This project demonstrates the development of an Agentic AI system using LangChain. The agent is powered by a Retrieval-Augmented Generation (RAG) pipeline, which allows it to ground responses in external knowledge. For this project, the AI is trained on a set of machine learning notes.
When prompted with a question, the agent retrieves relevant chunks from the notes, embeds them using FAISS vector search, and generates a response using a large language model (LLM) hosted via Groq. This ensures the answers are both contextual and up-to-date.
CONTENT
Implements an Agentic AI capable of reasoning with external knowledge.
Uses LangChain to orchestrate document loading, chunking, embeddings, and retrieval.
Embeddings are stored and searched efficiently using FAISS.
Answers are generated by a Groq-hosted LLM (LLaMA 3.1).
Knowledge base consists of curated machine learning notes (.txt file).
Can be extended with additional documents for broader knowledge coverage.
TECH STACK
Python
LangChain
FAISS (vector database)
HuggingFace Embeddings
Groq API (LLM backend)
PROJECT WORKFLOW
Load documents (.txt machine learning notes).
Split into chunks with metadata.
Generate embeddings with HuggingFace model.
Store & search embeddings in FAISS.
Query answered using retrieval + LLM generation.
GITHUB REPO