The Deep Research Agent is a powerful AI-driven Multi-Agent System built with Composio and LangGraph, designed to conduct comprehensive research on user-specified topics within a chosen domain. It generates targeted research questions, performs in-depth analysis using AI-powered tools, and compiles findings into a professional, McKinsey-style HTML report, seamlessly saved to Google Docs. Leveraging the composio_langgraph
library for tool integration and langchain_groq
for language model interactions, this tool is perfect for researchers, analysts, or anyone seeking structured, high-quality insights. ๐
With support for follow-up questions, it enables iterative refinement of research, making it a versatile solution for professional and academic use. ๐
COMPOSIO_SEARCH_TAVILY_SEARCH
for real-time web searches to gather accurate data.GOOGLEDOCS_CREATE_DOCUMENT_MARKDOWN
.โโโ notebook/deep_research.ipynb #notebook file โโโ src/app.py # Main Streamlit application โโโ src/graph.py # LangGraph workflow configuration โโโ src/state.py # Graph state definition โโโ src/nodes/nodes.py # Agent and tool nodes for the workflow โโโ src/tools/composio_tools.py # Composio toolset configuration โโโ src/tools/llm.py # Language model setup โโโ src/prompts.py # System prompt for the research agent โโโ .env # Environment variables โโโ requirements.txt #dependencies โโโ License #license file โโโ .gitignore #gitignore for env โโโ . gitattributes #text= auto for normalization โโโ README.md # Project documentation ๐
COMPOSIO_SEARCH_TAVILY_SEARCH
) and Google Docs integration (GOOGLEDOCS_CREATE_DOCUMENT_MARKDOWN
).The application utilizes a single agent named ResearchAgent, defined in nodes.py
as the agent_node
function. This agent is powered by the Meta LLaMA model (meta-llama/llama-4-scout-17b-16e-instruct
) and is responsible for:
The ResearchAgent operates within a stateful workflow, maintaining context across interactions using the MemorySaver
for seamless research continuity.
The application integrates the following tools via the ComposioToolSet
from the composio_langgraph
library, defined in composio_tools.py
:
COMPOSIO_SEARCH_TAVILY_SEARCH ๐: A web search tool that fetches up-to-date information with configurable parameters (e.g., search_depth
, max_results
, include_images
). Used to gather relevant data, statistics, and sources during research.
GOOGLEDOCS_CREATE_DOCUMENT_MARKDOWN ๐: A tool that saves the compiled HTML report as a markdown document in Google Docs, enabling easy sharing and storage of research outputs.
Report Writing with LLM A tool that uses LLM to write reports from getting answers from web search of given questions.
These tools are bound to the ResearchAgent using llm.bind_tools(tools)
in nodes.py
to enable function calling within the research workflow.
Input:
Output:
git clone https://github.com/Zeeshier/deep-research-agent.git cd deep-research-agent pip install -r requirements.txt # Set up your environment touch .env # Add your GROQ_API_KEY to the .env file # Run the app streamlit run app.py
This project is actively maintained. For questions, suggestions, or issues, please open an Issue or submit a Pull Request.
For direct inquiries, you can also reach me via:
This project is licensed under the MIT License.
You are free to use, modify, and distribute this software with proper attribution.
For more details, please see the LICENSE file.