Ollama Deep Researcher is inspired by IterDRAG. This approach will decompose a query into sub-queries, retrieve documents for each one, answer the sub-query, and then build on the answer by retrieving docs for the second sub-query. Here, we do similar:
Given a user-provided topic, use a local LLM (via Ollama) to generate a web search query
Uses a search engine (configured for Tavily) to find relevant sources
Uses LLM to summarize the findings from web search related to the user-provided research topic
Then, it uses the LLM to reflect on the summary, identifying knowledge gaps
It generates a new search query to address the knowledge gaps
The process repeats, with the summary being iteratively updated with new information from web search
It will repeat down the research rabbit hole
Runs for a configurable number of iterations (see configuration tab)
The output of the graph is a markdown file emailed to your email of choice containing the research summary, with citations to the sources used.
All sources gathered during research are saved to the graph state.
You can visualize them in the graph state, which is visible in LangGraph Studio:
There are no datasets linked
There are no datasets linked