Local Deep Research
AI-powered research assistant that searches 10+ sources — arXiv, PubMed, web, and your private documents — then generates comprehensive, cited reports. Supports local LLMs (via Ollama) and cloud providers (OpenAI, Anthropic, Google).
First Launch
- Open your domain — the web UI loads at the root URL
- Configure your LLM — set your preferred model provider in the settings (default uses Ollama if available)
- Start a research query — type a question and let the assistant search, analyze, and compile a report
- Review the report — results include citations, source links, and confidence scores
Key Features
- Multi-source research: arXiv, PubMed, Wikipedia, web search, and local documents
- Supports local LLMs (Ollama) and cloud APIs (OpenAI, Anthropic, Google)
- Per-user encrypted SQLite database — all data stays on your server
- Iterative deep research with configurable search depth
- Export reports in multiple formats
Environment Variables
Set LDR_LLM_PROVIDER and LDR_LLM_API_KEY to configure your preferred LLM backend. See the documentation for all available settings.
Persistent Data
Research data and configuration are stored in /data (backed by a Zeabur volume).
License
MIT — GitHub