An extensible, feature-rich, and user-friendly self-hosted LLM WebUI
Open WebUI is an extensible, feature-rich, and user-friendly self-hosted WebUI designed to operate entirely offline. It supports various LLM runners, including Ollama and OpenAI-compatible APIs. For more information, be sure to check out our Open WebUI Documentation.
This Open WebUI's OpenAI capability is somewhat broken. Use at your own risk. It is better to use another UI if you need to use GPT models.
Note that Zeabur services do not contain a GPU. To use the Ollama mode, you should run Ollama on another GPU-equipped server, and set OLLAMA_BASE_URL
to your Ollama server. Check the "If Ollama is on a Different Server" section for details.
For other settings, check the full configurable options here: https://docs.openwebui.com/getting-started/env-configuration.