Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
Features
- Workflows: Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.
- Comprehensive model support: Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found in the Model Providers section.
- Prompt IDE: Intuitive interface for crafting prompts, comparing model performance, and adding additional features such as text-to-speech to a chat-based app.
- RAG Pipeline: Extensive RAG capabilities that cover everything from document ingestion to retrieval, with out-of-box support for text extraction from PDFs, PPTs, and other common document formats.
- Agent capabilities: You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DELL·E, Stable Diffusion and WolframAlpha.
- LLMOps: Monitor and analyze application logs and performance over time. You could continuously improve prompts, datasets, and models based on production data and annotations.
- Backend-as-a-Service: All of Dify's offerings come with corresponding APIs, so you could effortlessly integrate Dify into your own business logic.
Deployment
To deploy Dify, click the "Deploy" button in the top-right corner, and fill in the required domain name. Once deployed, you can access your Dify app at https://<your-domain>.zeabur.app
.
App structure
- Redis, PostgreSQL, MINIO, and Webaviate for data storage and caching.
api
, worker
, web
, and sandbox
are the microservices of Dify.
nginx
is the gateway of Dify. It integrates the microservices to a single host. In another words, it is the entry point of your Dify app.
Configuration
Refer to https://docs.dify.ai/getting-started/install-self-hosted/environments for the configurable variables, which are primarily set in the api
service.
Upload File Size Limit
The default upload file size limit is 15 MB. You can change this limit by modifying the client_max_body_size
directive in the configuration file located at /etc/nginx/nginx.conf
for the nginx
service. The value should be a string with a unit (e.g., 15M
for 15 MB).
Additionally, you may need to set the UPLOAD_FILE_SIZE_LIMIT
variable in the environment of the api
service to match this value. Ensure that this value is the same as the client_max_body_size
directive in the /etc/nginx/nginx.conf
configuration (e.g., 15M
for 15 MB).