Dify is an open-source LLM app development platform. Its intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
Features
- Workflows: Build and test powerful AI workflows on a visual canvas, leveraging all the following features and beyond.
- Comprehensive model support: Seamless integration with hundreds of proprietary / open-source LLMs from dozens of inference providers and self-hosted solutions, covering GPT, Mistral, Llama3, and any OpenAI API-compatible models. A full list of supported model providers can be found in the Model Providers section.
- Prompt IDE: Intuitive interface for crafting prompts, comparing model performance, and adding additional features such as text-to-speech to a chat-based app.
- RAG Pipeline: Extensive RAG capabilities that cover everything from document ingestion to retrieval, with out-of-box support for text extraction from PDFs, PPTs, and other common document formats.
- Agent capabilities: You can define agents based on LLM Function Calling or ReAct, and add pre-built or custom tools for the agent. Dify provides 50+ built-in tools for AI agents, such as Google Search, DELL·E, Stable Diffusion and WolframAlpha.
- LLMOps: Monitor and analyze application logs and performance over time. You could continuously improve prompts, datasets, and models based on production data and annotations.
- Backend-as-a-Service: All of Dify's offerings come with corresponding APIs, so you could effortlessly integrate Dify into your own business logic.
Deployment
To deploy Dify, click the "Deploy" button in the top-right corner, and fill in the required domain name. Once deployed, you can access your Dify app at https://<your-domain>.zeabur.app.
App structure
- Redis, PostgreSQL, MINIO, and Weaviate for data storage and caching.
api, worker, worker-beat, web, and sandbox are the microservices of Dify.
plugin-daemon manages third-party plugin execution and lifecycle.
nginx is the gateway of Dify. It integrates the microservices to a single host. In another words, it is the entry point of your Dify app.
Configuration
Refer to https://docs.dify.ai/getting-started/install-self-hosted/environments for the configurable variables, which are primarily set in the api service.
Upload File Size Limit
The default upload file size limit is 100 MB. You can change this limit by modifying the client_max_body_size directive in the configuration file located at /etc/nginx/nginx.conf for the nginx service. The value should be a string with a unit (e.g., 100M for 100 MB).
Additionally, you may need to set the UPLOAD_FILE_SIZE_LIMIT variable in the environment of the api service to match this value. Ensure that this value is the same as the client_max_body_size directive in the /etc/nginx/nginx.conf configuration (e.g., 100M for 100 MB).
Changelog
v1.13.0
New
- Human-in-the-Loop (HITL) workflow node for AI-human collaboration
- Workflow streaming now runs in Celery workers for better reliability
Fixes
- Internal file URL for plugin daemon communication
v1.12.1-2
New
- Added worker-beat scheduler service
Fixes
- Increased file upload size limit from 15MB to 100MB
- Fixed plugin endpoint page asset loading 404
- Fixed plugin endpoint URL showing localhost
- Fixed app creation from templates failing
Improvements
- Environment variables are now managed by their respective services, reducing duplication