Skip to content
Free Tool Arena

AI & LLMs · Guide · AI & Prompt Tools

How to Use Dify

Self-hosting Dify, visual workflows, agents, datasets, apps (chatbot/completion), API keys, monitoring.

Updated April 2026 · 6 min read

Dify is an open-source LLMOps platform that ships visual workflows, agents, datasets, and APIs in one stack.

Advertisement

Dify positions itself between “no-code builder” and “full LLMOps platform.” You design prompts and agents in a browser, attach datasets for RAG, and the platform exposes them as REST APIs with auth, rate limits, and analytics. Self-host for free or use the managed cloud.

What it is

A Python backend (Flask + Celery + Postgres + Redis + a vector store) plus a Next.js frontend. Apps come in four flavors: chat, agent, workflow, and text generation. Datasets handle ingestion and retrieval; models plug in via a provider registry with 30+ vendors supported.

Install / set up

# self-host with docker compose
git clone https://github.com/langgenius/dify
cd dify/docker
cp .env.example .env
docker compose up -d

First run

Browse to http://localhost, create the admin account, and wire up a model provider (OpenAI, Anthropic, or a local Ollama endpoint). Click Create App, pick Chatbot, write a system prompt, and publish — Dify generates a shareable URL and an API token.

$ curl -X POST http://localhost/v1/chat-messages \
  -H "Authorization: Bearer app-xxx" \
  -d '{"inputs":{},"query":"hi","user":"u1"}'
{"answer":"Hello!","conversation_id":"..."}

Everyday workflows

  • Build a RAG chatbot by creating a dataset, uploading files, and toggling retrieval on in the app settings.
  • Use the Workflow app type to chain HTTP calls, LLM nodes, code blocks, and conditionals for deterministic pipelines.
  • Ship an internal tool by embedding the generated web app URL or calling the API from your existing product.

Gotchas and tips

The Docker Compose stack pulls a lot of images (Postgres, Redis, Weaviate, sandbox, SSRF proxy). Budget 4–6 GB of RAM minimum and don’t run it on a 1 GB VPS. The sandbox container runs user code and needs privileged or a seccomp profile — read the security docs before exposing Dify publicly.

Version upgrades occasionally require database migrations that aren’t automatic. Snapshot your Postgres volume before docker compose pull. The team moves quickly and breaking changes do happen between minor versions.

Who it’s for

Product teams that want to ship LLM features without building the platform layer. If you need prompts, datasets, auth, logs, and an API gateway in one package, Dify is the most complete open-source option today.

Advertisement

Found this useful?Email