Self-host Dify on your own VPS
Dify is the open-source LLM app development platform that lets you build chatbots, agents and RAG pipelines with a visual builder instead of code. Self-hosting on a VPS means you own the data, the prompts, and the user interactions, no per-seat fees, no upload caps, no vendor reading your prompts. Servury is built for it: anonymous signup, crypto payments, full root, no logs.
Dify needs at least 4 GB RAM. Starting at $15.59/mo.
Why self-host Dify
Quick start: Dify on Servury via Docker
Tested on Ubuntu 24. Pick a 4 GB+ plan, deploy, SSH in.
# 1. Install Docker + Compose plugin
curl -fsSL https://get.docker.com | sh
apt install -y docker-compose-plugin git
# 2. Clone Dify and start the stack
git clone https://github.com/langgenius/dify.git
cd dify/docker
cp .env.example .env
docker compose up -d
# 3. Open http://YOUR_SERVER_IP/install and finish onboarding
# 4. Recommended: put Caddy or nginx in front for HTTPS + a real domain
Dify ships with Postgres, Redis, Weaviate (or pgvector), and a worker. The compose file handles all of it.
Recommended specs
Single user, small RAG corpus, occasional traffic.
VPS-150 plan
Small team, growing knowledge base, customer-facing chatbot.
VPS-200 plan
High traffic, large vector DB, multiple apps and workspaces.
VPS-250 plan or VDS
Frequently asked questions
What is Dify?
Dify is an open-source LLM application development platform. You build chatbots, agents, and RAG pipelines with a visual editor, then expose them as APIs or embeddable web apps. It is the most popular self-hosted alternative to Voiceflow, Botpress, and OpenAI custom GPTs in 2026.
How much VPS do I need?
Dify ships several services in one Docker Compose stack (API, web, worker, Postgres, Redis, vector DB). 4 GB RAM is the practical minimum. For a real customer-facing app, 8 GB is the sweet spot.
Can I use my own LLM API keys?
Yes. Dify supports OpenAI, Anthropic, OpenRouter, Cohere, Hugging Face, Ollama, and self-hosted OpenAI-compatible endpoints. You manage keys in the admin panel.
Will my data stay private?
Yes if you self-host. The Dify Cloud version sees your prompts and uploads. The self-hosted version stores everything in your local Postgres and vector DB on the VPS. Servury runs zero application-level logging.
Can I run a local LLM with Dify on the same server?
Possible but only on heavy specs. Combine Dify (~4 GB) with Ollama running a small model (~8 GB for a 7B quantized model) on a 16 GB+ plan. For larger models, run Ollama on a separate beefier server and point Dify at it.
How do I update Dify?
cd dify/docker, git pull, docker compose pull, docker compose up -d. The Postgres data persists in the named volume. Read the upgrade notes for breaking changes between major versions.
Can I pay with crypto?
Yes. XMR, BTC, LTC, ETH, USDT, USDC and more. No KYC, no email, no name on file.
Where are servers located?
Montreal (owned hardware), New York, London, Paris, Frankfurt, Netherlands, and Singapore. Pick whichever is closest to your users.