Home / Servers / LibreChat VPS
Self-host LibreChat on your own VPS
LibreChat is the second most-starred self-hosted ChatGPT alternative, with deeper provider support than any competitor: OpenAI, Anthropic, AWS Bedrock, Azure, Google Vertex, Groq, Mistral, OpenRouter, plus any OpenAI-compatible endpoint including Ollama. Hosting it on your own VPS keeps prompts off third-party servers and gives you a single private interface for every LLM you use. Servury is built for it: anonymous signup, crypto payments, full root, no logs.
LibreChat needs at least 4 GB RAM. Starting at $15.59/mo.
Why self-host LibreChat
Quick start: LibreChat on Servury via Docker
Tested on Ubuntu 24. Pick a 4 GB+ plan, deploy, SSH in.
# 1. Install Docker + Compose plugin
curl -fsSL https://get.docker.com | sh
apt install -y docker-compose-plugin git
# 2. Clone LibreChat and start the stack
git clone https://github.com/danny-avila/LibreChat.git
cd LibreChat
cp .env.example .env
# 3. Edit .env to add your API keys (OpenAI, Anthropic, etc.)
# Set HOST=0.0.0.0 to bind to all interfaces.
nano .env
# 4. Bring up the stack (LibreChat + MongoDB + Meilisearch)
docker compose up -d
# 5. Open http://YOUR_SERVER_IP:3080 to register the first admin
# 6. Recommended: put Caddy or nginx in front for HTTPS
Recommended specs
Just you, cloud LLM backends, light usage.
VPS-150 plan
Small team, multi-provider, growing chat history and search index.
VPS-200 plan
Many users, large vector index, file uploads at scale.
VPS-250 or VDS
Frequently asked questions
What is LibreChat?
LibreChat is an open-source self-hosted ChatGPT alternative that supports more LLM providers than any other chat interface: OpenAI, Anthropic, AWS Bedrock, Azure, Google Vertex AI, Groq, Mistral, OpenRouter, and any OpenAI-compatible endpoint including Ollama. It is the second most-starred self-hosted chat UI after Open WebUI.
How is LibreChat different from Open WebUI?
Both are excellent. LibreChat has wider built-in provider support (especially enterprise ones like Bedrock and Vertex), more advanced multi-user/permission tooling, and richer plugin support. Open WebUI has a simpler UI and cleaner Ollama integration. Pick based on which providers you actually use.
How much VPS do I need?
LibreChat ships several services in one Docker Compose stack (LibreChat API, MongoDB, Meilisearch). 4 GB RAM is the practical minimum, 8 GB is the sweet spot for any team use.
Can I use my own API keys?
Yes. LibreChat is designed to bring-your-own-keys. You set them in the .env file (or per-user via the admin panel) and they live on your VPS, not in any cloud.
Does it support file uploads and document chat?
Yes. Drop in PDFs, images, code files. LibreChat handles ingestion and gives you a chat-with-document experience. Files are stored locally on your VPS.
Can I connect a local Ollama instance?
Yes. Add Ollama as a custom OpenAI-compatible endpoint in the config. Many users run LibreChat + Ollama on the same VPS for fully-local chat.
Will my data stay private?
Yes if you self-host. All conversations, uploads, and search indexes are stored locally on your VPS. Servury runs zero application-level logging on customer servers.
How do I update LibreChat?
cd LibreChat, git pull, docker compose pull, docker compose up -d. Mongo data persists across upgrades. Always read the release notes for breaking config changes between major versions.
Where are servers located?
Montreal (owned hardware), New York, London, Paris, Frankfurt, Netherlands, and Singapore. Pick whichever is closest to your users.