Self-Host LibreTranslate

Eran Goldman-Malka · January 17, 2026

🌐 Why I Self-Host LibreTranslate on DigitalOcean (and How You Can Too)


If you’re building apps with heavy translation needs, per-character pricing can quietly eat your budget, and you’re still sending user data to a third party. Self-hosting LibreTranslate lets you control both cost and privacy by running your own translation engine on a small DigitalOcean Droplet.

1. Picking the Right Droplet (Cost vs. Performance)

LibreTranslate loads full language models into memory and then uses CPU for each translation request. That means RAM and CPU matter more than disk once it’s running. community.libretranslate

  • Minimum setup (tests, low traffic): Shared CPU Droplet with 2 GB RAM (around 12 USD/month).
  • Recommended for real usage: 4 GB RAM General Purpose Droplet for snappier responses and multiple concurrent users. community.libretranslate
  • OS: Ubuntu 22.04 LTS or 24.04 LTS for long-term security updates.

Real-world note: On a 2 GB instance, startup can feel slow because models need to be downloaded and loaded into memory the first time. community.libretranslate

2. One-Command Deployment with Docker Compose

The easiest way to keep LibreTranslate maintainable is to run it in Docker and persist the models on a volume, so they don’t re-download every time. ipv6

Install Docker and Docker Compose:

sudo apt update && sudo apt install -y docker.io docker-compose

Create your project folder and Compose file:

mkdir ~/libretranslate && cd ~/libretranslate
vim docker-compose.yml

Paste (updated to enable API keys):

services:
  libretranslate:
    image: libretranslate/libretranslate:latest
    container_name: libretranslate
    restart: unless-stopped
    ports:
      - "5000:5000"
    environment:
      - LT_UPDATE_MODELS=true
      - LT_LOAD_ONLY=en,es,fr,de,it  # Load only what you need to save RAM
      - LT_API_KEYS=true             # Enable API key support for authentication
    volumes:
      - lt-models:/home/libretranslate/.local
volumes:
  lt-models:

Start the stack:

sudo docker-compose up -d

Enable API Tokens: Wait for the first startup (several minutes for model downloads). Then exec into the container and generate a key:

sudo docker-compose exec libretranslate /venv/bin/ltmanage keys add 120

This creates an API key with a 120 requests/minute limit (adjust as needed). List keys with ltmanage keys or remove with ltmanage keys remove <key>. The key will be printed—save it securely. thevadasan

3. Calling Your API From Your App

Once the container is running, your translation endpoint is available at:

  • http://YOUR_DROPLET_IP:5000/translate

A simple Node.js example (with API key):

async function translateToEnglish(text, apiKey) {
  const response = await fetch("http://YOUR_DROPLET_IP:5000/translate", {
    method: "POST",
    headers: { 
      "Content-Type": "application/json",
      "Authorization": `Bearer ${apiKey}`  // Pass your API key here
    },
    body: JSON.stringify({
      q: text,
      source: "auto",   // Let LibreTranslate detect the language
      target: "en",
      format: "text"
    })
  });

  const data = await response.json();
  return data.translatedText;
}

LibreTranslate’s API is minimal by design: q, source, target, and optional fields like format and api_key. With LT_API_KEYS=true, the Authorization: Bearer <key> header is required for API access. publicapi

4. Securing Your Instance (Don’t Leave Port 5000 Open)

Out of the box, the container publishes port 5000 to the internet. That’s convenient—but not safe.

  • DigitalOcean Firewall: In the Networking → Firewalls panel, restrict inbound access on port 5000 so only your web server’s IP (or VPN) can reach it. librechat
  • API keys: Use per-app keys with rate limits to control access and prevent abuse. hub.docker
  • Defense-in-depth: Many self-hosters bind containers to 127.0.0.1 and expose them only via a reverse proxy that handles TLS and access control. reddit

If you want to go one step further, you can put Nginx or Caddy in front, add HTTPS, and route everything through https://translate.yourdomain.com.

5. A Pro Trick: “Pre-English” Filter to Save CPU

For chat-style apps, a surprising amount of user input is already in English. Adding a quick language detection step before you call LibreTranslate can dramatically cut CPU usage.

  • Use a lightweight language detection library (e.g., langdetect / similar) in your backend.
  • If the language is already English with high confidence, skip the translation call entirely.

LibreTranslate exposes a /detect endpoint as well, but doing detection client-side or in your app logic avoids unnecessary load on the Droplet. openpublicapis


Need help with installation, scaling, or tech stack decisions? DM me. I’ll be happy to guide you through LibreTranslate deployment or recommend the best self-hosted translation setup for your use case.

Twitter, Facebook