Your homelab is running smoothly. You’ve got Nextcloud for files, Vaultwarden for passwords, Immich for photos, and a dozen other services humming along. But here’s the uncomfortable truth: if your server dies tomorrow, can you recover everything?

Most self-hosters only think about backups after disaster strikes. Don’t be one of them.

This guide covers everything you need to build a bulletproof backup strategy for your homelab — from understanding the 3-2-1 rule to implementing automated backups with tools like Restic, Borg, and rsync.

💡 This article contains affiliate links. If you buy through them, we earn a small commission at no extra cost to you. Learn more.

Why Homelab Backups Are Different

Commercial cloud services have redundant infrastructure, automated backups, and teams of engineers ensuring your data survives. Your homelab? That’s all on you.

Here’s what makes homelab backups unique:

  • You’re responsible for everything — from strategy to execution to testing
  • Cost matters — you can’t just throw money at the problem
  • Complexity varies — from a single Docker host to multi-node Proxmox clusters
  • Downtime is acceptable — you’re not Netflix; brief maintenance windows are fine

The good news? Modern backup tools are incredibly powerful, often free, and designed for exactly this use case.

The 3-2-1 Backup Rule (And Why It Matters)

Every backup strategy should follow the 3-2-1 rule:

  • 3 copies of your data — the original plus two backups
  • 2 different storage types — don’t put all eggs in one basket
  • 1 copy offsite — protects against fire, theft, or catastrophic failure

Real-World Example

Let’s say you’re running Nextcloud:

  1. Original data — on your homelab server’s SSD
  2. Local backup — automated nightly backups to a second internal drive
  3. Offsite backup — weekly encrypted backups to Backblaze B2 or Wasabi

This protects you against:

  • Drive failure (local backup saves you)
  • Server death (local backup still accessible)
  • House fire or theft (offsite backup is your lifeline)

What to Back Up

Not everything needs the same backup strategy. Categorize your data:

Critical Data (Daily backups, offsite copies)

  • Configuration files (docker-compose.yml, .env files)
  • Application databases (Nextcloud, Vaultwarden, etc.)
  • User-generated content (documents, photos)
  • Persistent volumes for Docker containers

Important Data (Weekly backups, local copies)

  • Media libraries (if hard to replace)
  • System configurations
  • VM/CT snapshots

Replaceable Data (Optional backups)

  • Downloaded media (if easily re-downloadable)
  • Cache directories
  • Temporary files

Pro tip: Start by backing up configuration files and databases. You can rebuild almost everything else from those.

Backup Tools Comparison

Let’s look at the most popular tools for homelab backups:

Restic — Best Overall

Pros:

  • Encrypted, deduplicated, compressed backups
  • Supports tons of backends (local, S3, B2, SFTP, etc.)
  • Fast incremental backups
  • Excellent documentation
  • Single binary, easy to install

Cons:

  • Repository needs periodic maintenance (restic prune)
  • No native GUI (but web UIs exist)

Best for: Docker volumes, configuration files, databases

BorgBackup — Maximum Efficiency

Pros:

  • Extremely efficient deduplication (better than Restic)
  • Compression built-in
  • Encrypted backups
  • Stable and mature

Cons:

  • More complex than Restic
  • Limited cloud support (primarily SSH-based)
  • Requires BorgBackup on remote server

Best for: Large datasets, when you control both ends

rsync — Simple and Reliable

Pros:

  • Dead simple
  • Fast incremental transfers
  • Widely available
  • Great for file-based backups

Cons:

  • No deduplication
  • No compression by default
  • No encryption (use with SSH)
  • Not space-efficient for versioned backups

Best for: Quick local mirrors, initial testing

Duplicati — GUI-Friendly

Pros:

  • Web-based GUI
  • Supports many cloud backends
  • Encrypted, compressed backups
  • Scheduling built-in

Cons:

  • Slower than Restic/Borg
  • Database corruption issues reported
  • Less actively maintained

Best for: Users who prefer GUIs over command line

Practical Implementation: Restic Backups

Let’s build a real backup system using Restic and Docker. This example backs up Docker volumes to both local storage and Backblaze B2.

Step 1: Install Restic

1
2
3
4
5
6
# Install Restic
sudo apt update
sudo apt install restic -y

# Verify installation
restic version

Step 2: Initialize Repositories

1
2
3
4
5
6
7
8
# Local repository
export RESTIC_PASSWORD="your-secure-password-here"
restic init --repo /mnt/backup/restic-repo

# Backblaze B2 repository
export B2_ACCOUNT_ID="your-b2-account-id"
export B2_ACCOUNT_KEY="your-b2-account-key"
restic init --repo b2:your-bucket-name:/restic-repo

Step 3: Create Backup Script

Create /usr/local/bin/backup-homelab.sh:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
#!/bin/bash
set -e

# Configuration
RESTIC_PASSWORD="your-secure-password-here"
LOCAL_REPO="/mnt/backup/restic-repo"
REMOTE_REPO="b2:your-bucket-name:/restic-repo"
DOCKER_VOLUMES="/var/lib/docker/volumes"
CONFIG_DIR="/opt/docker-configs"
LOG_FILE="/var/log/restic-backup.log"

# Backblaze B2 credentials
export B2_ACCOUNT_ID="your-b2-account-id"
export B2_ACCOUNT_KEY="your-b2-account-key"
export RESTIC_PASSWORD

echo "=== Backup started at $(date) ===" | tee -a "$LOG_FILE"

# Backup to local repository
echo "Backing up to local repository..." | tee -a "$LOG_FILE"
restic backup \
  --repo "$LOCAL_REPO" \
  --exclude-caches \
  --exclude="*.tmp" \
  --tag "homelab" \
  "$DOCKER_VOLUMES" \
  "$CONFIG_DIR" 2>&1 | tee -a "$LOG_FILE"

# Backup to remote repository (every 7th backup)
DAY_OF_YEAR=$(date +%j)
if [ $((DAY_OF_YEAR % 7)) -eq 0 ]; then
  echo "Backing up to remote repository..." | tee -a "$LOG_FILE"
  restic backup \
    --repo "$REMOTE_REPO" \
    --exclude-caches \
    --exclude="*.tmp" \
    --tag "homelab" \
    "$DOCKER_VOLUMES" \
    "$CONFIG_DIR" 2>&1 | tee -a "$LOG_FILE"
fi

# Cleanup old snapshots (keep last 30 daily, 12 weekly, 6 monthly)
echo "Cleaning up old snapshots..." | tee -a "$LOG_FILE"
restic forget \
  --repo "$LOCAL_REPO" \
  --keep-daily 30 \
  --keep-weekly 12 \
  --keep-monthly 6 \
  --prune 2>&1 | tee -a "$LOG_FILE"

echo "=== Backup completed at $(date) ===" | tee -a "$LOG_FILE"

Make it executable:

1
chmod +x /usr/local/bin/backup-homelab.sh

Step 4: Automate with Cron

1
2
3
4
5
# Edit crontab
crontab -e

# Add this line for daily backups at 2 AM
0 2 * * * /usr/local/bin/backup-homelab.sh

Step 5: Test Your Backups

Always test your backups! Here’s how:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
# List snapshots
restic snapshots --repo /mnt/backup/restic-repo

# Restore a specific file
restic restore latest \
  --repo /mnt/backup/restic-repo \
  --target /tmp/restore-test \
  --include /var/lib/docker/volumes/nextcloud_data

# Mount repository for browsing
mkdir /mnt/restic-mount
restic mount /mnt/restic-mount --repo /mnt/backup/restic-repo
# Browse files, then unmount
umount /mnt/restic-mount

Docker-Compose Backup Solution

For a containerized approach, here’s a complete docker-compose setup with automated backups:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
version: '3.8'

services:
  # Your application (example: Nextcloud)
  nextcloud:
    image: nextcloud:latest
    restart: unless-stopped
    volumes:
      - nextcloud_data:/var/www/html
      - nextcloud_db:/var/www/html/data
    environment:
      - MYSQL_HOST=db
      - MYSQL_DATABASE=nextcloud
      - MYSQL_USER=nextcloud
      - MYSQL_PASSWORD=secure_password
    depends_on:
      - db

  db:
    image: mariadb:latest
    restart: unless-stopped
    volumes:
      - nextcloud_db_data:/var/lib/mysql
    environment:
      - MYSQL_ROOT_PASSWORD=root_password
      - MYSQL_DATABASE=nextcloud
      - MYSQL_USER=nextcloud
      - MYSQL_PASSWORD=secure_password

  # Automated backup container
  restic-backup:
    image: mazzolino/restic:latest
    restart: unless-stopped
    hostname: docker-host
    volumes:
      - nextcloud_data:/data/nextcloud_data:ro
      - nextcloud_db_data:/data/nextcloud_db:ro
      - /opt/docker-configs:/data/configs:ro
      - ./restic-cache:/cache
    environment:
      - BACKUP_CRON=0 2 * * *
      - RESTIC_REPOSITORY=b2:your-bucket:/backups
      - RESTIC_PASSWORD=your-restic-password
      - B2_ACCOUNT_ID=your-b2-account-id
      - B2_ACCOUNT_KEY=your-b2-account-key
      - RESTIC_FORGET_ARGS=--keep-daily 7 --keep-weekly 4 --keep-monthly 6
      - TZ=Europe/Berlin

volumes:
  nextcloud_data:
  nextcloud_db:
  nextcloud_db_data:

This setup:

  • Runs backups daily at 2 AM
  • Backs up application data and database volumes
  • Automatically prunes old snapshots
  • Sends encrypted backups to Backblaze B2

Database Backups (The Right Way)

Never back up database files directly while the database is running. Use proper dump tools:

PostgreSQL

1
2
3
4
5
6
7
8
#!/bin/bash
# /usr/local/bin/backup-postgres.sh

docker exec postgres-container pg_dumpall -U postgres | \
  gzip > /mnt/backup/postgres-$(date +%Y%m%d).sql.gz

# Keep only last 14 days
find /mnt/backup -name "postgres-*.sql.gz" -mtime +14 -delete

MySQL/MariaDB

1
2
3
4
5
6
7
8
#!/bin/bash
# /usr/local/bin/backup-mysql.sh

docker exec mysql-container mysqldump -u root -p"$MYSQL_ROOT_PASSWORD" \
  --all-databases | gzip > /mnt/backup/mysql-$(date +%Y%m%d).sql.gz

# Keep only last 14 days
find /mnt/backup -name "mysql-*.sql.gz" -mtime +14 -delete

Add these to your Restic backup script or run them before your main backup.

Backing Up Docker Configurations

Your docker-compose.yml files and .env files are critical. Back them up separately:

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
#!/bin/bash
# /usr/local/bin/backup-docker-configs.sh

BACKUP_DIR="/mnt/backup/docker-configs"
TIMESTAMP=$(date +%Y%m%d-%H%M%S)

mkdir -p "$BACKUP_DIR"

# Tar up all docker-compose directories
tar czf "$BACKUP_DIR/docker-configs-$TIMESTAMP.tar.gz" \
  /opt/docker-configs \
  /root/.env

# Keep only last 30 backups
ls -t "$BACKUP_DIR"/docker-configs-*.tar.gz | tail -n +31 | xargs -r rm

Even better, use Git:

1
2
3
4
5
cd /opt/docker-configs
git init
git add .
git commit -m "Daily backup $(date +%Y-%m-%d)"
git push origin main  # Push to private GitHub/Gitea repo

Offsite Backup Options

Cloud Storage

Backblaze B2 — $6/TB/month, free egress with Cloudflare

  • Best for: Most homelabs
  • Restic/Borg support: Excellent
  • Cost: Very competitive

Wasabi — $6.99/TB/month, free egress

  • Best for: Larger datasets
  • Restic/Borg support: Excellent via S3 API
  • Cost: Predictable, no egress fees

AWS S3 Glacier Deep Archive — $0.99/TB/month

  • Best for: Rarely-accessed archives
  • Restic/Borg support: Good
  • Cost: Cheap storage, expensive retrieval

Self-Hosted Offsite

Parents’ house / friend’s server:

1
2
3
4
5
6
7
8
9
# Set up SSH keys first
ssh-keygen -t ed25519
ssh-copy-id user@remote-server

# Backup via SSH with Borg
borg create \
  user@remote-server:/mnt/backup/borg-repo::$(date +%Y%m%d) \
  /var/lib/docker/volumes \
  /opt/docker-configs

VPS as backup target:

  • Cheap VPS: ~$5/month for 100GB
  • Install BorgBackup or set up SFTP
  • Encrypt everything before sending

Testing Your Backups (Critical!)

A backup you haven’t tested is Schrödinger’s backup — it both exists and doesn’t exist until you verify it.

Monthly Backup Test Routine

  1. List snapshots:

    1
    
    restic snapshots --repo /mnt/backup/restic-repo
    
  2. Restore a random file:

    1
    2
    3
    
    restic restore latest --repo /mnt/backup/restic-repo \
      --target /tmp/restore-test \
      --include /path/to/important/file
    
  3. Verify database backups:

    1
    
    gunzip < /mnt/backup/postgres-20260207.sql.gz | head -n 50
    
  4. Full recovery drill (quarterly):

    • Spin up a test VM
    • Restore from backup
    • Verify all services start
    • Check data integrity

Monitoring and Alerting

Set up notifications for backup failures:

Using Healthchecks.io

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
#!/bin/bash
# Add to your backup script

HEALTHCHECK_URL="https://hc-ping.com/your-unique-uuid"

# At the start
curl -fsS --retry 3 "$HEALTHCHECK_URL/start"

# Your backup commands here
/usr/local/bin/backup-homelab.sh

# On success
if [ $? -eq 0 ]; then
  curl -fsS --retry 3 "$HEALTHCHECK_URL"
else
  curl -fsS --retry 3 "$HEALTHCHECK_URL/fail"
fi

Using Uptime Kuma

If you’re running Uptime Kuma (see our upcoming guide), add a Push monitor for your backup script.

Common Mistakes to Avoid

  1. Backing up to the same physical drive — if the drive dies, both original and backup are gone
  2. Never testing restores — you don’t have backups, you have backup attempts
  3. Ignoring encryption — offsite backups without encryption are a privacy nightmare
  4. Backing up while services are running — databases especially need proper dumps
  5. No monitoring — you won’t know backups are failing until you need them
  6. Keeping backups in the same location — fire/flood/theft takes everything

Your Backup Strategy Checklist

  • Identified critical data (configs, databases, user data)
  • Chosen backup tool (Restic recommended)
  • Set up local backup repository
  • Set up offsite backup repository
  • Created automated backup scripts
  • Configured cron/systemd timers
  • Tested restore procedure
  • Set up monitoring/alerts
  • Documented recovery process
  • Scheduled quarterly full recovery drills

Conclusion

Homelab backups aren’t optional. They’re the difference between a minor inconvenience and losing months or years of work.

Start simple:

  1. Back up your docker-compose files to Git (takes 5 minutes)
  2. Set up Restic for local backups (takes 30 minutes)
  3. Add offsite backups to Backblaze B2 (takes 15 minutes)
  4. Test a restore (takes 10 minutes)

One hour of work today saves you from devastating data loss tomorrow.

Your future self will thank you.


What’s your backup strategy? Are you using Restic, Borg, or something else? Let us know in the comments!

Next up: We’ll cover setting up Uptime Kuma for comprehensive homelab monitoring — including backup monitoring.