Skip to main content

Homeserver

I always dreamed of my own homeserver for all kinds of stuff. I'm happy with my current setup as it was fairly cheap and runs like a dream. However I'm still not sure what the best practices are for a setup like mine so I'm just fiddling around until I'm happy (enough) I am eager to hear suggestions for improvements!

But now let me show you how everything is set up.

General Purpose

As of now my homeserver is mostly used for torrenting and related media management with tools such as Sonarr, Radarr, ytdl-sub and some more. In the future I plan on adding my own cloud such as Nextcloud and paperless-ngx for document managing but Im taking it step by step. In general I try to dockerize as much as possible so that I can easily restore anything should my server burn down.

Overview

To give you a general idea on what this server does I created a high level diagram. Please note that this diagram doesn't include Prowlarr for simplicity's sake. I will describe that in more detail below.

Radarr/Sonarr- web interface to add Movies/TV Shows- automates movie/episode download- moves, renames downloadedmovies/episodesqBittorrent- Torrent Client- downloads/uploads mediaMedia Foldergluetun- VPN client- routes traffic via VPNPlex Server- watches media folder- provides Netflix like interface- streams media to userDownload FolderUserPlex Client- Android, iOS, Smart TV, WebInternetDocker ContainerFolderApplicationOverviewmediaconsumptionweb guimoves & renamesfinished downloadsstreamsadds torrentsstoresdownloadsreads files

The flow is the following. When I mention Radarr this also includes Sonarr as it basically does the same:

  1. User visits Radarr and adds a Movie/TV Show via the web interface
  2. Radarr searches the connected torrent trackers for the media in question & adds it to qBittorrent
  3. When qBittorrent finishes the download it notifies Radarr
  4. Radarr then moves* the file to the media folder & renames it
  5. Plex continuously monitors the media folders for new files
  6. User streams the file via Plex Client installed on his TV, Smartphone or other devices

* = it actually hardlinks in order to not take up twice the harddrive space

Hardware

The hardware is a used, small form factor Fujitsu Esprimo D757 Office PC with the following specs:

  • Intel Core i3 6th Gen
  • 8GB RAM
  • 250 GB SSd

I got this one of ebay for a whopping 55€. I added a 12 TB HDD, the Seagate Exos X16 12TB which cost me 155€. Both of this works beautifully but the HDD is loud as fuck. Luckily I have a tiny storage room where the server finds its place. But keeping this in your living or bed room wouldn't be possible.

Power Usage

The power usage was around 10W - 20W idle with the HDD connected. I honestly don't remember the specific number because I borrowed a powermeter from a friend of mine quite some time ago. I do remember that the number was so small that I decided not to give a shit about it.

OS

So I only had a little bit of experience with Linux before starting this project which is why I went with Ubuntu Desktop 24.04.1 LTS. So far this worked perfectly. I didn't have any issues with the OS yet.

Software

Docker Engine

At first I went with Docker Desktop but quickly realized that I don't need the Docker GUI - especially after switching to Portainer. I'm not completely against using GUIs but when it comes to servers I try to avoid it where I can. With Portainer I can manage the containers on my machine with a nice web interface.

Rclone

Rclone is a command-line tool to copy, sync, upload files across different cloud storage providers (and your local machine obviously). I use it to backup my configuration data to OneDrive.

Folder Structure

My folder structure is quite simple.

.
├── /home/cle/homeserver/
│ ├── docker/
│ │ ├── appdata # all Docker application data/volumes
│ │ └── compositions # docker compose files
│ └── scripts # contains backup script
└── /mnt/storage1/data
├── downloads/
│ ├── movies # qBittorrent downloaded movies
│ ├── tv # qBittorrent downloaded tv shows
│ └── other # qBittorrent other stuff I downloaded manually
└── media/
├── movies # cleanly renamed movies
├── tv # cleanly renamed tv shows
└── other # other stuff I put there manually if necessary

The comments should explain everything. In the following section Docker Containers I will go over the different dockererized applications and their respective docker compose files. You will see that each application only has access to the directory or directories it needs. Aside from /docker/appdata the download client for example only needs to access /mnt/storage1/downloads.

Docker Containers

In general I wanted to try to keep all applications dockerized on my homeserver. Should my server ever burn down or explode, I hope to recreate the system with the docker compose files quickly this way. After fiddling around with Docker Desktop for a while, I'm now running Docker Engine directly. For managing my containers I'm running portainer which I started with the following command:

docker run -d \
-p 8000:8000 \
-p 9443:9443 \
--name portainer \
--restart=always \
-v /var/run/docker.sock:/var/run/docker.sock \
-v portainer_data:/data \
portainer/portainer-ce:latest

After that portainer is accessible in the webbrowser via https://<server-ip>:9443

If you are not familiar with portainer: In general you can create called stacks which are docker compose files. Stacks are added via the webinterface. You simply take your docker compose files, add them there, put in environment variables (or load them from a file) and voila your containers can now be managed with portainer.

In portainer I currently have two stacks:

Watchtower

Contains only the Watchtower container that updates all my other docker containers if there is a newer image available - neat

Media

All docker containers that are used for media downloading, processing and consumption. qBittorrent, Sonarr, Radarr, Plex and so on.

Media Stack

In the following I will go over the services in my media stack. You can find the full docker compose files as I have them in portainer in the git repository which is linked at the bottom of this page.

General

File Access: Each container has a bind mount at docker/appdata/<service-name> that persists its configuration and application data. By mounting these directories into each containers /config path, all service configurations are stored in a central location for easy backup and management.

Aside from that containers are given selective access to /mnt/storage1/ through bind mounts.

qBittorrent - Torrent Client

Most importantly we need a torrent client to do the actual downloading of media. Here is the part of the docker compose for qBittorrent:

  qbittorrent:
image: lscr.io/linuxserver/qbittorrent:latest
container_name: qbittorrent
network_mode: "container:gluetun"
environment:
- PUID=1000
- PGID=1000
- TZ=${TZ}
- WEBUI_PORT=8080
volumes:
- ${DOCKER_CONFIG_DIR}/qbittorrent:/config
- /mnt/storage1/data/downloads:/data/downloads
restart: unless-stopped

User Interface: After running this via docker compose or in my case portainer, qBittorrent is available at https://<server-ip>:8080 which is specified via WEBUI_PORT.

File Access: qBittorrent only gets access to the download folder. Self explanatory since this is the only folder it really needs.

Networking: network_mode: "container:gluetun" tells qBittorrent to route all its traffic through the gluetun VPN container. If the gluetun container doesn't exist or is stopped then qBittorrent has no internet access - exactly what we want.

gluetun - VPN Client

gluetun is a small lightweight VPN client. Setting this up took me a bit but it now works without a hickup.

gluetun:
container_name: gluetun
image: qmcgaw/gluetun
cap_add:
- NET_ADMIN
devices:
- /dev/net/tun
volumes:
- ${DOCKER_CONFIG_DIR}/gluetun:/gluetun
ports:
# expose qBittorrents web interface
- "8080:8080"
environment:
- VPN_SERVICE_PROVIDER=airvpn
- VPN_TYPE=wireguard
- WIREGUARD_PRIVATE_KEY=${WIREGUARD_PRIVATE_KEY}
- WIREGUARD_PRESHARED_KEY=${WIREGUARD_PRESHARED_KEY}
- WIREGUARD_ADDRESSES=${WIREGUARD_ADDRESSES}
- SERVER_COUNTRIES=${SERVER_COUNTRIES}
- FIREWALL_VPN_INPUT_PORTS=${FIREWALL_VPN_INPUT_PORTS}
- HEALTH_VPN_DURATION_INITIAL=120s

An important aspect is to expose qBittorrents port 8080 to make it accessible via webbrowser. The environment variables WIREGUARD_PRIVATE_KEY, WIREGUARD_PRESHARED_KEY, WIREGUARD_ADDRESSES, SERVER_COUNTRIES and FIREWALL_VPN_INPUT_PORTS were generated on my VPN providers website. FIREWALL_VPN_INPUT_PORTS is the forwarded port that I also need to set as Listening Port in the qBittorrent connection options. Port forwarding is really important for getting good download speeds, since it allows other users to connect to your client.

Aside from that I followed the guide from my VPN provider which can be found here.

Radarr - Movie Downloading & Managment

With qBittorrent running behind a VPN I can already download stuff from the internet. Quite nice but I want the fully automatic experience. This is where Radarr & Sonarr come into play. Radarr automates the downloading, renaming, copying & moving of movies and provides a nice web interface. Sonarr does the same but for TV Shows. Both applications monitor qBittorrent through its API and wait for downloads to complete. When a download finishes, they create hardlinks instead of copying the files to the media folder. This means the same file exists in both the downloads and media folder without taking up additional space. This way qBittorrent can continue seeding while Plex can stream the properly named version from the media folder.

  radarr:
image: lscr.io/linuxserver/radarr:latest
container_name: radarr
environment:
- PUID=1000
- PGID=1000
- TZ=${TZ}
volumes:
- ${DOCKER_CONFIG_DIR}/radarr:/config
- /mnt/storage1/data:/data
ports:
- 7878:7878
restart: unless-stopped

User Interface: Port 7878 is exposed so the Radarr Web Interface can be reached at https://<server-ip>:7878.

File Access: Radarr gets access to the data on my harddrive. That's because it needs to move (or rather hardlink) files from the downloads to the media folders.

Plex Server - Media Streaming

If you paid attention you already know that Plex is the Application that streams the files from the media folders to the end user.

  plex:
image: lscr.io/linuxserver/plex:latest
container_name: plex
ports:
- 32400:32400/tcp
- 3005:3005/tcp
- 8324:8324/tcp
- 32469:32469/tcp
- 1900:1900/udp
- 32410:32410/udp
- 32412:32412/udp
- 32413:32413/udp
- 32414:32414/udp
environment:
- PUID=1000
- PGID=1000
- TZ=${TZ}
- VERSION=docker
volumes:
- ${DOCKER_CONFIG_DIR}/plex:/config
- /mnt/storage1/data/media:/media
restart: unless-stopped

User Interface: The Plex web interface can be accessed at http://<server-ip>:32400/web. You'll need to sign in with your Plex account to manage your server and access your media library. The additional ports are used for various Plex features like service discovery, streaming, and remote access.

File Access: Plex only gets access to the media folder as it doesn't need to know about the downloads - just the organized media files that are ready to stream.

Prowlarr

Prowlarr is like a search engine manager that helps Radarr and Sonarr find media across different torrent sites. Instead of configuring indexers in each application separately, Prowlarr manages them centrally and syncs them to your *arr services. So instead of having to manually add your torrent sites in both Radarr & Sonarr, this only needs to be done once at Prowlarr.

  prowlarr:
image: lscr.io/linuxserver/prowlarr:latest
container_name: prowlarr
environment:
- PUID=1000
- PGID=1000
- TZ=${TZ}
volumes:
- ${DOCKER_CONFIG_DIR}/prowlarr:/config
ports:
- 9696:9696
restart: unless-stopped

User Interface: You already guessed it: Prowlarr can be reached via http://<server-ip>:9696.

Backups

Figuring out a backup strategy wasn't that easy because there are many tools available. For the time being I am not backing up my 12TB harddrive. Should my server explode I would have to redownload my media. For backing up the whole docker setup I wrote a simple script that uses Rclone to copy backups to OneDrive. As a bonus I configured Ubuntus built in backup service to also create backups and store them in OneDrive. The built in service was configured using the GUI. For backing up with Rclone, here is the bash script that you can also find in my Gitlab repository:

#!/bin/bash

# Configuration
BWLIMIT="5M"
DOCKER_LOCAL_PATH="/home/cle/homeserver/docker"
DOCKER_REMOTE_PATH="onedrive:/backup/homeserver/docker"

# Stop all containers at once and wait for them (excluding Plex)
docker stop -t 60 gluetun prowlarr radarr sonarr qbittorrent sftp

# Backup each important folder separately

# compositions
rclone copy "${DOCKER_LOCAL_PATH}/compositions" "${DOCKER_REMOTE_PATH}/compositions" \
--exclude ".env" \
-P --bwlimit $BWLIMIT

# plex
rclone copy "${DOCKER_LOCAL_PATH}/appdata/plex/backup" \
"${DOCKER_REMOTE_PATH}/appdata/plex/backup" \
-P --bwlimit $BWLIMIT

# prowlarr
rclone copy "${DOCKER_LOCAL_PATH}/appdata/prowlarr/Backups" "${DOCKER_REMOTE_PATH}/appdata/prowlarr/backups" \
-P --bwlimit $BWLIMIT

# radarr
rclone copy "${DOCKER_LOCAL_PATH}/appdata/radarr/Backups" "${DOCKER_REMOTE_PATH}/appdata/radarr/backups" \
-P --bwlimit $BWLIMIT

# sonarr
rclone copy "${DOCKER_LOCAL_PATH}/appdata/sonarr/Backups" "${DOCKER_REMOTE_PATH}/appdata/sonarr/backups" \
-P --bwlimit $BWLIMIT

# qbittorrent
rclone copy "${DOCKER_LOCAL_PATH}/appdata/qbittorrent" "${DOCKER_REMOTE_PATH}/appdata/qbittorrent" \
-P --bwlimit $BWLIMIT

# gluetun
rclone copy "${DOCKER_LOCAL_PATH}/appdata/gluetun" "${DOCKER_REMOTE_PATH}/appdata/gluetun" \
-P --bwlimit $BWLIMIT

# Start containers back up in proper order
docker start gluetun
sleep 20
docker start prowlarr
docker start radarr
docker start sonarr
docker start sftp
docker start qbittorrent

As you can see I'm only backing up specific backup folders from the *arr applications. Within Radarr, Sonarr & Prowlarr you can specify a location where they should store their automatic backups; same goes for Plex. Before the backup starts, I stop all containers (except for Plex) to avoid any inconsistencies in the backup data. For qBittorrent and gluetun I'm simply backing up their whole volume since they don't have a built-in backup feature. This way, I can restore the exact state of all my containers if something goes wrong.

The script runs as a daily cron job during the night when the server isn't busy downloading or streaming anything.

Connecting to my Homeserver

VPN

Since I don't want to expose my homeserver to the internet (aside from Plex), I'm simply using the built-in VPN service of my router should I ever need to access a service from outside my network.

SSH

If I want to change stuff at my homeserver I usually use SSH. Combining SSH with the SSH plugin of my IDE VS Code I can comfortably browse my homeserver folder structure just like any other local project. I can also execute commands from within VS Code - neat! See the References below for a link to the guide, it's very simple.

Samba

Sometimes I want to access the files on my harddrive directly from my other PCs. Samba is perfect for that and can be configured easily. I won't go into detail but after installing Samba you can simply edit the file at /etc/samba/smb.conf where I added the following:

[mnt]
path = /mnt/
browseable = yes
read only = no
valid users = cle
write list = cle

This creates a samba share that is only accessible by my own user account. Samba works perfectly for Linux, Windows & MacOS for accessing files within the network - nice!

References