I’ve only ever used desktop Linux and don’t have server admin experience (unless you count hosting Minecraft servers on my personal machine lol). Currently using Artix and Void for my desktop computers as I’ve grown fond of runit.
I’m going to get a VPS for some personal projects and am at the point of deciding what distro I want to use. While I imagine that systemd is generally the best for servers due to the far more widespread support (therefore it’s better for the stability needs of a server), I have a somewhat high threat model compared to most people so I was wondering if maybe I should use something like runit instead which is much smaller and less vulnerable. Security needs are also the reason why I’m leaning away from using something like Debian, because how outdated the packages are would likely leave me open to vulnerabilities. Correct me if I’m misunderstanding any of that though.
Other than that I’m not sure what considerations there are to make for my server distro. Maybe a more mainstream distro would be more likely to have the software in its repos that I need to host my various projects. On the other hand, I don’t have any experience with, say, Fedora, and it’d probably be a lot easier for me to stick to something I know.
In terms of what I want to do with the VPS, it’ll be more general-purpose and hosting a few different projects. Currently thinking of hosting a Matrix instance, a Mastodon instance, a NextCloud instance, an SMTP server, and a light website, but I’m sure I’ll want to stick more miscellaneous stuff on there too.
So what distro do you use for your server hosting? What things should I consider when picking a distro?
I love Debian for servers. Super stable. No surprises. It just works. And millions of other people use it as well in case I need to look something up.
And even when I’m lazy and don’t update to the latest release oldstable will be supported for years and years.
@bjoern_tantau @communism That ‘support for years and years’ means security support. So even if the nominal versions stay stable, security fixes are backported. Security scans that only check versions usually give false positives: they think fixes in newer versions are not present when in fact they are.
Many others distros do exactly the same. I only chose Debian because the amount of software already packaged in the distro itself is bigger than any other, barring 3rd party repos.
Debian
I run NixOS. It (or something like it, with a central declarative configuration for basically everything on the system) is imo the ideal server distro.
@communism Debian is an easy pick, but sometimes I can do alpine. Generally, it’s all in containers anyway, so doesn’t really matters.
Debian, with a Kubernetes cluster on top running a bunch of Debian & Alpine containers. Never ever Ubuntu.
Never ever Ubuntu
Why’s that?
Because Ubuntu is the worst of both worlds. Its packages are both old and unstable, offering zero benefit over always-up-to-date distros like Arch or the standard Debian.
Especially when you’re running a containerised environment, there’s just no reason to opt for anything other than a stable, boring base OS while your containers can be as bleeding edge, crazy, or even Ubuntu-based as you like.
Gotcha 👍 makes sense
I second this. I run fedora on my desktop and debian on the server. Docker works great on debian as well.
NixOS for my homelab that I like to tinker with, Debian as Docker host for the server people actually rely on
uCore spin of Fedora CoreOS:
https://github.com/ublue-os/ucore
- SELinux
- Supports secure boot
- Immutable root partition (can’t be tampered with)
- Rootless Podman (significantly more secure than Docker)
- Everything runs in containers
- Smart and secure opinionated defaults
- Fedora base is very up-to-date, compared to something like Debian
My server is running headless Debian. I run what I can in a Docker container. My experience has been rock solid.
From what I understand Debian isn’t less secure due to the late updates. If anything it’s the opposite.
I have tons of experience with enterprise linux, so I tend to use Rocky linux. It’s similar to my Fedora daily driver, which is nice, and very close to the RHEL and Centos systems I used to own.
You are slightly mistaken with your assumption that debian is insecure because of the old packages. Old packages are fine, and not inherently insecure because of its age. I only become concerned about the security implications of a package if it is dual use/LOLBin, known to be vulnerable, or has been out of support for some time. The older packages Debian uses, at least things related to infrastructure and hosting, are the patched LTS release of a project.
My big concerns for picking a distro for hosting services would be reliability, level of support, and familiarity.
A more reliable distro is less likely to crash or break itself. Enterprise linux and Debian come to mind with this regard.
A distro that is well supported will mean quick access to security patches, updates, and more stable updates. It will have good, accurate documentation, and hopefully some good guides. Enterprise linux, Debian and Ubuntu have excellent support. Enterprise linux distros have incredible documentation, and often are similar enough that documentation for a different branch will work fine. Heck, I usually use rhel docs when troubleshooting my fedora install since it is close enough to get me to a point where the application docs will guide me through.
Familiarity is self explanatory. But it is important because you are more likely to accidentally compromise security in an unfamiliar environment, and it’s the driving force behind me sticking with enterprise linux over Nixos or a hardened OpenBSD.
As a fair word of warning, enterprise linux will be pretty different compared to any desktop distro, even fedora. It takes quite a bit of learning, to get comfortable (especially with SELinux), but once you do, things will go smoothly.
you can also use a pirated rhel certification guide to learn enterprise linuxIf anything, you can simply mess around in a local VM and try installing the tools and services needed before taking it to the cloud.
I just use debian cause it’s rock solid and most of what I set up are in containers or VM’S anyways
Debian has been rock solid for me.
It’s not insecure. Quite the contrary debian repositories only include packages that has been through extensive testing and had been found secure and stable. And of course it regularly introduce security updates.
It’s not insecure.
There’s the inconvenient truth: it’s easiest to secure an OS, say for enterprise life, the farther you are from the bleeding edge: churn is lower, the targets move dramatically slower, and testing an install set (as a set) is markedly easier. It’s why enterprise linux distros are ALL version-branched at a given version, and only port security fixes in: if you need to change a package and start the extensive testing, keep it to security fixes and similarly drastic reasons.
So most ent-like distros aren’t insecure; not at all. Security is the goal and the reason they endure wave after yearly wave of people not understanding why they don’t surf that bleeding edge. They don’t get it.
Enterprise distros also offer a really stable platform to release stuff on; that was a mantra the sales team used for Open that we’d stress in ISV Engineering too, as we dealt with companies and people porting onto Open. But ISVs had their own inexperienced types for whom the idea of a stable platform that guaranteed a long life to their product with guaranteed compatibility wasn’t as valuable as “ooh shiny”. But that was the indirect benefit: market your Sybase or ProgressDb on the brand new release and once it’s working you don’t have to care about library rug-pulls or similar surprises for a fucking decade (or half that as you start the next wave onto the next distro release). And 5 years is a much better cadence than ‘every week’.
So while it’s easy to secure and support something that never moves, that’s also not feasible: you have to march forward. So ent distros stay a little back from the bleeding edge, market ‘RHL7’ or ‘OL31’ as a stable LTS distro, and try to get people onto it so they have a better time of it.
Just, now devs have to cope with libs and tools that are, on average, 5 years stale. For some, that’s not acceptable. And that’s always the challenge.
Debian backports security updates to most software, including popular server software. Stable also always uses an LTS kernel, which stays supported upstream. So long as you’re using latest Debian Stable (Bookworm as of this writing), run apt update often (in fact, ‘’’unattended-upgrades’’’ is probably not the worst idea in this case) and do common sense security practices like a firewall and (brain is not working), you should be good.
In brief, it’s totally fine to use Debian and in fact one of the best options in my opinion.
I currently use Ubuntu for all my machines (desktops, laptops, and servers), but I used to use Void Linux on my machines for about 6 years, including on a couple of VPSes. Since you are familiar with Void Linux, you could stick with that and just use Docker/Podman for the individual services such as Matrix, Mastodon, etc.
In regards to Debian, while the packages are somewhat frozen, they do get security updates and backports by the Debian security team:
https://www.debian.org/security/
There is even a LTS version of Debian that will continue backporting security updates:
Good luck!
Debian with Docker containers works well for my needs.
Proxmox so I can run a bunch of other distros.