Can you please share your backup strategies for linux? I’m curious to know what tools you use and why?How do you automate/schedule backups? Which files/folders you back up? What is your prefered hardware/cloud storage and how do you manage storage space?

  • sntx@lemm.ee
    link
    fedilink
    arrow-up
    1
    ·
    1 hour ago

    I’m using rustic, a lock-free rust-written drop-in-replacement of restic, which (I’m referring to restic and therefore in extension to rustic) supports always-encrypted, deduplicating, compressed and easy backups without you needing to worry about whether to do a full- or incremental-backup.

    All my machines run hourly backups of all mounted partitions to an append-only repo at borgbase. I have a file with ignore pattern globs to skip unwanted files and dirs (i.e.: **/.cache).

    While I think borgbase is ok, ther’re just using hetzner storage boxes in the background, which are cheaper if you use them directly. I’m thinking of migrating my backups to a handfull of homelabs from trusted friends and family instead.

    The backups have a randomized delay of 5m and typically take about 8-9s each (unless big new files need to be uploaded). They are triggered by persistent systemd-timers.

    The backups have been running across my laptop, pc and server for about 6 months now and I’m at ~380 GiB storage usage total.

    I’ve mounted backup snapshots on multiple occasions already to either get an old version of a file, or restore it entirely.

    There is a tool called redu which is like ncdu but works on restic/rustic repos. This makes it easy to identify which files blow up your backup size.

  • _spiffy@lemmy.ca
    link
    fedilink
    English
    arrow-up
    6
    ·
    3 hours ago

    Dump configs to backup drive. Pray to the machine spirit that things don’t blow up. Only update when I remember. I’m a terrible admin for my own stuff.

  • fireshell@lemmy.ml
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 hours ago

    Example of a Bash script that performs the following tasks

    1. Checks the availability of an important web server.
    2. Checks disk space usage.
    3. Makes a backup of the specified directories.
    4. Sends a report to the administrator’s email.

    Example script:

    #!/bin/bash
    
    # Settings
    WEB_SERVER="https://example.com"
    BACKUP_DIR="/backup"
    TARGET_DIRS="/var/www /etc"
    DISK_USAGE_THRESHOLD=90
    ADMIN_EMAIL="admin@example.com"
    DATE=$(date +"%Y-%m-%d")
    BACKUP_FILE="$BACKUP_DIR/backup-$DATE.tar.gz"
    
    # Checking web server availability
    echo "Checking web server availability..."
    if curl -s --head $WEB_SERVER | grep "200 OK" > /dev/null; then
    echo "Web server is available."
    else
    echo "Warning: Web server is unavailable!" | mail -s "Problem with web server" $ADMIN_EMAIL
    fi
    
    # Checking disk space
    echo "Checking disk space..."
    DISK_USAGE=$(df / | grep / | awk '{ print $5 }' | sed 's/%//g')
    if [ $DISK_USAGE -gt $DISK_USAGE_THRESHOLD ]; then
    echo "Warning: Disk space usage exceeded $DISK_USAGE_THRESHOLD%!" | mail -s "Problem with disk space" $ADMIN_EMAIL
    else
    echo "There is enough disk space."
    fi
    
    # Creating backup
    echo "Creating backup..."
    tar -czf $BACKUP_FILE $TARGET_DIRS
    
    if [ $? -eq 0 ]; then
    echo "Backup created successfully: $BACKUP_FILE"
    else
    echo "Error creating backup!" | mail -s "Error creating backup" $ADMIN_EMAIL
    fi
    
    # Sending report
    echo "Sending report to $ADMIN_EMAIL..."
    REPORT="Report for $DATE\n\n"
    REPORT+="Web server status: $(curl -s --head $WEB_SERVER | head -n 1)\n"
    REPORT+="Disk space usage: $DISK_USAGE%\n"
    REPORT+="Backup location: $BACKUP_FILE\n"
    
    echo -e $REPORT | mail -s "Daily system report" $ADMIN_EMAIL
    
    echo "Done."
    

    Description:

    1. Check web server: Uses curl command to check if the site is available.
    2. Check disk space: Use df and awk to check disk usage. If the threshold (90%) is exceeded, a notification is sent.
    3. Create a backup: The tar command archives and compresses the directories specified in the TARGET_DIRS variable.
    4. Send a report: A report on all operations is sent to the administrator’s email using mail.

    How to use:

    1. Set the desired parameters, such as the web server address, directories for backup, disk usage threshold and email.
    2. Make the script executable:
    chmod +x /path/to/your/script.sh
    
    1. Add the script to cron to run on a regular basis:
    crontab -e
    

    Example to run every day at 00:00:

    0 0 * * * /path/to/your/script.sh
    
  • fmstrat@lemmy.nowsci.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    5 hours ago

    All important files go in /data.

    /data is ZFS, snapped and sent to NAS regularly

    Every time I change a setting, it gets added to a dconf script. Every time I install software, I write a script.

    Dotfiles git repo for home directory.

    With that, I can spin up a fresh machine in minutes with scripts.

  • vortexal@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    8 hours ago

    The only thing I use as a backup is a Live CD that’s mounted to a USB thumb drive.

    I used to use Timeshift but the one time I needed it, it didn’t work for some reason. It also had a problem of making my PC temporarily unusable while it was making a backup, so I didn’t enable it when I had to reinstall Linux Mint.

    • Teppichbrand@feddit.org
      link
      fedilink
      arrow-up
      2
      ·
      2 hours ago

      Same, Timeshift let me down one time when I needed it. I still use it though, and I’m afraid to upgrade Mint because I don’t want to set my system again for of the upgrade fails to keep my configuration and Timeshift fails to take me back

  • gerdesj@lemmy.ml
    link
    fedilink
    English
    arrow-up
    1
    ·
    6 hours ago

    You have loads of options but you need to also start from … “what if”. Work out how important your data really is. Take another look and ask the kids and others if they give a toss. You might find that no one cares about your photo collection in which case if your phone dies … who cares? If you do care then sync them to a PC or laptop.

    Perhaps take a look at this - https://www.veeam.com/products/free/linux.html its free for a few systems.

  • shapis@lemmy.ml
    link
    fedilink
    arrow-up
    11
    ·
    12 hours ago

    All my code and projects are on GitHub/codeberg.

    All my personal info and photos are on proton drive.

    If Linux shits itself (and it does often) who cares. I can have it up and running again in a fresh install in ten minutes.

  • somenonewho@feddit.org
    link
    fedilink
    arrow-up
    3
    ·
    10 hours ago

    For files are in git (using stow to recreate) and my documents folder is syncing to nextcloud (selfhosted) and this also to my laptop. This is of course not a “Backup” per se more a “multiple copies” but it gets the job done and also firs my workflow. To be happy with that I want to set up an offsite backup of data from my server to a NAS in my parents place but right now that’s just a to-do I haven’t put any work in yet ;)