How do you backup?
-
I use external drive for my important data and if my system is borked (which never happen to me) I just reinstall the OS
External drives are more prone to damage and failures, both because they're more likely to be dropped/bumped/spilled on etc, and because of generally cheaper construction compared to internal drives. In the case of SSDs the difference might be negligible, but I suggest you at least make a copy on another "cold" external drive if the data is actually important
-
I recently implemented a backup workflow for me. I heavily use restic for desktop backup and for a full system backup of my local server.
It works amazingly good. I always have a versioned backup without a lot of redundant data. It is fast, encrypted and compressed.But I wondered, how do you guys do your backups? What software do you use? How often do you do them and what workflow do you use for it?
I use Duplicacy to backup to my TrueNAS server. Crucial data like documents are backed up a second time to my GDrive, also using Duplicacy. Sadly it's a paid solution, but it works great for me.
-
I recently implemented a backup workflow for me. I heavily use restic for desktop backup and for a full system backup of my local server.
It works amazingly good. I always have a versioned backup without a lot of redundant data. It is fast, encrypted and compressed.But I wondered, how do you guys do your backups? What software do you use? How often do you do them and what workflow do you use for it?
Borg daily to the local drive then copied across to a USB drive, then weekly to cloud storage. Script is triggered by daily runs of topgrade before I do any updates
-
I recently implemented a backup workflow for me. I heavily use restic for desktop backup and for a full system backup of my local server.
It works amazingly good. I always have a versioned backup without a lot of redundant data. It is fast, encrypted and compressed.But I wondered, how do you guys do your backups? What software do you use? How often do you do them and what workflow do you use for it?
I created a script that I dropped into
/etc/cron.hourly
which does the following:- Use rsync to mirror my root partition to a btrfs partition on another hard drive (which only updates modified files).
- Use
btrfs subvolume snapshot
to create a snapshot of that mirror (which only uses additional storage for modified files). - Moves "old" snapshots into a trash directory so I can delete them later if I want to save space.
It is as follows:
#!/usr/bin/env python from datetime import datetime, timedelta import os import pathlib import shutil import subprocess import sys import portalocker DATETIME_FORMAT = '%Y-%m-%d-%H%M' BACKUP_DIRECTORY = pathlib.Path('/backups/internal') MIRROR_DIRECTORY = BACKUP_DIRECTORY / 'mirror' SNAPSHOT_DIRECTORY = BACKUP_DIRECTORY / 'snapshots' TRASH_DIRECTORY = BACKUP_DIRECTORY / 'trash' EXCLUDED = [ '/backups', '/dev', '/media', '/lost+found', '/mnt', '/nix', '/proc', '/run', '/sys', '/tmp', '/var', '/home/*/.cache', '/home/*/.local/share/flatpak', '/home/*/.local/share/Trash', '/home/*/.steam', '/home/*/Downloads', '/home/*/Trash', ] OPTIONS = [ '-avAXH', '--delete', '--delete-excluded', '--numeric-ids', '--relative', '--progress', ] def execute(command, *options): print('>', command, *options) subprocess.run((command,) + options).check_returncode() execute( '/usr/bin/mount', '-o', 'rw,remount', BACKUP_DIRECTORY, ) try: with portalocker.Lock(os.path.join(BACKUP_DIRECTORY,'lock')): execute( '/usr/bin/rsync', '/', MIRROR_DIRECTORY, *( OPTIONS + [f'--exclude={excluded_path}' for excluded_path in EXCLUDED] ) ) execute( '/usr/bin/btrfs', 'subvolume', 'snapshot', '-r', MIRROR_DIRECTORY, SNAPSHOT_DIRECTORY / datetime.now().strftime(DATETIME_FORMAT), ) snapshot_datetimes = sorted( ( datetime.strptime(filename, DATETIME_FORMAT) for filename in os.listdir(SNAPSHOT_DIRECTORY) ), ) # Keep the last 24 hours of snapshot_datetimes one_day_ago = datetime.now() - timedelta(days=1) while snapshot_datetimes and snapshot_datetimes[-1] >= one_day_ago: snapshot_datetimes.pop() # Helper function for selecting all of the snapshot_datetimes for a given day/month def prune_all_with(get_metric): this = get_metric(snapshot_datetimes[-1]) snapshot_datetimes.pop() while snapshot_datetimes and get_metric(snapshot_datetimes[-1]) == this: snapshot = SNAPSHOT_DIRECTORY / snapshot_datetimes[-1].strftime(DATETIME_FORMAT) snapshot_datetimes.pop() execute('/usr/bin/btrfs', 'property', 'set', '-ts', snapshot, 'ro', 'false') shutil.move(snapshot, TRASH_DIRECTORY) # Keep daily snapshot_datetimes for the last month last_daily_to_keep = datetime.now().date() - timedelta(days=30) while snapshot_datetimes and snapshot_datetimes[-1].date() >= last_daily_to_keep: prune_all_with(lambda x: x.date()) # Keep weekly snapshot_datetimes for the last three month last_weekly_to_keep = datetime.now().date() - timedelta(days=90) while snapshot_datetimes and snapshot_datetimes[-1].date() >= last_weekly_to_keep: prune_all_with(lambda x: x.date().isocalendar().week) # Keep monthly snapshot_datetimes forever while snapshot_datetimes: prune_all_with(lambda x: x.date().month) except portalocker.AlreadyLocked: sys.exit('Backup already in progress.') finally: execute( '/usr/bin/mount', '-o', 'ro,remount', BACKUP_DIRECTORY, )
-
I recently implemented a backup workflow for me. I heavily use restic for desktop backup and for a full system backup of my local server.
It works amazingly good. I always have a versioned backup without a lot of redundant data. It is fast, encrypted and compressed.But I wondered, how do you guys do your backups? What software do you use? How often do you do them and what workflow do you use for it?
I use Pika Backup (GUI that uses Borg Backup on the backend) to back up my desktop to my home server daily, then overnight that server has a daily backup using Borg to a Hetzner Storage Box. It's easy to set it and forget it (other than maybe verifying the backups every once in a while), and having that off site back up gives me peace of mind.
-
-
daily important stuff (job stuff, Documents folder, Renoise mods) is kept synced between laptop, desktop and home server via Syncthing. A vimwiki additionally also on the phone. Sync happens only when on home network.
-
the rest of the laptop and desktop I'll roll into a tar backup every now and then with a quick bash alias. The tar files also get synced onto home server's big file system (2 TB ssd) via Syncthing.
-
clever thing is that the 2 TB ssd replaced an old 2 TB spinning disk. I kept the old disk and set up a systemd thing that keeps it spun down, but starts and mounts it once a week and rsyncs the changes to the ssd over, then unmounts it so that it sleeps again for a week. That old drive is likely to serve for years still with this frugal use.
How do you make sure the disk spins down? Is unmounting enough?
-
-
I recently bought a storagebox from Hatzner and set up my server to run borgmatic every day to backup to it.
I've also discovered that Pika Backup works really well as a "read only" graphical browser for borg repos.
Do you use some kind of encryption on the VPS?
-
Do you use some kind of encryption on the VPS?
Yep, borgmatic encrypts it before it sends data to the server.
-
How do you make sure the disk spins down? Is unmounting enough?
Unmounting is enough if the disk has spindown configured. I've got this in /etc/udev/rules.d/ :
ACTION=="add", SUBSYSTEM=="block", KERNEL=="sd[a-z]", ENV{ID_SERIAL_SHORT}=="S2H7J9FZB02854", RUN+="/usr/bin/hdparm -S 70 /dev/%k"
-
I recently implemented a backup workflow for me. I heavily use restic for desktop backup and for a full system backup of my local server.
It works amazingly good. I always have a versioned backup without a lot of redundant data. It is fast, encrypted and compressed.But I wondered, how do you guys do your backups? What software do you use? How often do you do them and what workflow do you use for it?
Borg to a NAS, and that mirrored to Backblaze
-